Culture

Our energy hunger is tethered to our economic past

Just as a living organism continually needs food to maintain itself, an economy consumes energy to do work and keep things going. That consumption comes with the cost of greenhouse gas emissions and climate change, though. So, how can we use energy to keep the economy alive without burning out the planet in the process?

In a paper in PLOS ONE, University of Utah professor of atmospheric sciences Tim Garrett, with mathematician Matheus Grasselli of McMaster University and economist Stephen Keen of University College London, report that current world energy consumption is tied to unchangeable past economic production. And the way out of an ever-increasing rate of carbon emissions may not necessarily be ever-increasing energy efficiency--in fact it may be the opposite.

"How do we achieve a steady-state economy where economic production exists, but does not continually increase our size and add to our energy demands?" Garrett says. "Can we survive only by repairing decay, simultaneously switching existing fossil infrastructure to a non-fossil appetite? Can we forget the flame?"

Thermoeconomics

Garrett is an atmospheric scientist. But he recognizes that atmospheric phenomena, including rising carbon dioxide levels and climate change, are tied to human economic activity. "Since we model the earth system as a physical system," he says, "I wondered whether we could model economic systems in a similar way."

He's not alone in thinking of economic systems in terms of physical laws. There's a field of study, in fact, called thermoeconomics. Just as thermodynamics describe how heat and entropy (disorder) flow through physical systems, thermoeconomics explores how matter, energy, entropy and information flow through human systems.

Many of these studies looked at correlations between energy consumption and current production, or gross domestic product. Garrett took a different approach; his concept of an economic system begins with the centuries-old idea of a heat engine. A heat engine consumes energy at high temperatures to do work and emits waste heat. But it only consumes. It doesn't grow.

Now envision a heat engine that, like an organism, uses energy to do work not just to sustain itself but also to grow. Due to past growth, it requires an ever-increasing amount of energy to maintain itself. For humans, the energy comes from food. Most goes to sustenance and a little to growth. And from childhood to adulthood our appetite grows. We eat more and exhale an ever-increasing amount of carbon dioxide.

"We looked at the economy as a whole to see if similar ideas could apply to describe our collective maintenance and growth," Garrett says. While societies consume energy to maintain day to day living, a small fraction of consumed energy goes to producing more and growing our civilization.

"We've been around for a while," he adds. "So it is an accumulation of this past production that has led to our current size, and our extraordinary collective energy demands and CO2 emissions today."

Growth as a symptom

To test this hypothesis, Garrett and his colleagues used economic data from 1980 to 2017 to quantify the relationship between past cumulative economic production and the current rate at which we consume energy. Regardless of the year examined, they found that every trillion inflation-adjusted year 2010 U.S. dollars of economic worldwide production corresponded with an enlarged civilization that required an additional 5.9 gigawatts of power production to sustain itself . In a fossil economy, that's equivalent to around 10 coal-fired power plants, Garrett says, leading to about 1.5 million tons of CO2 emitted to the atmosphere each year. Our current energy usage, then, is the natural consequence of our cumulative previous economic production.

They came to two surprising conclusions. First, although improving efficiency through innovation is a hallmark of efforts to reduce energy use and greenhouse gas emissions, efficiency has the side effect of making it easier for civilization to grow and consume more.

Second, that the current rates of world population growth may not be the cause of rising rates of energy consumption, but a symptom of past efficiency gains.

"Advocates of energy efficiency for climate change mitigation may seem to have a reasonable point," Garrett says, "but their argument only works if civilization maintains a fixed size, which it doesn't. Instead, an efficient civilization is able to grow faster. It can more effectively use available energy resources to make more of everything, including people. Expansion of civilization accelerates rather than declines, and so do its energy demands and CO2  emissions."

A steady-state decarbonized future?

So what do those conclusions mean for the future, particularly in relation to climate change? We can't just stop consuming energy today any more than we can erase the past, Garrett says. "We have inertia. Pull the plug on energy consumption and civilization stops emitting but it also becomes worthless. I don't think we could accept such starvation."

But is it possible to undo the economic and technological progress that have brought civilization to this point? Can we, the species who harnessed the power of fire, now "forget the flame," in Garrett's words, and decrease efficient growth?

"It seems unlikely that we will forget our prior innovations, unless collapse is imposed upon us by resource depletion and environmental degradation," he says, "which, obviously, we hope to avoid."

So what kind of future, then, does Garrett's work envision? It's one in which the economy manages to hold at a steady state--where the energy we use is devoted to maintaining our civilization and not expanding it.

It's also one where the energy of the future can't be based on fossil fuels. Those have to stay in the ground, he says.

"At current rates of growth, just to maintain carbon dioxide emissions at their current level will require rapidly constructing renewable and nuclear facilities, about one large power plant a day. And somehow it will have to be done without inadvertently supporting economic production as well, in such a way that fossil fuel demands also increase."

It's a "peculiar dance," he says, between eliminating the prior fossil-based innovations that accelerated civilization expansion, while innovating new non-fossil fuel technologies. Even if this steady-state economy were to be implemented immediately, stabilizing CO2 emissions, the pace of global warming would be slowed--not eliminated. Atmospheric levels of CO2 would still reach double their pre-industrial level before equilibrating, the research found.

By looking at the global economy through a thermodynamic lens, Garrett acknowledges that there are unchangeable realities. Any form of an economy or civilization needs energy to do work and survive. The trick is balancing that with the climate consequences.

"Climate change and resource scarcity are defining challenges of this century," Garrett says. "We will not have a hope of surviving our predicament by ignoring physical laws."

Future work

This study marks the beginning of the collaboration between Garrett, Grasselli and Keen. They're now working to connect the results of this study with a full model for the economy, including a systematic investigation of the role of matter and energy in production.

"Tim made us focus on a pretty remarkable empirical relationship between energy consumption and cumulative economic output," Grasselli says. "We are now busy trying to understand what this means for models that include notions that are more familiar to economists, such as capital, investment and the always important question of monetary value and inflation."

Credit: 
University of Utah

Meteorite study suggests Earth may have been wet since it formed

A new study finds that Earth's water may have come from materials that were present in the inner solar system at the time the planet formed -- instead of far-reaching comets or asteroids delivering such water. The findings published Aug. 28 in Science suggest that Earth may have always been wet.

Researchers from the Centre de Recherches Petrographiques et Geochimiques (CRPG, CNRS/Universite de Lorraine) in Nancy, France, including one who is now a postdoctoral fellow at Washington University in St. Louis, determined that a type of meteorite called an enstatite chondrite contains sufficient hydrogen to deliver at least three times the amount of water contained in the Earth's oceans, and probably much more.

Enstatite chondrites are entirely composed of material from the inner solar system -- essentially the same stuff that made up the Earth originally.

"Our discovery shows that the Earth's building blocks might have significantly contributed to the Earth's water," said lead author Laurette Piani, a researcher at CPRG. "Hydrogen-bearing material was present in the inner solar system at the time of the rocky planet formation, even though the temperatures were too high for water to condense."

The findings from this study are surprising because the Earth's building blocks are often presumed to be dry. They come from inner zones of the solar system where temperatures would have been too high for water to condense and come together with other solids during planet formation.

The meteorites provide a clue that water didn't have to come from far away.

"The most interesting part of the discovery for me is that enstatite chondrites, which were believed to be almost 'dry,' contain an unexpectedly high abundance of water," said Lionel Vacher, a postdoctoral researcher in physics in Arts & Sciences at Washington University in St. Louis.

Vacher prepared some of the enstatite chondrites in this study for water analysis while he was completing his PhD at Universite de Lorraine. At Washington University, Vacher is working on understanding the composition of water in other types of meteorites.

Enstatite chondrites are rare, making up only about 2 percent of known meteorites in collections.

But their isotopic similarity to Earth make them particularly compelling. Enstatite chondrites have similar oxygen, titanium and calcium isotopes as Earth, and this study showed that their hydrogen and nitrogen isotopes are similar to Earth's, too. In the study of extraterrestrial materials, the abundances of an element's isotopes are used as a distinctive signature to identify where that element originated.

"If enstatite chondrites were effectively the building blocks of our planet -- as strongly suggested by their similar isotopic compositions -- this result implies that these types of chondrites supplied enough water to Earth to explain the origin of Earth's water, which is amazing!" Vacher said.

The paper also proposes that a large amount of the atmospheric nitrogen -- the most abundant component of the Earth's atmosphere -- could have come from the enstatite chondrites.

"Only a few pristine enstatite chondrites exist: ones that were not altered on their asteroid nor on Earth," Piani said. "In our study we have carefully selected the enstatite chondrite meteorites and applied a special analytical procedure to avoid being biased by the input of terrestrial water."

Coupling two analytical techniques -- conventional mass spectrometry and secondary ion mass spectrometry (SIMS) -- allowed researchers to precisely measure the content and composition of the small amounts of water in the meteorites.

Prior to this study, "it was commonly assumed that these chondrites formed close to the sun," Piani said. "Enstatite chondrites were thus commonly considered 'dry,' and this frequently reasserted assumption has probably prevented any exhaustive analyses to be done for hydrogen."

Credit: 
Washington University in St. Louis

A spatial regime shift to stickleback dominance

image: A male stickleback (Gasterosteus aculeatus) during spawning time.

Image: 
Photo: Joakim Hansen/Azote

Large numbers of three-spined stickleback have gradually taken over larger parts of the Baltic Sea's coastal ecosystem, shows a new scientific study. Stickleback is a small prey fish common in aquatic food webs across temperate Europe. The stickleback contributes to local ecosystem 'regime shifts', where young-of-the-year pike and perch decline in individual bays, and these shifts gradually spread like a wave from the outer archipelago into the mainland coast.

This disturbing change is shown in a unique study conducted by researchers from Stockholm University, the Swedish University of Agricultural Sciences and the University of Groningen (Netherlands), which is now published in the scientific journal Communications Biology.

Regime shifts across space, as well as time?

Regime shifts are large, sudden and long-lasting changes in the structure and function of ecosystems, often triggered by human pressures on the environment. Most scientific studies of regime shifts focus on temporal change. Meanwhile, ecological theory predicts that regime shifts in large, heterogeneous ecosystems may start locally, in the most vulnerable areas, and then spread gradually across space. So far, however, proof of such dynamics is very rare.

The stickleback 'explosion'

The small fish three-spined stickleback is an important part of aquatic food webs across the Northern hemisphere. It's eaten by both birds and larger fish. It's also a common model organism in studies of fish behavior and evolution.

"In the Baltic Sea, the number of stickleback has increased dramatically since the 1990s", says Ulf Bergström, the researcher in the project who first documented the increase. Eutrophication, climate change and less predatory fish are at least part of the explanation for the sharp rise in sticklebacks. However, research is underway to investigate the causes further.

At the same time, the recruitment of pike and perch - two large predatory fish that eat stickleback - has decreased in the outer archipelagos and open coastal stretches, where the sticklebacks often reside during spring and summer. Previous studies have shown that the stickleback can eat the eggs and larvae of pike and perch, but it has not been known to what extent this has actually affected their reproduction on a larger scale.

A wave of sticklebacks

To get a better picture of the extent and effects of the stickleback increase, researchers in the project PlantFish compiled and analyzed data from nearly 40 years of fish surveys in almost 500 shallow archipelago bays, along the entire Swedish Baltic Sea coast.

The results showed a large-scale change - a so-called regime shift - from predatory fish to stickleback dominance, which gradually spread from the outer archipelagos and in towards the mainland coast.

"One by one, individual bays shifted to being dominated by sticklebacks instead of predatory fish", says the main author Johan Eklöf. "We spontaneously began to call the pattern the 'stickleback wave', as the shift to stickleback dominance seen from the larger perspective wells up like a slow tsunami towards the coast."

Detailed studies of major ecosystem components in 32 of the bays showed that sticklebacks reduce the number of perch and pike young-of-the-year through predation on eggs and larvae. This is likely to diminish the local predatory fish stock, and further benefit the stickleback.

"We also see the reduction of perch and pike juveniles along the entire coast, and not just in the outer archipelagos and the open coastal stretches. These changes are serious, as they suggest that local stocks of predatory fish can gradually become extinct, which in turn affects the ecosystem as a whole", Johan Eklöf continues.

"Since the stickleback also eats small grazing animals, the shift also intensifies the eutrophication symptoms, as filamentous 'nuisance' algae, which suffocate bladderwrack and other important aquatic plants, are allowed to grow uncontrolled", Johan Eklöf points out.

A symptom revealing deeper problems

The researchers now believe that the large-scale nature of the stickleback wave shows a great need for increased monitoring so that we can detect and counteract this type of change in the future.

"The problem has previously gone under the radar or been judged to be very local, as existing monitoring programs have not focused on small fish and have not made any spatial analyzes", says Ulf Bergström. A better understanding of the connections between the open sea and the coast is now needed, where the stickleback's yearly migrations link the systems. The researchers have already started gaining such knowledge in a new project called "The stickleback wave", funded by the Swedish Research Council Formas.

The new project will also provide knowledge about what measures can be taken to try to reverse the shift. Various measures to strengthen the stocks of predatory fish are already being done, but targeted fishing for large sticklebacks could also be an environmental protection measure. The researchers will evaluate whether stickleback fishing can benefit perch and pike recruitment, but at the same time emphasize that the problem is more complex than the stickleback itself.

"Stickleback fishing may be inevitable to slow down the 'wave', but we must better understand and address the root causes of the major changes in the ecosystem. The stickleback wave is probably just a symptom and not the problem itself", Johan Eklöf concludes.

Credit: 
Stockholm University

The Newtonian gravitational constant: Latest advances of the measurements

image: Schematic diagram of four experimental devices used by the co-author's research team.

Image: 
©Science China Press

Newton's law of universal gravitation, which describes the attractive force between two masses separated by the distance, is one of the greatest achievements in the 17th century. The strength of this force is defined by the constant of proportionality G, which is called the gravitational constant, independent of the size, shape, and composition of the objects. G is one of the earliest fundamental constants introduced by human beings, and extensively used in the fields of cosmology and astrophysics, and also plays an important role in many other fields of physics.

However, the measurement precision of G is improved by only about two orders of magnitude through more than two centuries of efforts. Up to now, it remains the least precisely known during all fundamental physical constants, due to the extreme weakness and unshieldability of gravity, and in addition, there is not any quantitative relationship with any other fundamental constants.

The following graph shows latest values of G with high precision which were obtained after the year 2000, the difference between them reaches more than 500 ppm (part per million). This phenomenon of inconsistent measurement results of G value has almost occurred in entire history of G measurement and made so many scientists puzzled. It is most likely that there could be some undiscovered systematic errors in the G measurements.

In a new overview published in the Beijing-based National Science Review, scientists at Sun Yat-sen University (Zhuhai Campus) in Zhuhai, China and Huazhong University of Science and Technology in Wuhan, China present the latest advances in the measurements of G. Co-authors Chao Xue, Jian-Ping Liu, Qing Li, Jun-Fei Wu, Shan-Qing Yang, Qi Liu, Cheng-Gang Shao, Liang-Cheng Tu, Zhong-Kun Hu, and Jun Luo trace the history of the development of the G measurements; they also review the values of G adopted in the Committee on Data for Science and Technology recommended value, CODATA-2014, after the year 2000 and their latest two values published in 2018 using two independent methods. These scientists likewise outline the development directions of future experiments for G.

"Since Cavendish's first laboratory measurement of G value by using torsion balance over 200 years ago, experimenters have devoted tremendous efforts to investigating many possible contributions to the measurement uncertainty, but the relative uncertainty of G has not been greatly improved." they state in an article titled "Precision Measurement of the Newtonian Gravitational Constant"

"With the development of science and technology in recent years, experimenters have performed some novel techniques to improve the sensitivity of experiments." they add. "Unfortunately, there is still a large discrepancy of about 550 ppm among thirteen values of G which are reviewed in this paper, even though the relative standard uncertainties of many results have been less than 50 ppm.".

The measurement process, results and advantages and disadvantages of eleven values of G, which are adopted in CODATA-2014 after the year 2000, are described and analyzed in detail. Meanwhile, the values of G, which were obtained by the co-author's research team, are described systematically and comprehensively, the schematic diagram of four experimental devices are shown as follow. Especially the latest two values published in 2018. "an improved experiment with high accuracy and high confidence level needed to be carried out." they state. "These two G values of HUST-18 experiment with different methods have the smallest uncertainties reported until now, and both agree with each other within a 3σ range. S. Schlamminger from National Institute of Standards and Technology published the views to emphasize that our study was an example of excellent craftsmanship in precision measurements"

"Why is the scatter of the G values so large? In principle, there are two possibilities in science and technology that can explain the obvious inconsistency." They add. "The first is that there could be the systematic errors which are not fully understood exist in some or all of the experiments." "The second possibility is that there might be some unknown physical mechanism to explain the discrepancy of G values."

"For the future development of G measurement, the main target should be to reduce the discrepancy of every values of G." the scientists expect. "Every groups need to repeat their experiments with the same method and the same devices, and should make much more effort to estimate the potential systematic errors." "After that, different groups should strengthen the international cooperation to discuss the possible undiscovered systematic errors among different methods." Finally, they hope that "more and more scientists could be involved in G measurement and the problem of "Big G" can be solved in the near future"

Credit: 
Science China Press

Dealing a blow on monetarism

This year's third issue of the Financial Journal opens with an article by Marina Malkina, Professor at the Department of Economic Theory and Methodology of the UNN Institute of Economics and Entrepreneurship, and Igor Moiseev, research assistant at the Center for Macroeconomics and Microeconomics of the same Institute. Their article entitled "Endogeneity of Money Supply in the Russian Economy in the Context of the Monetary Regime Change" is published in the "Monetary policy" section.

Endogenous money (from Greek "endon", meaning "internal"), according to the established definition, is money created inside the economy as a response to its impulses; the economy in this case is considered a closed and autonomous unit.

"The article deals with the endogeneity of money supply in the Russian economy in the context of the changes made to the rules of monetary regulation. We summarized and analyzed the basic concepts of the modern theory of endogenous money, and described the approaches of various researchers to studying the impact of financial innovations and changes in the principles of monetary regulation on money supply endogeneity," Marina Malkina notes.

In the empirical part of the research, in order to test the hypothesis on the endogenous origin of the money supply in the Russian economy in 2010-2018, the authors applied the Granger causality test and the Johansen cointegration test as well as VAR and VECM models. The study was based on the monthly data for the monetary sphere (M2 monetary aggregate, monetary base, and money multiplier), the banking sphere (loans, deposits, and interest rate) and the transactional sector of the economy (wholesale and retail turnover).

The time series were split into two intervals: 2010-2013 and 2014-2018, when a change in the monetary regime occurred (introduction of the key rate, change in the refinancing system, transition to inflation targeting and floating exchange rate of the rouble).

"As a result of our study, we have obtained evidence of the endogenous origin of the money supply in the Russian economy for both periods of time, and the hypotheses of structuralism and preference for liquidity have been confirmed. In the short term of 2010-2013, it was mainly commercial banks that reacted to an increase in money demand (through money multiplier); however, in the long run the Central Bank of Russia prevailed (through changing the monetary base)", Igor Moiseev continues.

In 2014-2018, the Bank of Russia demonstrated an efficient adjustment of money supply toward money demand, which was reflected in the response of the monetary base in the short term.

Meanwhile, in this period the role of commercial banks in lending was increased through operational managing of their own resources, which was reflected in the reaction of the money multiplier to the growth of business activity in the country.

These changes indicated an amplification of money supply endogeneity in the Russian economy. In addition to passive adaptation, a particular type of activism was revealed by Lobachevsky University researchers in the monetary sphere: the initiative of money emission came not so much from the monetary authorities as from commercial banks fighting for a market share in the face of intensified banking competition and reduced bank margins.

Credit: 
Lobachevsky University

New genetic markers of glucosinolates in rapeseed may help improve oil composition

image: Population structure of Russian rapeseed lines. Population structure assessed using principal component analysis for the whole cohort (A), spring (B), and winter (C) types separately. Red dots correspond to spring rapeseed accessions. Blue dots correspond to winter rapeseed accessions. Yellow dots correspond to yellow-seeded winter rapeseed accessions. (D) Population clustering of rapeseed lines based on the admixture component of each accession, the bar colors correspond to the dot colors in panel A.

Image: 
Rim Gubaev et al./Genes

A group of scientists from Skoltech and Pustovoit All-Russian Research Institute of Oil Crops in Krasnodar performed genetic analysis of the Russian rapeseed collection. The scientists described the genetic diversity of Russian rapeseed lines and discovered new candidate genes that are potentially involved in controlling the content of glucosinolates, toxic secondary metabolites in rapeseed oil. Their findings can be used by crop breeders to improve the rapeseed oil composition. The research was published in the Genes journal.

Rapeseed is the world's second-largest oilseed crop after soybeans. Glucosinolates are secondary metabolites of rapeseed and related cruciferous plants. The content of these glucose-derived sulfur-containing organic substances strongly influences oil quality: if present in large amounts, glucosinolates spoil the taste of rapeseed oil and affect the quality of rapeseed meal, compelling crop breeders to look for ways of reducing their content.

The scientists performed genome-wide genotyping of 90 rapeseed lines and compared the results with the glucosinolate content data for these lines collected over 3 growing seasons. This helped identify both the genetic markers of glucosinolate content in oil and the linked candidate genes potentially involved in regulating the biosynthesis of glucosinolates. Once verified on an independent set of plants, the markers can be readily used for breeding new varieties and hybrids with low glucosinolate content.

"Our research aims to foster marker-assisted crop breeding in Russia by using genetic markers to control the characteristics relevant to cross-breeding processes and progeny analysis. This approach can make the breeding of new varieties much faster. Measuring glucosinolates content is an arduous task that can be made much easier by using the markers we have identified", says Rim Gubaev, the first author of the paper and a PhD student at Skoltech.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Two discoveries boost next-generation organoid development

image: During foregut organogenesis, the research team performed single-cell RNA sequencing (scRNA-seq) of the mouse embryonic foregut at three time points that spanned the period of early patterning and lineage induction. This confocal microscope image shows one of those time points.

Image: 
Cincinnati Children's and RIKEN

In back-to-back reports published Aug. 27, 2020, in Nature Communications, a team of scientists from Cincinnati Children's and Japan report discoveries that will be vital to a new wave of more-complex organoid development.

Their findings advance efforts to use human stem cells to grow organs from the fetal foregut including the trachea, esophagus, stomach, liver, gallbladder, bile ducts and pancreas.

"With single-cell analysis of mouse embryos, we defined the complex signaling networks controlling development of mesenchyme cells, which form the smooth muscle and fibroblast tissues that are essential for organ function," says senior author Aaron Zorn, PhD, who leads organoid development at Cincinnati Children's. "We then used this information from the mouse to differentiate the equivalent human tissue in the lab. This is important because up until now all of the liver, lung, stomach and esophagus organoids that we make mostly lack these mesenchyme cell types."

Zorn directs the Center for Stem Cell & Organoid Medicine (CuSTOM) at Cincinnati Children's, which has made groundbreaking advances in stomach, intestine, liver and esophageal organoid development. In 2019, the CuSTOM group launched a formal collaboration with RIKEN, Japan's largest comprehensive research institution, to pursue further organoid innovation.

The papers published in Nature Communications represent the first results of that collaboration.

Decoding foregut development cell-by-cell

In this study, the scientists report detecting a set of signals within the foregut--a proto-organ in very early-stage embryos--that trigger how and when the other organs form. Specifically, they found that the signals are driven by the genes Wnt and SHH, which travel between cells in the endoderm and mesoderm layers of very early embryos.

To define these signals, co-first authors Lu Han, PhD, and Keishi Kishimoto, PhD, collaborated with organoid experts James Wells, PhD, and Takanori Takebe, MD, to develop a high-resolution map of foregut development in mice. They detected an unexpected variety of cells sending a chorus of master signals that trigger formation of the various organs that branch out from the foregut.

This study is the first to pin down the dynamics at play in the embryonic mesoderm, co-authors say.

The action happens very early-between embryonic days 8.5 and 9.5 in mice, which roughly corresponds to days 17 to 23 in human gestation. During this brief window of development, groups of cells at certain spots along the simple foregut tube begin transforming into the sprouts of organs that become the trachea, esophagus, liver and pancreas.

By studying the molecular signaling activity during this period, at a cell-by-cell level, the researchers produced a roadmap that shows how and why the organs sprout where they do. They then used these signals to grow tissue from different organs from human pluripotent stem cells.

In September 2019, Takebe and colleagues reported the world's first success at growing a three-organoid system that included the liver, pancreas and biliary ducts. That breakthrough took five years to achieve and the organoids produced did not possess all the cell types needed for full-sized function.

The new road map will enable CuSTOM scientist to grow more compete interconnected organs, Zorn says.

A deep dive into trachea development

In a parallel paper also appearing in Nature Communications, the RIKEN and CuSTOM teams extended these studies with extensive experiments in mice to further define the mechanisms of trachea formation.

This study led by lung development expert Mitsuru Morimoto, PhD, in Japan used genetically modified mice to learn which cell signals were most important to trachea formation. When these signals fail, the developing embryo does not properly form the cartilage rings and smooth muscle tissues that the trachea needs to pipe air to the lungs.

Cincinnati Children's, which developed the first human esophagus organoid in 2018 has been working with the RIKEN team on this project as part of its involvement with the CLEAR Consortium (Congenital Esophageal and Airway Defect Research).

"This work helps explain what happens when birth defects like esophageal atresia, tracheoesophageal fistula, and tracheomalacia occur," Zorn says. "This work also opens the door to one day generating esophagus and trachea tissue for tissue replacement."

Implications for tissue engineering

The sheer complexity of the new signaling roadmap helps explain why it took so long to make the initial three-organoid breakthrough. For example, the map revealed five distinct populations of mesenchymal cells involved in liver formation alone.

Now, co-authors say the new roadmap will make the process faster, could expand the types of organs that can be grown together, and will allow researchers to grow sets of organoids engineered to mimic conditions that lead to birth defects or increased disease risk--including some forms of cancer.

"One important outcome of our study was to use the signaling roadmap to direct the development of stem cells into different organ cell types," Takebe says. "This approach may have important applications for tissue engineering."

Short term, such organoid systems can be used to test new medications with far less dependence on animal models, or to evaluate the harms caused by pollution, unhealthy diets, allergens, and so on. Longer term, once expects learn ways to grow organoids to significantly larger sizes, customized lab-grown tissues could be used to repair damaged organs and someday even replace failing ones.

The paper lays out the protocols for other scientists to use to make their own organoid systems. Detailed data collected during the project also can be explored via the interactive website https://research.cchmc.org/ZornLab-singlecell.

Credit: 
Cincinnati Children's Hospital Medical Center

DNA repair - Locating and severing lethal links

Chemical lesions in the genetic material DNA can have catastrophic consequences for cells, and even for the organism concerned. This explains why the efficient identification and rapid repair of DNA damage is vital for survival. DNA-protein crosslinks (DPCs), which are formed when proteins are adventitiously attached to DNA, are particularly harmful. DPCs are removed by the action of a dedicated enzyme - the protease SPRTN - which cleaves the bond between the protein and the DNA. Up to now, how SPRTN recognizes such crosslinks, which can differ significantly in structure, has remained unclear. Now a team led by Professor Julian Stingele (LMU Gene Center), in cooperation with Professor Michael Sattler (Helmholtz Zentrum München and Technical University of Munich), has shown that the enzyme utilizes a modular recognition mechanism to detect such sites, such that it is activated only under highly specific conditions. The new findings appear in the journal Molecular Cell.

DPCs can be created by interactions with highly reactive products of normal metabolism or with synthetic chemotherapeutic agents. These lesions are extremely toxic because they block the replication of DNA - and therefore inhibit cell division. Timely and effective repair of these crosslinks by SPRTN is crucial for cell viability and the suppression of tumorigenesis. In humans, mutations that reduce the activity of the enzyme are associated with a high incidence of liver cancer in early life and markedly accelerate the aging process. "SPRTN has a difficult job to do because, depending on the protein and the DNA subunit involved, the structure of the crosslink can vary widely. So the enzyme has to be able to identify many different structures as aberrant," explains Hannah Reinking, first author of the study. "We therefore asked ourselves what sorts of properties a DPC should have in order to be recognized and cleaved."

To answer this question, Reinking and colleagues constructed model substrates consisting of proteins attached to defined positions within DNA strands, and examined whether the SPRTN protease could repair them in the test-tube. This approach revealed that SPRTN interacts with structures that are frequently found in the vicinity of DPCs. With the aid of nuclear magnetic resonance spectroscopy, they went on to show that SPRTN contains two recognition domains. One binds to double-stranded, and the other to single-stranded DNA. "So the protein uses a modular system for substrate recognition. Only when both domains are engaged is the enzyme active - and DNA in which double-stranded and single-stranded regions occur in close proximity is often found in the vicinity of crosslinks," says Stingele.

These results are also of clinical relevance. The action of many chemotherapeutic drugs depends on their ability to form crosslinks with DNA. Since tumor cells divide more frequently than non-malignant cells, they are particularly sensitive to this type of DNA damage. DNA repair enzymes like SPRTN are therefore of great interest as potential drug targets for use in the context of personalized cancer therapies, and agents that specifically inhibit the protease could eventually be employed to boost the efficacy of chemotherapy. "Our work now makes it possible to conceptualize such therapeutic strategies", says Stingele.

Credit: 
Ludwig-Maximilians-Universität München

UVA-developed artificial pancreas effective for children ages 6-13, study finds

image: The Control-IQ artificial pancreas system was derived from research done at the Center for Diabetes Technology at the University of Virginia.

Image: 
Tandem Diabetes Care

An artificial pancreas originally developed at the University of Virginia Center for Diabetes Technology safely and effectively manages blood sugar levels in children ages 6 to 13 with type 1 diabetes, a national clinical trial has found. Data from this and other studies has prompted the U.S. Food and Drug Administration to approve the device for use by children ages 6 and older.

The Control-IQ system, manufactured by Tandem Diabetes Care, is an "all-in-one" diabetes management device that automatically monitors and regulates blood glucose. The artificial pancreas system has an insulin pump that is programmed with advanced control algorithms based on a mathematical model using the person's glucose monitoring information to automatically adjust the insulin dose as needed.

"After the resounding success of the system in adolescents and adults in an earlier study, it is very rewarding to see younger participants in this study benefit as well, and to the same extent," said Marc D. Breton, PhD, a UVA School of Medicine researcher who served as the trial's principal investigator. "We are excited to see the outcome of 15 years of research that led to these results acknowledged in the New England Journal of Medicine."

A Real-World Study

The randomized clinical trial enrolled 101 children ages 6 to 13 at four U.S. sites (UVA, Stanford, Yale and the University of Colorado) and assigned them to either the experimental group, which used the artificial pancreas system, or to the control group, which used a standard continuous glucose monitor and separate insulin pump. Check-ins and data collection were conducted every other week for four months. To provide the best possible real-life test of the artificial pancreas, study participants were instructed to continue their typical daily routines.

The study found that the artificial pancreas did a better job keeping the children's blood glucose in the target range: The average percentage of time in the target range during the day was 7 percentage points higher using the artificial pancreas, while nighttime control was 26 percentage points higher. Nighttime control is particularly important, as severe, unchecked hypoglycemia (very low blood-glucose levels) can lead to seizure, coma or even death.

The average amount of time overall where participants' blood-glucose levels were within the target range was 11 percentage points higher than in the control group, which equals 2.6 more hours per day in range. No cases of severe hypoglycemia or diabetic ketoacidosis (a complication caused by very high blood-glucose levels) occurred during the study.

"We are thrilled with the benefits observed in this study in school-aged children with type 1 diabetes, a population that often struggles with diabetes management for a variety of reasons," said R. Paul Wadwa, MD, the protocol chair for this trial who serves as associate professor of pediatrics at the Barbara Davis Center for Diabetes, University of Colorado Anschutz Medical Campus. "Control-IQ technology proved very easy to use for children and their parents and led to improved glucose control during both the day and night."

"We look forward to continue to enable access to this technology to even younger children with type 1 diabetes, and to develop even more advanced systems," Breton said.

Parents interested in the artificial pancreas should discuss whether it is appropriate for their children with their pediatricians.

Credit: 
University of Virginia Health System

Scientists establish first lethal mouse model for COVID-19

image: Image shows SARS-CoV-2 viral RNA detected in the nasal epithelium, olfactory bulbs, and eyes of transgenic mice expressing human ACE2, the receptor for SARS-CoV-2, by RNA in situ hybridization.

Image: 
Joseph W. Golden and Xiankun Zeng, USAMRIID

Army scientists have developed the first lethal mouse model of SARS-CoV-2, the virus that causes COVID-19, using mice that were genetically engineered to express the human ACE2 gene--a key mechanism by which the virus enters human cells. In addition to shedding light on the pathogenesis of COVID-19, this work directly contributes to the advancement of medical countermeasures against the virus.

A preview of the paper, authored by Joseph W. Golden, Ph.D. and colleagues at the U.S. Army Medical Research Institute of Infectious Diseases, appears online this week in JCI Insight, a journal published by the American Society for Clinical Investigation.

Developing a small animal model that mirrors the course of human disease is an essential step in advancing vaccines, diagnostics, and treatments to combat this international health crisis. Previous research on SARS-CoV, the virus responsible for the 2003 global outbreak of Severe Acute Respiratory Syndrome, revealed that the virus binds to target cells by means of an interaction between the 139 kDa viral spike protein and the host angiotensin-converting enzyme 2, or ACE2, protein.

The novel coronavirus, SARS-CoV-2, uses the same mechanism to infect cells. In normal mice, however, the virus does not easily bind to ACE2, making it difficult to study the course of infection. So Golden and the USAMRIID team used a special type of mice called K18-hACE2, which are specially bred with the human ACE protein instead of the mouse version. This mouse strain, developed for the original SARS-CoV outbreak by the Stanley Perlman laboratory at the University of Iowa in the early 2000s, was revived and recently produced at the Jackson Laboratory.

USAMRIID's study used two groups of 14 K18-hACE2 mice, infected with two different doses of SARS-CoV-2 administered by the intranasal route, to simulate how humans are exposed to the virus. Each group contained seven male and seven female mice. On day 3, four mice per group were euthanized to assess disease severity, while the remaining ten mice per group were monitored up to 28 days. Overall, the K18-hACE2 mice developed acute disease, including weight loss, lung injury, and brain infection, and ultimately succumbed to the disease, according to the authors.

In addition, the team infected three other types of mice--C57BL/6, BALB/c and RAG2 deficient mice--with a challenge dose of the virus. These animals did not lose weight and none succumbed to the disease.

"Given that the K18-hACE2 animal model is commercially available, it provides an important platform for evaluation of medical countermeasures across multiple laboratories," said Golden.

Credit: 
US Army Medical Research Institute of Infectious Diseases

Genomes published for major agricultural weeds

image: Amaranthus species are among the most troublesome agricultural weeds in the world. Now, scientists have assembled genomes for waterhemp, smooth pigweed, and Palmer amaranth, paving the way for more effective control strategies in the future.

Image: 
Steve Bowe, BASF

URBANA, Ill. - Representing some of the most troublesome agricultural weeds, waterhemp, smooth pigweed, and Palmer amaranth impact crop production systems across the U.S. and elsewhere with ripple effects felt by economies worldwide. In a landmark study, scientists have published the most comprehensive genome information to date for all three species, marking a new era of scientific discovery toward potential solutions.

"These genome assemblies will greatly foster further research on these difficult weed species, including better understanding the ways in which they evade damage from herbicides," says Pat Tranel, professor and associate head of the Department of Crop Sciences at the University of Illinois and co-author on the Genome Biology and Evolution study.

Draft genomes had already been published for waterhemp and Palmer amaranth, but techniques used in the Genome Biology and Evolution study provide a much clearer and richer picture of the species' gene sequences, a requisite for many genomic studies.

All three genomes were assembled using advanced long-read sequencing, which maintains the integrity and continuity of the genome similar to the way large puzzle pieces provide a clearer picture of the whole than small pieces. In Palmer amaranth, an additional sequencing technology (chromatin conformation capture sequencing) was used to further order pieces of the genome that were assembled using the long-read information.

"The goal of any genome assembly is to reveal the complete arrangement of genes in the genome, broken into chromosome-sized fragments. Unfortunately, until recently, quality genome assemblies have been very labor intensive and expensive. The previously published draft genomes for these species reported the genome broken into thousands of pieces, while the assemblies we report are down to hundreds. The vast majority of the sequence is now assembled into very large fragments," says Jacob Montgomery, a graduate student working with Tranel and first author on the study.

To further improve the assembly of the genomes for waterhemp and smooth pigweed, the team used an innovative approach known as trio binning, developed in cattle. Not only had this technique never before been fully utilized in plants, it had also not been used with parents from different species.

In normal reproduction, male and female parents each contribute one copy of every gene to their offspring. In this case, offspring are diploid, meaning they have two copies of every gene. In the study, the team created hybrid offspring from two separate species: waterhemp and smooth pigweed. These offspring are still diploid, but the trio binning technique allowed the researchers to pull apart and isolate the two copies from each parent species, resulting in haploid (single copy) genomes for each.

"This approach resolved a problem in the previous waterhemp genome assembly. When parent alleles (copies of each gene) are very different from each other, as is often the case in outcrossing species such as waterhemp, the genome assembly program interprets them to be different genes," Tranel says. "With only one allele from each species, we were able to obtain a much cleaner assembly of their gene sequences."

Detlef Weigel, director of the Max Planck Institute for Developmental Biology and co-author on the study, adds, "I am a big fan of the new advanced sequencing techniques, but even though they should theoretically be sufficient to sort out the arrangement of genes, in practice they are not. This is where genetics can help out, using information on whether genes were inherited from mom or dad. This allowed us to assign each gene to either a maternal or paternal chromosome."

The researchers specifically chose waterhemp as the male parent in the smooth pigweed × waterhemp cross because the previously published waterhemp genome was from a female plant. Tranel is pursuing research to understand the genetic basis for maleness and femaleness in waterhemp and Palmer amaranth, with potential applications toward introducing female sterility as a future control method.

"The genomes of the male waterhemp and Palmer amaranth already have enabled my group to make rapid progress on identifying the potential genes that could be responsible for the determination of sex (male or female) in both species," Tranel says.

Importantly, the genomes for all three species could start to chip away at the problem of herbicide resistance in these weeds. More and more, scientists are uncovering evidence of non-target-site or metabolic resistance in waterhemp and Palmer amaranth, allowing the weeds to detoxify herbicides before they can cause damage. Unfortunately, it is usually very difficult to determine which specific enzyme, among hundreds, is responsible for detoxifying the herbicide.

Now, researchers will essentially be able to sort through a list to find the culprit with the hope of either knocking out the enzyme responsible or modifying the herbicide molecule to evade detoxification.

"Innovation is essential for the future of agriculture. We at BASF are working continuously on improving our products and services including sustainable solutions for the management of herbicide-resistant weeds. We want to better understand the amaranth biochemical resistance mechanisms in order to offer farmers new products and solutions for optimal control of key weeds," says Jens Lerchl, head of early biology research on herbicides at BASF and study co-author. Lerchl coordinated the Palmer amaranth genome work with KeyGene/Wageningen -The Netherlands.

"The area of genome sequencing is highly dynamic. That is why BASF chose KeyGene as the partner for both latest sequencing technology and bioinformatics. Together with the expertise of the University of Illinois and Max Planck Society, we were able to compare genomes and address specific biological topics," Lerchl says. In addition to collaborating on this research, BASF is also a founding member of the International Weed Genomics Consortium, led by Colorado State University aiming at the sequencing and analysis of ten high priority key weeds.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Vertebral body tethering shows clinical success as treatment for scoliosis

image: Dan Hoernschemeyer, MD, MU Health Care pediatric orthopaedic surgeon visits with a pediatric patient and her parent. Hoernschemeyer was the principal investigator of a retrospective study examining the outcomes of vertebral body tethering patients.

Image: 
Justin Kelley, MU Health Care

Scoliosis is the most common spinal deformity affecting pediatric patients. A posterior spinal fusion (PSF) is the gold standard treatment for patients with curves exceeding 45 degrees, but the procedure's drawbacks include the loss of spinal mobility, persistent pain and adjacent segment disc disease. However, a new retrospective study from the University of Missouri School of Medicine and MU Health Care shows an alternative to PSF called vertebral body tethering (VBT) yields promising results with fewer long-term consequences for a specific group of scoliosis patients.

VBT is an alternative to PSF for scoliosis patients who still have growth remaining and flexibility of their spine. Screws are attached in a minimally invasive fashion to the thoracic or lumbar vertebra in the curved area of the spine. A polyethylene cord connects the screws and tension straightens the spine, correcting the scoliosis. MU Health Care is one of only a handful of medical centers in the U.S. offering the procedure. Pediatric orthopaedic surgeon Dan Hoernschemeyer, MD, is the principal investigator for the study and has performed more than 85 VBT surgeries on patients from around the country.

"If the child's spine has more than 45 degrees of scoliosis and still has some growth remaining, VBT is a way to correct the scoliosis, preserve motion and modulate normal growth instead of fusing it," Hoernschemeyer said. "Our study examined the radiographic and clinical outcomes of idiopathic scoliosis patients with various curve patterns treated with VBT."

Hoernschemeyer and his team conducted a retrospective review of 29 consecutive patients with 2-5 year follow-ups. Successful outcomes were defined by a curve of less than 30 degrees in patients who reached skeletally maturity and had not receive a PSF. At the latest follow-up, 27 patients reached skeletal maturity and 74% achieved clinical success.

"Despite our patient population being slightly more mature at the time of surgery than when compared to previous studies, we found a higher success rate and a lower revision rate," Hoernschemeyer said. "Our overall revision rate was 21% and a PSF was avoided in 93% of patients, indicating that VBT may be a reliable treatment option for adolescent scoliosis."

The most common complication that led to revision involved tether breakage. Fourteen patients experienced a broken tether, and half required either a revision surgery or PSF.

"While the exact role of VBT in the management of adolescent scoliosis continues to be defined, the data from this study supports the fact that this surgical procedure should be considered as a treatment option for children with scoliosis and an alternative to fusion," Hoernschemeyer said.

Credit: 
University of Missouri-Columbia

South African wildlife management/conservation models do not protect carnivores equally

image: Chris Sutherland at the University of Massachusetts Amherst, with first author and doctoral student Gonçalo Curveira-Santos of the Centre for Ecology, Evolution and Environmental Changes at the University of Lisbon, used a large network of camera traps to study occupancy of free-ranging carnivore species including leopards, hyenas, jackals and mongooses in different habitats and levels of protection in northeast South Africa.

Image: 
Gonçalo Curveira-Santos

AMHERST, Mass. - In results released this week, an international team of wildlife ecologists reports that the trend toward more reliance on private game farms and reserves to manage and conserve free-ranging carnivores in South Africa is more complicated than it appears - "a mosaic" of unequal protection across different land management types.

Chris Sutherland at the University of Massachusetts Amherst, with first author and doctoral student Gonçalo Curveira-Santos of the Centre for Ecology, Evolution and Environmental Changes at the University of Lisbon, used a large network of camera traps to study occupancy of free-ranging carnivore species including leopards, hyenas, jackals and mongooses in different habitats and levels of protection in northeast South Africa.

Curveira-Santos says, "Widespread conversion of agricultural and livestock areas for commercial wildlife industry, ecotourism and hunting is a major component of conservation in South Africa. Management initiatives and conservation outcomes are typically focused on the large charismatic species like lions or cheetahs, but we know very little about how unmanaged, free-ranging carnivores respond to landscapes defined by varying management and conservation models."

He adds, "Our results support the notion that the private reserves or game ranches play a complementary role to formal protected areas, but that it's also important to recognize they do not play the same role, and may not be a conservation panacea. For governments, it's attractive to move conservation to the private sector, but for us to assess the conservation benefits of doing so, we need some benchmarks, and protected areas under long-term formal protection are important references to a "natural state."

Writing in the Journal of Applied Ecology, the researchers explain how they conducted a survey to explore the relative conservation role of the private and formal protected areas for South African free-ranging carnivores.

With reserve rangers and staff, Curveira-Santos established a camera trap network to survey a natural quasi-experimental setting in northern KwaZulu-Natal, South Africa. The area offers a "protection gradient" from a provincial protected area, the 108-year-old uMkhuze Game Reserve, part of iSimangaliso Wetland Park UNESCO World Heritage Site, to a private ecotourism reserve, Mun-ya-wana Private Game Reserve, to commercial game ranches and traditional communal areas with villages as a disturbance reference.

Sutherland points out that this was "a phenomenal field effort" that used 294 trail cameras for an average 75 days each. The motion-activated cameras generated 7,224 images of 13 free-ranging carnivores from small mongooses to much larger leopards and spotted hyenas. The researchers analyzed multi-species site occupancy data, stratifying by four protection levels, and formally compared community patterns at several scales.

Overall, they found species number and identity was similar in the protected area, private reserve and game ranches and markedly lower in the communal area. However, they observed "important variation in species occupancy rates - as a proxy for abundance - that was mainly driven by the level and nature of protection."

They say findings provide "support for the added value of multi-tenure conservation estates augmenting and connecting South Africa's protected areas." Further, similar carnivore richness between the private reserve and game ranches and higher occupancy compared to communal lands shows that carnivores can thrive in private wildlife areas.

However, for most species, occupancy rates were highest in the formal protected area, Sutherland notes, "but clearly more research is needed to understand what factors may be hindering species recovery to the levels observed in the formal protected area, and importantly, what the ecological consequences of such patterns are."

Curveira-Santos adds, "Our point is that the formal old protected areas may play a key role that cannot be replicated easily," especially often-overlooked free-ranging carnivores outside protected areas that seem to respond differently to different management approaches. "In general terms," he adds, "our work adds to the call for a more holistic perspective of wildlife for effective conservation planning. In the meantime, ensuring the long-term maintenance of formal protected areas is probably our safest bet."

Credit: 
University of Massachusetts Amherst

Evidence of hibernation-like state in Antarctic animal

image: Life restoration of Lystrosaurus in a state of torpor

Image: 
Crystal Shin

Among the many winter survival strategies in the animal world, hibernation is one of the most common. With limited food and energy sources during winters - especially in areas close to or within polar regions - many animals hibernate to survive the cold, dark winters. Though much is known behaviorally on animal hibernation, it is difficult to study in fossils.

According to new research, this type of adaptation has a long history. In a paper published Aug. 27 in the journal Communications Biology, scientists at Harvard University and the University of Washington report evidence of a hibernation-like state in an animal that lived in Antarctica during the Early Triassic, some 250 million years ago.

The creature, a member of the genus Lystrosaurus, was a distant relative of mammals. Lystrosaurus were common during the Permian and Triassic periods and are characterized by their turtle-like beaks and ever-growing tusks. During Lystrosaurus' time, Antarctica lay largely within the Antarctic Circle and experienced extended periods without sunlight each winter.

"Animals that live at or near the poles have always had to cope with the more extreme environments present there," said lead author Megan Whitney, a postdoctoral researcher at Harvard University in the Department of Organismic and Evolutionary Biology, who conducted this study as a UW doctoral student in biology. "These preliminary findings indicate that entering into a hibernation-like state is not a relatively new type of adaptation. It is an ancient one."

The Lystrosaurus fossils are the oldest evidence of a hibernation-like state in a vertebrate animal and indicate that torpor -- a general term for hibernation and similar states in which animals temporarily lower their metabolic rate to get through a tough season -- arose in vertebrates even before mammals and dinosaurs evolved.

Lystrosaurus arose before Earth's largest mass extinction at the end of the Permian Period - which wiped out 70% of vertebrate species on land - and somehow survived. It went on to live another 5 million years into the Triassic Period and spread across swathes of Earth's then-single continent, Pangea, which included what is now Antarctica. "The fact that Lystrosaurus survived the end-Permian mass extinction and had such a wide range in the early Triassic has made them a very well-studied group of animals for understanding survival and adaptation," said co-author Christian Sidor, a UW professor of biology and curator of vertebrate paleontology at the Burke Museum.

Today, paleontologists find Lystrosaurus fossils in India, China, Russia, parts of Africa and Antarctica. The creatures grew to be 6 to 8 feet long, had no teeth, but bore a pair of tusks in the upper jaw. The tusks made Whitney and Sidor's study possible because, like elephants, Lystrosaurus tusks grew continuously throughout their lives. Taking cross-sections of the fossilized tusks revealed information about Lystrosaurus metabolism, growth and stress or strain. Whitney and Sidor compared cross-sections of tusks from six Antarctic Lystrosaurus to cross-sections of four Lystrosaurus from South Africa. During the Triassic, the collection sites in Antarctica were roughly 72 degrees south latitude -- well within the Antarctic Circle. The collection sites in South Africa were more than 550 miles north, far outside the Antarctic Circle.

The tusks from the two regions showed similar growth patterns, with layers of dentine deposited in concentric circles like tree rings. The Antarctic fossils, however, held an additional feature that was rare or absent in tusks farther north: closely-spaced, thick rings, which likely indicate periods of less deposition due to prolonged stress, according to the researchers. "The closest analog we can find to the 'stress marks' that we observed in Antarctic Lystrosaurus tusks are stress marks in teeth associated with hibernation in certain modern animals," said Whitney.

The researchers cannot definitively conclude that Lystrosaurus underwent true hibernation. The stress could have been caused by another hibernation-like form of torpor, such as a more short-term reduction in metabolism. Lystrosaurus in Antarctica likely needed some form of hibernation-like adaptation to cope with life near the South Pole, said Whitney. Though Earth was much warmer during the Triassic than today -- and parts of Antarctica may have been forested -- plants and animals below the Antarctic Circle would still experience extreme annual variations in the amount of daylight, with the sun absent for long periods in winter.

Many other ancient vertebrates at high latitudes may also have used torpor, including hibernation, to cope with the strains of winter, Whitney said. But many famous extinct animals, including the dinosaurs that evolved and spread after Lystrosaurus died out, don't have teeth that grow continuously.

"To see the specific signs of stress and strain brought on by hibernation, you need to look at something that can fossilize and was growing continuously during the animal's life," said Sidor. "Many animals don't have that, but luckily Lystrosaurus did."
If analysis of additional Antarctic and South African Lystrosaurus fossils confirms this discovery, it may also settle another debate about these ancient, hearty animals.
"Cold-blooded animals often shut down their metabolism entirely during a tough season, but many endothermic or 'warm-blooded' animals that hibernate frequently reactivate their metabolism during the hibernation period," said Whitney. "What we observed in the Antarctic Lystrosaurus tusks fits a pattern of small metabolic 'reactivation events' during a period of stress, which is most similar to what we see in warm-blooded hibernators today." If so, this distant cousin of mammals is a reminder that many features of life today may have been around for hundreds of millions of years before humans evolved to observe them.

Credit: 
Harvard University, Department of Organismic and Evolutionary Biology

Decoded: The structure of the barrier between three cells

Organs in animals and in humans have one thing in common: they are bounded by so-called epithelial cells. These, along with the muscle, connective and nervous tissues, belong to the basic types of tissue. Epithelial cells form special connections with one another in order to prevent substances or pathogens from passing between the cells, i.e. they have a protective and sealing function for the body. Researchers at the Institute of Animal Physiology at the University of Munster have now found out how two proteins called Anakonda and M6 interact in epithelial cells in fruit flies in order to produce a functioning barrier at so-called tricellular contacts.

These corner points between three cells - so-called tricellular junctions (TCJs) - are a preferred route for migrating cells as well as for bacterial pathogens entering into the body. Although the formation of the barrier function between two epithelial cells has already been well examined, much less is known about the biology of the tricellular contacts. The working group headed by Prof. Dr. Stefan Luschnig is aiming to gain a better understanding of the structure and dynamics of epithelial barriers, hoping that this can contribute in the long term to developing more effective forms of diagnosis and treatment for example of bacterial infections or inflammation reactions. The study has been published in the journal "Current Biology".

Background and method

TCJs play an essential role in the functioning of the barrier between epithelial cells and in the migration of cells across tissue boundaries. Special protein complexes at the tricellular contacts are responsible for the sealing properties of these structures. Despite the fundamental roles of tricellular contacts in epithelial biology, their molecular structure and the dynamics of their assembly and remodelling have so far been insufficiently understood.

In order to study this process, the researchers visualized the M6 protein in embryos of the fruit fly Drosophila with a fluorescent marker and, using a high-resolution confocal microscope, they observed the processes taking place in the tricellular contacts in the living cells. As a result of their studies, Stefan Luschnig and his team discovered that the M6 protein is responsible for keeping the Anakonda protein stable in its place at the cell membrane of the TCJs.

When the researchers removed the M6 protein, the Anakonda protein - though it still reached its destination at the cell membrane - was not anchored stably there. The consequence is a permeable tricellular junction. These and other findings led the researchers to conclude that the two proteins depend on each other and form a complex, which is of crucial importance for the stabilising properties of cell contacts and consequently for survival of the animal. "On the basis of these results obtained from the model organism Drosophila", says Stefan Luschnig, "we can gain fundamental insights into the structure and development of epithelial tissues in more complex animals, as well as in humans."

Credit: 
University of Münster