Tech

Aluminium alloy research could benefit manned space missions

image: The cover of Advanced Science referencing the work of Tunes and Greaves

Image: 
University of Huddersfield

The MIAMI-2 - Microscopes and Ion Accelerators for Materials Investigations - facility has helped Dr Matheus Tunes investigate a new alloy that will harden aluminium without increasing its weight significantly.

Spacecraft launched from Earth need to be light, but still have the right amount of fuel to see them achieve orbit. If too heavy, the amount of fuel required would be prohibitive. Once outside of the Earth's protective magnetic field, a vehicle may then be exposed to potentially destructive amounts of solar radiation, which becomes more important for any long duration mission such as to Mars.

Making spacecraft from aluminium is one solution, as aluminium is a light yet strong material. Alloys help aluminium become harder via precipitation strengthening, but the radiation encountered in space can dissolve the hardening precipitates with potentially disastrous and fatal consequences for astronauts.

But the research carried out at MIAMI-2 in partnership with Montanuniversitaet Leoben (MUL) in Austria has discovered that a particular hardening precipitate of a new aluminium alloy - developed by a group of metallurgists led by Professor Stefan Pogatscher (MUL) - does not dissolve when bombarded with particle radiation when compared with existing data on irradiation of conventional aluminium alloys.

The result is an alloy with a radiation resistant hardening phase called a T-phase, which has a complex crystal structure of Mg32(Zn,Al)49. The research led to a paper that has been published in the prestigious journal Advanced Science, together with an eye-catching cover.

"The idea of the paper was testing these new alloys using the MIAMI facilities, because we can subject the alloy to energetic particle radiation and, at the same time, monitor the effect of this radiation on the alloy microstructure with a transmission electron microscope", says Matheus.

"We monitored the crystallographic signal of the T-phase as the radiation increased and observed that compared with other conventional aluminium alloys, the alloy we developed was radiation tolerant - meaning that the hardening phase does not dissolve under high radiation doses.

"It sheds light on a very exciting new field of research we call 'prototypic space materials for stellar-radiation environments'. A nuclear reactor is also an extreme environment, as is the sun with solar cycles, but dynamic instabilities on the sun such as solar flares and coronal mass ejections are more extreme than anything on Earth. The sun is a very efficient nuclear fusion reactor and high-energy particle accelerator."

Dr Graeme Greaves, Senior Research Fellow at the MIAMI Facility, adds, "when Matt first came to us from Brazil as a postgraduate student he was always looking for new projects and created a number of new collaborations, and I'm very happy that as he is starting the next part of his career in Austria and expanding into new areas, he is continuing to collaborate with us here at the MIAMI facility, with this aluminium alloys project being just one example."

With manned missions to the moon and Mars currently being planned, the advantages of spacecraft that are light enough to launch and withstand radiation to protect their crews are clear. Next on the agenda for Matheus, Graeme and colleagues is to find out why the alloy behaves the way it does and what further benefits there could be.

"I am particularly proud that I finished my PhD in Huddersfield, I've now moved to Austria but still continue to work with Graeme," Matheus adds. "We have an active collaboration and 2021 will be a busy year for the joint Huddersfield-Leoben space materials research project".

"We discovered the T-phase is radiation-tolerant, but we haven't discovered why that is. We have an idea which involves the chemical complexity of the phase that we believe could lead to some very interesting research. We hope that we can make an important contribution to further human exploration of space."

Credit: 
University of Huddersfield

Scientists get the lowdown on sun's super-hot atmosphere

image: Images of the sun captured by the IRIS mission show new details of how low-lying loops of plasma are energized and may also reveal how the hot corona is created.

Image: 
Rice University/NASA

HOUSTON - (Dec. 7, 2020) - A phenomenon first detected in the solar wind may help solve a long-standing mystery about the sun: why the solar atmosphere is millions of degrees hotter than the surface.

Images from the Earth-orbiting Interface Region Imaging Spectrograph, aka IRIS, and the Atmospheric Imaging Assembly, aka AIA, show evidence that low-lying magnetic loops are heated to millions of degrees Kelvin.

Researchers at Rice University, the University of Colorado Boulder and NASA's Marshall Space Flight Center make the case that heavier ions, such as silicon, are preferentially heated in both the solar wind and in the transition region between the sun's chromosphere and corona.

There, loops of magnetized plasma arc continuously, not unlike their cousins in the corona above. They're much smaller and hard to analyze, but have long been thought to harbor the magnetically driven mechanism that releases bursts of energy in the form of nanoflares.

Rice solar physicist Stephen Bradshaw and his colleagues were among those who suspected as much, but none had sufficient evidence before IRIS.

The high-flying spectrometer was built specifically to observe the transition region. In the NASA-funded study, which appears in Nature Astronomy, the researchers describe "brightenings" in the reconnecting loops that contain strong spectral signatures of oxygen and, especially, heavier silicon ions.

The team of Bradshaw, his former student and lead author Shah Mohammad Bahauddin, now a research faculty member at the Laboratory for Atmospheric and Space Physics at Colorado, and NASA astrophysicist Amy Winebarger studied IRIS images able to resolve details of these transition region loops and detect pockets of super-hot plasma. The images allow them to analyze the movements and temperatures of ions within the loops via the light they emit, read as spectral lines that serve as chemical "fingerprints."

"It's in the emission lines where all the physics is imprinted," said Bradshaw, an associate professor of physics and astronomy. "The idea was to learn how these tiny structures are heated and hope to say something about how the corona itself is heated. This might be a ubiquitous mechanism that operates throughout the solar atmosphere."

The images revealed hot-spot spectra where the lines were broadened by thermal and Doppler effects, indicating not only the elements involved in nanoflares but also their temperatures and velocities.

At the hot spots, they found reconnecting jets containing silicon ions moved toward (blue-shifted) and away from (red-shifted) the observer (IRIS) at speeds up to 100 kilometers per second. No Doppler shift was detected for the lighter oxygen ions.

The researchers studied two components of the mechanism: how the energy gets out of the magnetic field, and then how it actually heats the plasma.

The transition region is only about 10,000 degrees Fahrenheit, but convection on the sun's surface affects the loops, twisting and braiding the thin magnetic strands that comprise them, and adds energy to the magnetic fields that ultimately heat the plasma, Bradshaw said. "The IRIS observations showed that process taking place and we're reasonably sure at least one answer to the first part is through magnetic reconnection, of which the jets are a key signature," he said.

In that process, the magnetic fields of the plasma strands break and reconnect at braiding sites into lower energy states, releasing stored magnetic energy. Where this takes place, the plasma becomes superheated.

But how plasma is heated by the released magnetic energy has remained a puzzle until now. "We looked at the regions in these little loop structures where reconnection was taking place and measured the emission lines from the ions, chiefly silicon and oxygen," he said. "We found the spectral lines of the silicon ions were much broader than the oxygen."

That indicated preferential heating of the silicon ions. "We needed to explain it," Bradshaw said. "We had a look and a think and it turns out there's a kinetic process called ion cyclotron heating that favors heating heavy ions over lighter ones."

He said ion cyclotron waves are generated at the reconnection sites. The waves carried by the heavier ions are more susceptible to an instability that causes the waves to "break" and generate turbulence, which scatters and energizes the ions. This broadens their spectral lines beyond what would be expected from the local temperature of the plasma alone. In the case of the lighter ions, there might be insufficient energy left over to heat them. "Otherwise, they don't exceed the critical velocity needed to trigger the instability, which is faster for lighter ions," he said.

"In the solar wind, heavier ions are significantly hotter than lighter ions," Bradshaw said. "That's been definitively measured. Our study shows for the first time that this is also a property of the transition region, and might therefore persist throughout the entire atmosphere due to the mechanism we have identified, including heating the solar corona, particularly since the solar wind is a manifestation of the corona expanding into interplanetary space."

The next question, Bahauddin said, is whether such phenomena are happening at the same rate all over the sun. "Most probably the answer is no," he said. "Then the question is, how much do they contribute to the coronal heating problem? Can they supply sufficient energy to the upper atmosphere so that it can maintain a multimillion-degree corona?

"What we've shown for the transition region was a solution to an important piece of the puzzle, but the big picture requires more pieces to fall in the right place," Bahauddin said. "I believe IRIS will be able to tell us about the chromospheric pieces in the near future. That will help us build a unified and global theory of the sun's atmosphere."

Credit: 
Rice University

Image-based navigation could help spacecraft safely land on the moon

TROY, N.Y. -- In order for future lunar exploration missions to be successful and land more precisely, engineers must equip spacecraft with technologies that allow them to "see" where they are and travel to where they need to be. Finding specific locations amid the moon's complicated topography is not a simple task.

In research recently published in the AIAA Journal of Spacecraft and Rockets, a multidisciplinary team of engineers demonstrated how a series of lunar images can be used to infer the direction that a spacecraft is moving. This technique, sometimes called visual odometry, allows navigation information to be gathered even when a good map isn't available. The goal is to allow spacecraft to more accurately target and land at a specific location on the moon without requiring a complete map of its surface.

"The issue is really precision landing," said John Christian, an associate professor of aerospace engineering at Rensselaer Polytechnic Institute and first author on the paper. "There's been a big drive to make the landing footprint smaller so we can go closer to places of either scientific interest or interest for future human exploration."

In this research, Christian was joined by researchers from Utah State University and Intuitive Machines, LLC (IM) in Houston, Texas. NASA has awarded IM multiple task orders under the agency's Commercial Lunar Payload Services (CLPS) initiative. IM's inaugural IM-1 mission will deliver six CLPS payloads and six commercial payloads to Oceanus Procellarum in the fourth quarter of 2021. Their IM-2 commercial mission will deliver a NASA drill and other payloads to the lunar south pole in the fourth quarter of 2022.

"The interdisciplinary industry/academia team follows in the footsteps of the NASA Autonomous Hazard Avoidance and Landing Technology (ALHAT) project which was a groundbreaking multi-center NASA/industry/academia effort for precision landing," said Timothy Crain, the Vice President of Research and Development at IM. "Using the ALHAT paradigm and technologies as a starting point, we identified a map-free visual odometry technology as being a game-changer for safe and affordable precision landing."

In this paper, the researchers demonstrated how, with a sequence of images, they can determine the direction a spacecraft is moving. Those direction-of-motion measurements, combined with data from other spacecraft sensors and information that scientists already know about the moon's orientation, can be substituted into a series of mathematical relationships to help the spacecraft navigate.

"This is information that we can feed into a computer, again in concert with other measurements, that all gets put together in a way that tells the spacecraft where it is, where's it's going, how fast it's going, and what direction it's pointed," Christian said.

Credit: 
Rensselaer Polytechnic Institute

Risk of vine-to-vine spread of Xylella fastidiosa is greatest in July and August

image: Missing vines due to X. fastidiosa

Image: 
Mark Sisterson

The bacterial plant pathogen Xylella fastidiosa is a worldwide threat to perennial tree and vine crops and has been linked to Pierce's disease of grapevine in California, olive quick decline in Italy, and citrus variegated chlorosis in South America.

Scientists know that seasonality plays an important role in the spread of X. fastidiosa, but there are limited field data available. Scientists also know that epidemics of Pierce's disease in the southern San Joaquin Valley of California are associated with high abundance of the invasive glassy-winged sharpshooter that transmits X. fastidiosa.

"Managing the spread of X. fastidiosa is challenging due to a lack of field data on seasonal changes in vector abundance, proportion of vector population carrying the pathogen, and probability of acquisition from infected plants," explained Mark Sisterson, a vector entomologist with the Agricultural Research Service-USDA.

To gather more data, Sisterson and colleagues conducted a field study in the San Joaquin Valley to determine the time of year that vine-to-vine transmission of X. fastidiosa was most likely to occur. They found that grapevines were more likely to test positive for the pathogen in July and August than in spring. They also found that more glassy-winged sharpshooters tested positive for X. fastidiosa in July and August than in the spring.

"Accordingly, risk of vine-to-vine spread of X. fastidiosa is greatest in July and August," said Sisterson. "These findings will improve timing of insecticide applications to reduce glassy-winged sharpshooter populations, thereby reducing spread of X. fastidiosa."

Sisterson also notes that in other parts of California, studies have shown higher rates of vine recovery from infection over the winter when vines become infected during late summer compared to vines infected earlier in the season. Results from this study suggest that late season infections in the southern San Joaquin Valley are more likely to persist the following year and highlights the need for regionally specific data to inform vineyard management decisions.

Credit: 
American Phytopathological Society

Pupils can learn more effectively through stories than activities

image: Students used salt dough to study either the differences between trilobites or the bones in human and animal limbs.

Image: 
University of Bath

Storytelling - the oldest form of teaching - is the most effective way of teaching primary school children about evolution, say researchers at the Milner Centre for Evolution at the University of Bath.

A randomised controlled trial found that children learn about evolution more effectively when engaged through stories read by the teacher, than through doing tasks to demonstrate the same concept.

The scientists investigated several different methods of teaching evolution in primary schools, to test whether a pupil-centred approach (where pupils took part in an activity) or a teacher-centred approach (where pupils were read a story by the teacher), led to a greater improvement in understanding of the topic.

They also looked at whether using human-based examples of evolution (comparing arm bones in humans with those in animals), or more abstract examples that were harder to emotionally engage with (comparing the patterns of trilobites), produced better results in terms of the children's understanding of evolution.

Whilst all the methods improved the pupils' understanding of evolution, the study, published in the journal Science of Learning, found that the story-based approach combined with the abstract examples of evolution were the most effective lessons.

This goes against educational orthodoxy that states that a pupil-centred approach to learning, using human-based examples with which children can easily identify, should yield the best results.

The study recruited 2500 primary school students who were tested for understanding of evolutionary concepts before and after the lessons.

Professor Laurence Hurst, Director of the Milner Centre for Evolution at the University of Bath, led the study.

He said: "We were really surprised by the results - we expected that pupils would be more engaged with an activity rather than listening to a story, and that children would identify more strongly with the human-based examples of evolution than the somewhat abstract example of trilobites, but in fact the opposite was true.

"This is the first large randomised controlled trial that is evaluating the effectiveness of different methods of teaching, using similar scientific methods to those used in drug interaction trials to test whether a new treatment works.

"Our results show that we should be careful about our preconceptions of what works best.

"We only tested the teaching of evolution in this way - it would be interesting to see if these findings also applied to other subjects of the curriculum."

Professor Momna Hejmadi, Associate Dean of the University's Faculty of Science, helped to design the study and co-authored the paper. She said: "Evolution was introduced to the national curriculum for primary schools in 2014.

"It's a really important subject as it forms the foundation for many parts of biology. However, many primary school teachers, if they don't have a science background, are less confident about teaching it.

"At the Milner Centre for Evolution, we've developed a range of free lesson plans using really cheap teaching materials, as well as a free online course for teachers to help them engage their pupils with this important subject.

"We'd like to thank the schools who took part in the study, especially the teachers who delivered the lessons. We hope they can continue to successfully use these resources in future years."

Credit: 
University of Bath

New transistor design disguises key computer chip hardware from hackers

image: The four transistors on this chip were built out of a 2D material that disguises them from hackers.

Image: 
Purdue University photo/John Underwood

WEST LAFAYETTE, Ind. -- A hacker can reproduce a circuit on a chip by discovering what key transistors are doing in a circuit - but not if the transistor "type" is undetectable.

Purdue University engineers have demonstrated a way to disguise which transistor is which by building them out of a sheet-like material called black phosphorus. This built-in security measure would prevent hackers from getting enough information about the circuit to reverse engineer it.

The findings appear in a paper published Monday (Dec. 7) in Nature Electronics.

Reverse engineering chips is a common practice - both for hackers and companies investigating intellectual property infringement. Researchers also are developing x-ray imaging techniques that wouldn't require actually touching a chip to reverse engineer it.

The approach that Purdue researchers have demonstrated would increase security on a more fundamental level. How chip manufacturers choose to make this transistor design compatible with their processes would determine the availability of this level of security.

A chip computes using millions of transistors in a circuit. When a voltage is applied, two distinct types of transistors - an N type and a P type - perform a computation. Replicating the chip would begin with identifying these transistors.

"These two transistor types are key since they do different things in a circuit. They are at the heart of everything that happens on all our chips," said Joerg Appenzeller, Purdue's Barry M. and Patricia L. Epstein Professor of Electrical and Computer Engineering. "But because they are distinctly different, the right tools could clearly identify them - allowing you to go backwards, find out what each individual circuit component is doing and then reproduce the chip."

If these two transistor types appeared identical upon inspection, a hacker wouldn't be able to reproduce a chip by reverse engineering the circuit.

Appenzeller's team showed in their study that camouflaging the transistors by fabricating them from a material such as black phosphorus makes it impossible to know which transistor is which. When a voltage toggles the transistors' type, they appear exactly the same to a hacker.

While camouflaging is already a security measure that chip manufacturers use, it is typically done at the circuit level and doesn't attempt to obscure the functionality of individual transistors - leaving the chip potentially vulnerable to reverse engineering hacking techniques with the right tools.

The camouflaging method that Appenzeller's team demonstrated would be building a security key into the transistors.

"Our approach would make N and P type transistors look the same on a fundamental level. You can't really distinguish them without knowing the key," said Peng Wu, a Purdue Ph.D. student of electrical and computer engineering who built and tested a prototype chip with black phosphorus-based transistors in the Birck Nanotechnology Center of Purdue's Discovery Park.

Not even the chip manufacturer would be able to extract this key after the chip is produced.

"You could steal the chip, but you wouldn't have the key," Appenzeller said.

Current camouflaging techniques always require more transistors in order to hide what's going on in the circuit. But hiding the transistor type using a material like black phosphorus - a material as thin as an atom - requires fewer transistors, taking up less space and power in addition to creating a better disguise, the researchers said.

The idea of obscuring the transistor type to protect chip intellectual property originally came from a theory by University of Notre Dame professor Sharon Hu and her collaborators. Typically, what gives N and P type transistors away is how they carry a current. N type transistors carry a current by transporting electrons while P type transistors use the absence of electrons, called holes.

Black phosphorus is so thin, Appenzeller's team realized, that it would enable electron and hole transport at a similar current level, making the two types of transistors appear more fundamentally the same per Hu's proposal.

Appenzeller's team then experimentally demonstrated the camouflaging abilities of black phosphorus-based transistors. These transistors are also known to operate at the low voltages of a computer chip at room temperature due to their smaller dead zone for electron transport, described as a small "band gap."

But despite the advantages of black phosphorus, the chip manufacturing industry would more likely use a different material to achieve this camouflage effect.

"The industry is starting to consider ultrathin, 2D materials because they would allow more transistors to fit on a chip, making them more powerful. Black phosphorus is a little too volatile to be compatible with current processing techniques, but showing experimentally how a 2D material could work is a step toward figuring out how to implement this security measure," Appenzeller said.

Credit: 
Purdue University

Study finds large-scale expansion of stem rust resistance gene in barley and oat lineages

image: Asyraf Hatta (top left) and corresponding authors - Guru Radhakrishnan (top right), Sambasivam Periyannan (bottom right) and Brande Wulff (bottom left)).

Image: 
John Innes Centre

Stem rust is one of the most devastating fungal diseases of wheat and historically has caused dramatic, widespread crop failures resulting in significant yield losses around the world. Stem rust epidemics in major wheat growing areas could cause a major threat to global food security. Scientists have identified a resistance gene, Sr22, as one of the few characterized genes that protects against a large array of stem rust races.

Given its effectiveness against stem rust, Sr22 is an important gene. It was recently incorporated into a multi-Sr transgene stack and found to achieve complete field-immunity to stem rust. As a result of this success, scientists are looking for ways to deploy the gene in the field.

A new study in the MPMI journal describes the functional and evolutionary characterization of Sr22, based on a comprehensive search of the genomes and transcriptomes of 80 plant species. The study found that the gene is conserved among grasses in the Triticeae and Poeae lineages.

"We originally set out to mine Sr22 alleles and their function then expanded the work to include a large-scale comparison of the Sr22 locus across monocot species," explained Dr. Guru Radhakrishnan who works for the John Innes Center in the United Kingdom. "This is when we discovered the surprising large-scale expansion of the Sr22 locus in the barley and oat lineages."

This study also describes the sequence variation between different Sr22 alleles, which may be due to intra-allelic recombination. Three of the alleles were functionally characterized in transgenic wheat and two of these were found to confer resistance to the notorious Ug99 isolate of the wheat stem rust pathogen.

"To our knowledge, this is the first study to comprehensively explore the evolution of a resistance gene across a broad range of monocot lineages in addition to exploring allelic variation between accessions of monocot species," added first author Dr. M. Asyraf Md. Hatta. "With more high-quality monocot genome and transcriptome assemblies becoming available, such studies are expected to provide valuable insights on the evolution of resistance genes in this agriculturally important group of plants."

Their study contributes valuable knowledge on plant disease resistance gene function and evolution, which can facilitate the improvement of crops against agriculturally important diseases, such as stem rust. To learn more, read "Extensive Genetic Variation at the Sr22 Wheat Stem Rust Resistance Gene Locus in the Grasses Revealed Through Evolutionary Genomics and Functional Analyses" published in the November issue of the MPMI journal.

Credit: 
American Phytopathological Society

Researchers call for renewed focus on thermoelectric cooling

image: Zhifeng Ren, right, director of the Texas Center for Superconductivity at the University of Houston, and researcher Jun Mao are calling for renewed emphasis on developing new materials for thermoelectric cooling.

Image: 
University of Houston

Almost 200 years after French physicist Jean Peltier discovered that electric current flowing through the junction of two different metals could be used to produce a heating or cooling effect, scientists continue to search for new thermoelectric materials that can be used for power generation.

Researchers writing in Nature Materials, however, say it is time to step up efforts to find new materials for thermoelectric cooling.

Bismuth tellurium compounds have been used for thermoelectric cooling for more than 60 years, and the researchers say the fact that there is already a commercial demand for the technology suggests better materials can expand the market.

"Most work is focused on high temperature materials for power generation, but there's no market there yet," said Zhifeng Ren, director of the Texas Center for Superconductivity at the University of Houston and corresponding author for the paper. "Cooling is an existing market, a billion dollar market, and there has not been much progress on materials."

He and co-authors Jun Mao, a researcher at TcSUH, and Gang Chen, a mechanical engineer and nanotechnologist at the Massachusetts Institute of Technology, call for increased focus on the development of new advanced materials that work at or near room temperature.

The three were part of a group that in 2019 reported in the journal Science a new material that works efficiently at room temperature while requiring almost no costly tellurium, a major component of the current state-of-the-art material.

The material, comprised of magnesium and bismuth, was almost as efficient as the traditional bismuth-tellurium material. Work to improve the material is ongoing, Ren said.

Thermoelectric materials work by exploiting the flow of heat current from a warmer area to a cooler area, providing an emission-free source of energy. The materials can be used to turn waste heat - from power plants, automobile tailpipes and other sources - into electricity, and a number of new materials have been reported for that application, which requires materials to perform at higher temperatures.

Thermoelectric cooling modules have posed a greater challenge because they have to work near room temperature, making it more difficult to achieve a high thermoelectric figure-of-merit, a metric used to determine how efficiently a material works. Thermoelectric materials used for power generation more easily achieve a high figure-of-merit because they operate at higher temperatures - often around 500 Centigrade, or about 930 Fahrenheit.

But there are also advantages to thermoelectric cooling devices: they are compact, operate silently and can almost instantaneously switch between heating and cooling, allowing precise temperature control. They also operate without generating ozone-damaging greenhouse gases.

They are used mainly for small applications, including the transport of medical supplies and cooling laser diodes.

"For large-scale cooling devices, a compressor is still more efficient," said Ren, who is also M.D. Anderson Chair Professor of Physics. "For smaller systems or for any cooling application requiring very precise temperature control, regular compressor-driven cooling is not as good."

But the discovery of new and better materials could expand the market.

"If you can find materials with a higher figure-of-merit, you can have a very competitive performance for refrigerators or even air conditioning," Ren said. "It's not there yet, but I don't see why it cannot be in the future."

Credit: 
University of Houston

Hard and fast emission cuts slow warming in the next 20 years

A new study shows that strong and rapid action to cut emissions of carbon dioxide and other greenhouse gases will help to slow down the rate of global warming over the next twenty years.

This highlights that immediate action on climate change can bring benefits within current lifetimes, and not just far into the future.

Scientists already agree that rapid and deep emissions reductions made now will limit the rise in global temperatures during the second half of the century.

However, pinpointing shorter-term benefits over the next few decades has been more challenging, particularly as natural cycles in global atmosphere and ocean systems can cause slow ups and downs in temperature that temporarily mask human influence on the climate.

But, by using a novel approach that combines large amounts of data from different sources, a new study from the University of Leeds has untangled human-induced warming from natural variability on much shorter timescales than previously thought possible.??

The study, published in Nature Climate Change, used thousands of simulations from different climate models alongside multiple estimates of observed natural climate variability to investigate how various levels of emissions cuts could affect the speed of global warming over the next two decades.

The findings show that reducing emissions in line with the Paris Agreement, and in particular with its aim to pursue efforts to stabilise global warming at 1.5°C above pre-industrial levels, has a substantial effect on warming rates over the next 20 years, even after natural variability is taken into account.

In fact, the risk of experiencing warming rates that are stronger than anything previously seen would be 13 times lower with rapid and deep emissions cuts, compared to an "average" future that continues to rely heavily on fossil fuels. A fossil-fuel heavy future could see temperatures rise by up to 1-1.5°C in the next 20 years - meaning the Paris Agreement temperature limits will be breached well before 2050.

The study's lead author, Dr Christine McKenna, is a Postdoctoral Research Fellow at Leeds's School of Earth and Environment, working on the EU-funded CONSTRAIN?project.

Dr McKenna said: "Our results show that it's not only future generations that will feel the benefits of rapid and deep cuts in emissions. Taking action now means we can prevent global warming from accelerating in the next few decades, as well as get closer to the goal of limiting warming in the longer term.

"It will also help us to avoid the impacts that more rapid and extreme temperature changes could bring.

"With global temperatures currently rising at around 0.2?C per decade, without urgent action on climate change we are clearly in danger of breaching the Paris Agreement. These findings are further motivation for both governments and non-state actors to set stringent greenhouse gas mitigation targets, combining a green recovery from the economic impacts of coronavirus with reaching net-zero emissions as soon as possible."

Credit: 
University of Leeds

Research brief: Global trends in nature's contributions to people

image: Lettuce rows

Image: 
Kate Brauman from the University of Minnesota

In a new study published today in the Proceedings of the National Academy of Sciences, a research team co-led by the University of Minnesota, examined the risks to human well-being and prosperity stemming from ongoing environmental degradation.

"There are many ways that nature provides benefits to people -- from the production of material goods to non-material benefits, and the benefits of natural ecology that regulate environmental conditions," said Kate Brauman, lead author and a lead scientist at the U of M Institute on the Environment (IonE). "We are in a much better position to identify the problems in the way we are managing nature, and that gives us a path forward to manage it better."

The study looked at a variety of peer-reviewed papers addressing wide-ranging elements of trends in nature and associated impacts on people. The study found that:
global declines in most of nature's contributions to people over the past 50 years, such as natural regulations of water pollutants;
negative impacts on people's well-being are already occurring, including reductions in crop yields from declining pollinator populations and soil productivity and increased exposure to flooding and storms as coastal ecosystems are degraded; and
understanding and tracking nature's contributions to people provides critical feedback that can improve our ability to manage earth systems effectively, equitably and sustainably.

"This paper highlights the value of nature's contributions to our well-being," said co-author Steve Polasky, an IonE fellow and a professor in the College of Biological Sciences. "By making these values more visible, we hope that actions are taken to protect nature, so that nature can continue to provide benefits for future generations."

Credit: 
University of Minnesota

Deep rooted -- mother's empathy linked to 'epigenetic' changes to the oxytocin gene

image: Methylation of the oxytocin gene (OXT) is positively correlated with personal distress, a negative emotional response to others' negative emotions and an element of empathy (A-D).

Image: 
Akemi Tomoda from University of Fukui, Japan

Our ability to feel and understand the emotions of others, or "empathy," is at the core of our prosocial behaviors such as cooperation and caregiving. Scientists have recognized two types of empathy: cognitive and affective. Cognitive empathy involves understanding another person's emotions on an intellectual level, taking into consideration someone's situation and how they would react (for example, "putting yourself in someone else's shoes"). Affective empathy, on the other hand, is a kind of emotional contagion, where you feel someone's emotion instinctively after observing their expression or other mood indicators. Both these types strongly predict how parents behave with their children and can subsequently influence child psychological development. Therefore, understanding how empathy is shaped can help us to decipher parental behavior.

When it comes to biological mechanisms of empathy, scientists are particularly interested in oxytocin, the so-called "love hormone." High oxytocin levels predict sensitive parenting, but it isn't clear how the oxytocin-related gene might generate variation in empathy and parental behavior. One possible explanation is epigenetic changes to the gene--a way of altering gene function without changing the actual DNA sequence. Specifically, "DNA methylation"--the addition of a chemical group called the "methyl" group at specific locations--in the oxytocin gene (called OXT) has been associated with personality traits and brain structure in humans. This raises a question: can methylation of OXT influence empathy in mothers? A team of scientists at University of Fukui in Japan, led by Prof. Akemi Tomoda, decided to find out, in a study published in Psychoneuroendocrinology.

Specifically, the scientists wanted to investigate how methylation of OXT, brain structure, and empathy are related in mothers. For this, they measured OXT methylation through analyses of saliva samples from 57 Japanese mothers who were caring for at least one young child. Moreover, they used an MRI technique called "voxel-based morphometry" to examine the size of brain regions related to OXT methylation, aiming to identify any connections between brain morphology and DNA methylation. This is part of an exciting new field called "imaging epigenetics" that seeks to explain behavior through linking epigenetic changes with brain structures and/or functions. Finally, they used a well-established psychology questionnaire to determine the levels of cognitive and affective empathy they have.

The findings showed that OXT methylation was positively correlated with a mother's "personal distress," relating to harsh parenting. Additionally, OXT methylation was negatively correlated with the volume of gray matter in the right inferior temporal gyrus. In other words, high methylation of the oxytocin gene lowered brain volume in the inferior temporal gyrus while increasing personal distress. "This is the first study to find a correlation between DNA methylation of the oxytocin gene with empathy, and the first to link that methylation with both empathy and variation in brain structure," Prof. Tomoda commented. "So, we've gained very important insight into the relationship between this gene and the phenotype--or the physical manifestation of gene expression."

The researchers also used statistical analyses to find out whether DNA methylation affected changes to brain structure, or vice versa. But they did not find a significant effect of gray matter volume of the inferior temporal gyrus on OXT methylation and empathy. This means that brain structure did not appear to mediate the relationship between epigenetic changes to the OXT gene and empathy.

These findings shed light on the complex processes involved in maternal empathy, which could have a real contribution in understanding psychological development in children. As Prof. Tomoda explains, "Our study really helps to clarify the link between oxytocin gene methylation and parental empathy, as well as the effects on empathy-related parts of the brain. This understanding augments efforts to better understand maltreated children and contributes to their healthy development."

Credit: 
University of Fukui

ASH: Off-the-shelf immune drug shows promise in aggressive multiple myeloma

PHILADELPHIA-- A subcutaneous injection of the immune-boosting drug teclistamab was found to be safe and elicit responses in a majority of patients with relapsed or refractory multiple myeloma, according to findings from a multi-institutional phase I study being presented by Alfred L. Garfall, MD, an assistant professor of Medicine in the division of Hematology-Oncology in the Perelman School of Medicine at the University of Pennsylvania, at the annual American Society of Hematology & Exposition Meeting on Dec. 5 (abstract #180).

Of 22 patients treated with the injection dose of teclistamab, who were also chosen for the phase II study, 74 percent experienced a partial response or better.

Teclistamab is a bispecific antibody that activates T cells to attack multiple myeloma cells expressing BCMA, or B cell maturation antigen. These updated results from the phase I show, for the first time, the safety and efficacy of the more convenient injection form of the drug -- which was previously reported to be safe and elicit responses when administered intravenously in May 2020.

"These are exciting results for multiple myeloma patients," Garfall said. "To have a single, subcutaneous-injectable drug that is effective in patients whose disease had become resistant to so many prior therapies, is well tolerated, and often yields long-lasting responses is a promising achievement."

Patients who received both the injection (n=65) and infusion (n=84) experienced side effects commonly associated with immune-boosting drugs. Grade 1-2 cytokine release syndrome occurred in 54 percent and 57 percent of patients with intravenous and subcutaneous dosing, respectively, and typically occurred one to two days after the drugs were administered. Other side effects included anemia (55 percent), neutropenia (57 percent), thrombocytopenia (40 percent), and leukopenia (28 percent). Side effects, the authors reported, significantly subsided after the first two weeks. All events were considered within a manageable safety profile.

The overall response rate among the 68 patients treated with the most active intravenous and subcutaneous doses was 69 percent, including 59 percent with very good partial responses or better and 26 percent with complete responses or better.

Penn serves as the leading participant in the global study, enrolling more than 35 patients out of the 149 participants for the trial.

The results offer up promise for relapsed or refractory multiple myeloma patients, who tend to have poor prognoses once they've exhausted other treatment avenues, with a median overall survival of about eight months.

Among 47 patients who responded to teclistamab at the most active dose levels, 94 percent remain on teclistamab after a median follow-up 6.5 months, with some ongoing responses up to 21 months in duration.

"Teclistamab takes a similar approach to cellular therapies, which genetically engineer a patient's T cells to find and destroy cancer cells," Garfall said. "Except, this is jumpstarting the immune system with a single, off-the-shelf drug that takes 15 minutes to administer, in contrast to cellular therapies that take several weeks to manufacture for each patient."

Credit: 
University of Pennsylvania School of Medicine

Study reveals surprising benefit of clonal hematopoiesis in allogeneic transplants

image: Christopher Gibson, MD

Image: 
Dana-Farber Cancer Institute

Clonal hematopoiesis (CH) is a recently identified condition in which mutations associated with blood cancers are detected in the blood of some healthy, usually older, individuals who don't have cancer. People with CH, while asymptomatic, have an elevated risk of developing blood cancers and other negative health outcomes, including heart attacks and strokes.

In a surprising twist, a study by Dana-Farber Cancer Institute scientists has revealed for the first time that CH can - in the right context - confer a health benefit. That context is in the setting of allogeneic stem cell or bone marrow transplants. The researchers report today at the virtual 62nd American Society of Hematology (ASH) Annual Meeting that patients who received transplants from older donors with CH had a lower risk of relapse and longer survival compared with patients who got transplants from donors without CH.

"Because clonal hematopoiesis in the non-transplant setting is associated with adverse outcomes, we initially expected to see something similar in recipients of transplants from donors with CH," said Dana-Farber's Christopher Gibson, MD, who co-led the study with R. Coleman Lindsley, MD, PhD. "However, we largely found the opposite: donor CH is actually associated with better survival in most transplant recipients due to a reduced risk of relapse from their underlying cancer."

The term clonal hematopoiesis refers to a genetically distinct subpopulation, or clone, of blood cells that share a unique mutation. Its prevalence is low in younger people but is estimated to occur in 10-20% of the population over age 70.

The Dana-Farber scientists had previously shown that CH could be unknowingly passed from donor to recipient during transplantation. "The only way to detect CH is to perform genetic sequencing of blood, which is not a routine part of the workup for prospective transplant donors," said Gibson. "We were the first to show that passing CH from donor to recipient can occur without causing a new leukemia to arise in donor cells, but our study was not powered to assess the impact on other outcomes. We've been working on the follow-up study ever since."

They evaluated the impact of CH in donors aged 40 years or older on recipient clinical outcomes in 1,727 donor-recipient pairs. The investigators identified CH in 388 of the 1,727 donor samples. The most common mutations found in the donor samples were in the gene DNMT3A. Those mutations were specifically associated with the improved overall survival and reduced risk of relapse in transplant recipients. Other gene mutations found in the samples were not associated with the survival benefit.

"We are not yet sure why donor DNMT3A mutations reduce the risk of relapse, but our data suggest that they improve the immune activity of donor T cells, which are one of the most critical determinants of transplant efficacy," said Gibson. This theory fits with data from the trial showing that transplant recipients who received the drug cyclophosphamide to prevent graft-versus-host disease did not benefit from transplants from CH donors. That was likely because cyclophosphamide eliminates donor T cells from the graft as a means of preventing chronic graft-versus-host disease. In all other patients, on balance, despite the higher risk of chronic graft-versus-host disease, the reduction in relapse outweighed that negative outcome and yielded better survival with the CH donors.

"Our findings are exciting because they have the potential for an immediate impact on clinical care," said Gibson. For example, some transplant centers have been excluding potential donors found to have CH during their pre-transplant workup. Gibson said the new findings show that this is not necessary, and, in certain circumstances, donors with DNMT3A mutations could be preferable to similar donors without that mutation.

Gibson will present findings on this study during Session 732, Abstract 80 on Saturday, Dec. 5 at 11:45 a.m. EST.

Credit: 
Dana-Farber Cancer Institute

Green energy transition: Early and steady wins the race

image: Assistant Professor Marta Victoria

Image: 
Ida Jensen, AU Photo

Researchers from Aarhus University have modelled the decarbonisation of the sector-coupled European energy system using very high-resolution data. The results are clear: To reach climate-neutrality by 2050 we need solar energy. And lots of it.

What's the cheapest, easiest way to honour the Paris Agreement of limiting the global warming to 1.5 degrees Celsius? A clear and strong investment in wind and solar power. Starting now.

That's the message in a new scientific paper published in Nature Communications, where Aarhus University researchers have modelled the decarbonisation of the sector-coupled European energy system using uninterrupted high-res hourly data for every European and Scandinavian country and network interconnectivity.

Using the university's supercomputer, PRIME, the researchers have modelled how to modify the production of electricity, heating and transport sector energy, so to make sure that there's enough of everything for every possible hour, even in the coldest weeks of winter.

"We ask the question of which energy strategy to employ in order to reach the 2050 goal. We have a 'carbon budget' - a maximum amount of CO2 we can emit - and how do we make sure, that by 2050 we reach climate-neutrality in the cheapest and most feasible way?" asks Assistant Professor Marta Victoria, an expert in photovoltaics (PV) and energy systems at the Department of Engineering, Aarhus University.

She continues:

"There are two scenarios: Early and steady or late and rapid. Our model clearly shows that the cost optimised solution is to act now. To be ambitious in the short term. And we find solar energy and onshore and offshore wind to be the cost optimised cornerstone in a fully decarbonised 2050 energy system."

Marta Victoria highlights, that both paths require a massive deployment of wind and solar PV during the next 30 years.

The required installation rates are similar to historical maxima making the transition challenging, yet possible.

"It's not an easy task," she emphasises:

"In some years, we will have to install more than a 100 Gigawatts of solar PV and wind power, and to achieve full decarbonisation the CO2 prices will have to be a lot higher than today."

The paper illustrates a slowly inclining CO2 price that maximises around 400 €/ton in the year 2050 - around 20 times higher than today's prices. Needed, in order to favour the renewable transition, Marta states.

The model also includes hydro power and - to account for so-called 'nightmare weeks' -
a small amount of gas-based electricity and heating production plus energy storage facilities:

"District heating systems are efficient for very cold and critical periods where electricity demand and heating demand is high, but wind and solar energy production is low. Large hot water tanks discharge during those weeks. This way we make sure, that the future energy systems works for every possible scenario."

Credit: 
Aarhus University

Baby's first breath triggers life-saving changes in the brain

image: Douglas A. Bayliss, PhD, and colleagues have discovered a signaling system within the brainstem that activates almost immediately at birth to support early breathing.

Image: 
Dan Addison | UVA Communications

There are few moments in life as precious, as critical and as celebrated as baby's first breath. New research from the University of Virginia School of Medicine sheds light on the lifelong changes in breathing systems that occur precisely with that first breath - and may offer important insights into Sudden Infant Death Syndrome (SIDS).

A team of researchers led by UVA's Yingtang Shi, MD; Patrice Guyenet, PhD; and Douglas A. Bayliss, PhD, have discovered a signaling system within the brainstem that activates almost immediately at birth to support early breathing. That first gasp that every parent cherishes appears to trigger this support system.

"Birth is traumatic for the newborn, as the baby has to independently take control over various important body functions, including breathing," said Bayliss, chairman of UVA's Department of Pharmacology. "We think that activation of this support system at birth provides an extra safety factor for this critical period."

Regulating Baby's Breathing

The new findings help researchers understand how breathing transitions from a fragile state susceptible to brain-damaging and potentially deadly pauses early in development to a stable and robust physiological system that flawlessly supplies the body with oxygen for the rest of our lives. Before a baby is born, breathing is not required and breathing movements occur only intermittently, so the transition at birth can be a highly vulnerable time.

Bayliss and his colleagues at UVA, working with researchers at the University of Alberta and Harvard University, found that a specific gene is turned on immediately at birth in a cluster of neurons that regulate breathing selectively in mice. This gene produces a peptide neurotransmitter - a chain of amino acids that relays information between neurons. This transmitter, called PACAP, starts to be released by these neurons just as the baby emerges into the world.

The scientists determined that suppressing the peptide in mice caused breathing problems and increased the frequency of apneas, which are potentially dangerous pauses in breathing. These apneas further increased with changes in environmental temperature. These observations suggest that problems with the neuropeptide system may contribute to SIDS.

Understanding SIDS

SIDS, also known as crib death, is the sudden unexplained death of a child less than a year of age. It is the leading cause of infant mortality in Western countries. SIDS is attributed to a combination of genetic and environmental factors, including temperature. UVA's new research suggest that problems with the neuropeptide system may increase babies' susceptibility to SIDS and other breathing problems.

PACAP is the first signaling molecule shown to be massively and specifically turned on at birth by the breathing network, and it has been linked genetically to SIDS in babies. The causes of SIDS likely are complex, and there may be other important factors to discover, the researchers note.

"These finding raise the interesting possibility that additional birth-related changes may occur in the control systems for breathing and other critical functions," Bayliss said. "We wonder if this could be a general design principle in which fail-safe support systems are activated at this key transition period, and that understanding those may help us better treat disorders of the newborn."

Credit: 
University of Virginia Health System