Tech

Photonics researchers report breakthrough in miniaturizing light-based chips

image: A schematic drawing shows an electro-optical modulator developed in the lab of Qiang Lin, professor of electrical and computer engineering. The smallest such component yet developed, it takes advantage of lithium niobate, a "workhorse" material used by researchers to create advanced photonics integrated circuits.

Image: 
University of Rochester illustration / Michael Osadciw

Photonic integrated circuits that use light instead of electricity for computing and signal processing promise greater speed, increased bandwidth, and greater energy efficiency than traditional circuits using electricity.

But they're not yet small enough to compete in computing and other applications where electric circuits continue to reign.

Electrical engineers at the University of Rochester believe they've taken a major step in addressing the problem. Using a material widely adopted by photonics researchers, the Rochester team has created the smallest electro-optical modulator yet. The modulator is a key component of a photonics-based chip, controlling how light moves through its circuits.

In Nature Communications, the lab of Qiang Lin, professor of electrical and computer engineering, describes using a thin film of lithium niobate (LN) bonded on a silicon dioxide layer to create not only the smallest LN modulator yet, but also one that operates at high speed and is energy efficient.

This "paves a crucial foundation for realizing large-scale LN photonic integrated circuits that are of immense importance for broad applications in data communication, microwave photonics, and quantum photonics," writes lead author Mingxiao Li, a graduate student in Lin's lab.

Because of its outstanding electro-optic and nonlinear optic properties, lithium niobate has "become a workhorse material system for photonics research and development," Lin says. "However current LN photonic devices, made upon either bulk crystal or thin-film platform require large dimensions and are difficult to scale down in size, which limits the modulation efficiency, energy consumption, and the degree of circuit integration. A major challenge lies in making high-quality nanoscopic photonic structures with high precision."

The modulator project builds upon the lab's previous use of lithium niobate to create a photonic nanocavity--another key component in photonic chips. At only about a micron in size, the nanocavity can tune wavelengths using only two to three photons at room temperature--"the first time we know of that even two or three photons have been manipulated in this way at room temperatures," Lin says. That device was described in a paper in Optica.

The modulator could be used in conjunction with a nanocavity in creating a photonic chip at the nanoscale.

Credit: 
University of Rochester

Call of the wild: Individual dolphin calls used to estimate population size and movement

video: A spectrogram video of whistles recorded from bottlenose dolphin population of Walvis Bay

Image: 
Emma Longden, Namibian Dolphin Project

An international team of scientists has succeeded in using the signature whistles of individual bottlenose dolphins to estimate the size of the population and track their movement.

The research, led by Stellenbosch University and the University of Plymouth, marks the first time that acoustic monitoring has been used in place of photographs to generate abundance estimates of dolphin populations.

Writing in the Journal of Mammalogy, researchers say they are excited by the positive results yielded by the method, as the number of dolphins estimated was almost exactly the same as estimated through the more traditional photographic mark-recapture method.

They are now working to refine the technique, in the hope it can be used to track other species - with a current focus on endangered species such as humpback dolphins.

Quicker information processing and advances in statistical analysis mean in the future that automated detection of individually distinctive calls could be possible. This can generate important information on individual animals and would be particularly useful for small, threatened populations where every individual counts.

"The capture-recapture of individually distinctive signature whistles has not been attempted before," says the paper's senior author Dr Tess Gridley, Co-Director of Sea Search and the Namibian Dolphin Project and a postdoctoral fellow in the Department of Botany and Zoology at SU. "The dolphins use these sounds throughout life and each has its own unique whistle. Therefore, by recording signature whistles over time and in different places we can calculate where animals are moving to and how many animals there are in a population."

Working with Dr Simon Elwen of Stellenbosch University, the Namibian Dolphin Project has been researching Namibia's resident bottlenose dolphins for the past 12 years, and built up a catalogue of more than 55 signature whistles dating back to 2009.

This particular study was led by Emma Longden, who began the project during her BSc (Hons) Marine Biology degree at the University of Plymouth. As an undergraduate, Emma completed an internship with the Namibia Dolphin Project for a month in 2016, and returned again in 2018 to complete work on the mark-recapture project.

She analysed more than 4000 hours of acoustic data from four hydrophones positioned along the coast south and north of Walvis Bay, Namibia, during the first six months of 2016.

All in all, they identified 204 acoustic encounters, 50 of which contained signature whistle types. From these encounters, 53 signature whistle types were identified; 40 were in an existing catalogue developed in 2014 for the Walvis Bay bottlenose dolphin population, and 13 were newly identified. Of the 53 signature whistle types identified, 43% were captured only once, whereas the majority (57%) were recaptured twice or more.

"One of the great things about bioacoustics is that you can leave a hydrophone in the water for weeks at a time and collect so much data without interfering with the lives of the animals you are studying," says Emma, whose work on the project was also supervised by Dr Clare Embling, Associate Professor of Marine Ecology at the University of Plymouth.

Dr Embling added: "This work is incredibly important as it allows us to track and count the number of dolphins in small vulnerable populations. It builds on our previous research looking at the impacts of noise on marine organisms and monitoring vulnerable marine mammal populations. It also showcases the fantastic level of research that our marine biology students are able to achieve, and the opportunities available to them through our partnerships with conservation organisations such as the Namibia Dolphin Project and the Ocean Giants Trust."

Future research includes the work undertaken by PhD student Sasha Dines from Stellenbosch University to further refine the technique to better understand the population of endangered humpback dolphins in South Africa. Another PhD student, Jack Fearey from the University of Cape Town, is continuing to conduct research along the Namibian Coast.

Credit: 
University of Plymouth

Atlantic sturgeon in the king's pantry -- unique discovery in Baltic sea wreck from 1495

image: The wooden barrel, with parts of the sturgeon (in orange)

Image: 
Brett Seymour

Researchers at Lund University in Sweden can now reveal what the Danish King Hans had planned to offer when laying claim to the Swedish throne in 1495: a two-metre-long Atlantic sturgeon. The well-preserved fish remains were found in a wreck on the bottom of the Baltic Sea last year, and species identification was made possible through DNA analysis.

At midsummer in 1495, the Danish King Hans was en route from Copenhagen to Kalmar, Sweden, on the royal flagship Gribshunden. Onboard were the most prestigious goods the Danish royal court could provide, but then, the trip was also very important. King Hans was going to meet Sten Sture the Elder (he hoped) to lay claim to the Swedish throne. It was important to demonstrate both power and grandeur.

However, when the ship was level with Ronneby in Blekinge, which was Danish territory at the time, a fire broke out on board and Gribshunden sank. The King himself was not on board that night, however, both crew and cargo sank with the ship to the sea floor, where it has lain ever since.

Thanks to the unique environment of the Baltic Sea - with oxygen-free seabeds, low salinity and an absence of shipworms - the wreck was particularly well preserved when it was discovered approximately fifty years ago, and has provided researchers with a unique insight into life on board a royal ship in the late Middle Ages. In addition, researchers now also know what was in the royal pantry - the wooden barrel discovered last year, with fish remains inside.

"It is a really thrilling discovery, as you do not ordinarily find fish in a barrel in this way. For me, as an osteologist, it has been very exciting to work with", says Stella Macheridis, researcher at the Department of Archaeology and Ancient History at Lund University.

When the remains were discovered it was possible to see that they came from a sturgeon pretty early on due to the special bony plates, the scutes. However, researchers were unsure which species it was. Up until relatively recently, it was believed to be the European sturgeon found in the Baltic Sea at the time. However, the DNA analysis revealed it was the Atlantic variety with which King Hans planned on impressing the Swedes. Researchers have also been able to estimate the length of the sturgeon - two metres - as well as demonstrate how it was cut.

For Maria C Hansson, molecular biologist at Lund University, and the researcher who carried out the DNA analysis, the discovery is of major significance, particularly for her own research on the environment of the Baltic Sea.

"For me, this has been a glimpse of what the Baltic Sea looked like before we interfered with it. Now we know that the Atlantic sturgeon was presumably part of the ecosystem. I think there could be great potential in using underwater DNA in this way to be able to recreate what it looked like previously", she says.

The Atlantic sturgeon is currently an endangered species and virtually extinct.

The discovery on Gribshunden is unique in both the Scandinavian and European contexts -such well preserved and old sturgeon remains have only been discovered a few times at an underwater archaeological site.

It is now possible, in a very specific way, to link the sturgeon to a royal environment - the discovery confirms the high status it had at the time. The fish was coveted for its roe, flesh and swim bladder - the latter could be used to produce a kind of glue (isinglass) that, among other things, was used to produce gold paint.

"The sturgeon in the King's pantry was a propaganda tool, as was the entire ship. Everything on that ship served a political function, which is another element that makes this discovery particularly interesting. It provides us with important information about this pivotal moment for nation-building in Europe, as politics, religion and economics - indeed, everything - was changing", says Brendan P. Foley, marine archaeologist at Lund University, and project coordinator for the excavations.

Gribshunden will become the subject of further archaeological excavations and scientific analyses in the coming years.

Credit: 
Lund University

A topography of extremes

image: False-color electron microscopic image of a microstructure (violet) contacted via gold tracks (yellow) after reopening the diamond anvil cell. Ruby spheres (red) are used to sense the pressure in the sample chamber via laser fluorescence spectroscopy. Debris particles are remnants of the pressure medium and pressure device.

Image: 
Toni Helm/HZDR

An international team of scientists from the Helmholtz-Zentrum Dresden-Rossendorf (HZDR), the Max Planck Institute for Chemical Physics of Solids, and colleagues from the USA and Switzerland have successfully combined various extreme experimental conditions in a completely unique way, revealing exciting insights into the mysterious conducting properties of the crystalline metal CeRhIn5. In the journal Nature Communications (DOI: 10.1038/s41467-020-17274-6), they report on their exploration of previously uncharted regions of the phase diagram of this metal, which is considered a promising model system for understanding unconventional superconductors.

"First, we apply a thin layer of gold to a microscopically small single crystal. Then we use an ion beam to carve out tiny microstructures. At the ends of these structures, we attach ultra-thin platinum tapes to measure resistance along different directions under extremely high pressures, which we generate with a diamond anvil pressure cell. In addition, we apply very powerful magnetic fields to the sample at temperatures near absolute zero."

To the average person, this may sound like an overzealous physicist's whimsical fancy, but in fact, it is an actual description of the experimental work conducted by Dr. Toni Helm from HZDR's High Magnetic Field Laboratory (HLD) and his colleagues from Tallahassee, Los Alamos, Lausanne and Dresden. Well, at least in part, because this description only hints at the many challenges involved in combining such extremes concurrently. This great effort is, of course, not an end in itself: the researchers are trying to get to the bottom of some fundamental questions of solid state physics.

The sample studied is cer-rhodium-indium-five (CeRhIn5), a metal with surprising properties that are not fully understood yet. Scientists describe it as an unconventional electrical conductor with extremely heavy charge carriers, in which, under certain conditions, electrical current can flow without losses. It is assumed that the key to this superconductivity lies in the metal's magnetic properties. The central issues investigated by physicists working with such correlated electron systems include: How do heavy electrons organize collectively? How can this cause magnetism and superconductivity? And what is the relationship between these physical phenomena?

An expedition through the phase diagram

The physicists are particularly interested in the metal's phase diagram, a kind of map whose coordinates are pressure, magnetic field strength, and temperature. If the map is to be meaningful, the scientists have to uncover as many locations as possible in this system of coordinates, just like a cartographer exploring unknown territory. In fact, the emerging diagram is not unlike the terrain of a landscape.

As they reduce temperature to almost four degrees above absolute zero, the physicists observe magnetic order in the metal sample. At this point, they have a number of options: They can cool the sample down even further and expose it to high pressures, forcing a transition into the superconducting state. If, on the other hand, they solely increase the external magnetic field to 600,000 times the strength of the earth's magnetic field, the magnetic order is also suppressed; however, the material enters a state called "electronically nematic".

This term is borrowed from the physics of liquid crystals, where it describes a certain spatial orientation of molecules with a long-range order over larger areas. The scientists assume that the electronically nematic state is closely linked to the phenomenon of unconventional superconductivity. The experimental environment at HLD provides optimum conditions for such a complex measurement project. The large magnets generate relatively long-lasting pulses and offer sufficient space for complex measurement methods under extreme conditions.

Experiments at the limit afford a glimpse of the future

The experiments have a few additional special characteristics. For example, working with high-pulsed magnetic fields creates eddy currents in the metallic parts of the experimental setup, which can generate unwanted heat. The scientists have therefore manufactured the central components from a special plastic material that suppresses this effect and functions reliably near absolute zero. Through the microfabrication by focused ion beams, they produce a sample geometry that guarantees a high-quality measurement signal.

"Microstructuring will become much more important in future experiments. That's why we brought this technology into the laboratory right away," says Toni Helm, adding: "So we now have ways to access and gradually penetrate into dimensions where quantum mechanical effects play a major role." He is also certain that the know-how he and his team have acquired will contribute to research on high-temperature superconductors or novel quantum technologies.

Credit: 
Helmholtz-Zentrum Dresden-Rossendorf

Microbes working together multiply biomass conversion possibilities

image: Fungal biofilm growing on an oxygen permeable, helically coiled tubular membrane in an otherwise anaerobic bioreactor. The biofilm is removed from the reactor at the end of a fermentation run.

Image: 
M. Studer (BFH)

With the race for renewable energy sources in full swing, plants offer one of the most promising candidates for replacing crude oil. Lignocellulose in particular - biomass from non-edible plants like grass, leaves, and wood that don't compete with food crops - is abundant and renewable and offers a great alternative source to petroleum for a whole range of chemicals.

In order to extract useful chemicals from it, lignocellulose is first pretreated to "break it up" and make it easier to further process. Then it's exposed to enzymes that solubilize cellulose, which is a chain of linked up sugars (glucose). This step can be done by adding to the pre-treated lignocellulose a microorganism that naturally produces the necessary, cellulose-cleaving enzymes, e.g. a fungus.

The enzymes "crack" the cellulose and turn it into its individual sugars, which can be further processed to produce a key chemical: lactic acid. This second step is also accomplished with a microorganism, a bacterium that "eats" the sugars and produces lactic acid when there's no oxygen around.

In the final step of this microbial assembly line, the lactic acid can then be processed to make a whole host of useful chemicals.

A team of scientists from the Bern University of Applied Sciences (BFH), the University of Cambridge, and EPFL have made this assembly chain possible in a single setup and demonstrated this conversion can be made more versatile and modular. By easily swapping out the microorganisms in the final, lactic-acid processing, step, they can produce a whole range of useful chemicals.

The breakthrough study is published in Science, and was carried out by Robert Shahab, an EPFL PhD student in Professor Jeremy Luterbacher's lab, while working at the lab of Professor Michael Studer at the BFH, who led the study.

The researchers present what they refer to as a "lactate platform", which is essentially a spatially segregated bioreactor that allows multiple different microorganisms to co-exist, each performing one of the three steps of lignocellulose processing.

The platform consists of a tubular membrane that lets a defined amount of oxygen to go through it. On the tube's surface can be grown the fungus that consumes all oxygen that passes through the membrane, and provides the enzymes that will break up cellulose into sugars. Further away from the membrane, and therefore in an atmosphere without oxygen, grow the bacteria that will "eat" the sugars and turn them into lactic acid.

But the innovation that Shahab made was in the last step. By using different lactic acid-fermenting microorganisms, he was able to produce different useful chemicals. One example was butyric acid, which can be used in bioplastics, while Luterbacher's lab recently showed that it can even be turned into a jet fuel.

The work demonstrates the benefits of mixed microbial cultures in lignocellulose biomass processing: modularity and the ability to convert complex substrates to valuable platform chemicals.

"The results achieved with the lactate platform nicely show the advantages of artificial microbial consortia to form new products from lignocellulose," says Michael Studer. "The creation of niches in otherwise homogeneous bioreactors is a valuable tool to co-cultivate different microorganisms."

"Fermenting lignocellulose to a lot of different products was a significant amount of work but it was important to show how versatile the lactate platform is," says Robert Shahab. "To see the formation of lactate and the conversion into target products was a great experience as it showed that the concept of the lactate platform worked in practice."

Jeremy Luterbacher adds: "The ultimate goal is to rebuild a green manufacturing sector to replace one that produces many products from crude oil. A method that introduces flexibility and modularity is an important step in that direction."

Credit: 
Ecole Polytechnique Fédérale de Lausanne

Binding sites for protein-making machinery

Genome sequencing of bacteria, plants and even humans has become a routine process, yet the genome still poses many unanswered questions. One of these concerns the sites on messenger RNAs (mRNAs) that ribosomes - the cellular structures responsible for protein synthesis - bind to in order to translate genetic information. Currently, the function of these ribosome binding sites is only partly understood.

An interdisciplinary team of researchers from the Department of Biosystems Science and Engineering (D-BSSE) at ETH Zurich in Basel has now developed a new approach that, for the first time, makes it possible to obtain detailed information on an incredibly large number of these binding sites in bacteria. The new approach combines experimental methods of synthetic biology with machine learning.

Precise control over protein production

Ribosome binding sites are short RNA sequences upstream of a gene's coding sequence. In the past, biotechnologists also developed synthetic binding sites. The ribosomes bind extremely well to some of these, and less well to others. The tighter ribosomes are able to bind to a specific variant, the more often they translate the respective gene and the greater the amount of the corresponding protein they produce.

Biotechnologists who use bacteria to produce chemicals of interest such as pharmaceuticals can influence the amount of involved proteins in the cell through their choice of ribosome binding sites. "Exerting this kind of control is particularly important and helpful when incorporating complex gene networks comprising multiple proteins at the same time. The key here is to establish an optimal
balance amongst the different proteins," says Markus Jeschek, senior scientist and group leader at D-BSSE.

An experiment with 300,000 sequences

Together with ETH professors Yaakov Benenson and Karsten Borgwardt and members of the respective groups, Jeschek has now developed a method to determine how tightly ribosomes bind to hundreds of thousands or more RNA sequences in a single experiment. Previously this was only possible for a few hundred sequences.

The ETH researchers' approach harnesses deep sequencing, the latest technology used to sequence DNA and RNA. In the laboratory, the scientists produced over 300,000 different synthetic ribosome binding sites and fused each of these with a gene for an enzyme that modifies a piece of target DNA. They introduced the resulting gene constructs into bacteria in order to see how tightly the ribosomes bind to RNA in each individual case. The better the function of the binding site, the more enzyme is produced in the cell and the more rapidly the target DNA will be changed. At the end of the experiment, the researchers can read this change together with the binding site's RNA sequence using deep sequencing.

Universally applicable approach

Since 300,000 represents only a small fraction of the many billions of theoretically possible ribosome binding sites, the scientists analysed their data using machine learning algorithms. "These algorithms can detect complex patterns in large datasets. With their help, we can predict how tightly ribosomes will bind to a specific RNA sequence," says Karsten Borgwardt, Professor of Data Mining. The ETH researchers have made this prediction model freely available as software so that other scientists can make use of it, and they will soon be introducing an easy-to-use online service as well.

The approach developed by the scientists is universally applicable, Benenson and Jeschek emphasise, and the team is planning to extend it to other organisms including human cells. "We're also keen to find out how genetic information influences the amount of protein that is produced in a human cell," Benenson says. "This could be particularly useful for genetic diseases."

Credit: 
ETH Zurich

How to make AI trustworthy

One of the biggest impediments to adoption of new technologies is trust in AI.

Now, a new tool developed by USC Viterbi Engineering researchers generates automatic indicators if data and predictions generated by AI algorithms are trustworthy.  Their research paper, "There Is Hope After All: Quantifying Opinion and Trustworthiness in Neural Networks" by Mingxi Cheng, Shahin Nazarian and Paul Bogdan of the USC Cyber Physical Systems Group, was featured in Frontiers in Artificial Intelligence.

Neural networks are a type of artificial intelligence that are modeled after the brain and generate predictions. But can the predictions these neural networks generate be trusted?  One of the key barriers to adoption of self-driving cars is that the vehicles need to act as independent decision-makers on auto-pilot and quickly decipher and recognize objects on the road--whether an object is a speed bump, an inanimate object, a pet or a child--and make decisions on how to act if another vehicle is swerving towards it. Should the car hit the oncoming vehicle or swerve and hit what the vehicle perceives to be an inanimate object or a child? Can we trust the computer software within the vehicles to make sound decisions within fractions of a second--especially when conflicting information is coming from different sensing modalities such as computer vision from cameras or data from lidar? Knowing which systems to trust and which sensing system is most accurate would be helpful to determine what decisions the autopilot should make.

Lead author Mingxi Cheng was driven to work on this project by this thought: "Even humans can be indecisive in certain decision-making scenarios. In cases involving conflicting information, why can't machines tell us when they don't know?"

A tool the authors created named DeepTrust can quantify the amount of uncertainty," says Paul Bogdan, an associate professor in the Ming Hsieh Department of Electrical and Computer Engineering and corresponding author, and thus, if human intervention is necessary.

Developing this tool took the USC research team almost two years employing what is known as subjective logic to assess the architecture of the neural networks. On one of their test cases, the polls from the 2016 Presidential election, DeepTrust found that the prediction pointing towards Clinton winning had a greater margin for error.

The other significance of this study is that it provides insights on how to test reliability of AI algorithms that are normally trained on thousands to millions of data points. It would be incredibly time-consuming to check if each one of these data points that inform AI predictions were labeled accurately. Rather, more critical, say the researchers, is that the architecture of these neural network systems has greater accuracy.  Bogdan notes that if computer scientists want to maximize accuracy and trust simultaneously, this work could also serve as guidepost as to how much "noise" can be in testing samples.

The researchers believe this model is the first of its kind. Says Bogdan, "To our knowledge, there is no trust quantification model or tool for deep learning, artificial intelligence and machine learning. This is the first approach and opens new research directions." He adds that this tool has the potential to make "artificial intelligence aware and adaptive."

Credit: 
University of Southern California

A chiral surprise in the rainforest

image: Recently, scientists discovered a reversed ratio of the two chiral forms of alpha-pinene in the Amazon rainforest. Nora Zannoni and her colleagues consider nests of termites as a possible emission source.

Image: 
Nora Zannoni, MPI for Chemistry

Forests such as the Amazon rainforest emit huge amounts of biogenic volatile organic compounds (BVOC) into the atmosphere. These compounds impact the physical and chemical properties of the atmosphere and also our climate. The molecules react rapidly with ambient OH radicals and ozone, thereby influencing the oxidation capacity of the atmosphere for pollutants such as carbon monoxide and greenhouse gases such as methane. Furthermore, BVOC are precursors to secondary organic aerosols, which affect the Earth's radiative budget.

Many BVOCs such as ?-pinene are chiral. This means that they exist in two non-superimposable mirror image forms just like our left and right hands. Scientists speak of enantiomers, or plus and minus forms. However, all physical properties such as their boiling point, mass and their reaction rate with atmospheric oxidizing agents like OH and ozone are identical.

Despite the chemical similarity of these chiral pairs, insects and plants can distinguish enantiomeric forms of pheromones and phytochemicals, although little attention has been paid to the mixing ratio of the two separated forms in forests. Previous measurements reported minus α-pinene to be the dominant chiral molecule of the tropical forest. Scientists from the Max Planck Institute for Chemistry, the Johannes Gutenberg-University Mainz and from Brazil have now made a surprising discovery: from the 325-meter-high measuring tower in the Amazon rainforest, they were able to show that the ratio of the α-pinene enantiomers varies in the vertical by a factor of ten. The team around the Max Planck researcher Nora Zannoni was also able to demonstrate that the concentrations are altitude-dependent and vary with the time of day and in both wet and dry seasons.

While plus-α-pinene dominates at 40 meters anytime and at 80 meters during the night, the minus form predominates at 80 meters during the day and at all other higher heights anytime. The team also observed that the minus α-pinene concentration depends on temperature at 80 meters while plus α-pinene does not. "The photosynthetic activity of the vegetation depends on temperature and stomatal opening. It thus drives the emissions of minus α-pinene, demonstrating that leaves are the main source of emission of this isomer, and that the two isomers are released from leaves through different pathways", says Zannoni, who is first author of a study recently published in the science magazine "Communications Earth & Environment".

Termites as unknown source of plus α-pinene in the canopy?

During the dry season, the chiral ratio of the two forms reverses at 80 meters. "This indicates a strong, uncharacterized source of plus α-pinene in the canopy," says Jonathan Williams, group leader at the institute in Mainz and last author of the study. Since the researchers could rule out atmospheric sinks such as the chiral-selective degradation of pinene by OH radicals and ozone or deposition onto aerosols as well as the influence of wind direction and sunlight, they instead suspect that insect stresses such as herbivores feeding and termites emissions are responsible for the plus α-pinene higher values. In order to test a possible impact of insects the researchers conducted additional measurements above termite nests which confirmed that such emissions can overturn the ambient chiral ratio of α-pinene. As termite populations are expected to increase significantly in the future with continued deforestation and climate warming, their influence needs to be considered in forest emission models and forest signaling.

"We also know that plants can release large amounts of plus α-pinene when injured or eaten," Williams adds. This is supported by measurements of volatile compounds associated with leaf wounding that even revealed when the herbivores were most active. The atmospheric chemists Zannoni and Williams conclude that they need to rethink how canopy emissions of volatile organic compounds are simulated, and take the whole ecosystem into account.

Credit: 
Max Planck Institute for Chemistry

Methane: emissions increase and it's not a good news

It is the second greenhouse gas with even a global warming potential larger than CO2. An international study realized in the framework of the Global Carbon Project provides updated information and data on its increasing concentrations in the atmosphere. Eyes on many sectors, such as agriculture, waste and fossil fuel sectors. Among the authors, CMCC researchers Simona Castaldi and Sergio Noce.

Not only CO2. Of course, carbon dioxide plays a key role in global warming, but among all the greenhouse gases, methane deserves a special attention because of its larger global warming potential (28 times higher than carbon dioxide on a 100-year time horizon). Moreover, once in the atmosphere carbon dioxide can continue to affect climate for thousands of years. Methane, by contrast, is mostly removed from the atmosphere by chemical reactions, persisting for about 12 years. Thus, although methane is a potent greenhouse gas, its effect is relatively short-lived and any measures to remove methane emissions from the atmosphere can have a very rapid positive effect.
Methane is therefore becoming an increasingly important component for managing realistic pathways to mitigate climate change.
After a period of stabilizations in the early 2000s, methane concentrations are rising again since 2007. The increase in methane concentrations follows trends of future scenarios that do not comply with the objectives of Paris Agreement.

This is the trend underlined in a study recently published on Earth System Science Data (among the authors, the CMCC researchers Simona Castaldi and Sergio Noce from IAFES – Impacts on Agriculture, Forests and Ecosystem Services Division), complemented by an article in Environmental Research Letters. The study was conducted by an international research team and led by the Laboratoire des Sciences du Climat et de l’Environnement (LSCE, CEA-CNRS-UVSQ) in France, under the umbrella of the Global Carbon Project that initiated the work. It represents an up-date of the global methane sources and sinks to the atmosphere for the period 2000-2017. This budget show that global methane emissions have increased by 9 % (about 50 Million tons) between 2000-2006 and 2017. Anthropogenic emissions appear to be the main contributors to this increase, with equal shares between fossil fuel sector and agriculture and waste sector.
“We know well”, commented CMCC researchers Simona Castaldi and Sergio Noce, “that carbon dioxide is the major driver of climate change, but methane has undoubtedly an important role in this process. This study recently published on ESSD is the result of the great effort of an international research team of more than 90 co-authors; it represents an update of a research previously published in 2016 summing up our current knowledge on methane emissions, their trends and evolution, while combining knowledge of more than 70 research centers all around the world. Each researcher gave a contribution according to her/his own expertise: at the CMCC we dealt with an estimate of CH4 emissions from termites at the global scale – CH4 is released during the anaerobic decomposition of plant biomass in their gut -.”

A likely major driver of the recent rapid rise in global CH4 concentrations is an increase of emissions mostly from agriculture and waste management; anthropogenic emissions are shared as follows between the different main sources of methane: 30% from enteric fermentation and manure management; 22% from oil and gas production and use; 18% from handling solid and liquid wastes; 11% from coal extraction; 8% from rice cultivation; 8% from biomass and biofuel burning. The rest is attributed to transport (e.g. road transport) and industry.
64% of global methane emissions originate from the Tropics, 32 % from the Northern mid-latitudes and only 4% from the Northern high latitudes.
Therefore, methane emissions from boreal regions did not increased significantly. This means that the high climate sensitivity of boreal regions does not (yet) translate in large increase in methane emissions.

Increasing emissions in Africa, Asia and North America
The three main regions contributing to this methane emission increase are likely: Africa, China and Asia, each contributing 10-15 million tons of CH4. Then North America likely contribute to 5-7 million tons, including 4-5 million tons from USA.
In Africa and Asia (except China), the agriculture and waste sector contribute the most, followed by the fossil fuel sector. This is the opposite for China and North America, where the increase in the fossil fuel sector is largest than the one in the agriculture and waste sector.

Decreasing emissions in Europe
Europe seems to be the only region where emissions have decreased: between -4 et -2 million tons, depending on the approach used for the estimation. This decrease is mainly related to the agriculture and waste sector.

To meet the objectives of Paris Agreement, not only CO2 emissions need to be reduced but also methane emissions.
Despite still some uncertainties in methane sources and sinks, the recent increase in methane concentrations suggests a dominant anthropogenic contribution.
Although methane is a potent greenhouse gas, its effect is relatively short, remaining in the atmosphere for about 10 years, and reducing methane emissions would have a rapid positive effect on climate.
Methane therefore might offer growing opportunities for climate change mitigation while providing rapid climate benefits and economic, health and agricultural co-benefits.

Sources:
The LSCE press release, based on the following publications:

Saunois et al. (2020) The Global Methane Budget 2000-2017. Earth System Science Data. https://doi.org/10.5194/essd-12-1561-2020

Jackson et al. (2020). Increasing anthropogenic methane emissions arise equally from agricultural and fossil fuel sources. Environmental Research Letters. https://doi.org/10.1088/1748-9326/ab9ed2

Access to the data:
Data for the global methane budget are available from the Global Carbon Atlas, with budgets by regions and sectors. For the release of the global methane budget, the Global Carbon Atlas includes a new design and new applications related to the Global Carbon Project: CO2 emissions for 343 cities worldwide, and carbon cycle and natural CO2 emissions from rivers and lakes.

http://www.globalcarbonatlas.org/en/content/welcome-carbon-atlas

Data and figures:
http://www.globalcarbonproject.org/methanebudget

http://www.globalcarbonatlas.org

https://www.icos-cp.eu/GCP-CH4/2019

Journal

Earth System Science Data

DOI

10.5194/essd-12-1561-2020

Credit: 
CMCC Foundation - Euro-Mediterranean Center on Climate Change

FEFU scientists are paving way for future tiny electronics and gadgets

image: Intel Core i5 9400F

Image: 
Christian Wiediger, @christianw; Unsplash

Scientists of the School of Natural Sciences of Far Eastern Federal University (SNS FEFU) with colleagues from Russia, South Korea, and Australia suggest the breaking new ground approach to manage spin-electronic properties and functionality of the thin-film magnetic nanosystems. The method is important for the development of a new generation of tiny electronics (spin-orbitronics) and superfast high-capacity computer memory. A related article appears in NPG Asia Materials.

Scientists from the SNS FEFU Laboratory for Thin Film Technologies propose to control the functionality of a magnetic nanosystem through the surface roughness of a magnetic film sandwiched between heavy metal and a capping layer.
Modulating the roughness amplitude on the bottom and top surfaces (interfaces) of the magnetic film in the range of less than a nanometer, which is about the size of atoms, the researchers have maximized useful spin-electron effects important for the operation of electronics. They also found that the roughness at the bottom and upper interfaces of the magnetic film should correlate.

The efficacy of the approach was first demonstrated on the example of a magnetic system consisting of a palladium (Pd) layer with the thickness ranging from 0 to 12 nanometers (nm) covered with a 2-nm-thick platinum (Pt) layer and a 1.5-nm-thick ferromagnet (CoFeSiB alloy). The multilayer structure had been covered with a layer of magnesium oxide (MgO), tantalum (Ta), or ruthenium (Ru), due to the different capping materials are expanding the possibilities for manipulating the magnetic properties of the nanosystem.

"In contemporary electronics, transistors are constantly getting reduced in size. Along with that, the general development trend is to obtain atomically smooth defect-free surfaces", explains Alexander Samardak, the author of the research idea, FEFU Vice-President for Research, "However, it would be a big mistake to strive for ideal interfaces, because a lot of new and practically interesting physical effects lie outside the atomic ordering and ideally flat surfaces. With permanent miniaturization of the functional elements in electronics, the role of surface roughness is skyrocketing. Mainly due to the development of highly sensitive analytical equipment, only now we have begun to reveal the nature of the discovered phenomena and understand the role of roughness and atomic mixing at interfaces. The main message of our research is that atomic roughness can serve for the benefit of implementation of new spin-orbitronic devices with improved properties".

The scientist explained that a new field of physics, i.e. spin-orbitronics, has been actively developing in the world over the past five years. It studies not just the spin of the electron (the intrinsic angular momentum of the electron which is a quantum property and doesn't associate with the motion (displacement or rotation of the electron as a whole), but the spin-orbit interaction. This interaction occurs between an electron orbiting around an atomic nucleus and inducing a magnetic field, and its own magnetic moment conditioned by the electron spin.

The advantage of spin-orbitronics is that the functionality of such devices (for example, magnetic memory) is provided directly through the control of the spin-orbit interaction in their constituent nanomaterials, for example, in heavy metals.

Heavy metals of the platinum group (Ru, Rh, Pd, Os, Ir, Pt) have a fairly strong spin-orbit interaction. If bring one of these metals into contact with an ultrathin, several-atomic-layers-thick magnetic film (for example, Co, Ni, Fe, Py or their alloys), the electronic and magnetic properties of the system will be radically changed.

"First, it is possible to manage magnetization and prepare nanosystems magnetized perpendicularly to the plane of the film. This is how it implemented in modern hard disk drives and under-development new electronic carriers in order to increase the density of information storage, accelerate the speed of data writing/reading and increase the number of rewriting cycles. Second, the strong spin-orbit interaction in a heavy metal leads to the "deformation" of the electron orbitals of atoms of the magnetic material (film). That results in spin effects such as magnetic damping and the interfacial Dzyaloshinsky-Moriya interaction appearing at the interface between the heavy metal and magnetic layer. This antisymmetric exchange interaction leads to the transformation of the ferromagnetic order and the appearance of nontrivial spin textures such as skyrmions and skyrmioniums. Such spin textures have enormous potential for the electronics of the near future, playing the role of non-volatile information carriers. For example, on their basis, one can make computer memory components that will work with no magnetic heads, where bits will be moved and switched by current pulses due to the manipulation of electron spins. Such devices shall operate at bit moving rates up to several km per second thanks to the electric current only and will store an order of magnitude more data because tiny spin textures," says Alexander Samardak.

For the experiment, researchers grew a series of palladium films with an ideal single crystal structure (Figure 1 (a)) using molecular-beam epitaxy. Since the Pd grows as three-dimensional islands, the authors have found that the rough surface of Pd films can be described via a sinusoidal function. Varying the thickness of the Pd films in the range from 0 to 12.6 nm, they managed to control the amplitude and period of roughness in the range from 0 to 2 nm and from 0 to 50 nm, correspondingly. After that, the surface of palladium was coated by thin films of platinum and a magnetic alloy (Pt (2 nm) / CoFeSiB (1.5 nm)) using magnetron sputtering in a vacuum. The system was capped with different variants of materials such as magnesium oxide, tantalum, ruthenium, Fig. 1 (b). The capping material influenced the magnetic anisotropy strongly, while the influence on the Dzyaloshinsky-Moriya interaction was not so significant. In this case, the deposited Pt and CoFeSiB layers repeated the morphology of the Pd surface.

As a result, researchers found out that with no change in the composition of the magnetic system, only via modulating the surface roughness in the sub-nanometer range by varying the thickness of the Pd layer, it is possible to change its functional properties dramatically. For example, the magnitude of the Dzyaloshinsky-Moriya interaction increased 2.5 times when the Pd layer was 10 nm thick. At this thickness the roughness of the bottom and top interfaces of the magnetic film maximally correlated to each other.

According to Alexander Samardak, about four years were necessary to conduct the study and one more year to publish the article in the prestigious journal of Nature group. For a long time, the reviewers could not believe in the possibility of controlling the spin-orbit properties via modulating the interface roughness. During the correspondence, the authors managed to convince the reviewers and defend their approach.

At the time being, scientists are preparing new magnetic samples to study how interface roughness affects the spin Hall effectand the spin-orbit torques. These future findings will make it possible to get closely to the implementation of memory cells whose magnetic moment is switched only by electric current.

Credit: 
Far Eastern Federal University

Gout treatment may aid patients with congenital heart disease

image: Jack Rubinstein, MD, is shown in the University of Cincinnati College of Medicine.

Image: 
Colleen Kelley/University of Cincinnati Creative + Brand.

A drug used to treat gout, probenecid, may improve heart function in individuals with a particular heart defect, according to results from a small pilot study run by a University of Cincinnati researcher.

Jack Rubinstein, MD, associate professor in the UC College of Medicine and UC Health cardiologist, conducted a randomized double-blind trial which included eight participants who had palliative surgery to correct a condition of the heart known as congenital univentricular circulation. Each participant received probenecid or a placebo during a 12-week period.

As part of the study, Rubinstein and co-investigators at Cincinnati Children's Hospital recruited patients to receive either probenecid or a placebo for four weeks followed by a four-week period without medication. They were then required to undergo another four weeks of alternate treatment. All patients were assessed at baseline immediately preceding the initial use of probenecid or the placebo. This included symptom reporting, heart imaging and exercise testing to determine aerobic capacity and endurance.

The study findings are available online in the scholarly journal Pediatric Cardiology.

"Heart function in participants along with their symptoms improved as a result of the pilot study," says Rubinstein. "Heart contractility was better. It wasn't a huge increase but enough for us to be able to detect it. They ran better and their heart pumped better. We observed a small change, partially because there were a small number of people involved."

"We can repurpose this medicine, long used to treat gout, to improve how the heart works for kids with univentricular circulation without any adverse effects," says Rubinstein. "The next step is a larger study to prove we can make it work safely in the long term."

Probenecid has been shown in recent years to positively influence cardiac function via effects on the Transient Receptor Potential Vanilloid 2 (TRPV2) channel in cardiomyocytes, explains Rubinstein. Researchers observed an improvement in cardiac function and exercise performance with probenecid in patients with a functionally univentricular circulation.

This study also reported work with colleagues at the University of Colorado that showed that patients with single ventricle physiology had higher levels of TRPV2 in their hearts, while collaborators at Oslo University Hospital reported a novel mechanism through which probenecid may particularly be helpful in this patient population.

Univentricular heart (UVH) is a severe congenital cardiac malformation characterized by one functional chamber. The clinical manifestations include congestive heart failure, failure to thrive, cyanosis, hypoxemia and neurodevelopmental disabilities.

Credit: 
University of Cincinnati

When liver cirrhosis is deadly

When the body can no longer compensate the gradual failure of the liver caused by liver cirrhosis, there is a high risk of acute decompensated liver cirrhosis. In some patients this develops quickly into an often deadly acute-on-chronic liver failure, in which other organs such as the kidneys or brain fail. A study by an international team of researchers headed by Professor Jonel Trebicka from the Frankfurt University Hospital and funded by the foundation EF Clif, has discovered which patients are particularly at risk. With their findings, the scientists have laid the foundation for the development of preventive therapy to prevent acute-on-chronic liver failure.

The liver has many functions: it stores nutrients and vitamins, produces dextrose, coagulation factors and hormones, and breaks down toxins, drugs and alcohol. Chronic alcohol abuse, viruses or other diseases can damage the liver and lead to chronic liver disease. Without treatment, chronic liver disease leads to liver cirrhosis in the final stages, in which liver tissue turns into connective tissue, making the liver increasingly unable to carry out its functions. The result: the blood's clotting ability is impaired, toxic metabolic products are fortified, the liver is not adequately supplied with blood and blood pressure rises in the portal veins that supply the liver.

The body tries to compensate for the reduced liver function. For example, new veins develop as alternative circulation from the oesophagus, stomach and intestines which expand into varicose veins. When the disease progresses to the point that this kind of compensation is no longer possible - physicians speak of acute decompensated liver cirrhosis - the situation becomes life-threatening: tissue fluid (ascites) collects in the abdominal cavity, leading to bacterial infections and internal bleeding, for example in the oesophagus. Difficulty concentrating, mood swings and sleepiness are signs of a poisoning of the brain (hepatic encephalopathy) that can result in a hepatic coma.

A European clinical study headed by Professor Jonel Trebicka, and carried out under the umbrella of the European Foundation for the Study of Chronic Liver Failure, has for the first time identified three clinical course variations in patients admitted to the hospital with acute decompensated cirrhosis.

1. The first clinical course is characterised by high blood inflammation values, indicating inflammatory reactions throughout the body. Within three months after admission to the hospital, a number of body organs fail: the acute decompensation becomes "acute-on-chronic liver failure" (ACLF). The physicians therefore call this variation Pre-ACLF. More than half of patients die from it; only a third survive after a year.

2. Patients with the second clinical course do not develop ACLF and have moderate inflammation values. They suffer, however, from significant hypertension in the portal vein. Approximately 20 percent die within the following three months, another 15 percent over the course of the following year. The physicians named this variation "instable decompensated liver cirrhosis".

3. The patients with the third clinical course exhibit neither high inflammation values nor frequent complications. They do not develop ACLF in the first three months. Within a year, however, one in ten dies. The physicians call this variation "stable decompensated liver cirrhosis."

Lead investigator Professor Jonel Trebicka, gastroenterologist and hepatologist at Medical Clinic I of University Hospital Frankfurt explains: "We are now working intensively on the development of new diagnostic options, especially for the group of pre-ACLF patients, in order to identify this group before admission to the hospital so that preventive measures can be implemented early on. The development of preventive therapies for the often deadly ACLF is one of our most important research goals in this context."

Study co-author Professor Stefan Zeuzem, Dean of the Faculty of Medicine and Director of Medical Clinic I at Frankfurt University Hospital explains: "Liver diseases are one of the main focal points of Medical Clinic I and we offer numerous specialised outpatient departments for patients with acute and chronic liver diseases. So on the one hand we were able to observe patients for the study. On the other hand, the research findings on improving ACLF prevention and therapies will rapidly benefit all of our patients."

Credit: 
Goethe University Frankfurt

NASA sees Hurricane Laura's nighttime landfall

image: From the International Space Station Astronaut Chris Cassidy took this photo of Hurricane Laura as it neared the Gulf coast on Aug. 26, 2020 at 3:27 p.m. EDT.

Image: 
NASA/Chris Cassidy

Many NASA assets were used to provide forecasters with information to incorporate into their analysis of Hurricane Laura. Satellite imagery, photographs from the International Space Station, and a computer program that produces animations of imagery are all things that NASA used to analyze the storm. NASA-NOAA's Suomi NPP satellite also caught a nighttime image of Laura just after landfall.

Laura's Landfall

Laura made landfall as a powerful Category 4 hurricane along the Louisiana coast. At 2 a.m. EDT (0600 UTC) on Aug. 27, Doppler radar images indicated that the eye of Hurricane Laura made landfall at the coast near Cameron, Louisiana, near latitude 29.8 degrees north and longitude 93.3 degrees west. Air Force reconnaissance and Doppler radar data indicated that the maximum sustained winds were near 150 mph (240 kph) with higher gusts. National Hurricane Center (NHC) Senior Hurricane Specialist John Cangialosi and Hurricane Model Diagnostician and Meteorologist David Zelinsky noted, "At the time of landfall, Laura was a ferocious looking hurricane with a clear circular eye, an intense eyewall, and tightly-coiled surrounding spiral bands."

A Nighttime View of Laura's Landfall

NASA-NOAA's Suomi NPP satellite passed over Hurricane Laura soon after it made landfall in southwestern Louisiana around 2 a.m. EDT on Aug. 27. The Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard Suomi NPP captured a nighttime image. Laura's cloud cover extends from Houston, Texas east to just west of New Orleans, Louisiana.

Weather Station Reports Just After Landfall

A National Ocean Service tide station at Calcasieu Pass, Louisiana observed a water level rise of 9.19 feet Mean Higher High Water at 1 a.m. CDT/2 a.m. EDT. The Lake Charles, Louisiana airport reported a sustained wind of 85 mph (137 kph) with a gust to 128 mph (206 kph) around 2 a.m. CDT/3 a.m. EDT. A University of Florida observing tower near Lake Charles recently reported a sustained wind of 86 mph (138 kph) with a gust to 112 mph (180 kph). A Texas Coastal Ocean Observing Network site at Sabine Pass on the Texas/Louisiana border reported sustained winds of 74 mph (119 kph) with a gust to 90 mph (145 kph).

Watches and Warnings on Aug. 27, 2020

A Storm Surge Warning is in effect for High Island, Texas to the mouth of the Mississippi River.

A Tropical Storm Warning is in effect for High Island, Texas to the mouth of the Mississippi River.

Animating Hurricane Laura's Path to Landfall

The Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard Suomi NPP provided a visible imagery of Laura from its birth in the Atlantic. Using the Worldview application at NASA's Goddard Space Flight Center in Greenbelt, Md., an animation was created of Laura's track from Puerto Rico to Louisiana. The imagery spanned from Aug. 20 to 26 and showed Tropical Storm Marco move from the Caribbean Sea to a landfall near the mouth of the Mississippi while Tropical Storm Laura moved from the east over Puerto Rico, Hispaniola and into the Gulf of Mexico, just before landfall in southwestern Louisiana.

NASA's Earth Observing System Data and Information System (EOSDIS) Worldview application provides the capability to interactively browse over 700 global, full-resolution satellite imagery layers and then download the underlying data. Many of the available imagery layers are updated within three hours of observation, essentially showing the entire Earth as it looks "right now."

A View from the International Space Station

Astronaut Chris Cassidy aboard the International Space Station took several photos of Hurricane Laura as it neared the Gulf coast on Aug. 26, 2020 at 3:27 p.m. EDT. The photos showed the extent of this large hurricane. The photos also revealed a clear eye surrounded by powerful thunderstorms. The photo below is one of four taken by Cassidy from the ISS.

Laura's Status on Aug. 27 at 8 a.m. EDT

At 8 a.m. EDT (1200 UTC), the center of Hurricane Laura was located near latitude 31.2 degrees north and longitude 93.3 degrees west. That is about 20 miles (30 km) north of Fort Polk, Louisiana.

Laura is moving toward the north near 15 mph (24 kph) and this motion should continue through the day. A northeastward to east-northeastward motion is expected tonight and Friday. Maximum sustained winds are near 100 mph (160 kph) with higher gusts.  Laura is now a Category 2 hurricane on the Saffir-Simpson scale at present based on wind speed.

Hurricane-force winds extend outward up to 60 miles (95 km) from the center and tropical-storm-force winds extend outward up to 175 miles (280 km).  An observing site in Alexandria, Louisiana, recently reported a wind gust to 74 mph (119 kph).The estimated minimum central pressure is 970 millibars.

Laura's Forecast and Track

NHC warned about dangerous storm surge, heavy rainfall, hurricane-force winds and isolated tornadoes.

NHC said, "The combination of a dangerous storm surge and the tide will cause normally dry areas near the coast to be flooded by rising waters moving inland from the shoreline.  The water could reach the following heights above ground somewhere in the indicated areas if the peak surge occurs at the time of high tide.

Through Friday, forecasters expect Laura to produce the following rainfall totals: Across portions of Louisiana, Mississippi, across Arkansas: 6 to 12 inches with isolated totals of 18 inches. This rainfall will cause widespread flash and urban flooding, small streams and creeks to overflow their banks, and minor to moderate freshwater river flooding.

Hurricane-force winds and damaging wind gusts are also expected to spread well inland into portions of eastern Texas and western Louisiana this morning. Tropical storm conditions will spread northward within the warning areas through the day.

In addition, tornadoes are possible today and tonight over parts of Louisiana, Arkansas, and western Mississippi."

Rapid weakening is forecast, and Laura is expected to become a tropical storm later today (Aug. 27). On the forecast track, Laura will move northward across western and northern Louisiana through this afternoon. The center of Laura is forecast to move over Arkansas tonight, the mid-Mississippi Valley on Friday, and the Mid-Atlantic States on Saturday.

NASA Researches Tropical Cyclones

Hurricanes/tropical cyclones are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

For Key Messages and updated forecasts, visit: http://www.nhc.noaa.gov

Credit: 
NASA/Goddard Space Flight Center

Hurricanes could be up to five times more likely in the Caribbean if tougher targets are missed

Global warming is dramatically increasing the risk of extreme hurricanes in the Caribbean, but meeting more ambitious climate change goals could up to halve the likelihood of such disasters in the region, according to new research.

The study, led by the University of Bristol, analysed future projections of hurricane rainfall in the Caribbean and found it to be particularly vulnerable to climate change, resulting in extreme hurricane rainfall events being as much as five times more likely in a warmer world.

"Hurricane research has previously focused on the United States, so we wanted to look at the Caribbean region, which has fewer resources to recover. The findings are alarming and illustrate the urgent need to tackle global warming to reduce the likelihood of extreme rainfall events and their catastrophic consequences, particularly for poorer countries which take many years to recover," said lead author Emily Vosper, Research Student at the School of Computer Science, at the University of Bristol.

The researchers generated thousands of synthetic hurricanes under three climate scenarios: present day conditions compared to the Paris Agreement goals of 1.5 degrees Celsius and 2°C warming above pre-industrial levels. The main objective of the Paris Agreement, a global framework to tackle climate change, is to hold the global average temperature increase to well below 2°C above pre-industrial levels and endeavour to limit the temperature increase to 1.5°C.

Focusing their analysis on the Caribbean region, the study generated rainfall statistics by applying a physics-based model to the synthetic hurricanes. The model takes into account several factors including the land features and large-scale winds, and has been shown to give realistic results compared to observations of real-life hurricanes.

The study, published in Environmental Research Letters, found that extreme hurricane rainfall events affecting the Caribbean, those which typically happen once every 100 years under the current climate, occur more often under the Paris Agreement scenarios. But a 1.5°C warmer world would see significantly fewer intense Caribbean hurricanes, reducing occurrence by as much as half in the Eastern regions, compared to a 2°C warmer world.

Hurricane Maria brought as much as a quarter of normal annual rainfall to some regions of Puerto Rico when it made landfall in 2017 and storms of this magnitude are roughly once in a 100-year events. The results show that in a 2°C warmer world, an event of similar size to Maria would be more than twice (2.3 times) as likely, occurring once every 43 years. Similarly, a 100-year storm affecting the Bahamas would be 4.5 times as likely under the 2°C Paris Agreement scenario compared to the present day. Under the more ambitious goal of 1.5°C warming, such extreme hurricane rainfall events affecting the Dominican Republic would occur roughly once every 57 years, which is half as likely compared to the 2°C warming scenario where they would occur once every 30 years.

Emily said: "We expected extreme hurricanes to be more prevalent in the 2°C global warming scenario, but the scale of the projected increases was surprising and should serve as a stark warning to countries across the globe underscoring the importance of keeping climate change under control."

The projections reinforce the Intergovernmental Panel on Climate Change special report, which concludes that restricting global warming to 1.5°C would limit the risk of climate-related hazards, such as torrential rainfall, drought, and temperature extremes.

Emily said: "Our findings show that the impacts of a 2?C warming above pre-industrial levels are set to disproportionately affect the Caribbean. By focusing efforts to stabilise global warming to the more ambitious 1.5?C goal, we could dramatically reduce the likelihood of extreme hurricane rainfall events in the area, particularly in the Eastern Caribbean region."

It takes at least six years for even the richest of the Caribbean countries to rebuild after a major hurricane hits, stalling economic growth. Building resilient infrastructure throughout the islands is not feasible due to financial and time constraints. The study recommends its findings could be used to inform a multi-hazard, multi-scale approach which identifies the most at-risk areas so resilience funding and strategies can be more effectively targeted.

Emily said: "Resources to mitigate damage are limited, so our findings could help highlight the hotspots in greatest danger and need. An integrated climate risk approach is needed to fully understand the threat of future hurricanes to Caribbean populations.

"Further studies could therefore incorporate factors that directly affect the health and well-being of local populations - such as storm surge, flood and landslide modelling - into the rainfall results to quantify such threats and feed into adaptation and resilience planning.

"Reducing the likelihood of extreme hurricanes should be the overriding priority. Our research clearly illustrates how vital it is to keep striving to meet the lower global warming temperature target, and the collective responsibility all countries, cities, communities, governments and individuals share to make that happen."

Credit: 
University of Bristol

Discovered: Cellular pathway involved in resistance to Ebola virus and SARS-like coronaviruses

Researchers working in human cells have identified a new pathway that targets a common vulnerability in several different pandemic viruses. This pathway can protect cells from infection by Ebola virus and from coronaviruses like SARS-CoV-2, they say. Their new findings, uncovered by an innovative screening approach, may inform future therapies against a broad range of viruses. Recent and ongoing outbreaks of Ebola virus in Africa and the SARS-CoV-2 pandemic globally highlight the need to identify additional treatment strategies for viral infections. Therapies that focus on host pathways of cellular resistance to viruses, and that target common vulnerabilities across multiple viruses, are of particular interest, but finding these pathways has been difficult using conventional genetic screens. Here, Anna Bruchez and colleagues used a novel screening approach based on activation of chromosomal segments called transposons to look for new genes that can prevent infection by Ebola virus. This screening strategy uncovered that the gene MHC class II transactivator (CIITA) induces resistance to Ebola virus in human cell lines by activating the expression of a second gene, CD74. One isoform of CD74, known as p41, disrupts the processing of proteins on the coat of the Ebola virus protein by cellular proteases called cathepsins. This prevents entry of the virus into the cell and subsequent infection. In further research using human cell lines, Bruchez and colleagues showed that CD74 p41 also blocked the cathepsin-dependent entry pathway of coronaviruses, including SARS-CoV-2. The results reveal a new role for the two genes identified, which likely came before their better-known role in antigen processing, the authors say. "We anticipate that the application of this transposon screening approach to other models of infection will reveal other mechanisms that have eluded conventional screening strategies," they write.

Credit: 
American Association for the Advancement of Science (AAAS)