Earth

New guidelines say breastfeeding is safe after anaesthesia

New guidelines published by the Association of Anaesthetists in the journal Anaesthesia, to coincide with the start of World Breast Feeding Week (1-7 August) say that breastfeeding is safe after the mother has had anaesthesia, as soon as she is alert and able to feed.

"The guidelines say there is no need to discard any breast milk due to fear of contamination, since evidence shows that anaesthetic and non-opioid painkiller drugs are transferred to breast milk in only very small amounts," explain the authors who include Dr Mike Kinsella of the Association of Anaesthetists Safety Committee, based at St Michael's Hospital, Bristol, UK, and colleagues. "For almost all of these drugs, there is no evidence of effects on the breastfed infant."

However, they caution that drugs such as opioids and benzodiazepines should be used with caution, especially after multiple doses and in babies up to 6 weeks old (corrected for gestational age). "In this situation, the infant should be observed for signs of abnormal drowsiness and respiratory depression, especially if the woman is also showing signs of sedation," they explain. "Techniques that reduce opioid usage are preferable for the breastfeeding woman. Local and regional anaesthesia have benefits in this regard, and also have the least interference with the woman's ability to care for her infant."

They also add that codeine should not be used by breastfeeding women following concerns of excessive sedation in some infants, related to differences in metabolism.

More generally, the guidelines say that any women with an infant aged 2 years or younger should routinely be asked if they are breastfeeding during their preoperative assessment, so that it can be explained to them that breastfeeding will be safe after their surgery. They say: "Where possible, day surgery is preferable to avoid disrupting normal routines. A woman having day surgery should have a responsible adult stay with her for the first 24 hours. She should be cautious with co-sleeping, or sleeping while feeding the infant in a chair, as she may not be as responsive as normal."

They conclude: "In summary, the pharmacological aspects of anaesthesia and sedation require little alteration in breastfeeding women. However, supportive care for the woman in the peri-operative period, and accurate advice, will ensure minimal disruption to this important part of childcare."

Credit: 
AAGBI

Nano-sponges of solid acid transform carbon dioxide to fuel and plastic waste to chemicals

image: Nano solid acids that transform carbon dioxide directly to fuel(dimethyl ether) and plastic waste into chemicals (hydrocarbons).

Image: 
Ayan Maity,TIFR, Mumbai

Solid acids are amongst the most essential heterogeneous catalysts, which have the potential to replace environmentally harmful liquid acids, in some of the most important processes, such as hydrocarbon cracking, alkylation, as well as plastic waste degradation and carbon dioxide to fuel conversion.

Two best known solid acids are crystalline zeolites and amorphous aluminosilicates. Although zeolites are strongly acidic, they are limited by their inherent microporosity, causing extreme diffusion limitation, whereas aluminosilicates are although mesoporous, they suffer from low acidity and moderate stability. Thus, it is a synthetic challenge to design and synthesize solid acids with both strong acidities like zeolites and textural properties like aluminosilicates, speculated as "Amorphous Zeolites", which are ideally strongly acidic amorphous aluminosilicates.

On the other hand, the primary cause of climate change is atmospheric carbon dioxide, whose levels are rising every day. The effect of global warming in terms of drastic changes in weather patterns is already clearly visible and alarming. There is, therefore, a great need to find ways to reduce carbon dioxide levels, either by sequestering it or by converting it to fuel. On the other hand, an excessive amount of plastic waste has become a serious environmental problem. Most of the countries generate thousands of tonnes of plastic waste every day.

In this work, researchers dealt with both these problems at one stroke, by developing nano solid acids that transform carbon dioxide directly to fuel (dimethyl ether) and plastic waste to chemicals (hydrocarbons).

By using the techniques of bicontinuous microemulsion droplets as a soft template, Prof. Vivek Polshettiwar's group at Tata Institute of Fundamental Research (TIFR), Mumbai, synthesized an acidic amorphous aluminosilicate (AAS), speculated as "Amorphous Zeolites", with a nano-sponge morphology, exhibiting both zeolitic (strong acidity) and amorphous aluminosilicate (mesoporous high surface area) properties. The presence of zeolite-like bridging silanol in AAS was proved by various catalytic reactions (styrene oxide ring-opening, vesidryl synthesis, Friedel-Crafts alkylation, jasminaldehyde synthesis, m-xylene isomerization, and cumene cracking) which requires strong acidic sites and larger pore sizes. The synergy between strong acidity and accessibility was reflected in the fact that AAS showed better performance than state-of-the-art zeolites and amorphous aluminosilicates. This was also confirmed by detailed solid-state NMR studies. Thus, it was clear that the material possesses strongly acidic zeolite-like bridging silanol sites, even though materials are not crystalline but amorphous. They, therefore, fall into a new class of materials at the interface between crystalline zeolite and amorphous aluminosilicate.

Thus, the approach may allow the development of solid acid catalysis for plastic degradation as well as carbon dioxide to fuel, at the significant rates, scales, and stabilities required to make the process economically competitive. The protocol has scientific and technological advantages, owing to its superior activity and stability.

Credit: 
Tata Institute of Fundamental Research

Short wind turns with strong cooling effect

image: Microstructure probe at the stern of the Meteor when launching with the instrument's own winch. The fast fading of the orange Kevlar cable allows the turbulence measurements to be carried out almost in free fall of the probe through the water. Photo: Marcus Dengler.

Image: 
M. Dengler, GEOMAR.

Sea surface temperatures in the tropics have a major influence on the climate in the tropics and the adjacent continents. For example, they determine the position of the Intertropical Convergence Zone and the beginning and strength of the West African monsoon. Therefore, it is important to understand the variability of sea surface temperatures for climate predictions. Until now, the seasonal cycle of sea surface temperature in the tropical North Atlantic could not be sufficiently explained. "More precisely, the sea surface is colder than predicted by the combination of previous direct observations of solar radiation, currents and mixing, especially in the summer months from July to September", explains Dr. Rebecca Hummels from the GEOMAR Helmholtz Centre for Ocean Research Kiel and first author of a study now published in Nature Communications.

Ship-based observations with the German research vessel METEOR in September 2015 provided first measurements of a strong turbulent mixing event below the sea surface, where mixing was up to a factor of 100 higher than previously observed at this location. "When we noticed the greatly enhanced turbulence in the water column during data processing, we at first suspected a malfunction of our sensors," says Dr. Marcus Dengler, co-author of the study. "But when we also noticed strong currents at the ocean surface, we became curious". Precisely such events can explain the lower temperatures at the ocean surface.

"We were able to isolate the process behind this strong mixing event, which lasted only for a few days," explains Dr. Hummels. "It is a so-called inertial wave, which is a very short but intense flow event," Hummels continues. Inertial waves are horizontal wave phenomena in which the current at the surface rotates clockwise with time, whereas the movement rapidly decays with increasing depth. The different velocities at the surface and in the layer below cause instabilities and ultimately mixing between the warm water in the surface layer and the colder water below. Such inertial waves can be caused by brief variations in the near-surface winds. Up to now, generally only weak currents have been observed in this region and the rather steady trade winds at this time of year did not suggest particularly strong mixing events. However, wind variations are crucial to trigger these waves in the upper ocean. The winds do not have to be particularly strong, but ideally should rotate the same way the ocean currents do. Since such wind fluctuations are relatively rare and only last a few days, it has not yet been possible to measure such a strong wave phenomenon with the associated strong mixing in this region.

After the discovery of this event during the METEOR cruise in September 2015, the Kiel scientists wanted to know more about the frequency and the actual impact of such events. "Through model-based data analysis, we were able to give a context to the in-situ observations", explains co-author Dr. Willi Rath from the Research Unit Ocean Dynamics at GEOMAR. "Together, we have scanned 20 years of global wind observations looking for similar events triggered by wind fluctuations and described their occurrence in the region and during the course of the year", Dr. Rath adds. This has supported the hypothesis that the temporal and spatial distribution of such events can indeed explain the gap in the heat balance of the upper ocean.

The strong turbulent mixing caused by the inertial waves at the base of the surface layer is also crucial for biology: For example, the cold water that is mixed into the surface layer during such an event also brings nutrients from deeper layers into the upper ocean penetrated by sunlight. "This also explains the hitherto largely unexplained occurrence of chlorophyll blooms in this region, which could now also be attributed to the seasonally increased occurrence of these inertial waves," explains Dr. Florian Schütte, also co-author of the study.

The ship measurements in the tropical Atlantic were carried out in close cooperation with the international PIRATA program. For more than 20 years, the PIRATA surface buoys have been providing valuable data for studies of ocean-atmosphere interaction, which were also used for this study. "Indeed, the intensive mixing measurements resulted from a failure in the hydraulic system of the METEOR, which made other measurements impossible at that time", says Prof. Dr. Peter Brandt, chief scientist of the expedition. Despite buoys and series of ship expeditions to this region, new phenomena are still being discovered - sometimes by chance - which decisively advance our understanding of the tropical climate.

Credit: 
Helmholtz Centre for Ocean Research Kiel (GEOMAR)

Tiny plants crucial for sustaining dwindling water supplies: Global analysis

image: A diverse biocrust community in western New South Wales.

Image: 
David Eldridge

A global meta-analysis led by UNSW scientists shows tiny organisms that cover desert soils - so-called biocrusts - are critically important for supporting the world's shrinking water supplies.

Biocrusts are a rich assortment of mosses, lichens, cyanobacteria, and microscopic organisms such as bacteria and fungi that live on the surface of dryland soils. Drylands, collectively, are the world's largest biome.

"Biocrusts are critically important because they fix large amounts of nitrogen and carbon, stabilise surface soils, and provide a home for soil organisms," said lead author Professor David Eldridge from UNSW Science.

"But we still have a poor understanding of just how biocrusts influence hydrological cycles in global drylands.

"Accounting for biocrusts and their hydrological impacts can give us a more accurate picture of the impacts of climate change on dryland ecosystems and improve our capacity to manage those effects," Prof. Eldridge said.

Exploring more than 100 scientific papers

For the study, the team assembled and then analysed the largest ever global database of evidence on the effects of biocrusts on water movement, storage and erosion, focussing on drylands.

"Our emphasis was on dryland soils because biocrusts are often the dominant surface covering on these soils, particularly during dry times," Prof. Eldridge said.

A huge increase in the number of publications on biocrusts over the past decade had prompted the group to critically assess the links between water capture and storage, and landscape stability in drylands.

Co-author Dr Samantha Travers from UNSW Science helped retrieve and analyse data from more than 100 scientific papers published over the past 30 years.

"The global literature on biocrust effects on hydrology has often been conflicting, preventing us from making broadscale recommendations on how to manage them to manage water," Dr Travers said.

Importantly, the researchers showed that globally, the presence of biocrusts on the soil surface reduced water erosion by an average of 68%.

"Cyanobacteria in the crusts secrete organic gels and polysaccharides that help to bind small soil particles into stable surfaces. Mosses in the crusts also trapped water and sediment on the soil surface, preventing the removal of soil particles," Dr Travers said.

Although biocrusts reduced the infiltration of water into the soil, they tended to increase water storage in the uppermost layers.

"This upper layer is where most of the nutrients and microbes are found - it is a critical zone for plant production and stability in dryland soils," Prof. Eldridge said.

"More water in the upper layers means greater productivity and stability."

Prof. Eldridge said we now had a better understanding of how biocrusts affect water relations in drylands.

"However, the effects depend on factors such as the type of crust and whether it is intact or disturbed," he said.

Three decades of biocrust research

Prof. Eldridge and his team have been studying the role of biocrusts on Australia's soils for more than 30 years.

The focus of the team's research is on drylands because they occupy almost half of Earth's land surface and support almost 40% of the global human population.

"Many people in drylands rely on pastoralism for their livelihoods, so the capture and use of water is critically important in these water-limited environments," Prof. Eldridge said.

"Anything that alters the hydrological balance in drylands has the potential therefore to affect millions of people, hence the importance of these tiny surface communities."

He said a major problem for sustainable management of drylands was overgrazing by livestock.

"Trampling by sheep and cattle breaks up the crust, destabilising the soil surface and leading to increased water erosion - effects that are supported by our global analyses," he said.

"Preventing overgrazing by livestock is critical if we are to prevent the loss of biocrusts, but until recently, the magnitude of the effects have not been known.

"The results of this work will be incorporated into global water balance and soil loss models so that managers and governments have a better understanding of the implications of losing biocrusts on the world's dwindling water supplies," Prof. Eldridge said.

The study, published in Global Change Biology today, was a collaborative effort between UNSW Sydney, and scientists from the United States, Spain, Germany, Mexico and China.

The work is part of a larger global study, supported by the John Wesley Powell Center for Analysis and Synthesis to predict the impacts of climate change on biological crust communities.

The research team is now examining how global land use changes affect biocrust communities, and developing best management practices to restore biocrusts as we move towards a hotter and drier world.

Credit: 
University of New South Wales

A new chemical analysis upends conventional explanation for global cooling

image: Researchers relied upon isotope analysis of sediments collected from Hall's Cave, located in the Texas Hill Country, to determine a new explanation for a dramatic period of global cooling about 13,000 years ago.

Image: 
Nan Sun, University of Houston

Scientists have long known the earth cooled dramatically about 13,000 years ago, dropping temperatures by about 3 degrees Centigrade. There are several theories about the cause. The leading explanation has been a so-called extraterrestrial event, a massive object slamming into earth from space or bursting in the atmosphere.

Texas researchers now have reported in Science Advances new evidence for another, more likely explanation - the eruption of a volcano on what is now the European continent, upending thinking about an event that shaped future evolution.

"The cooling period, known as the Younger Dryas, disrupted a general warming trend at the end of the Pleistocene era," said Nan Sun, a doctoral student at the University of Houston's Department of Earth and Atmospheric Sciences and first author for the paper. "That resulted in the extinction of a number of species and coincides with the disappearance of the Clovis culture."

"This work means the actual trigger for this cooling event didn't come from space," said Alan Brandon, professor of isotope geochemistry at UH and corresponding author for the paper. "It was terrestrial. This shows that there are other mechanisms besides space objects that can cause these events. It looks like volcanoes may be much more important than people thought."

In addition to Sun and Brandon, researchers on the project include S.L. Forman, and K.S. Befus with the Department of Geosciences at Baylor University and M.R. Waters, director the Center for the Study of the First Americans at Texas A&M University.

The work involved the isotope analysis of sediments collected from Hall's Cave, located in the Texas Hill Country and known to be associated with the eras encompassing the Younger Dryas. The analysis focused on osmium and levels of highly siderophile elements (known as HSE), including iridium, ruthenium, platinum, palladium and rhenium, including determining the proportion of each element to the others. Sun determined the elements were not present in the same relative proportions as is found in sediments known to have had materials added by a meteor, asteroid or other object from space when it impacted Earth.

That meant the cooling couldn't have been caused by an extraterrestrial impact. It had to have been something happening on earth.

But what?

Sun said the signature from the osmium isotope analysis and the relative proportion of the elements matched that previously reported in volcanic gases. These signatures were likely the result of major eruptions across the northern hemisphere, including in the Aleutians, Cascades and Europe. For example, the timing fits for the Laacher See volcano, located in what is now Germany. Laacher See is known to have erupted at the onset of the Younger Dryas, which is the most recent dramatic cooling event recorded in the past 15,000 years.

The earth may have been at a tipping point, possibly from the ice sheet discharge, and the eruptions of one or more powerful volcanoes could have provided the impetus to drive the cooling.

Brandon noted that he wasn't immediately convinced, however. "I was skeptical," he said. "We took every avenue we could to come up with this conclusion."

A volcanic eruption had been considered one possible explanation but was generally dismissed because there was no associated geochemical fingerprint, Brandon said. Instead, many scientists attributed the elements associated with the Younger Dryas to an extraterrestrial impact.

Other potential causes under consideration include a massive ice sheet discharge from the North Atlantic Ocean and a supernova explosion that depleted the ozone layer, resulting in atmospheric and surface changes that led to the cooling. Whether a single major eruption of a volcano could drive the cooling observed, however, is still an open question, the researchers said. The earth may have been at a tipping point, possibly from the ice sheet discharge, and that, combined with the Laacher Sea eruption, provided just the right mix to drive the cooling.

In addition to identifying a major likely contributor to the cooling, Nan said the research offers powerful validation for using analysis of both HSE and osmium isotope ratios.
Brandon said it also serves as a testament to the value of interdisciplinary work, noting that the research team included isotope geochemists, an anthropologist, geoscientists and a volcanologist.

Credit: 
University of Houston

NASA examines water vapor and structure in Hurricane Isaias

image: On July 31 at 3:20 a.m. EDT (0720 UTC), NASA's Aqua satellite passed over Hurricane Isaias as it departed Hispaniola. Aqua found highest concentrations of water vapor (brown) and coldest cloud top temperatures were around the center.

Image: 
Credits: NASA/NRL

When NASA's Aqua satellite passed over the North Atlantic Ocean, it gathered water vapor data on Isaias, while NASA-NOAA's Suomi NPP satellite provided forecasters with a visible image that showed a more organized tropical cyclone.

A Visible View of Isaias

NASA-NOAA's Suomi NPP satellite passed over the Atlantic Ocean during the afternoon on July 30, the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument provided forecasters with a visible image of Isaias as it was intensifying. VIIRS revealed strong thunderstorms had circled the center of circulation. The image showed the center near the northeastern coast of the Dominican Republic and bands of thunderstorms from the eastern quadrant stretching over Puerto Rico. At the time of the image, Isaias had not yet reached Turks and Caicos. A thick band of thunderstorms from the center of circulation also stretched out in a southwesterly direction and over the Caribbean Sea.

Water Vapor Imagery Reveals Heavy Rainfall Potential

Water vapor analysis of tropical cyclones tells forecasters how much potential a storm has to develop. Water vapor releases latent heat as it condenses into liquid. That liquid becomes clouds and thunderstorms that make up a tropical cyclone. Temperature is important when trying to understand how strong storms can be. The higher the cloud tops, the colder and stronger the storms.

NASA's Aqua satellite passed over Hurricane Isaias on July 31 at 3:20 a.m. EDT (0720 UTC), and the Moderate Resolution Imaging Spectroradiometer or MODIS instrument gathered water vapor content and temperature information. The MODIS image showed highest concentrations of water vapor and coldest cloud top temperatures were around the center of circulation and in a thick band of thunderstorms that extends southwest over western Hispaniola and into the Caribbean Sea. Those cloud top temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 degrees Celsius) in those storms. Storms with cloud top temperatures that cold have the capability to produce heavy rainfall.

That rainfall potential is apparent in today's forecast from the National Hurricane Center. NHC said the Dominican Republic and northern Haiti: 4 to 8 inches, with isolated maximum totals of 12 inches through Saturday. For the Bahamas, Turks and Caicos: 4 to 8 inches, and for Cuba: 1 to 2 inches, with isolated maximum totals of 4 inches.

Warnings and Watches on July 31, 2020

On July 31, NOAA's National Hurricane Center (NHC) issued a Hurricane Warning for the northwestern Bahamas including Andros Island, New Providence, Eleuthera, Abacos Islands, Berry Islands, Grand Bahamas Island, and Bimini. It is also in effect for the southeastern Bahamas including the Acklins, Crooked Island, Long Cay, the Inaguas, Mayaguana, and the Ragged Islands; and for the central Bahamas, including Cat Island, the Exumas, Long Island, Rum Cay, and San Salvador.

A Tropical Storm Warning is in effect for the Dominican Republic entire southern and northern coastlines, and the Turks and Caicos Islands. In addition, a Tropical Storm Watch is in effect for the east coast of Florida from Ocean Reef to Sebastian Inlet and for Lake Okeechobee.

Isaias' Status on July 31, 2020

At 8 a.m. EDT (1200 UTC) on July 31, NHC reported the center of Hurricane Isaias was located by NOAA and Air Force Reserve Hurricane Hunter aircraft near latitude 21.3 north, longitude 73.9 west. That puts the center about 30 miles (50 km) northwest of Great Inagua Island, and 340 miles (545 km) southeast of Nassau. Reports from the reconnaissance aircraft indicate that the minimum central pressure is 990 millibars.

Isaias was moving toward the northwest near 17 mph (28 kph), and a generally northwestward motion with some decrease in forward speed is expected for the next couple of days followed by a turn toward the north-northwest. Maximum sustained winds are near 80 mph (130 kph) with higher gusts.  Some strengthening is possible today, and Isaias is expected to remain a hurricane for the next few days.

NOAA's NHC Forecast for Isaias

On the forecast track, the center of Isaias will move near or over the southeastern Bahamas today. Isaias is forecast to be near the central Bahamas tonight, and move near or over the northwestern Bahamas and be near or east of the Florida peninsula on Saturday and Sunday.

Interests elsewhere along the southeast coast of the United States should monitor the progress of this system. Additional watches or warnings may be required for a portion of the Florida peninsula later today.

NASA Researches Tropical Cyclones

Hurricanes/tropical cyclones are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting. NASA's Aqua satellite is one in a fleet of NASA satellites that provide data for hurricane research.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

For updated forecasts, visit: http://www.nhc.noaa.gov

By Rob Gutro
NASA's Goddard Space Flight Center

Credit: 
NASA/Goddard Space Flight Center

Physicists find misaligned carbon sheets yield unparalleled properties

video: This animation shows what happens when two stacked graphene layers are misaligned by a small amount called a twist angle. A new periodic design in the mesh emerges, called a moiré pattern. University of Texas at Dallas physicists are investigating how the twist angle affects the electronic properties of such twisted bilayer graphene.

Image: 
University of Texas at Dallas

A material composed of two one-atom-thick layers of carbon has grabbed the attention of physicists worldwide for its intriguing -- and potentially exploitable -- conductive properties.

Dr. Fan Zhang, assistant professor of physics in the School of Natural Sciences and Mathematics at The University of Texas at Dallas, and physics doctoral student Qiyue Wang published an article in June with Dr. Fengnian Xia's group at Yale University in Nature Photonics that describes how the ability of twisted bilayer graphene to conduct electrical current changes in response to mid-infrared light.

From One to Two Layers

Graphene is a single layer of carbon atoms arranged in a flat honeycomb pattern, where each hexagon is formed by six carbon atoms at its vertices. Since graphene's first isolation in 2004, its unique properties have been intensely studied by scientists for potential use in advanced computers, materials and devices.

If two sheets of graphene are stacked on top of one another, and one layer is rotated so that the layers are slightly out of alignment, the resulting physical configuration, called twisted bilayer graphene, yields electronic properties that differ significantly from those exhibited by a single layer alone or by two aligned layers.

"Graphene has been of interest for about 15 years," Zhang said. "A single layer is interesting to study, but if we have two layers, their interaction should render much richer and more interesting physics. This is why we want to study bilayer graphene systems."

A New Field Emerges

When the graphene layers are misaligned, a new periodic design in the mesh emerges, called a moiré pattern. The moiré pattern is also a hexagon, but it can be made up of more than 10,000 carbon atoms.

"The angle at which the two layers of graphene are misaligned -- the twist angle -- is critically important to the material's electronic properties," Wang said. "The smaller the twist angle, the larger the moiré periodicity."

The unusual effects of specific twist angles on electron behavior were first proposed in a 2011 article by Dr. Allan MacDonald, professor of physics at UT Austin, and Dr. Rafi Bistritzer. Zhang witnessed the birth of this field as a doctoral student in MacDonald's group.

"At that time, others really paid no attention to the theory, but now it has become arguably the hottest topic in physics," Zhang said.

In that 2011 research MacDonald and Bistritzer predicted that electrons' kinetic energy can vanish in a graphene bilayer misaligned by the so-called "magic angle" of 1.1 degrees. In 2018, researchers at the Massachusetts Institute of Technology proved this theory, finding that offsetting two graphene layers by 1.1 degrees produced a two-dimensional superconductor, a material that conducts electrical current with no resistance and no energy loss.

In a 2019 article in Science Advances, Zhang and Wang, together with Dr. Jeanie Lau's group at The Ohio State University, showed that when offset by 0.93 degrees, twisted bilayer graphene exhibits both superconducting and insulating states, thereby widening the magic angle significantly.

"In our previous work, we saw superconductivity as well as insulation. That's what's making the study of twisted bilayer graphene such a hot field -- superconductivity. The fact that you can manipulate pure carbon to superconduct is amazing and unprecedented," Wang said.

New UT Dallas Findings

In his most recent research in Nature Photonics, Zhang and his collaborators at Yale investigated whether and how twisted bilayer graphene interacts with mid-infrared light, which humans can't see but can detect as heat.
"Interactions between light and matter are useful in many devices -- for example, converting sunlight into electrical power," Wang said. "Almost every object emits infrared light, including people, and this light can be detected with devices."

Zhang is a theoretical physicist, so he and Wang set out to determine how mid-infrared light might affect the conductance of electrons in twisted bilayer graphene. Their work involved calculating the light absorption based on the moiré pattern's band structure, a concept that determines how electrons move in a material quantum mechanically.

"There are standard ways to calculate the band structure and light absorption in a regular crystal, but this is an artificial crystal, so we had to come up with a new method," Wang said. Using resources of the Texas Advanced Computing Center, a supercomputer facility on the UT Austin campus, Wang calculated the band structure and showed how the material absorbs light.

The Yale group fabricated devices and ran experiments showing that the mid-infrared photoresponse -- the increase in conductance due to the light shining -- was unusually strong and largest at the twist angle of 1.8 degrees. The strong photoresponse vanished for a twist angle less than 0.5 degrees.

"Our theoretical results not only matched well with the experimental findings, but also pointed to a mechanism that is fundamentally connected to the period of moiré pattern, which itself is connected to the twist angle between the two graphene layers," Zhang said.

Next Step

"The twist angle is clearly very important in determining the properties of twisted bilayer graphene," Zhang added. "The question arises: Can we apply this to tune other two-dimensional materials to get unprecedented features? Also, can we combine the photoresponse and the superconductivity in twisted bilayer graphene? For example, can shining a light induce or somehow modulate superconductivity? That will be very interesting to study."

"This new breakthrough will potentially enable a new class of infrared detectors based on graphene with high sensitivity," said Dr. Joe Qiu, program manager for solid-state electronics and electromagnetics at the U.S. Army Research Office (ARO), an element of the U.S. Army Combat Capabilities Development Command's Army Research Laboratory. "These new detectors will potentially impact applications such as night vision, which is of critical importance for the U.S. Army."

Credit: 
University of Texas at Dallas

How to improve climate modeling and prediction

We are changing the Earth system at a unprecedented speed without knowing the consequences in detail. Increasingly detailed, physics-based models are improving steadily, but an in-depth understanding of the persisting uncertainties is still lacking. The two main challenges have been to obtain the neccesary amount of detail in the models and to accurately predict how anthropogenic carbon dioxide disturbs the climate's intrinsic, natural variability. A path to surmounting both of these obstacles are now laid out in a comprehensive review published in Reviews of Modern Physics by Michael Ghil and Valerio Lucarini from the EU Horizon 2020 climate science project TiPES.

- We propose ideas to perform much more effective climate simulations than the traditional approach of relying exclusively on bigger and bigger models allows. And we show how to extract much more information at much higher predictive power from those models. We think it is a valuable, original and much more effective way than a lot of things that are being done, says Valerio Lucarini, professor in mathematics and statistics at the University of Reading, UK and at CEN, the Institute of meteorology, University of Hamburg, Germany.

Such an approach is urgently needed, because nowadays climate models generally fail in performing two important tasks.

First, they cannot reduce the uncertainty in determining the mean global temperature at the surface after a doubling of carbon dioxide (CO2) in the atmosphere. This number is called equlibrium climate sensitivity and in 1979 it was computed to 1,5-4 degrees Celsius. Since then the uncertainty has grown. Today it is 1,5-6 degrees in spite of decades of improvement to numerical models and huge gains in computational power over the same period.

Second, climate models struggle to predict tipping points, which occur when a subsystem i.e. a sea current, an ice sheet, a landscape, an eco system suddenly and irrevocably shift from one state to another. These kind of events are well documented in historical records and pose a major threat to modern societies. Still, they are not predicted by the high end climate models that the IPCC assessments rely upon.

These difficulties are grounded in the fact that mathematical methodology used in most high resolution climate calculations does not reproduce well deterministically chaotic behavior nor the associated uncertainties in the presence of time-dependent forcing.

Chaotic behavior is intrinsic to the Earth system as very different physical, chemical, geological and biological processes like cloud formation, sedimentation, weathering, ocean currents, wind patterns, moisture, photosynthesis etc. range in timescales from microseconds to million of years. Apart from that, the system is forced mainly by solar radiation which varies naturally over time, but also by antropogenic changes to the atmosphere. Thus, the Earth system is highly complex, deterministically chaotic, stochastically perturbed and never in equilibrium.

- What we are doing is essentially extending deterministic chaos to a much more general mathematical framework, which provides the tools to determine the response of the climate system to all sorts of forcings, deterministic as well as stochastic, explains Michael Ghil, professor at Ecole Normale Supérieure and PSL University in Paris, France and at the University of California, Los Angeles, USA.

The fundamental ideas are not that new. The theory was developed decades ago, but as a very difficult mathematical theory which calls for cooperation between experts in different fields to be implemented in climate models. Such interdisciplinary approaches involving the climate science community as well as experts in applied mathematics, theoretical physics and dynamical systems theory have been slowly emerging. The authors hope the review paper will accelerate this tendency as it describes the mathematical tools needed for such work.

- We present a self-consistent understanding of climate change and climate variability in a well defined coherent framework. I think that is an important step in solving the problem. Because first of all you have to pose it correctly. So the idea is - if we use the conceptual tools we discuss extensively in our paper, we might hope to help climate science and climate modelling make a leap forward, says Valerio Lucarini.

Credit: 
University of Copenhagen

Unusual electron sharing found in cool crystal

image: When CsW2O6 is cooled below -58°C, molecular triangles form of tungsten atoms that are bonded together by only two electrons. Similar bonding has only previously been demonstrated in trihydrogen ions in outer space.

Image: 
Yoshihiko Okamoto

A team of scientists led by Nagoya University in Japan has detected a highly unusual atomic configuration in a tungsten-based material. Until now, the atomic configuration had only been seen in trihydrogen, an ion that exists in between star systems in space. The findings, published in the journal Nature Communications, suggest further studies could reveal compounds with interesting electronic properties.

Atoms that make up humans and trees and kitchen tables generally bond together by sharing electrons - think of electrons as the atomic glue of life. Nagoya University applied physicist Yoshihiko Okamoto and colleagues have found a highly unusual version of this glue: a regular triangular molecule was formed of three atoms bonded together by two electrons.

"This type of bond had only previously been seen in the trihydrogen ions found in interstellar material," says Okamoto. "We were excited to see this configuration in a cooled tungsten-based crystal."

The so-called tritungsten molecules were discovered in single crystals of caesium tungsten oxide (CsW2O6) cooled below -58°C. CsW2O6 conducts electricity at room temperature but changes into an insulating material when it is cooled below -58°C. It has been a challenge to study how the atomic structure of this type of material changes in response to temperature. To overcome this, Okamoto and his colleagues in Japan synthesized very pure single crystals of CsW2O6 and bombarded them with X-ray beams at room temperature and -58°C.

The tungsten molecules in the conducting crystal form three-dimensional networks of tetrahedral pyramids connected at their corners, known as a pyrochlore structure. The bonds between the molecules form due to a symmetrical sharing of electrons between them.

However, when the compound is cooled, the electrons re-arrange and two types of tungsten atoms emerge within the tetrahedra, each with a different 'valence', or bonding power with other atoms. This, in turn, distorts the lengths of tungsten bonds with oxygen atoms in the compound, leading to a more compressed shape. Importantly, the tungsten atoms with lower valence form small and large triangles on the sides of the tungsten tetrahedra, with the highly unusual tritungsten molecules forming on the small triangles. The three tungsten atoms forming the points of these triangles share only two electrons between them to keep them bonded together.

"To our knowledge, CsW2O6 is the only example where this type of bond formation, where several atoms share only a few electrons, appears as a phase transition," says Okamoto.

The team aims to further investigate compounds with pyrochlore structures, with the ultimate goal of discovering materials with new and interesting properties.

Credit: 
Nagoya University

Ancient mountain formation and monsoons helped create a modern biodiversity hotspot

image: A plant press used by the researchers doing fieldwork in the Hengduan Mountains.

Image: 
Deren Eaton

One of the big questions in biology is why certain plants and animals are found in some places and not others. Figuring out how species evolve and spread, and why some places are richer in species than others, is key to understanding and protecting the world around us. Mountains make a good laboratory for scientists tackling these questions: mountains are home to tons of biodiversity, in part due to all the different habitats at different elevations. In a new study in Science, researchers examined the plant life in the China's Hengduan Mountains, the Himalaya Mountains, and the Qinghai-Tibet Plateau. Using DNA to build family trees of species, they learned that the diversity of plants in that region today can be traced back to newly-formed mountain ranges 30 million years ago, and monsoons that came later. It's a concrete example of how climatic and environmental changes influence life on Earth.

"This paper addresses the fundamental question of why there are so many species in some parts of the world and not others," says Rick Ree, a curator at Chicago's Field Museum and corresponding author of the Science study. "The formation of this very species-rich community was fueled by ancient mountain-building and then subsequent effects of the monsoon. The biodiversity that we see today has been profoundly shaped by geology and climate."

The paper focuses on plants growing above the treeline (called the alpine zone) in the Hengduan Mountains of southwestern China. "It's an incredibly interesting part of the world, it's a relatively small area that harbors one-third of all the plant species in China," says Ree. "In the Hengduan Mountains, you can see coniferous forests, rushing glacial streams, craggy valleys, and meadows just teeming with wildflowers." Some of the flowers, Ree notes, might be familiar to Western gardeners, including rhododendrons and delphiniums.

Ree and his colleagues wanted to find out how plants are distributed in the alpine regions of the Hengduan Mountains, Himalaya, and Qinghai-Tibet Plateau, and how they got there in the first place. To figure it out, they turned to phylogenetic reconstructions: essentially, using DNA and key pieces of fossil evidence to piece together the plants' family trees, going back tens of millions of years.

The researchers compared the DNA of different plant species that live in the region, determining how closely related they were to each other and how they evolved. If you have DNA sequences for a bunch of different plants, by looking at the differences in their DNA and using fossil plants as benchmarks for how long it takes new species to arise, you can make an educated guess as to how long ago their common ancestor lived and figure out the family tree that makes the most sense.

In this study, Ree and his colleagues were able to trace the origins of alpine plants in the Hengduan, Himalaya, and Qinghai-Tibet Plateau. Many of the plants first evolved in the Hengduan Mountains. Then, as the Indian tectonic plate collided with Asia, slowly creating new mountains, a bunch of new habitats formed up the mountains' sides and in the valleys below. And as the new mountains formed, the region began to experience more intense monsoons, possibly because the mountains altered the prevailing winds, creating new weather conditions.

"The combined effect of mountain-building and monsoons was like pouring jet fuel onto this flame of species origination," says Ree. "The monsoon wasn't simply giving more water for plants to grow, it had this huge role in creating a more rugged topography. It caused erosion, resulting in deeper valleys and more incised mountain ranges."

"The theory is, if you increase the ruggedness of a landscape, you're more likely to have populations restricted in their movement because it's harder to cross a deeper valley than a shallow valley. So any time you start increasing the patchiness and barriers between populations, you expect evolution to accelerate," says Ree.

And that's exactly what the team found in reconstructing the plants' genetic family tree: as the landscape grew more rugged over time, the now-isolated populations of plants veered off into their own separate species, resulting in the biodiversity we see today.

In addition to showing how geological and climate changes over the last 30 million years affect today's spread of plants, Ree notes that the study has implications for better understanding the climate change the Earth is currently experiencing.

"This study sheds light on the conditions under which we get rich versus poor biodiversity," says Ree. "Mountain ecosystems tend to be very sensitive to things like global warming, because the organisms that live there are dependent on a tight range of elevation and temperature. Understanding how historical environmental change affected alpine plants twenty million years ago can help us predict how today's climate change will affect their descendants."

Credit: 
Field Museum

Loss of adaptive immunity helps deep sea anglerfish fuse with their mates

The discovery of altered adaptive immunity in anglerfish helps explain how the creatures are able to temporarily or permanently fuse with their mates without experiencing immune rejection. For most vertebrates, the loss of the adaptive immune arm - orchestrator of protective T and B cell responses that are considered hallmarks of vertebrate immunity - could be fatal. But for these deep-sea denizens, the protective immune branch could actually stand in the way of reproductive success in the abyss, where finding a mate is so difficult that the animals have evolved a way to fuse tissues - and eventually, circulatory systems - with viable mates once they've found them. To allow for sexual parasitism without harmful effects, it's possible adaptive immune functions were suppressed in anglerfish and replaced by an alternative form of innate immunity to compensate for the loss, the researchers postulate. This unique immunological shift suggests vertebrate immune systems can be more flexible over time, contrary to the common belief that, once established, neither innate nor adaptive immune systems can be eliminated without catastrophic consequences. The phenomenon of sexual parasitism in anglerfish has been difficult to investigate, mainly because of the challenge of obtaining either live or dead specimens.
Jeremy Swann and colleagues tackled the mystery by grinding up frozen tissue and sequencing DNA from 31 preserved specimens, covering a broad spectrum of non-attaching, temporarily-attaching, and permanently-attaching anglerfish species. The researchers found that fusion-undergoing species exhibited dramatic changes in the composition and structure of key immune system genes, compared to non-attached species. Species with temporary attaching males lacked functional aicda genes that underpin the maturation of antibodies - a critical process in adaptive immunity. Species known to permanently attach exhibited additional alterations, such as the loss of rag genes, which are essential for the assembly of T cell receptor and antibody genes. The results suggest that co-evolution of innate and adaptive immunity has been disentangled in anglerfishes to support adoption of sexual parasitism. In the absence of adaptive immunity, modified innate immunity might have helped facilitate the evolutionary success of anglerfish, which remain the most species-rich vertebrate taxon in the deep sea. In the far future, greater insights into this unique immune system might help inform clinical efforts, such as therapies to enhance innate immunity in immunodeficient patients.

Credit: 
American Association for the Advancement of Science (AAAS)

New understanding of CRISPR-Cas9 tool could improve gene editing

image: The 3D structure of a base editor, comprised of the Cas9 protein (white and gray), which binds to a DNA target (teal and blue helix) complementary to the RNA guide (purple), and the deaminase proteins (red and pink), which switch out one nucleotide for another.

Image: 
UC Berkeley image by Gavin Knott and Audrone Lapinaite

Within a mere eight years, CRISPR-Cas9 has become the go-to genome editor for both basic research and gene therapy. But CRISPR-Cas9 also has spawned other potentially powerful DNA manipulation tools that could help fix genetic mutations responsible for hereditary diseases.

Researchers at the University of California, Berkeley, have now obtained the first 3D structure of one of the most promising of these tools: base editors, which bind to DNA and, instead of cutting, precisely replace one nucleotide with another.

First created four years ago, base editors are already being used in attempts to correct single-nucleotide mutations in the human genome. Base editors now available could address about 60% of all known genetic diseases - potentially more than 15,000 inherited disorders -- caused by a mutation in only one nucleotide.

The detailed 3D structure, reported in the July 31 issue of the journal Science, provides a roadmap for tweaking base editiors to make them more versatile and controllable for use in patients.

"We were able to observe for the first time a base editor in action," said UC Berkeley postdoctoral fellow Gavin Knott. "Now we can understand not only when it works and when it doesn't, but also design the next generation of base editors to make them even better and more clinically appropriate."

A base editor is a type of Cas9 fusion protein that employs a partially deactivated Cas9 -- its snipping shears are disabled so that it cuts only one strand of DNA -- and an enzyme that, for example, activates or silences a gene, or modifies adjacent areas of DNA. Because the new study reports the first structure of a Cas9 fusion protein, it could help guide the invention of myriad other Cas9-based gene-editing tools.

"We actually see for the first time that base editors behave as two independent modules: You have the Cas9 module that gives you specificity, and then you have a catalytic module that provides you with the activity," said Audrone Lapinaite, a former UC Berkeley postdoctoral fellow who is now an assistant professor at Arizona State University in Tempe. "The structures we got of this base editor bound to its target really give us a way to think about Cas9 fusion proteins, in general, giving us ideas which region of Cas9 is more beneficial for fusing other proteins."

Lapinaite and Knott, who recently accepted a position as a research fellow at Monash University in Australia, are co-first authors of the paper.

Editing one base at a time

In 2012, researchers first showed how to reengineer a bacterial enzyme, Cas9, and turn it into a gene-editing tool in all types of cells, from bacterial to human. The brainchild of UC Berkeley biochemist Jennifer Doudna and her French colleague, Emmanuelle Charpentier, CRISPR-Cas9 has transformed biological research and brought gene therapy into the clinic for the first time in decades.

Scientists quickly co-opted Cas9 to produce a slew of other tools. Basically a mash-up of protein and RNA, Cas9 precisely targets a specific stretch of DNA and then precisely snips it, like a pair of scissors. The scissors function can be broken, however, allowing Cas9 to target and bind DNA without cutting. In this way, Cas9 can ferry different enzymes to targeted regions of DNA, allowing the enzymes to manipulate genes.

In 2016, David Liu of Harvard University combined a Cas9 with another bacterial protein to allow the surgically precise replacement of one nucleotide with another: the first base editor.

While the early adenine base editor was slow, the newest version, called ABE8e, is blindingly fast: It completes nearly 100% of intended base edits in 15 minutes. Yet, ABE8e may be more prone to edit unintended pieces of DNA in a test tube, potentially creating what are known as off-target effects.

The newly revealed structure was obtained with a high-powered imaging technique called cryo-electron microscopy (cryoEM). Activity assays showed why ABE8e is prone to create more off-target edits: The deaminase protein fused to Cas9 is always active. As Cas9 hops around the nucleus, it binds and releases hundreds or thousands of DNA segments before it finds its intended target. The attached deaminase, like a loose cannon, doesn't wait for a perfect match and often edits a base before Cas9 comes to rest on its final target.

Knowing how the effector domain and Cas9 are linked can lead to a redesign that makes the enzyme active only when Cas9 has found its target.

"If you really want to design truly specific fusion protein, you have to find a way to make the catalytic domain more a part of Cas9, so that it would sense when Cas9 is on the correct target and only then get activated, instead of being active all the time," Lapinaite said.

The structure of ABE8e also pinpoints two specific changes in the deaminase protein that make it work faster than the early version of the base editor, ABE7.10. Those two point mutations allow the protein to grip the DNA tighter and more efficiently replace A with G.

"As a structural biologist, I really want to look at a molecule and think about ways to rationally improve it. This structure and accompanying biochemistry really give us that power," Knott added. "We can now make rational predications for how this system will behave in a cell, because we can see it and predict how it's going to break or predict ways to make it better."

Credit: 
University of California - Berkeley

Transcranial stimulation to prevent fear memories from returning

What if we were able to modify the negative effect of a returning memory that makes us afraid? A research group from the University of Bologna succeeded in this and developed a new non-invasive experimental protocol. The result of this study (to be found in the journal Current Biology) is an innovative protocol that combines fear conditioning - a stimulus associated with something unpleasant, that induces a negative memory - and the neurostimulation of a specific site of the prefrontal cortex.

This process alters the perception of an unpleasant (aversive) event so that it will no longer induce fear. "This experimental protocol combining transcranial stimulation and memory reconsolidation allowed us to modify an aversive memory that the participants had learned the day before", explains Sara Borgomaneri, researcher at the University of Bologna and first author of the study. "This result has relevant repercussions for understanding how memory works. It might even lead to the development of new therapies to deal with traumatic memories".

CAN MEMORIES BE ALTERED?

The primary focus of the research group is the process of reconsolidation. This process maintains, strengthens, and alters those events that are already stored in our long-term memory. "Every time an event is recalled in our memory, there is a limited period of time in which it can be altered", explains Simone Battaglia, researcher and co-author of this study. "The protocol we developed exploits this short time window and can, therefore, interfere with the reconsolidation process of learned aversive memories".

Researchers used TMS (Transcranial Magnetic Stimulation) to "erase" the fear induced by a negative memory. With an electromagnetic coil placed on the head of the participant, TMS creates magnetic fields that can alter the neural activity of specific brain areas. TMS is a non-invasive procedure that does not require surgery or any action on the participant and for this reason is widespread in research as well as in clinic and rehabilitation programmes.

"With TMS, we could alter the functioning of the prefrontal cortex, which proved to be fundamental in the reconsolidation process of aversive memories" says Sara Borgomaneri. "Thanks to this procedure, we obtained results that, until now, were only possible by delivering drugs to patients".

THE TRIAL

The research group developed this protocol through a trial involving 98 healthy people. Every participant had learned an aversive memory and the next day underwent a TMS session over the prefrontal cortex.

"First, we created the aversive memory by combining an unpleasant stimulation with some images", explains Borgomaneri. "The day after, we presented a group of participants with the same stimulus, which, in their memory, was recorded as aversive. Using TMS immediately afterwards, we interfered with their prefrontal cortex activity".

To test the effectiveness of the protocol, other groups of participants underwent TMS without their aversive memory to be recalled (no reconsolidation was triggered), and some other groups were stimulated with TMS in control brain areas, not involved in memory reconsolidation.

At that point, the only thing left to do for researchers was to evaluate the effectiveness of TMS. They waited for another day and once again tested how the participants reacted when the aversive memory was recalled. And they obtained encouraging results. Participants who had their prefrontal cortex activity inhibited by TMS showed a reduced psycho-physiological response to the unpleasant stimulus. They remembered the event (explicit memory) but its negative effect was substantially reduced.

"This trial showed that it is feasible to alter the persistence of potentially traumatic memories. This may have crucial repercussions in the fields of rehabilitation and clinical medicine", says Professor Giuseppe di Pellegrino, who coordinated the study. "We're dealing with a new technique that can be employed in different contexts and can assume a variety of functions, starting from treating PTSD, which will be the focus of our next study".

Credit: 
Università di Bologna

Coastal cities leave up to 75% of seafloor exposed to harmful light pollution

video: Researchers carry out fieldwork on the River Tamar (UK) as part of the study

Image: 
University of Plymouth, Plymouth Marine Laboratory

The global expansion of coastal cities could leave more than three quarters of their neighbouring seafloor exposed to potentially harmful levels of light pollution.

A study led by the University of Plymouth (UK) showed that under both cloudy and clear skies, quantities of light used in everyday street lighting permeated all areas of the water column.

This could pose a significant threat to coastal species, with recent research showing the presence of artificial skyglow can disrupt the lunar compass species use when covering long distances.

However, the current study found that the colour of the wavelengths shone at the surface had a marked difference on how much biologically important light pollution reached the seafloor.

Many of the white LEDs now being used to illuminate the world's towns and cities use a mixture of green, blue and red wavelengths to generate their brightness.

Green and blue wavelengths left up to 76% and 70% of the three-dimensional seafloor area exposed to light pollution respectively, while the presence of red light was less than 1%.

The research - which also involved Bangor University, the University of Strathclyde and Plymouth Marine Laboratory - is published in Scientific Reports, an online journal from the publishers of Nature.

It is the first study in the world to quantify the extent to which biologically important artificial light is prevalent on the seafloor and could, in turn, be having a detrimental effect on marine species.

Dr Thomas Davies, Lecturer in Marine Conservation at the University of Plymouth and the paper's lead author, said: "The areas exposed here are not trivial. Our results focused on a busy marine area and demonstrate the light from coastal urban centres is widespread across the sea surface, sub surface and seafloor of adjacent marine habitats. But Plymouth is still just one coastal city with a population of 240,000 people.

"Seventy-five per cent of the world's megacities are now located in coastal regions and coastal populations are projected to more than double by 2060. So unless we take action now it is clear that biologically important light pollution on the seafloor is likely to be globally widespread, increasing in intensity and extent, and putting marine habitats at risk."

The study focussed on Plymouth Sound and the Tamar Estuary which together form a busy waterway and are home to the largest naval port in Western Europe.

It was conducted over four nights in 2018, when there was little or no moonlight, and blue, green, and red artificial light was shone at the sea surface during both clear and cloudy conditions, and at low and high tide.

A combination of mapping and radiative transfer modelling tools were then used to measure exposure at the surface, beneath the surface, and at the seafloor.

The researchers are now calling for a more comprehensive review of the full impacts of coastal light pollution, to try and mitigate against the most harmful effects as coastal cities grow globally.

Dr Tim Smyth, Head of Science of Marine Biogeochemistry and Ocean Observations at Plymouth Marine Laboratory, said: "Light pollution from coastal cities is likely having deleterious impacts on seafloor ecosystems which provide vital ecosystem services. We investigated this by visiting the Tamar, Plym and Plymouth Sound for four successive nights in September 2018. The time-lapse video of our trips really highlights how bright our shorelines are at night. During the fieldwork we measured the above water light field and in-water optics as well as running in-water light modelling simulations, in order for us to map the light field across the whole of the Tamar Estuary network."

Credit: 
University of Plymouth

New study confirms extensive gas leaks in the North Sea

image: On the basis of investigations directly on the seafloor it was possible to determine the amount of escaping gas.

Image: 
Photo: ROV team/GEOMAR

During expeditions to oil and gas reservoirs in the central North Sea in 2012 and 2013, scientists of the GEOMAR Helmholtz Centre for Ocean Research Kiel (Germany) became aware of a phenomenon that had been hardly recognized before. They discovered that methane bubbles emerged from the seabed around abandoned wells. The gas originates from shallow gas pockets, which lie less than 1000 meters deep below the seafloor and that were not the target of the original drilling operations. An initial assessment showed that these emissions could be the dominant source of methane in the North Sea.

A new study published by GEOMAR scientists today in the International Journal of Greenhouse Gas Control, confirms this initial estimate on a larger data basis. 'We have combined investigations at additional wells with extensive seismic data. The results clearly show that thousands of tons of methane are leaking from old drill holes on the North Sea floor every year,' says Dr. Christoph Böttner, who is the main author of the study, which is part of his doctoral thesis at GEOMAR.

During expeditions with RV POSEIDON in 2017 and 2019, the researchers were able to detect gas leakage at 28 of 43 directly investigated wells. 'The propensity for such leaks increases the closer the boreholes are located with respect to shallow gas pockets, which are normally uninteresting for commercial use. Apparently, however, the disturbance of the overburden sediment by drilling process causes the gas to rise along the borehole,' explains Dr. Matthias Haeckel from GEOMAR, who lead the study.

In addition, the team used available seismic data of the industry from the British sector of the North Sea to make further statements about the boreholes in the area. 'We cover 20,000 square kilometres of seafloor in our study, which is approximately the size of Wales. This area contains 1,792 wells of which we have information. We evaluated a number of factors, such as location, distance to shallow gas pockets, and age, based on our direct measurements and weighted how these factors promote methane gas leakage from old wells. The most important factor was indeed the distance of the wells from the gas pockets,' explains Dr. Böttner.

The positions of the boreholes and the location and extent of the gas pockets indicate that this area of the North Sea alone has the potential to emit 900 to 3700 tonnes of methane every year. 'However, more than 15,000 boreholes have been drilled in the entire North Sea,' adds Dr Haeckel.

In seawater, methane is usually consumed by microbes. This can lead to local seawater acidification. In the North Sea, about half of the boreholes are at such shallow water depths that part of the emitted methane can escape into the atmosphere. Methane is the second most important greenhouse gas after carbon dioxide.

The authors of the study encourage the industry to publish their data and recommend more independent emission measurements from abandoned wells in order to develop stricter guidelines and legally binding regulations for abandonment procedures.

'The sources and sinks of methane, the second most important greenhouse gas after carbon dioxide, are still subject to large uncertainties. This also applies to emissions from the fossil energy sector. In order to better understand the reasons for the continuously increasing methane concentrations in the atmosphere and to be able to take mitigation measures, it is important to have a reliable numbers of the individual anthropogenic contributions,' summarizes Dr. Haeckel.

Credit: 
Helmholtz Centre for Ocean Research Kiel (GEOMAR)