Earth

NASA soaks up Tropical Storm Leslie's water vapor concentration

image: NASA's Aqua satellite passed over Tropical Storm Leslie in the Central Atlantic Ocean on Oct. 2 at 1:15 a.m. EDT (0515 UTC). The MODIS instrument showed highest concentrations of water vapor (brown) and coldest cloud top temperatures circled the center with a gap in the north-northeastern quadrant.

Image: 
Credits: NASA/NRL

When NASA's Aqua satellite passed over the Central Atlantic Ocean on Oct. 2 the MODIS instrument aboard analyzed water vapor within Tropical Storm Leslie.

Water vapor analysis of tropical cyclones tells forecasters how much potential a storm has to develop. Water vapor releases latent heat as it condenses into liquid. That liquid becomes clouds and thunderstorms that make up a tropical cyclone. Temperature is important when trying to understand how strong storms can be. The higher the cloud tops, the colder and the stronger they are.

NASA's Aqua satellite passed over Tropical Storm Leslie in the Central Atlantic Ocean on Oct. 2 at 1:15 a.m. EDT (0515 UTC) and the Moderate Resolution Imaging Spectroradiometer or MODIS instrument gathered water vapor content and temperature information. The storm continues to have a ragged banded eye.

The MODIS image showed highest concentrations of water vapor and coldest cloud top temperatures circled the center with the exception of the north-northeastern quadrant. MODIS saw coldest cloud top temperatures were as cold as minus 70 degrees Fahrenheit (minus 56.6 degrees Celsius) in those areas. Storms with cloud top temperatures that cold have the capability to produce heavy rainfall.

Leslie remains far enough from land areas so that there are no warnings or watches in effect. At 11 a.m. EDT (1500 UTC), the center of Tropical Storm Leslie was located near latitude 30.9 degrees north and longitude 56.1 degrees west. That's 520 miles (840 km) east of Bermuda. Leslie is moving toward the southwest near 8 mph (13 kph). Maximum sustained winds remain near 65 mph (100 kph) with higher gusts.

The National Hurricane Center or NHC forecast noted "Gradual strengthening is expected during the next day or two, and Leslie is forecast to become a hurricane tonight or on Wednesday, Oct. 3."

Credit: 
NASA/Goddard Space Flight Center

NASA sees a lot of strength in infrared view of cat four Hurricane Walaka

image: On Oct. 2 at 9:30 a.m. EDT (1330 UTC) NASA's Aqua satellite revealed cloud top temperatures in strongest storms around Walaka's center and in a fragmented band of thunderstorms east of the center of circulation. Those temperatures were as cold as or colder than minus 70 degrees (red) Fahrenheit (minus 56.6 degrees Celsius).

Image: 
NASA/NRL

Infrared satellite imagery provides temperature data, and when NASA's Aqua satellite passed over the Central Pacific Ocean, it analyzed Hurricane Walaka. Walaka is a Category 4 Hurricane on the Saffir-Simpson Hurricane Wind Scale.

Infrared satellite imagery provides temperature data, and when NASA's Aqua satellite passed over the Central Pacific Ocean, it analyzed Hurricane Walaka. Walaka is a Category 4 Hurricane on the Saffir-Simpson Hurricane Wind Scale.

Cloud top temperatures determine strength of the thunderstorms that make up a tropical cyclone. The colder the cloud top, the stronger the uplift in the storm that helps thunderstorm development. Basically, infrared data helps determine where the most powerful storms are within a tropical cyclone.

The Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard Aqua provided that infrared data on Oct. 2 at 9:30 a.m. EDT (1330 UTC) MODIS data showed cloud top temperatures in strongest storms around Walaka's center and in a fragmented band of thunderstorms east of the center of circulation. They were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 degrees Celsius). NASA research indicates very cold cloud tops with the potential to generate very heavy rainfall.

On Oct. 2, a Hurricane Warning was in effect for the Johnston Atoll and a Hurricane Watch is in effect for Papahanaumokuakea Marine National Monument from Nihoa to French Frigate Shoals to Maro Reef.

At 5 a.m. HST/11 a.m. EDT/1500 UTC, the center of Hurricane Walaka was located near latitude 14.7 degrees north and longitude 170.0 degrees west. Walaka is moving toward the north near 10 mph (17 kph). This general motion is expected to continue through Wednesday night with a steady increase in forward speed. On the forecast track, the center of Walaka is expected to pass just to the west of Johnston later today.

NOAA's Central Pacific Hurricane Center noted "Maximum sustained winds are near 155 mph (250 kph) with higher gusts. Walaka is a category 4 hurricane on the Saffir-Simpson Hurricane Wind Scale. Little change in intensity is expected through tonight, with rapid weakening expected Wednesday and Wednesday night."

Credit: 
NASA/Goddard Space Flight Center

Invasive plants can boost blue carbon storage

image: Invasive Phragmites reeds encroach on a boardwalk in a Maryland marsh. On this marsh, biologists with the Smithsonian Environmental Research Center have been running climate change experiments for more than 30 years.

Image: 
Gary Peresta/Smithsonian Environmental Research Center

When invasive species enter the picture, things are rarely black and white. A new paper has revealed that some plant invaders could help fight climate change by making it easier for ecosystems to store "blue carbon"--the carbon stored in coastal environments like salt marshes, mangroves and seagrasses. But other invaders, most notably animals, can do the exact opposite.

"We were aware of the effects of invasions on other facets of these habitats, but this was the first time we really delved into blue carbon storage," said Ian Davidson, a marine invasions biologist at the Smithsonian Environmental Research Center (SERC) and lead author of the new study. While blue carbon has become a buzz word in climate change circles, it has not appeared in many conversations about invasive species, especially in the marine realm.

The paper, published Monday, Oct. 1, in Global Change Biology, is the first meta-analysis to look exclusively at marine habitats when tackling the issue of invasions and carbon storage. Previous carbon storage research has focused largely on terrestrial environments like forests. But marshes and mangroves can store carbon an estimated 40 times faster than forests. And over the past century, biologists estimate the world has lost 25 to 50 percent of its blue carbon habitats, with an additional 8,000 square kilometers disappearing every year. Understanding these ecosystems is critical as policymakers work to mitigate both climate change and the impacts of invasive species.

"It's now part of global climate change solutions to get carbon credits in forests," said co-author Christina Simkanin, also a marine biologist at SERC. "But for blue-carbon habitats, the marine version, that has been slower to materialize."

Davidson, Simkanin and two Ireland-based biologists (Grace Cott, a wetland ecologist at University College Cork, and John Devaney, a postdoc at Trinity College Dublin) teamed up to run the study. They gathered data from 104 different studies, covering 345 comparisons around the world. Each study compared an invaded blue-carbon ecosystem to a similar uninvaded one. The scientists used the data to calculate how much plant-based biomass or soil carbon changed in each place in the presence of an invader. Over time, plant-based pools of biomass can be converted into valuable blue-carbon storage "sinks" that are locked in the soils beneath these habitats.

But when the researchers crunched the numbers, they discovered invasive species do not fall into a single camp. When the most powerful plants invaded--the ones Davidson called "ecosystem engineers"--biomass skyrocketed. At a 117 percent boost, they more than doubled an ecosystem's biomass and potential to store carbon. The reason, the authors said, is because most of those plants were similar to the species they usurped (a new kind of mangrove tree entering a mangrove forest, for example, or a reed like Phragmites entering a salt marsh). Because the invaders grew larger and faster than the native species, the ecosystem as a whole could store more carbon.

"When you have these essentially 'ecosystem engineers' come into the system, not only are they helping build habitat, they seem to be doing it more aggressively and more efficiently," Davidson said.

However, not all plants were so helpful. When more dissimilar plants took over, like algae invading a seagrass bed, biomass fell by more than a third. And animals cut biomass nearly in half, leaving ecosystems much feebler blue-carbon sinks.

"Introduced animals are essentially going in there eating, trampling, cutting and destroying biomass," Davidson said.

Salt marshes seemed to get the biggest biomass boost from their invaders, about 91 percent on average. This was partly because most salt marsh invaders fell into the "ecosystem engineer" plant category. However, the authors pointed out, salt marshes made up a huge portion of the data they were able to analyze. Seagrasses and mangroves have received much less attention, so the researchers did not have as much information to draw on.

The authors also cautioned against viewing invasive species as unlikely heroes. Carbon storage is one metric that some invaders could enhance, but managers still need to consider the other impacts invaders can have, such as biodiversity loss or habitat shrinking. The real question, the authors said, is how to manage environments where an invasive species has already taken hold and evaluate the true costs and benefits of eradication.

"Nobody's advocating, 'Let's introduce Phragmites, because it grows really fast and great, and let's increase carbon storage here,'" Simkanin said. "We're talking about how to best manage systems that are already impacted by humans, and how to do that in terms of what functions you want to preserve or you find most important."

"Ecosystem managers will be faced with a decision to eradicate or control invasive species," said Cott. "The information contained in this study can help managers make decisions if carbon storage is a function they want to enhance."

Credit: 
Smithsonian

Mobile device by UCLA enables easy prediction and control of harmful algal blooms

image: UCLA's imaging flow cytometer is powered by deep learning

Image: 
Zoltan Gorocs, Miu Tamamitsu,Vittorio Bianco, Patrick Wolf,Shounak Roy, Koyoshi Shindo, Kyrollos Yanny, Yichen Wu, Hatice Ceylan Koydemir, Yair Rivenson & Aydogan Ozcan. <em>Light: Science & Applications </em>(2018) 7:66. DOI 10.1038/s41377-018-0067

In the past 10 years, harmful algal blooms -- sudden increases in the population of algae, typically in coastal regions and freshwater systems -- have become a more serious problem for marine life throughout the U.S. The blooms are made up of phytoplankton, which naturally produce biotoxins, and those toxins can affect not only fish and plant life in the water, but also mammals, birds and humans who live near those areas.

According to the National Oceanic and Atmospheric Administration, the events have become more common and are occurring in more regions around the world than ever before.

The ability to forecast harmful algal blooms and their locations, size and severity could help scientists prevent their harmful effects. But it has been difficult to predict when and where the blooms will occur.

Now, UCLA researchers have developed an inexpensive and portable device that can analyze water samples immediately, which would provide marine biologists with real-time insight about the possibility that the algal blooms could occur in the area they're testing. That, in turn, would allow officials who manage coastal areas to make better, faster decisions about, for example, closing beaches and shellfish beds before algal blooms cause serious damage.

UCLA researchers created a new flow cytometer -- which detects and measures the physical and chemical characteristics of tiny objects within a sample -- based on holographic imaging and artificial intelligence. It can quickly analyze the composition of various plankton species within a matter of seconds, much faster than the current standard method, which involves collecting water samples manually and running them through several steps.

The research, which was published online by Light: Science & Applications and will appear in the journal's print edition, was led by Aydogan Ozcan, the UCLA Chancellor's Professor of Electrical and Computer Engineering and associate director of the California NanoSystems Institute at UCLA.

The growing threat from blooms is being caused in part by higher water temperature due to climate change, and in part by high levels of nutrients (mainly phosphorus, nitrogen and carbon) from fertilizers used for lawns and farmland.

The toxic compounds produced by the blooms can deplete oxygen from the water and can block sunlight from reaching fish and aquatic plants, which cause them to die or migrate elsewhere. In addition, fish and nearby wildlife can even ingest the toxins; and in some rare cases, if they are close enough to the blooms, humans can inhale them which can affect the nervous system, brain and liver, and eventually lead to death.

Scientists have generally tried to understand algal blooms through manual sampling and traditional light microscopy used to create high-resolution maps that show the phytoplankton composition over extended periods of time. To build those maps, technicians have to collect water samples by hand using plankton nets, and then bring them to a lab for analysis. The process is also challenging because the concentration and composition of algae in a given body of water can change quickly -- even in the time it takes to analyze samples.

The device created by Ozcan and his colleagues speeds up the entire process and because it does not use lenses or other optical components, it performs the testing at a much lower cost. It images algae samples -- and is capable of scanning a wide range of other substances, too --using holography and artificial intelligence.

Commercially available imaging flow cytometers used in environmental microbiology can cost from $40,000 to $100,000, which has limited their widespread use. The UCLA cytometer is compact and lightweight and it can be assembled from parts costing less than $2,500.

One challenge the researchers had to overcome was ensuring the device would have enough light to create well-lit, high-speed images without motion blur.

"It's similar to taking a picture of a Formula 1 race car," Ozcan said. "The cameraman needs a very short exposure to avoid motion blur. In our case, that means using a very bright, pulsed light source with a pulse length about one-thousandth the duration of the blink of an eye."

To test the device, the scientists measured ocean samples along the Los Angeles coastline and obtaining images of its phytoplankton composition. They also measured the concentration of a potentially toxic alga called Pseudo-nitzschia along six public beaches in the region. The UCLA researchers' measurements were comparable to those in a recent study by the California Department of Public Health's Marine Biotoxin Monitoring Program.

Zoltán Gӧrӧcs, a UCLA postdoctoral scholar and the study's first author, said the researchers are in the process of discussing their new device with marine biologists to determine where it would be most useful.

"Our device can be adapted to look at larger organisms with a higher throughput or look at smaller ones with a better image quality while sacrificing some of the throughput," Gӧrӧcs said.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

EU countries unprepared to move future Alzheimer's treatment into rapid clinical use

The health care systems in some European countries lack the capacity to rapidly move a disease-modifying treatment for Alzheimer's disease from approval into widespread clinical use, which could leave 1 million people without access to transformative care if such a breakthrough occurs, according to a new RAND Corporation study.

Researchers examined the health systems of France, Germany, Italy, Spain, Sweden and the United Kingdom and modeled the infrastructure challenges beginning in 2020 if confronted with a surge of patients seeking screening to determine if they qualify for a treatment that might prevent or delay the development of Alzheimer's.

The study found the primary problem is the need for medical specialists trained to diagnose patients who may have early signs of Alzheimer's and confirm that they would be eligible for therapy to prevent the progression of the disease to full-blown dementia.

Some nations have too few medical specialists and may require additional training of health providers to evaluate early-stage Alzheimer's patients. Another shortcoming is that there are too few facilities with capacity to deliver infusion treatments to patients.

The burden of Alzheimer's disease in high-income countries is expected to nearly double between 2015 and 2050. Recent positive results from clinical trials give hope that a disease-changing treatment could become available for routine use within a few years.

"Although there is continued effort to develop treatments to slow or block the progression of Alzheimer's dementia, less work has been done to prepare nations' medical systems to deliver such an advancement," said Jodi Liu, senior author of the study and a policy researcher at RAND, a U.S.-based nonprofit research organization. "While there is no certainty an Alzheimer's therapy will be approved soon, our work suggests that health care leaders in the European Union should begin thinking about how to respond should such a breakthrough occur."

Researchers analyzed how a new therapy might challenge the current health care system in the six nations examined. The analysis focuses on the supply of dementia specialists, diagnostic tools used to identify Alzheimer's abnormalities in the brain, and access to infusion centers that would deliver the treatment.

The RAND report estimates that as many as 1 million patients with mild cognitive impairment from the six countries could develop Alzheimer's dementia while waiting for evaluation and treatment resources over a two-decade period after approval of an Alzheimer's therapy.

The RAND analysis assumes that a therapy is approved for use beginning in 2020 and screening would begin in 2019, although researchers stress that the date was chosen only as a scenario for the model, not as a prediction of when an Alzheimer's therapy may be approved.

The study estimates that under such a scenario about 7.1 million people with mild cognitive impairment would seek timely diagnosis by a specialist. After follow-up evaluations and biomarker testing, the study estimates that 2.3 million people in the six countries ultimately may be recommended for treatment.

The analysis suggests that the health care systems in some of the European countries have insufficient capacity to diagnose and treat the large number of patients with early-stage Alzheimer's disease. The projected peak wait times range from five months for treatment in Germany to 19 months for evaluation in France. The first year without wait times would be 2030 in Germany, 2033 in France, 2036 in Sweden, 2040 in Italy, 2042 in the United Kingdom and 2044 in Spain.

In Germany and Sweden, the main infrastructure constraint would be infusion capacity. In the other four countries, wait times caused by both specialist availability and infusion capacity would delay treatment for more significant numbers of patients. Specialist capacity is the primary rate-limiting factor in France, the United Kingdom, and Spain.

"Each of the countries we studied has a unique set of medical system constraints and addressing those issues may turn out to be challenging," Liu said. "So it is important to begin discussions now about how to address these obstacles so that each nation is best prepared if a Alzheimer's breakthrough occurs."

RAND researchers suggest that a combination of reimbursement, regulatory, and workforce planning policies will be needed to address constraints in each medical system. In addition, innovation in diagnosis and treatment delivery would help ensure that sufficient capacity is created to treat patients with early-stage Alzheimer's disease.

Credit: 
RAND Corporation

Unveiling the mechanism protecting replicated DNA from degradation

image: EM images show sections of single-stranded DNA (ssDNA) being produced instead of double-stranded DNA (dsDNA) in the absence of AND-1.

Image: 
Takuya Abe

Researchers from Tokyo Metropolitan University and the FIRC Institute of Molecular Oncology (IFOM) in Italy have succeeded in depleting AND-1, a key protein for DNA replication, by using a recently developed conditional protein degradation system. Consequently, they were able to gain unprecedented access to the mechanism behind how AND-1 works during DNA replication and cell proliferation in vertebrate cells, demonstrating that AND-1 has two different functions during DNA replication mediated by different domains of AND-1.

DNA is often referred to as the "blueprint of life"; in order for living organisms to function, it is vital that all cells share the same blueprint. This is made possible by the process of DNA replication, where the DNA is accurately copied and distributed before the cell multiplies. Replication underpins all biological inheritance, and is supported by a whole range of biochemical pathways designed to ensure that it occurs without error and at the right speed. Failure to do so may have catastrophic consequences, including cancer: understanding the specific mechanisms behind this highly complex procedure is of the utmost importance.

The AND-1/Ctf4 protein is a key player in DNA replication, and is found in a vast range of living organisms, from fungi to vertebrates. Ctf4/AND-1 is essential in some organisms, but whether it is an essential gene for cell proliferation in vertebrates has not been shown experimentally. Moreover, how loss of AND-1 affects cell proliferation is not known.

In order to address this question, a team led by Dr. Dana Branzei from IFOM and Prof. Kouji Hirota from Tokyo Metropolitan University combined the use of two unique systems, the DT40 cell, a type of avian cell that is particularly suited to genetic engineering, and the auxin-inducible degron (AID) system, a means to realize selective depletion of a target protein. With these, they successfully established the and-1-aid cell line, in which a modified version of the AND-1 protein is degraded in a few hours after adding auxin, a type of plant hormone. This cell line enabled them to analyze the acute consequence of AND-1 loss, giving unprecedented insight into the role it played.

When done correctly, DNA replication should result in the formation of new double-stranded DNA helices. Authors used transmission electron microscopy (TEM) to visualize DNA replication intermediates and observed newly synthesized DNA having abnormally long single stranded DNA at the fork branching point in the absence of AND-1. They hypothesized that this was due to a DNA cleaving enzyme, a nuclease, disrupting the process of strands being disassembled. On further addition of a compound that suppresses the action of a particular nuclease, MRE11, they were able to successfully revert the abnormal replication fork phenotype and recover cell division, explicitly demonstrating the key role played by AND-1 in preventing nascent DNA cleavage by the nuclease during replication. Further analysis revealed that a specific part of the protein called WD40 repeats was responsible for preventing the accumulation of damage to the strand.

Further to these breakthrough findings, the study highlights the successful combination of cutting-edge techniques to realize conditional inactivation of specific proteins; the new cells were in fact developed over a single month. This leaves the exciting prospect of the method being applied to study other genes and processes which are otherwise difficult to target, leading to new insights into how cells work.

Credit: 
Tokyo Metropolitan University

Quantum mechanics work lets oil industry know promise of recovery experiments

image: Clockwise from top left: a schematic diagram of the calcite/brine/oil system, a simulation supercell (color scheme: Ca-indigo, C-brown, O-red, H-white) with ions in brine shown schematically, and the oil-in-water contact angle assuming an initial mixed-wet state and difference (relative to calcite-water) in the effective charge of the surface.

Image: 
Sokrates Pantelides

With their current approach, energy companies can extract about 35 percent of the oil in each well. Every 1 percent above that, compounded across thousands of wells, can mean billions of dollars in additional revenue for the companies and supply for consumers.

Extra oil can be pushed out of wells by forced water - often inexpensive seawater - but scientists doing experiments in the lab found that sodium in water impedes its ability to push oil out, while other trace elements help. Scientists experiment with various combinations of calcium, magnesium, sulfates and other additives, or "wettability modifiers," in the laboratory first, using the same calcite as is present in the well. The goal is to determine which lead to the most oil recovery from the rock.

Vanderbilt University physicist Sokrates Pantelides and postdoctoral fellow in physics Jian Liu developed detailed quantum mechanical simulations on the atomic scale that accurately predict the outcomes of various additive combinations in the water.

They found that calcium, magnesium and sulfates settle farther from the calcite surface, rendering it more water-wet by modifying the effective charge on the surface, enhancing oil recovery. Their predictions have been backed by experiments carried out by their collaborators at Khalifa University in Abu Dhabi: Saeed Alhassan, associate professor of chemical engineering and director of the Gas Research Center, and his research associate, Omar Wani.

"Now, scientists in the lab will have a procedure by which they can make intelligent decisions on experiments instead of just trying different things," said Pantelides, University Distinguished Professor of Physics and Engineering, William A. & Nancy F. McMinn Professor of Physics, and professor of electrical engineering. "The discoveries also set the stage for future work that can optimize choices for candidate ions."

The team's paper, ­­­­­"Wettability alteration and enhanced oil recovery induced by proximal adsorption of Na+, Cl-, Ca2+, Mg2+, and SO42- ions on calcite," appears today in the journal Physical Review Applied. It builds on Pantelides' previous work on wettability, released earlier this year.

His co-investigators in Abu Dhabi said the work will have a significant impact on the oil industry.

"We are excited to shed light on combining molecular simulations and experimentation in the field of enhanced oil recovery to allow for more concrete conclusions on the main phenomenon governing the process," Alhassan said. "This work showcases a classic approach in materials science and implements it in the oil and gas industry: the combination of modeling and experiment to provide understanding and solutions to underlying problems."

Credit: 
Vanderbilt University

Observing the development of a deep-sea greenhouse gas filter

image: The submersible takes samples in the mud around Håkon Mosby mud volcano. With this tube so-called sediment cores can be taken, which allow an insight into the community of organisms on the surface and deeper in the sediment.

Image: 
MARUM - Centre for Marine Environmental Sciences, University of Bremen

Large quantities of the greenhouse gas methane are stored in the seabed. Fortunately, only a small fraction of the methane reaches the atmosphere, where it acts as a climate-relevant gas, as it is largely degraded within the sediment. This degradation is carried out by a specialized community of microbes, which removes up to 90 percent of the escaping methane. Thus, these microbes are referred to as the "microbial methane filter". If the greenhouse gas were to rise through the water and into the atmosphere, it could have a significant impact on our climate.

But not everywhere the microbes work so efficiently. On sites of the seafloor that are more turbulent than most others - for example gas seeps or so-called underwater volcanoes -, the microbes remove just one tenth to one third of the emitted methane. Why is that? Emil Ruff and his colleagues from the Max Planck Institute for Marine Microbiology and the University of Bremen aimed to answer this question.

Methane consumption around a mud volcano

In the North Sea off Norway at 1250 meters water depth lies the Håkon Mosby mud volcano. There, warm mud from deeper layers rises to the surface of the seafloor. In a long-term experiment, Ruff and his colleagues were able to film the eruption of the mud, take samples and examine them closely. "We found significant differences in the different communities on-site. In fresh, recently erupted mud there were hardly any organisms. The older the mud, the more life it contained", says Ruff. Within a few years after the eruption, the number of microorganisms as well as their diversity increased tenfold. Also, the metabolic activity of the microbial community increased significantly over time. While there were methane consumers even in the young mud, efficient filtering of the greenhouse gas seems to occur only after decades.

"This study has given us new insights into these unique communities," says Ruff. "But it also shows that these habitats need to be protected. If the methane-munchers are to continue to help remove the methane, then we must not destroy their habitats with trawling and deep-sea mining. These habitats are almost like a rainforest - they take decades to grow back after a disturbance. "

International deep-sea research

Antje Boetius, co-author of the study, director of the Alfred Wegener Institute Helmholtz Center for Polar and Marine Research (AWI) and head of the research group for deep-sea ecology and technology at the Max Planck Institute in Bremen and the AWI, emphasizes the importance of national and international research cooperations to achieve such research results: "This study was only possible through the long-term cooperation between the AWI, the MARUM - Center for Environmental Sciences of the University of Bremen and the Max Planck Institute for Marine Microbiology with international partners in Norway, France and Belgium. Through various EU projects, we have been able to use unique deep-sea technologies to study the Håkon Mosby mud volcano and its inhabitants in great detail", says Boetius.

Credit: 
Max Planck Institute for Marine Microbiology

NASA looks at Tropical Storm Kirk's Caribbean rainfall

image: The GPM core satellite passed over Tropical Storm Kirk at 8:36 a.m. EDT (1236 UTC) on Friday, Sept. 28, 2018. GPM found the heaviest rainfall (pink) was around the center of circulation located west of the Leeward Islands. There, rain was falling at a rate of 100 mm (about 4 inches) per hour. Rain extended east of the center over the island chain where rain was falling (yellow, blue) between 10 and 40 mm (0.4 and 1.5 inches) per hour.

Image: 
NASA/JAXA/NRL

Tropical Storm Kirk just passed through the Leeward Islands and when the GPM satellite passed overhead, it revealed that Kirk continued to bring rain to the chain on Sept. 28.

The Global Precipitation Measurement mission or GPM core satellite passed over Tropical Storm Kirk at 8:36 a.m. EDT (1236 UTC) on Friday, Sept. 28, 2018. GPM found the heaviest rainfall was around the center of circulation located west of the Leeward Islands. There, rain was falling at a rate of 100 mm (about 4 inches) per hour.
Rain extended east of the center over the island chain where rain was falling between 10 and 40 mm (0.4 and 1.5 inches) per hour. GPM is a joint mission between NASA and the Japan Aerospace Exploration Agency, JAXA.

Those rains are expected to continue affecting the islands over the next day. The National Hurricane Center said "Kirk is expected to produce total rainfall of 4 to 6 inches across the northern Windward and southern Leeward Islands with isolated maximum totals up to 10 inches across Martinique and Dominica. These rains may produce life-threatening flash floods and mudslides. Across Saint Croix and eastern Puerto Rico, Kirk is expected to bring 2 to 4 inches with isolated maximum totals of 6 inches today and Saturday, Sept. 29."

Meanwhile, the Meteorological Service of St. Lucia has discontinued the Tropical Storm Warning for St. Lucia, and the Meteorological Service of Barbados has discontinued the Tropical Storm Watch for St. Vincent and the Grenadines.

At 11 a.m. EDT (1500 UTC), the center of Tropical Storm Kirk was located near latitude 13.8 degrees north and longitude 63.6 degrees west. That's about 185 miles (295 km) west-southwest of Martinique.

Kirk is moving toward the west-northwest near 13 mph (20 kph), and this motion is expected to continue through Sunday. On the forecast track, the center of Kirk or its remnants will move across the eastern and central Caribbean Sea over the next day or two. Reports from an Air Force Reserve Hurricane Hunter aircraft indicate that maximum sustained winds have decreased to near 45 mph (75 kph) with higher gusts. Kirk is forecast to weaken to a tropical depression tonight, and then degenerate into a trough of low pressure on Saturday, Sept. 29.

For forecast updates on Kirk, visit: http://www.nhc.noaa.gov.

Credit: 
NASA/Goddard Space Flight Center

'Cellular memory' of DNA damage in oocyte quality control

image: Females are born with a finite number of eggs that are steadily depleted throughout their lifetime. This reserve of eggs is selected from a much larger pool of millions of precursor cells, or oocytes, that form during fetal life. So there is a substantial amount of quality control during the process of forming an egg cell, or ovum, that weeds out all but the highest quality cells. New research from Neil Hunter's laboratory at UC Davis reveals that this quality control works by "remembering" previous DNA damage to the cell. In this image, ovaries of mice lacking the rnf212 gene (lower panels) had more oocytes than wild type mice (upper panels). Rnf212 seems to act as a quality check that tags flawed oocytes for destruction.

Image: 
Huanyu Qiao and Neil Hunter, UC Davis

Females are born with a finite number of eggs that are steadily depleted throughout their lifetime. This reserve of eggs is selected from a much larger pool of millions of precursor cells, or oocytes, that form during fetal life. So there is a substantial amount of quality control during the process of forming an egg cell, or ovum, that weeds out all but the highest quality cells. New research from Neil Hunter's laboratory at UC Davis reveals the surprising way that this critical oocyte quality control process works.

Previous research in the Hunter lab showed that a gene called Rnf212 is required for chromosomes to undergo crossing over during the early stages of oocyte development. The researchers were surprised to find a new, late function for Rnf212 in the oocyte selection process. The results are published Sept. 27 in the journal Molecular Cell.

During oocyte quality control, a decision is made whether each oocyte should continue and join the reserve of eggs, or undergo apoptosis -- cellular death.

"We almost stumbled upon this role in oocyte quality control when Joe (Huanyu Qiao, joint first author) first noticed that the Rnf212 mutants had more oocytes in their ovaries," said Hunter, professor of Microbiology and Molecular Genetics and an Investigator of the Howard Hughes Medical Institute. Hunter is senior author on the paper.

In mice, as in humans, developing females initially form very large number of oocytes. Around six million oocytes enter meiosis in humans, but a stunning 5 million are culled by birth. By puberty, the ovaries contain only around 250,000 oocytes, which are steadily depleted until menopause.

This drastic reduction reflects selection for only the highest quality oocytes. Oocytes that experienced defects in meiosis, including damage to their DNA, are culled. Only those that pass through quality control checkpoints can continue and become established in the ovarian reserve.

In mice that lacked Rnf212, more oocytes were able to pass through quality control checkpoints, increasing the overall size of the ovarian reserve.

A "cellular memory" of DNA damage

During the formation of gametes (sperm and eggs), portions of the parental chromosomes are swapped through the formation and repair of DNA breaks, a process called crossover. If the crossover process is defective, the repair of some DNA breaks is delayed. RNF212 helps tag these lingering breaks so that defective oocytes are sensitive to apoptosis.

The researchers found that RNF212 prevents the repair of lingering breaks to create a "cellular memory" of defects that occured in earlier stages of development. This allows the cell to assess how bad the defects were. If the number of unrepaired breaks passes a critical threshold of around ten, the cell is deemed to be of poor quality and undergoes apoptosis. If there are only a few lingering breaks, repair is reactivated and the oocyte is allowed to progress and become part of the ovarian reserve.

"It seems counterintuitive that a cell would actively impede DNA repair, but this is how oocytes gauge the success of earlier events. High levels of lingering breaks means there was a problem and the oocyte is likely to form a low quality egg," Hunter said.

Bigger isn't always better

Female fecundity depends primarily on two factors: the size of her ovarian reserve and the quality of eggs in it. Thus, the reproductive system must balance quality and quantity of oocytes for optimal fertility.

In mice without RNF212, lingering breaks are quickly repaired allowing significantly defective oocytes to sneak through quality control. Thus the overall size of the ovarian reserve is increased at the expense of oocyte quality. Many such oocytes would likely terminate early in a spontaneous miscarriage, causing fertility issues. Fetuses that develop further would have a higher risk of severe developmental defects.

"Basically, anything that goes wrong in the germline is a risk for congenital disease," said Hunter.

Previous research has shown that variants of the Rnf212 gene affect the rate of chromosome crossovers. This new work suggests that Rnf212variants might also affect the size and quality of ovarian reserves.

Credit: 
University of California - Davis

Tomatoes: Naturally 'mixing chemical cocktails' to promote disease resistance

image: Bacterial wilt devastates tomato crops world-wide. So far farmers had to wait for mature plants to observe resistance to the disease. Now research shows a possible way of saving time and reducing risk significantly for farmers and plant breeders. A new approach promises to forecast cultivar resistance at seedling stage, when plants already 'mix cocktails' of metabolomic chemicals to defend themselves.

Image: 
<p>Esiul at Pixabay <p>CC0 Creative Commons

Bacterial wilt devastates food crops all over the world. It destroys major crop plants such as tomatoes, potatoes, bananas, ginger and pepper. It occurs in many countries and attacks over 200 plant species.

The bacterium which causes the disease lingers in soil, seeds and plant material for years. It can infect water and farming equipment as well.

Plant breeders and farmers would like to know how resistant a cultivar is to the bacterium as early as possible. But so far they have had to plant - and then wait for mature plants to observe resistance in the fields.

Now research shows a possible way of saving time and reducing risk significantly for farmers and plant breeders. A new approach promises to forecast cultivar resistance much earlier than was possible before.

Researchers can now analyse cultivar resistance at seedling stage for a range of threats. They use plant metabolomics and statistical modelling to decode the plants' chemical defences.

Combined with genetic methods, the approach will be useful to identify resistance which depends on several genes, a long-standing challenge in plant breeding.

Farmer's dilemma

"When farmers buy seed, they need to think about the threats the plants will have to survive. A farm may have drought and heat, for example. If the soil is already infected with Ralstonia solanacearum bacteria that cause wilt, the farmer has two choices, says Prof Ian Dubery. He is the Director of the Centre for Plant Metabolomics Research at the University of Johannesburg.

"First, don't plant crops that get attacked by bacterial wilt. Two, choose plant cultivars that are more resistant. If there is drought and heat and bacterial wilt, the farmer wants a cultivar that is resistant enough to all three threats."

This may look simple in the age of full-genome DNA sequencing. Analyse all the genes of the cultivars, and pick the ones with the right genes for the threats on a particular farm. But a plant's resistance may not work like that. For one threat, a single gene may switch resistance on or off. For another, several genes may be involved.

"It is difficult to see which cultivars are resistant to bacterial wilt. Resistance to Ralstonia solanacearum is a multigenic trait - it depends on many genes - and these are not well understood yet. It will take time before science knows how it works," adds Dubery.

Relying on what plants look like can be deceptive as well. When plants are young, it may be possible to tell that a cultivar is unable to defend itself against a threat - that it is susceptible. But eliminating susceptible cultivars doesn't reliably leave you with resistant ones, says Dubery. In plant immunity, susceptibility and resistance can be heavily influenced by environmental factors.

With the current situation, farmers risk seedlings dying in infected soil. Mature plants can also turn out less resistant than expected. Plant breeders risk time on improving cultivars which turn out unsuitable for agriculture.

Chemical tomato defences

For the research article, then-Honours student Dylan Zeiss studied four tomato cultivars. The cultivars show medium to high resistance to Ralstonia solanacearum in commercial agriculture.

He took bits of leaves, stems and roots from healthy plants of each cultivar and mashed them up. Then he analysed these mixes for chemicals that the cultivars make to defend themselves.

"Plants can develop resistance to threats, such as bacteria, viruses or environmental stresses. But unlike animals, they do not have circulating immune cells in support of acquired immunity," says Zeiss.

"Plants use the innate resistance encoded by their genes. They also synthesize a variety of anti-microbial chemicals to counter threats. For each threat, a plant needs to make a different chemical 'cocktail'. The needed cocktail can vary, depending on location, weather and other stresses," he says.

Zeiss analysed 41 of these chemicals, called secondary metabolites, from the tomato cultivars. He used liquid chromatography coupled with high definition mass spectrometry. This showed which cultivars made what metabolites, and how much.

Then the researchers ran the raw data through a statistics engine to do multivariate analysis.

Better than genetics alone

Plants 'notice' what is attacking them in their environment. Some cultivars are better at noticing several threats at once, and making all the chemicals needed to defend themselves.

If a cultivar has stronger resistance against a threat, it will make more of the chemicals needed. These chemicals appear as strong peaks on the analyses. If the cultivar doesn't have much resistance against that threat, it either doesn't make the chemical, or in much lower quantities.

The researchers compared the chemical composition of the cultivar 'cocktails', and correlated this with the known resistance of the cultivars to bacterial wilt. In the process, they found a 'metabolite fingerprint' for tomato resistance to wilt.

"In principle, we can use this approach for any plant-pathogen interaction. The likely resistance of a cultivar can be forecast at seedling stage," says Dubery.

"If a cultivar has the genetic ability to develop resistance to a threat, it will synthesize the chemicals to defend itself. In this way, we can 'see' plant resistance much better than just looking at them.

"And we can do this when the seedlings are only a few weeks old, rather than waiting months to see if mature plants are resistant," he says.

In future, plant breeders can select food crop cultivars more resistant to heat, drought, bacteria and viruses, by combining metabolomics with gene-based technology, says Dubery.

Credit: 
University of Johannesburg

NASA satellite analyzes new Southern Pacific Ocean tropical cyclone

image: At 11 a.m. EDT (1500 UTC) on Sept. 27, the MODIS instrument aboard NASA's Aqua satellite looked at Tropical Storm Liua in infrared light. MODIS found two areas of coldest cloud top temperatures west and northeast of Liua's center were as cold as or colder than minus 80 degrees (yellow) Fahrenheit (minus 112 degrees Celsius). Surrounding them were storms with cloud tops as cold as or colder than minus 70 degrees (red) Fahrenheit (minus 56.6 degrees Celsius).

Image: 
NASA/NRL

NASA's Aqua satellite provided an infrared look at a new storm that formed in the southern Pacific Ocean called Liua and saw strongest storms off-center.

At 11 a.m. EDT (1500 UTC) on Sept. 27, the MODIS or Moderate Resolution Imaging Spectroradiometer instrument aboard NASA's Aqua satellite looked at Tropical Storm Liua in infrared light. MODIS found two areas of coldest cloud top temperatures west and northeast of Liua's center were as cold as or colder than minus 80 degrees Fahrenheit (minus 112 degrees Celsius). Those areas represented the strongest storms. Surrounding them were storms with cloud tops as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 degrees Celsius).

NASA research has found that cloud top temperatures as cold as or colder than the 70F/56.6C threshold have the capability to generate heavy rainfall.

At 11 a.m. EDT (1500 UTC), the center of Tropical Storm Liua was located near latitude 12.2 degrees south and longitude 162.5 degrees east. That's about 471 miles northwest of Port Vila, Vanuatu. Liua was moving toward the south-southwest. Liua's maximum sustained winds were near 40 mph (35 knots/62 kph) with higher gusts.

The Joint Typhoon Warning Center noted that the forecast takes Liua west over cooler sea surface temperatures and where outside winds will weaken the storm. Liua is forecast to dissipate by Sept. 29.

Credit: 
NASA/Goddard Space Flight Center

The whole day matters for cognitive development in children

OTTAWA, September 26, 2018 - In a new study, researchers at the CHEO Research Institute have found that children aged nine and 10 who meet recommendations in the Canadian 24-hour Movement Guidelines for physical activity, screen time and sleep time have superior global cognition. The results were published today in The Lancet Child & Adolescent Health journal.

Researchers from the CHEO Research Institute's Healthy Active Living and Obesity (HALO) group used data from the US National Institutes of Health (NIH)-funded Adolescent Brain Cognitive Development (ABCD) study - more than 4,500 US children aged nine and 10. The study team looked at the cognition of children compared to their physical activity levels, recreational screen time use and sleep time. When children met the recommendations for these, they were found to have higher measures of cognition. Cognition was measured six ways: language abilities, episodic memory, executive function, attention, working memory, and processing speed, using the Youth NIH ToolBox.

"When we looked at the ABCD data, we saw clearly that the whole day matters for children's cognitive health," says Dr. Jeremy Walsh, lead author of the study, formerly post-doctoral fellow at the CHEO Research Institute and currently a Michael Smith Foundation for Health Research Post-Doctoral Fellow at University of British Columbia - Okanagan. "The greatest benefits for cognition were when children met the screen time plus sleep time recommendations or the screen time recommendations alone. But, the more recommendations a child met, the more positive their global cognition."

The 24-hour Movement Guidelines were developed by the Canadian Society for Exercise Physiology, with leadership from HALO and are the first guidelines to incorporate movement recommendations for a child's whole day. In children aged five to 13 years, the guidelines recommend at least 60 minutes of moderate-to-vigorous-intensity physical activity, no more than two hours of daily recreational screen time and nine to 11 hours of uninterrupted sleep. The study shows that only half of children were meeting the sleep time recommendation, 36% met the screen time recommendation and only 17% met the physical activity recommendation.

"The shift in the lifestyle behaviours of children towards low physical activity, a reduction in sleep and ubiquitous screen time use may pose a threat to cognitive development," says Dr. Mark Tremblay, Senior Scientist with the CHEO Research Institute, Professor of Pediatrics, Faculty of Medicine, University of Ottawa and senior author of the paper. "We need to be doing more to encourage behaviours that promote healthy activity throughout the whole day. There is a positive relationship among childhood global cognition, future academic success and lower all-cause mortality. Meeting the recommendations in the 24-hour Guidelines sets kids up for a lifetime. If kids aren't meeting the recommendations, their healthy development could be at risk."

The ABCD is a 10-year longitudinal, observational study investigating brain development in US children. Data were collected on more than 4,500 US children aged nine and 10 years old at 20 study sites across the country.

Credit: 
Children's Hospital of Eastern Ontario Research Institute

Big increase in economic costs if cuts in greenhouse gas emissions are delayed

Stronger efforts to cut emissions of greenhouse gases should be undertaken if global warming of more than 1.5 Celsius degrees is to be avoided without relying on potentially more expensive or risky technologies to remove carbon dioxide from the atmosphere or reduce the amount of sunlight reaching the Earth's surface, a comprehensive new analysis has concluded.

A paper on 'The Economics of 1.5?C Climate Change', published in the journal Annual Review of Environment and Resources, warns that the target of limiting global warming to 1.5 Celsius degrees could soon become too economically expensive to justify, despite the benefits it could provide.

The authors from the London School of Economics and Political Science, Imperial College London and the University of East Anglia assessed almost 200 published academic papers on climate change, including recent studies about the economics of limiting global warming to 1.5 Celsius degrees.

They noted that economics analyses produce inconclusive results about the value of limiting global warming to 1.5 Celsius degrees.

The paper states: "Due to large uncertainties about the economic costs and, in particular, the benefits, there can be no clear answer to the question of whether the 1.5?C target passes a cost-benefit test".

Nonetheless, it draws attention to large benefits of limiting global warming to 1.5 Celsius degrees instead of 2 Celsius degrees: "There is evidence to suggest that limiting warming to 1.5?C reduces the risk of crossing climate tipping points, such as melting of the Greenland and Antarctic ice sheets, but the reduction in risk cannot presently be quantified."

Prof Simon Dietz of the ESRC Centre for Climate Change Economics and Policy and the Grantham Research Institute on Climate Change and the Environment at the London School of Economics and Political Science, who is the lead author on the paper, said: "The evidence we have simply does not give us a clear answer on whether the benefits of limiting warming to 1.5 Celsius degrees exceed the costs. But if we want to keep the option open to limit warming to 1.5 Celsius degrees, then unless we discover a much cheaper way to remove carbon dioxide from the air, and if we want to avoid risky methods of blocking out sunlight, we have to pursue the goal of 1.5 Celsius degrees now."

Another co-author of the paper, Prof Rachel Warren of the University of East Anglia, said: "Our review of recent studies shows the significant projected benefits of limiting global warming to 1.5 Celsius degrees rather than 2 Celsius degrees above pre-industrial levels for both human and natural systems. These benefits include preservation of Arctic sea ice, reduced biodiversity loss, and reduced damage to coral reefs."

The paper acknowledges the larger financial investments required to cut emissions sufficiently to limit global warming to 1.5 Celsius degrees instead of 2 Celsius degrees, particularly if policies are not designed well. It states: "The remaining carbon budget consistent with 1.5?C is very small and the global economy would need to be decarbonized at an unprecedented scale to stay within it, likely entailing larger costs.

"Any further delay in pursuing an emissions path consistent with 1.5?C likely renders that target unattainable by conventional means, instead relying on expensive large-scale CDR [carbon dioxide removal], or risky solar radiation management."

Carbon dioxide removal could be achieved, for example, by planting more vegetation, by burning plants and trees to generate energy and capturing and storing the resulting emissions of carbon dioxide, or by directly capturing carbon dioxide from the atmosphere and storing it. These methods of carbon dioxide removal tend to be costly and some present challenges for agriculture, sustainability and biodiversity.

Solar radiation management could involve, for instance, reducing the amount of the Sun's energy that reaches the Earth by injecting aerosol particles into the atmosphere to block out some sunlight, in order to counteract the warming caused by rising levels of greenhouse gases. This is largely untested and has significant associated risks.

Another co-author of the paper, Dr Ajay Gambhir of the Grantham Institute - Climate Change and the Environment at Imperial College London, said: "The comparative benefits of keeping global warming below 1.5 Celsius degrees, compared to 2 Celsius degrees, are striking. So even though it will be significantly more challenging to achieve the lower temperature goal, in terms of the required investments, strength of policy and greater reliance on measures such as carbon dioxide removal, we must not close the door on it. This means we need to step up the immediacy and pace of action."

The paper draws attention to studies showing that the carbon price consistent with limiting global warming to 1.5 Celsius degrees would be more than US$100 per tonne of carbon-dioxide-equivalent by 2020, and about three times higher than the price required to stop warming of more than 2 Celsius degrees.

This study is launched ahead of the planned publication by the Intergovernmental Panel on Climate Change in October 2018 of a special report on global warming of 1.5 Celsius degrees.

National governments are currently carrying out a collective stocktake, named the Talanoa Dialogue, of their contributions to the implementation of the Paris Agreement on climate change, which includes a commitment to "[h]olding the increase in the global average temperature to well below 2?C above preindustrial levels and pursuing efforts to limit the temperature increase to 1.5?C above preindustrial levels".

Credit: 
University of East Anglia

By Jove! Methane's effects on sunlight vary by region

image: This simulation, showing the monthly-mean total solar absorption by methane from 2006 to 2010, indicates large regional variability in the gas' power to absorb incoming energy from the sun. Note the activity over the Sahara Desert, Arabian Peninsula, and portions of Australia - all places where bright, exposed surfaces reflect light upwards to make methane's absorptive properties up to 10 times stronger than elsewhere on Earth.

Image: 
Lawrence Berkeley National Laboratory

Scientists investigating how human-induced increases in atmospheric methane also increase the amount of solar energy absorbed by that gas in our climate system have discovered that this absorption is 10 times stronger over desert regions such as the Sahara Desert and Arabian Peninsula than elsewhere on Earth, and nearly three times more powerful in the presence of clouds.

A research team from the U.S. Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) came to this conclusion after evaluating observations of Jupiter and Titan (a moon of Saturn), where methane concentrations are more than a thousand times those on Earth, to quantify methane's shortwave radiative effects here on Earth.

These findings were published online today in the journal Science Advances in an article entitled "Large Regional Shortwave Forcing by Anthropogenic Methane Informed by Jovian Observations." The paper indicates large regional variability in the ways methane acts as a solar absorber, finding that methane absorption, or "radiative forcing," is largely dependent on bright surface features and clouds.

"When we measure the impact of methane emissions on the planet, we wrongly assume it is easy to apply calculations of methane taken locally to predict what effect the gas is having globally," said William Collins, the study's lead author and director of the Climate and Ecosystems Sciences Division at Berkeley Lab. "Our work represents the importance of taking into consideration what impact methane and other greenhouse gases are having not just in general, but with regional certainty."

As greenhouse gases, carbon dioxide and methane primarily absorb heat, or longwave radiation, emitted to space by the Earth's atmosphere. However, methane and other gases also absorb incoming solar energy, or shortwave radiation, and convert it to heat, thereby warming the atmosphere by an additional 25 percent while simultaneously cooling Earth's surface.

More is known about shortwave forcing by carbon dioxide than methane, largely because the relatively complex tetrahedral shape of methane makes its physical absorption characteristics extremely difficult to quantify in the laboratory. The Berkeley Lab research team set out to assess whether previous climate assessments had suffered from uncertainties in calculations of anthropogenic shortwave forcing by methane, widely considered to be the second-most important greenhouse gas after the more abundant carbon dioxide due to methane's extreme potency.

The scientists analyzed methane absorption data from previous observations of the planet Jupiter, and Titan, the largest moon of Saturn. Concentrations of methane in the atmospheres of this Jovian planet and moon are at least three orders of magnitude greater than those on Earth, making it easy to detect absorption properties of methane using occultation measurements.

This analysis showed that estimates of the forcing using the incomplete methane absorption data from Earthbound laboratories agree with estimates using the far more comprehensive methane absorption data collected from Jupiter and Titan. Based on this finding, the current spectroscopy is sufficient for calculating methane radiative forcing in historic climate analyses and future projections.

Their work also lays to rest a previously unsettled issue that climate models might be underestimating the shortwave radiative effects of methane due to limitations of existing laboratory measurements of this gas. The measurements from Jupiter and Titan show it is possible to accurately calculate the extent of radiative forcing by methane in climate assessments, and that current climate models have been doing so.

The result then enabled the team to use existing capabilities to undertake the first global spatially-resolved calculations of this forcing with realistic atmospheric and boundary conditions. They advanced beyond the existing global annual-mean estimate of methane forcing by resolving its seasonal and appreciable spatial variability.

Not all methane created equal

Their analysis showed that methane forcing is not spatially uniform whatsoever, and exhibits remarkable regional patterns. The most striking finding from the first comprehensive calculations of methane forcing is that because desert regions at low latitudes feature bright, exposed surfaces that reflect light upwards to enhance the absorptive properties of methane, there can be a 10-fold increase in the localized methane shortwave forcing.

This effect is most pronounced in locations such as the Saharan Desert or Arabian Peninsula. These regions receive the most sunlight due to their proximity to the equator and feature exceptionally low relative humidity, which helps to further enhance the effects of methane.

Cloud cover was also shown to influence the gas' radiative effects. Increased forcing for methane overlying clouds was found to be up to nearly three times greater than global annualized forcing, and were associated with the oceanic stratus cloud decks west of southern Africa and North and South America and with the cloud systems in the Intertropical Convergence Zone near the equator. High-altitude clouds can reduce the solar flux incident on methane in the lower troposphere, reducing its forcing relative to clear-sky conditions, but over nearly 90 percent of the Earth's surface, cloud radiative effects enhance methane radiative forcing.

The researchers believe this information about the effect of methane on incoming solar energy is useful to advancing climate change mitigation strategies both for taking into account the relative strength in the greenhouse effect between carbon dioxide and methane and for determining the relative vulnerability of different regions across the world to atmospheric warming.

Credit: 
DOE/Lawrence Berkeley National Laboratory