Tech

Immunotherapy is beneficial in gastric and oesophageal cancers, studies show

image: This table is to be integrated in abstract LBA6_PR as part of the results section

Image: 
ESMO

Lugano, Switzerland, 21 September 2020 - New data presented at ESMO 2020 have shown that immunotherapy is beneficial for patients with gastric and oesophageal cancers who currently have poor survival. (1-3)

Immune therapy would be a big change in treatment, since immune checkpoint inhibitors are not yet approved for early therapy in Western countries. Three studies provide evidence, based on different patient populations and different immune checkpoint inhibitors used as first-line therapy.

CheckMate 649

The CheckMate 649 trial (1) evaluated nivolumab plus chemotherapy versus chemotherapy alone as first-line treatment in patients with non-HER-2-positive advanced gastric cancer, gastro-oesophageal junction cancer, or oesophageal cancer - all with adenocarcinoma histology. The results show that nivolumab and chemotherapy improved overall survival and progression-free survival in patients with PD-L1 combined positive score (CPS) greater than or equal to 5 tumours. Improvements were also observed in patients with PD-L1 CPS greater than or equal to 1 tumours and in the overall patient population.

Additional analysis of subgroups and biomarkers (e.g. MSI-High) are planned to better characterise the efficacy benefit in patients across all CPS cutoffs.

Commenting on the new data, Prof Salah-Eddin Al-Batran, Director, Institute of Clinical Cancer Research and Director of GI Oncology, Krankenhaus Nordwest-University Cancer Centre, Frankfurt, Germany, ESMO 2020 upper GI track chair, said: "The results are clinically very relevant. Based on this trial, for patients with HER2-negative gastric adenocarcinoma, oesophageal adenocarcinoma, or gastro-oesophageal junctional adenocarcinoma with PD-L1 CPS greater than or equal to 5 tumours, the addition of nivolumab to chemotherapy will become the standard of care for first-line treatment. The open question is the effect in patients who have a PD-L1 CPS smaller than 5."

ATTRACTION 4

The ATTRACTION 4 trial (2) was similar to CheckMate 649 except for two important differences: it was performed only in Asian patients and the primary endpoints were designed for all-comers, rather than a specific CPS value. First-line treatment with nivolumab plus chemotherapy improved the co-primary progression-free survival endpoint, but not overall survival.

"The improvement in progression-free survival was clinically relevant and the trial strongly supports the results of CheckMate 649," said Al-Batran. "Overall survival was not improved, possibly because all-comers were treated or because patients in Asia receive more subsequent therapies than Western populations."

KEYNOTE 590

The KEYNOTE 590 trial (3) examined first-line chemotherapy, with or without pembrolizumab, in patients with squamous cell carcinoma of the oesophagus, adenocarcinoma of the oesophagus, or Siewert type 1 gastro-oesophageal junction adenocarcinoma. It demonstrated that pembrolizumab plus chemotherapy improved overall survival in patients with squamous cell carcinoma of the oesophagus with PD-L1 CPS greater than or equal to 10 tumours, all squamous cell carcinomas, all patients with CPS greater than or equal to 10, and the study population as a whole. Progression-free survival was also improved.

Most oesophageal cancer patients in the trial had squamous cell carcinoma (73%) and those with adenocarcinoma were a small subgroup. The results in the subgroup of patients with adenocarcinoma were an experimental analysis, but in the adenocarcinoma subgroup, median overall survival (OS) was 11.6 months and 9.9 months (hazard ratio [HR]=0.74), and median progression-free survival (PFS) was 6.3 months and 5.7 months (HR=0.63) in the Pembro+Chemo and Chemo group, respectively. The OS- and PFS-benefit observed in the adenocarcinoma subgroup was consistent with the benefit observed in the overall patient population.

Commenting on the findings, Al-Batran said: "I expect that KEYNOTE-590 will change practice for patients with metastatic squamous cell carcinoma or adenocarcinoma of the oesophagus who have PD-L1 CPS greater than or equal to 10 tumours, for whom pembrolizumab added to chemotherapy will become the standard of care in the first-line."

Al-Batran concluded: "The results of these trials offer oncologists new treatment options. In the first-line setting, there is a clear change of our standard of care, in which patients with high PD-L1 expression will be candidates for immune checkpoint inhibitors plus chemotherapy. However, more data are needed on the subgroups who benefit from the treatment (e.g. PD-L1 CPS groups and MSI)."

Credit: 
European Society for Medical Oncology

New discovery to have huge impact on development of future battery cathodes

image: New paper in Nature Energy, reveals how researchers fully identified the nature of oxidised oxygen in the important battery material - Li-rich NMC - using RIXS (Resonant Inelastic X-ray Scattering) at Diamond Light Source. This compound is being closely considered for implementation in next generation Li-ion batteries because it delivers higher energy density than current materials, and could translate to longer driving ranges for electric vehicles and enable scientists to tackle issues like battery longevity and voltage fade

Image: 
Diamond Light Source & University of Oxford

A new paper published today in Nature Energy reveals how a collaborative team of researchers have been able to fully identify the nature of oxidised oxygen in the important battery material - Li-rich NMC - using RIXS (Resonant Inelastic X-ray Scattering) at Diamond Light Source. This compound is being closely considered for implementation in next generation Li-ion batteries because it can deliver a higher energy density than the current state-of-the-art materials, which could translate to longer driving ranges for electric vehicles. They expect that their work will enable scientists to tackle issues like battery longevity and voltage fade with Li-rich materials.

The paper, "First cycle voltage hysteresis in Li-rich 3d cathodes associated with molecular O2 trapped in the bulk" by a joint team from the University of Oxford, the Henry Royce and Faraday Institutions and Diamond Light Source, the UK's national synchrotron examines the results of their investigations to better understand the important compound known in the battery industry as Li-rich NMC (or Li1.2Ni0.13Co0.13Mn0.54O2).

Principal Beamline Scientist on I21 RIXS at Diamond, Kejin Zhou, explains: "Our work is much about understanding the mysterious first cycle voltage hysteresis in which the O-redox process cannot be fully recovered resulting in the loss of the voltage hence the energy density."

A previous study (Nature 577, 502-508 (2020)) into this process made by the same research team, at the I21 beamline at Diamond, reported that, in Na-ion battery cathodes, the voltage hysteresis is related to the formation of molecular O2 trapped inside of the particles due to the migration of transition metal ions during the charging process.

He adds: "Our current work, focuses on the Li-rich material Li1.2Ni0.13Co0.13Mn0.54O2. The key findings as before show the formation of free O2 molecules inside the materials, which has not been appreciated before in the community. This is a very important discovery as the material has higher TM-O covalency which was thought to suppress formation of molecular O2. I believe our work will have huge impact in future battery cathodes designs to minimise the unstable honeycomb structure. Our work also has important consequences for tackling other issues associated with Li-rich NMC, such as voltage fade, which hinder their commercialisation and ultimately discovering new materials which may be able to harness O-redox more reversibly."

Li-rich cathode materials are one of the very few options available to increase the energy density of Li-ion batteries. Almost all of the lithium in these structures can be removed, compensated first by oxidation of the transition metal (TM) ions and subsequently the oxide ions. However, the high voltage associated with this O-redox process on charge is not recovered on discharge leading to so called voltage hysteresis and a substantial loss of energy density. This represents one of the key challenges that has inhibited exploiting the full potential of these materials and the understanding of this phenomenon remains incomplete.

"In our study, we used HR RIXS - High Resolution- Resonant Inelastic X-ray Scattering spectroscopy at beamline I21 at Diamond to investigate the O-redox process. This is how the material stores charge on the oxide ions, which make up part of its structure. However, this process has proved very difficult for researchers to understand fully. The material undergoes complicated structural changes during the first charge resulting in large voltage hysteresis, and the mechanism by which oxide ions store energy was unclear," explains lead author, Dr Rob House, University of Oxford, Department of Materials. He also adds:

"The data we achieved allowed us to assign mysterious spectroscopic features that had previously been detected by the RIXS technique , but could not be fully identified. We were able to resolve fine structure arising from the vibrations of O2 molecules allowing us to assign the RIXS features obtained in this important class of battery material. These O2 molecules are trapped within the bulk of the cathode material and can be reformed back into oxide ions during discharge, but at a lower voltage than on the initial charge. This provides a new mechanism for explaining the O-redox process and represents an important step forward for battery materials."

Credit: 
Diamond Light Source

Astronomers discover an Earth-sized "pi planet" with a 3.14-day orbit

In a delightful alignment of astronomy and mathematics, scientists at MIT and elsewhere have discovered a "pi Earth" -- an Earth-sized planet that zips around its star every 3.14 days, in an orbit reminiscent of the universal mathematics constant.

The researchers discovered signals of the planet in data taken in 2017 by the NASA Kepler Space Telescope's K2 mission. By zeroing in on the system earlier this year with SPECULOOS, a network of ground-based telescopes, the team confirmed that the signals were of a planet orbiting its star. And indeed, the planet appears to still be circling its star today, with a pi-like period, every 3.14 days.

"The planet moves like clockwork," says Prajwal Niraula, a graduate student in MIT's Department of Earth, Atmospheric and Planetary Sciences (EAPS), who is the lead author of a paper published today in the Astronomical Journal, titled: "π Earth: a 3.14-day Earth-sized Planet from K2's Kitchen Served Warm by the SPECULOOS Team."

"Everyone needs a bit of fun these days," says co-author Julien de Wit, of both the paper title and the discovery of the pi planet itself.

Planet extraction

The new planet is labeled K2-315b; it's the 315th planetary system discovered within K2 data -- just one system shy of an even more serendipitous place on the list.

The researchers estimate that K2-315b has a radius of 0.95 that of Earth's, making it just about Earth-sized. It orbits a cool, low-mass star that is about one-fifth the size of the sun. The planet circles its star every 3.14 days, at a blistering 81 kilometers per second, or about 181,000 miles per hour.

While its mass is yet to be determined, scientists suspect that K2-315b is terrestrial, like the Earth. But the pi planet is likely not habitable, as its tight orbit brings the planet close enough to its star to heat its surface up to 450 kelvins, or around 350 degrees Fahrenheit -- perfect, as it turns out, for baking actual pie.

"This would be too hot to be habitable in the common understanding of the phrase," says Niraula, who adds that the excitement around this particular planet, aside from its associations with the mathematical constant pi, is that it may prove a promising candidate for studying the characteristics of its atmosphere.

"We now know we can mine and extract planets from archival data, and hopefully there will be no planets left behind, especially these really important ones that have a high impact," says de Wit, who is an assistant professor in EAPS, and a member of MIT's Kavli Institute for Astrophysics and Space Research.

Niraula and de Wit's MIT co-authors include Benjamin Rackham and Artem Burdanov, along with a team of international collaborators.

Dips in the data

The researchers are members of SPECULOOS, an acronym for The Search for habitable Planets EClipsing ULtra-cOOl Stars, and named for a network of four 1-meter telescopes in Chile's Atacama Desert, which scan the sky across the southern hemisphere. Most recently, the network added a fifth telescope, which is the first to be located in the northern hemisphere, named Artemis -- a project that was spearheaded by researchers at MIT.

The SPECULOOS telescopes are designed to search for Earth-like planets around nearby, ultracool dwarfs -- small, dim stars that offer astronomers a better chance of spotting an orbiting planet and characterizing its atmosphere, as these stars lack the glare of much larger, brighter stars.

"These ultracool dwarfs are scattered all across the sky," Burdanov says. "Targeted ground-based surveys like SPECULOOS are helpful because we can look at these ultracool dwarfs one by one."

In particular, astronomers look at individual stars for signs of transits, or periodic dips in a star's light, that signal a possible planet crossing in front of the star, and briefly blocking its light.

Earlier this year, Niraula came upon a cool dwarf, slightly warmer than the commonly accepted threshold for an ultracool dwarf, in data collected by the K2 campaign -- the Kepler Space Telescope's second observing mission, which monitored slivers of the sky as the spacecraft orbited around the sun.

Over several months in 2017, the Kepler telescope observed a part of the sky that included the cool dwarf, labeled in the K2 data as EPIC 249631677. Niraula combed through this period and found around 20 dips in the light of this star, that seemed to repeat every 3.14 days.

The team analyzed the signals, testing different potential astrophysical scenarios for their origin, and confirmed that the signals were likely of a transiting planet, and not a product of some other phenomena such as a binary system of two spiraling stars.

The researchers then planned to get a closer look at the star and its orbiting planet with SPECULOOS. But first, they had to identify a window of time when they would be sure to catch a transit.

"Nailing down the best night to follow up from the ground is a little bit tricky," says Rackham, who developed a forecasting algorithm to predict when a transit might next occur. "Even when you see this 3.14 day signal in the K2 data, there's an uncertainty to that, which adds up with every orbit."

With Rackham's forecasting algorithm, the group narrowed in on several nights in February 2020 during which they were likely to see the planet crossing in front of its star. They then pointed SPECULOOS' telescopes in the direction of the star and were able to see three clear transits: two with the network's Southern Hemisphere telescopes, and the third from Artemis, in the Northern Hemisphere.

The researchers say the new pi planet may be a promising candidate to follow up with the James Webb Space Telescope (JWST), to see details of the planet's atmosphere. For now, the team is looking through other datasets, such as from NASA's TESS mission, and are also directly observing the skies with Artemis and the rest of the SPECULOOS network, for signs of Earthlike planets.

"There will be more interesting planets in the future, just in time for JWST, a telescope designed to probe the atmosphere of these alien worlds," says Niraula. "With better algorithms, hopefully one day, we can look for smaller planets, even as small as Mars."

Credit: 
Massachusetts Institute of Technology

Mirror-like photovoltaics get more electricity out of heat

New heat-harnessing "solar" cells that reflect 99% of the energy they can't convert to electricity could help bring down the price of storing renewable energy as heat, as well as harvesting waste heat from exhaust pipes and chimneys.

The energy storage application, known informally as a "sun in a box," stores extra wind and solar power generation in a heat bank.

"This approach to grid-scale energy storage is receiving widespread interest because it is estimated to be ten-fold cheaper than using batteries," said Andrej Lenert, an assistant professor of chemical engineering.

The "sun" itself in this approach is already low cost: a tank of molten silicon, for instance. The relatively expensive parts are the photovoltaic panels that turn the stored heat back into electricity.

Compared to ordinary solar panels that turn light, rather than heat, into electricity, thermal photovoltaics need to be able to accept lower energy photons--packets of light or heat--because the heat source is at lower temperature than the sun. To maximize efficiency, engineers have been looking to reflect the photons that are too low-energy back into the heat bank. That way, the energy gets reabsorbed and has another chance to get packaged into an electricity-producing, higher-energy photon.

"It's a recycling job," said Steve Forrest, the Peter A. Franken Distinguished University Professor of Engineering and the Paul G. Goebel Professor of Engineering. "The energy emitted by the heat bank has over 100 chances to be absorbed by the solar cell before it gets lost."

The conventional gold-backed thermophotovoltaic reflects 95% of light that it can't absorb--not bad, but if 5% of the light is lost with each bounce, that light has on average 20 chances to be re-emitted in a photon with enough energy to be turned into electricity.

Increasing the number of opportunities means one could potentially use cheaper solar cell materials that are choosier about what photon energies they'll accept. This has additional benefits: higher energy photons make higher energy electrons, which means higher voltages and less energy lost while getting the electricity out.

In order to improve the reflectivity, the team added a layer of air between the semiconductor--the material that converts the photons into electricity--and the gold backing. The gold is a better reflector if the light hits it after traveling in air, rather than coming straight from the semiconductor. To minimize the degree to which the light waves cancel each other out, the thickness of the air layer must be similar to the wavelengths of the photons

Initially, electrical engineering and computer science doctoral student Dejiu Fan balked at the job of making such a cell. Fan explained that the thickness of the air layer had to be very precise--within a few nanometers--to reflect the lower energy photons. What's more, the fragile semiconductor film is only 1.5 micrometers (.0015 millimeters) thick, yet it needed to span over 70 micrometers of air between the 8-micrometer-wide gold beams.

"It was not clear at the beginning if this 'air bridge' structure, with such a long span and without any mechanical support in the middle, could be built with high precision and survive multiple harsh fabrication processes," Fan said.

But he did it--and remarkably quickly, Forrest said. Fan, working with Tobias Burger, a doctoral student in chemical engineering, and other collaborators, laid the gold beams onto the semiconductor. Then, they coated a silicon back plate with gold to make the mirror and cold-welded the gold beams to the gold backing. This way, the thickness of the gold beams could accurately control the height of the air-bridge, enabling the near-perfect mirroring.

Lenert is already looking ahead to raising the efficiency further, adding extra "nines" to the percentage of photons reflected. For instance, raising the reflectivity to 99.9% would give heat 1,000 chances to turn into electricity.

Credit: 
University of Michigan

Ribeye-eating pigs demonstrate protein quality for humans

URBANA, Ill. - Nearly a decade ago, the UN's Food and Agriculture Organization (FAO) developed a new index to assess protein quality in foods. The goal, writ large, was to address food security for the world's most vulnerable populations, creating more accurate tools for food assistance programs seeking to provide balanced nutrition.

Hans H. Stein at the University of Illinois knew he could help.

The new index, known as the digestible indispensable amino acid score (DIAAS), parses out the digestibility of individual amino acids making up proteins. And it relies on pigs, not rats, as the preferred model for humans.

Stein has been evaluating nutrient digestibility, including amino acids, in pigs for 30 years.

"The FAO determined the pig is the preferred model for humans when you evaluate proteins, moving away from the rat, which had been used for the last hundred years. They also recommended human foods should be evaluated exactly the same way as we evaluate feed ingredients for pigs. So, when I saw that I thought, 'Well, we know how to do this,'" says Stein, professor in the Department of Animal Sciences and the Division of Nutritional Sciences at Illinois. "We started doing some research in this area and published the very first paper on DIAAS values for proteins in 2014."

His team has completed multiple studies since then, including a new one published in the British Journal of Nutrition. In this work, Stein and his co-authors show meat products, including ribeye steak, bologna, beef jerky, and more, score above 100 on the DIAAS chart, meaning their amino acids are highly digestible and complement lower-quality proteins.

"If the protein quality is greater than 100, that means it can compensate for low protein quality in another food. In developing countries where people are eating a lot of maize or rice, they are typically undernourished in terms of amino acids. But if they can combine that with a higher-quality protein such as a small amount of meat, then you have improved quality overall," Stein says.

Other meats, as well as dairy products, have already been shown to have high DIAAS scores, but this is the first study to evaluate cooked and processed meat products. Since cooking and processing can affect proteins, Stein knew it was important to feed the pigs the same form of meats that humans consume.

"We did feed ribeye steaks to the pigs," Stein says. "They loved it."

Nine pigs were fed each of nine meat products for a week: salami, bologna, beef jerky, raw ground beef, cooked ground beef, and ribeye roast cooked medium-rare, medium, and well-done. Researchers collected material from the ileum, part of the small intestine, through a small surgically placed port called a cannula. Amino acid digestibility and DIAAS scores were calculated for various human age groups using this material.

For all the meat products and age groups, DIAAS values were generally greater than 100 regardless of processing, although scores tended to be higher when calculated for older children, adolescents, and adults than children between 6 months and 3 years of age.

"The reason for that is the amino acid requirement, and the requirement for higher quality protein, is greater for younger children because they're actively growing. Adults don't necessarily need a very high protein quality because their protein needs are not very high, unless they are bodybuilders or nursing women," Stein says.

The results also showed bologna and medium-cooked ribeye steak offered the highest DIAAS values in the study for the older children, adolescents, and adults age group. That bologna, a highly processed, low-cost meat product, offers high-quality protein could come as welcome news for lower-income families.

Stein points out that meat proteins aren't the only low-cost option. His earlier research shows milk and other dairy products are excellent sources of protein for children. And he plans to evaluate fish, eggs, plant-based meats, and more products in the future.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

A link between sensory neurons activation and the immune system

Pain is a protective mechanism, alerting us to danger by generating an unpleasant sensation. The warning message is carried to the spinal cord by specialized sensory neurons, which are intertwined with other sensory and motor neurons in peripheral nerves. If injury cannot be avoided, inflammation arises and is associated with redness, swelling and pain. "For a long time, pain and inflammation were thought to be distinct processes, generated independently. Whether the sensory fibers that trigger pain can induce inflammation has never been shown before" indicates Frédéric Michoud, a post-doctoral fellow at EPFL. Such a neuroimmune interaction if present would have implications for future treatments of inflammation. The study published in Nature Biotechnology using the new neurotechnology shows that this is the case.

Selective stimulation of sensory neurons with light.

Optogenetics is a technique allowing for modulation of genetically-selected neurons by shining light into neural tissues. Although optogenetics has revolutionized neuroscience focusing on the brain, its application to neurons in peripheral nerves has been difficult. &laquo The challenge has been to develop a technological approach that allows optical stimulation/illumination to occur repeatedly over many days without damaging the nerve and impacting the behavior of the animal" says Professor Clifford Woolf of Harvard Medical School/Boston Children's Hospital.

An optoelectronic implant around the sciatic nerve.

Researchers at the Bertarelli Foundation Chair in Neuroprosthetic Technology developed a soft implant that wraps around the sciatic nerve and delivers blue flashes of light on demand. "In the compliant implant, we have integrated several light-emitting diodes. The advantage is that we can control the illumination electrically," explains Stéphanie Lacour, Professor in the Faculty of Engineering Sciences and Techniques. This implant is connected by a subcutaneous cable to an electronic system secured on the top of the head of the mice. "Our colleagues at ETHZ have developed a miniaturized chip for controlling the implanted diodes, that is energy efficient and integrated into a wireless communication interface. Thanks to this system, we can control exactly when and how the implant is activated, regardless of what the animal is doing", explains the scientist. Control tests confirmed that the optoelectronic implant did not interfere with the animal's behavior in any way and did not induce side effects.

Activation of pain-triggering sensory fibers initiates and amplifies inflammation.

To the team's surprise, the repeated optical stimulation of specific sensory neurons in the nerve produced mild redness in the animal's hindpaw, a clear sign of inflammation, further confirmed by quantified analyses of immune cells present in skin samples. "Our study has provided an answer to the long-held question of whether those neurons that produce pain also produce immune-mediated inflammation - the answer clearly is, yes !" concludes Clifford Woolf.

This miniaturized implantable neurotechnology paves the way for many other studies that will allow investigators to decipher and unravel peripheral and central neural circuits, and possibly define future approaches to treat syndromes such as chronic pain or persistent inflammation.

Credit: 
Ecole Polytechnique Fédérale de Lausanne

NASA satellite found Post-Tropical Storm Alpha fizzle over Portugal and Spain

image: On Sept. 19 at 9:35 a.m. EDT (1335 UTC), NASA-NOAA's Suomi NPP satellite found the remnants of former Subtropical storm Alpha spread over Portugal and into northwestern Spain.

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

Former Subtropical Storm Alpha was a short-lived storm that formed and fizzled within 24 hours. NASA-NOAA's Suomi NPP satellite found the remnants of former Subtropical storm Alpha spreading over Portugal and northwestern Spain.

Alpha formed off the coast of Portugal by 12:30 p.m. EDT (1630 UTC) on Friday, Sept. 18. Alpha made landfall in Portugal later that day around 5 p.m. EDT (2100 UTC) about 120 miles (195 km) north-northeast of Lisbon, Portugal.

On Friday, Sept. 18 at 11 p.m. EDT (0300 UTC on Sept. 19), the National Hurricane Center noted that the storm had become a post-tropical cyclone. Post-tropical is a generic term describes a cyclone that no longer possesses sufficient tropical characteristics to be considered a tropical cyclone. Post-tropical cyclones can continue carrying heavy rains and high winds. Former tropical cyclones that have become fully extratropical and remnant lows are two classes of post-tropical cyclones.

On Sept. 18 at 11 p.m. EDT (0300 UTC on Sept. 19) METEOSAT satellite imagery, radar data, and surface observations indicated that Alpha had degenerated to a post-tropical remnant low just a few miles to the southeast of Viseu, Portugal.

Less than 12 hours later on Sept. 19 at 9:35 a.m. EDT (1335 UTC), the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard NASA-NOAA's Suomi NPP satellite found the remnants of former Subtropical storm Alpha spread over Portugal and into northwestern Spain.

The National Hurricane Center forecast called for the remnants to move into north central Spain later on Saturday, Sept. 19 and dissipated by Sept. 20 at 0000 UTC (Sept. 19 at 8 p.m. EDT).

Additional information on this system can be found in products from the Portuguese Institute for Sea and Atmosphere at http://www.ipma.pt

NASA Researches Tropical Cyclones

Hurricanes/tropical cyclones are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

Credit: 
NASA/Goddard Space Flight Center

Highly efficient perovskite solar cells with enhanced stability and minimised lead leakage

image: A researcher tests the function of the solar cells inside the glove box.

Image: 
City University of Hong Kong

While the power conversion efficiency of perovskite solar cells (PVSCs) - a future of solar cells - has already greatly improved in the past decade, the problems of instability and potential environmental impact are yet to be overcome. Recently, scientists from City University of Hong Kong (CityU) have developed a novel method which can simultaneously tackle the leakage of lead from PVSCs and the stability issue without compromising efficiency, paving the way for real-life application of perovskite photovoltaic technology.

The research team is co-led by Professor Alex Jen Kwan-yue, CityU's Provost and Chair Professor of Chemistry and Materials Science, together with Professor Xu Zhengtao and Dr Zhu Zonglong from the Department of Chemistry. Their research findings were recently published in the scientific journal Nature Nanotechnology, titled "2D metal-organic framework for stable perovskite solar cells with minimized lead leakage".

Currently, the highest power conversion efficiency of PVSCs has been on par with the state-of-the-art silicon-based solar cells. However, the perovskites used contain lead component which raises a concern for potential environmental contamination. "As the solar cell ages, the lead species can leak through the devices, e.g. through rainwater into the soil, posing a toxicity threat to the environment," explained Professor Jen who is an expert in PVSCs. "To put PVSCs into large-scale commercial uses, it requires not only high power conversion efficiency but also long-term device stability and minimised environmental impact."

Collaborating with Professor Xu whose expertise is materials synthesis, Professor Jen and Dr Zhu led the team to overcome the above challenges by applying two-dimensional (2D) metal-organic frameworks (MOFs) to PVSCs. "We are the first team to fabricate PVSC devices with minimised lead leakage, good long-term stability and high power conversion efficiency simultaneously," Professor Jen summarised their research breakthrough.

Multi-functional MOF layer

Metal-organic framework (MOF) materials have been previously applied as scaffolds to template the growth of perovskites. Scientists have also used them as additives or surface modifiers to passivate (to reduce the reactivity of the material's surface) the defects of perovskites for enhancing the device performance and stability.

However, most of the 3D MOFs are quite electrical insulating with low charge-carrier mobility, hence unsuitable to be used as the charge-transporting materials.

But the MOFs prepared by Professor Xu is different. They are honeycomb-like, 2D structure equipped with numerous thiol groups as a key functionality. They possess suitable energy levels, enabling them to be an electron-extraction layer (also called "electron-collection layer") where electrons are finally collected by the electrode of the PVSCs. "Our molecular engineered MOFs possess the property of a multi-functional semiconductor, and can be used to enhance the charge extraction efficiency," explained Professor Xu.

Trapping the lead ions to prevent contamination

More importantly, the dense arrays of thiol and disulphide groups in the MOFs can "capture" heavy metal ions at the perovskite-electrode interface to mitigate lead leakage.

"Our experiments showed that the MOF used as the outer layer of the PVSC device captured over 80% of the leaked lead ions from the degraded perovskite and formed water-insoluble complexes which would not contaminate the soil," Professor Jen explained. Unlike the physical encapsulation methods used in reducing lead leakage in other studies, this in-situ chemical sorption of lead by the integrated MOF component in the device was found to be more effective and sustainable for long-term practical applications.

Long-term operational stability achieved

Moreover, this MOF material could protect perovskites against moisture and oxygen while maintaining high efficiency.

The power conversion efficiency of their PVSC device modified with MOF could reach 22.02% with a fill factor of 81.28% and open-circuit voltage of 1.20 V. Both the conversion efficiency and the open-circuit voltage recorded are among the highest values reported for the planar inverted PVSCs. At the same time, the device exhibited superior stability in an ambient environment with the relative humidity of 75%, maintaining 90% of its initial efficiency after 1,100 hours. In contrast, the power conversion efficiency of the PVSC without MOF dropped significantly to less than 50% of its original value.

Also, their device retained 92% of its initial efficiency under continuous light irradiation for 1,000 hours at 85°C. "Such level of stability has already met the standard for commercialisation set by the International Electrotechnical Commission (IEC)," said Dr Zhu.

"This is a very significant result which proved our MOF method is technically feasible and has the potential in commercialising the PVSC technology," added Professor Jen.

Highly efficient PVSCs for clean energy applications

It took the team almost two years to accomplish this promising research. Their next step will be to further enhance the power conversion efficiency and explore the ways to lower the production cost.

"We hope in the future the manufacturing of this type of PVSCs would be like 'printing out' newspapers and easily scaled up in production, facilitating the large-scale deployment of highly efficient PVSCs for clean energy applications," concluded Professor Jen.

Credit: 
City University of Hong Kong

Unexpected wildfire emission impacts air quality worldwide

image: Wildfire smoke plumes over California in November 2018.

Image: 
NASA worldview

In lab studies of wildfire, nitrous acid seems like a minor actor, often underrepresented in atmospheric models. But in the real-world atmosphere, during wildfires, the chemical plays a leading role--spiking to levels significantly higher than scientists expected, driving increased ozone pollution and harming air quality, according to a new study led by the University of Colorado Boulder and the Belgian Institute for Space Aeronomy.

"We found nitrous acid levels in wildfire plumes worldwide are two to four times higher than expected," said Rainer Volkamer, CIRES Fellow, professor of chemistry at CU Boulder and co-lead author on the Nature Geoscience study. "The chemical can ultimately drive the formation of lung- and crop-damaging ozone pollution downwind of fires."

Nitrous acid in wildfire smoke is accelerating the formation of an oxidant, the hydroxyl radical or OH. Unexpectedly, nitrous acid was responsible for around 60 percent of OH production in the smoke plumes worldwide, the team estimated--it is by far the main precursor of OH in fresh fire plumes. The hydroxyl radical, then, can degrade greenhouse gases, and it can also accelerate the chemical production of ozone pollution--by as much as 7 parts per billion in some places. That's enough to push ozone levels over regulated levels (eg, 70 ppb in the United States).

"Fire size and burn conditions in the real world show higher nitrous acid than can currently be explained based on laboratory data, and this added nitrous acid drives faster chemistry to form ozone, oxidants and modifies aerosols in wildfire smoke," Volkamer said.

Nitrous acid, while abundant after wildfire, degrades quickly in sunlight, and is thus exceedingly difficult to study globally. So the CU Boulder team worked with European colleagues to combine two sets of data: 1) global measurements from a satellite instrument TROPOMI observed nitrous acid in wildfire plumes around the world, and 2) custom instruments flown on aircraft during a 2018 wildfire study in the Pacific Northwest during the BB-FLUX campaign. Remarkably, the team was able to compare near-simultaneous measurements made within minutes by the satellite looking down on a plume, and the aircraft-based instrument looking up into the same plume from below.

"Kudos to the pilots and the entire team for dealing actively with this fundamental sampling challenge," Volkamer said. "Simultaneous measurements conducted at different temporal and spatial scales helped us to understand and use what are the first global measurements of nitrous acid by our Belgium colleagues." With the new comparison in hand, Volkamer and his colleagues--including Nicolas Theys, the study's lead author from BIRA--could then scrutinize satellite data from a large number of wildfires in all major ecosystems across the planet to assess nitrous acid emissions.

The chemical is consistently higher than expected everywhere, but levels differ depending on the landscape. "Nitrous acid emissions relative to other gases involved in ozone formation varied by ecosystem, with the lowest in savannas and grasslands and highest in extratropical evergreen forests," said Kyle Zarzana, chemistry postdoctoral scientist at CU Boulder who led instrument deployment for the aircraft measurements, and coauthor on the new paper.

"Wildfire smoke contains many trace gases and aerosols that adversely affect visibility and public health over large distances, as we are recently witnessing from fires raging in the Western United States that affect air quality on the East Coast," said Volkamer. "Our findings reveal a chemically very active ingredient of this smoke, and help us to better keep track as photochemistry rapidly modifies emissions downwind."

Credit: 
University of Colorado at Boulder

New technology is a 'science multiplier' for astronomy

image: The first image of a black hole by the the Event Horizon Telescope in 2019 was enabled in part b support for the NSF's Advanced Technologies and Instrumentation program.

Image: 
NASA

Federal funding of new technology is crucial for astronomy, according to results of a study released Sept. 21 in the Journal of Astronomical Telescopes, Instruments and Systems.

The study tracked the long-term impact of early seed funding obtained from the National Science Foundation. Many of the key advances in astronomy over the past three decades benefited directly or indirectly from this early seed funding.

Over the past 30 years, the NSF Advanced Technologies and Instrumentation program has supported astronomers to develop new ways to study the universe. Such devices may include cameras or other instruments as well as innovations in telescope design. The study traced the origins of some workhorse technologies in use today back to their humble origins years or even decades ago in early grants from NSF. The study also explored the impact of technologies that are just now advancing the state-of-the-art.

The impact of technology and instrumentation research unfolds over the long term. "New technology is a science multiplier" said study author Peter Kurczynski, who served as a Program Director at the National Science Foundation and is now the Chief Scientist of Cosmic Origins at NASA Goddard Space Flight Center. "It enables new ways of observing the universe that were never before possible." As a result, astronomers are able to make better observations, and gain deeper insights, into the mysteries of the cosmos.

The study also looked at the impact of grant supported research in the peer-reviewed literature. Papers resulting from technology and instrumentation grants are cited with the same frequency as those resulting from pure science grants, according to the study. Instrumentation scientists "write papers to the same degree, and with the same impact as their peers who do not build instruments," said Staša Milojevic, associate professor of informatics and the director of the Center for Complex Network and Systems Research in the Luddy School of Informatics, Computing and Engineering at Indiana University, who is a coauthor of the study.

Also noteworthy is that NSF grant supported research was cited more frequently overall than the general astronomy literature. NSF is considered to have set the gold standard in merit review process for selecting promising research for funding.

An anonymous reviewer described the article as a "go-to record for anyone needing to know the basic history of many breakthroughs in astronomical technology." Better observations have always improved our understanding of the universe. From the birth of modern astronomy in the middle ages to the present day, astronomers have relied upon new technologies to reveal the subtle details of the night sky with increasing sophistication.

This study comes at a critical time of reflection on the nation's commitment to Science, Technology, Engineering and Math. U.S. preeminence in STEM is increasingly challenged by China and Europe. This study reveals that investments in technology have a tremendous impact for science. Astronomers today are still reaping the benefits of research that was begun decades ago. The future of astronomy depends upon technologies being developed today.

A post on the NSF Science Matter blog provides an in-depth look at one discovery enabled by support of the ATI program: Downloading the first image of a black hole by the the Event Horizon Telescope in 2019.

Credit: 
Indiana University

New composite material revs up pursuit of advanced electric vehicles

image: ORNL scientists used new techniques to create long lengths of a composite copper-carbon nanotube material with improved properties for use in electric vehicle traction motors.

Image: 
Andrew Sproles, ORNL/U.S. Department of Energy

Scientists at Oak Ridge National Laboratory used new techniques to create a composite that increases the electrical current capacity of copper wires, providing a new material that can be scaled for use in ultra-efficient, power-dense electric vehicle traction motors.

The research is aimed at reducing barriers to wider electric vehicle adoption, including cutting the cost of ownership and improving the performance and life of components such as electric motors and power electronics. The material can be deployed in any component that uses copper, including more efficient bus bars and smaller connectors for electric vehicle traction inverters, as well as for applications such as wireless and wired charging systems.

To produce a lighter weight conductive material with improved performance, ORNL researchers deposited and aligned carbon nanotubes on flat copper substrates, resulting in a metal-matrix composite material with better current handling capacity and mechanical properties than copper alone.

Incorporating carbon nanotubes, or CNTs, into a copper matrix to improve conductivity and mechanical performance is not a new idea. CNTs are an excellent choice due to their lighter weight, extraordinary strength and conductive properties. But past attempts at composites by other researchers have resulted in very short material lengths, only micrometers or millimeters, along with limited scalability, or in longer lengths that performed poorly.

The ORNL team decided to experiment with depositing single-wall CNTs using electrospinning, a commercially viable method that creates fibers as a jet of liquid speeds through an electric field. The technique provides control over the structure and orientation of deposited materials, explained Kai Li, a postdoctoral researcher in ORNL's Chemical Sciences Division. In this case, the process allowed scientists to successfully orient the CNTs in one general direction to facilitate enhanced flow of electricity.

The team then used magnetron sputtering, a vacuum coating technique, to add thin layers of copper film on top of the CNT-coated copper tapes. The coated samples were then annealed in a vacuum furnace to produce a highly conductive Cu-CNT network by forming a dense, uniform copper layer and to allow diffusion of copper into the CNT matrix.

Using this method, ORNL scientists created a copper-carbon nanotube composite 10 centimeters long and 4 centimeters wide, with exceptional properties. The microstructural properties of the material were analyzed using instruments at the Center for Nanophase Materials Sciences at ORNL, a U.S. Department of Energy Office of Science user facility.
Researchers found the composite reached 14% greater current capacity, with up to 20% improved mechanical properties compared with pure copper, as detailed in ACS Applied Nano Materials.

Tolga Aytug, lead investigator for the project, said that "by embedding all the great properties of carbon nanotubes into a copper matrix, we are aiming for better mechanical strength, lighter weight and higher current capacity. Then you get a better conductor with less power loss, which in turn increases the efficiency and performance of the device. Improved performance, for instance, means we can reduce volume and increase the power density in advanced motor systems."

The work builds on a rich history of superconductivity research at ORNL, which has produced superior materials to conduct electricity with low resistance. The lab's superconductive wire technology was licensed to several industry suppliers, enabling such uses as high-capacity electric transmission with minimal power losses.

While the new composite breakthrough has direct implications for electric motors, it also could improve electrification in applications where efficiency, mass and size are a key metric, Aytug said. The improved performance characteristics, accomplished with commercially viable techniques, means new possibilities for designing advanced conductors for a broad range of electrical systems and industrial applications, he said.

The ORNL team also is exploring the use of double-wall CNTs and other deposition techniques such as ultrasonic spray coating coupled with a roll-to-roll system to produce samples of some 1 meter in length.

"Electric motors are basically a combination of metals -- steel laminations and copper windings," noted Burak Ozpineci, manager of the ORNL Electric Drive Technologies Program and leader of the Power Electronics and Electric Machinery group. "To meet DOE's Vehicle Technologies Office's 2025 electric vehicle targets and goals, we need to increase power density of the electric drive and reduce the volume of motors by 8 times, and that means improving material properties."

Credit: 
DOE/Oak Ridge National Laboratory

NASA analyzes soaking capabilities of hurricane Teddy on Bermuda approach

image: On Sept. 21 at 6:30 a.m. EDT (1030 UTC), NASA's IMERG estimated Hurricane Teddy was generating as much as 30 mm/1.18 inches of rain (dark pink) around the center of circulation. Rainfall throughout most of the rest of the storm was occurring between 5 and 15 mm (0.2 to 0.6 inches/yellow and green colors) per hour. The rainfall data was overlaid on infrared imagery from NOAA's GOES-16 satellite.

Image: 
NASA/NOAA/NRL

Using a NASA satellite rainfall product that incorporates data from satellites and observations, NASA estimated Hurricane Teddy's rainfall rates as it approaches Bermuda on Sept. 21. Teddy is a large hurricane and growing. It is also churning up seas all the way to the U.S. and Canadian coastlines.

Watches and Warnings on Sept. 21

NOAA's National Hurricane Center posted a Tropical Storm Warning for Bermuda and a Tropical Storm Watch is in effect from Lower East Pubnico to Main-a-Dieu, Nova Scotia, Canada.

Teddy's Status on Sept. 21

At 11 a.m. EDT (1500 UTC), the center of Hurricane Teddy was located near latitude 31.1 degrees north and longitude 62.7 degrees west. That is about 150 miles (240 km) east-southeast of Bermuda and about 935 miles (1,500 km) south of Halifax, Nova Scotia, Canada.

Teddy was moving toward the north-northeast near 14 mph (22 kph), and this motion is expected to continue today, followed by a turn toward the north overnight and north-northwest on Tuesday.  Maximum sustained winds were near 90 mph (150 kph) with higher gusts. Teddy is a large hurricane. Hurricane-force winds extend outward up to 80 miles (130 km) from the center and tropical-storm-force winds extend outward up to 230 miles (370 km).

An Air Force Reserve Hurricane Hunter aircraft recently reported a minimum central pressure of 960 millibars.

Estimating Teddy's Rainfall Rates from Space

NASA's Integrated Multi-satellitE Retrievals for GPM or IMERG, which is a NASA satellite rainfall product, estimated on Sept. 21 at 6:30 a.m. EDT (1030 UTC), Teddy was generating as much as 30 mm (1.18 inches) of rain per hour around the center of circulation.

Rainfall throughout most of the storm was estimated as falling at a rate between 5 and 15 mm (0.2 to 0.6 inches) per hour. At the U.S. Naval Laboratory in Washington, D.C., the IMERG rainfall data was overlaid on infrared imagery from NOAA's GOES-16 satellite to provide a full extent of the storm.

As Teddy moves north, that heavy rainfall is expected across Atlantic Canada between Tuesday and Thursday.

What Does IMERG Do?

This near-real time rainfall estimate comes from the NASA's IMERG, which combines observations from a fleet of satellites, in near-real time, to provide near-global estimates of precipitation every 30 minutes. By combining NASA precipitation estimates with other data sources, we can gain a greater understanding of major storms that affect our planet.

What the IMERG does is "morph" high-quality satellite observations along the direction of the steering winds to deliver information about rain at times and places where such satellite overflights did not occur. Information morphing is particularly important over the majority of the world's surface that lacks ground-radar coverage. Basically, IMERG fills in the blanks between weather observation stations

NHC Key Messages

The National Hurricane Center issued three key messages as Hurricane Teddy approaches Bermuda and grows in size:

The center of Teddy is forecast to move east of Bermuda today. Wind gusts of tropical-storm-force have been reported on the island, and tropical storm conditions could continue today.
`Teddy is expected to transition to a powerful post-tropical cyclone as it moves near or over portions of Atlantic Canada late Tuesday through Thursday, where there is an increasing risk of direct impacts from wind, rain, and storm surge. A Tropical Storm Watch is in effect for portions of Nova Scotia, and heavy rainfall across Atlantic Canada is expected with Teddy between Tuesday and Thursday after it becomes a strong post-tropical cyclone.
`Large swells produced by Teddy are expected to affect portions of Bermuda, the Leeward Islands, the Greater Antilles, the Bahamas, the east coast of the United States, and Atlantic Canada during the next few days. These swells will likely cause life-threatening surf and rip current conditions.

Teddy's Forecast

"Teddy's size will likely increase substantially during the next couple of days as it moves northward and interacts with a frontal system," said Eric Blake, a senior hurricane specialist at NOAA's National Hurricane Center in Miami, Florida. "Gale force winds are likely along portions of the near shore waters of the northeast U.S."

NHC forecasters said Teddy should turn to the north-northeast as it approaches Nova Scotia on Wednesday. Teddy is expected to gain strength overnight, but weaken steadily by Wednesday and become a strong post-tropical cyclone.

NASA Researches Tropical Cyclones

Hurricanes/tropical cyclones are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

Credit: 
NASA/Goddard Space Flight Center

Southern hemisphere could see up to 30% less rain at end of the century

image: Analysis is based on climate models for the mid-Pliocene period, which occurred 3 million years ago and shared characteristics with present-day warming

Image: 
Gabriel Marques Pontes / USP

Projections based on climate models for the mid-Pliocene Warm Period (about 3 million years ago) suggest that countries in the tropical and subtropical southern hemisphere, including Brazil, may face longer droughts in the future. Annual rainfall may decrease as much as 30% compared with current levels.

One of the main variables considered in this scenario is a rise of 3 °C in the global average temperature, which may happen between 2050 and the end of the century unless the effects of climate change are mitigated.

The mid-Pliocene, before the emergence of Homo sapiens, shares characteristics with modern warming because temperatures were then between 2 °C and 3 °C higher than in the pre-industrial age (around the 1850s). High-latitude sea surface temperatures rose as much as 9 °C in the northern hemisphere and 4 °C in the southern hemisphere.
Atmospheric CO2 levels were similar to today's at about 400 parts per million (ppm).

These considerations are in the article "Drier tropical and subtropical Southern Hemisphere in the mid-Pliocene Warm Period", published in Scientific Reports. The lead author is Gabriel Marques Pontes , a PhD candidate at the University of São Paulo's Oceanographic Institute (IO-USP) in Brazil with a scholarship from São Paulo Research Foundation - FAPESP .

The second author is Ilana Wainer, a professor in IO-USP and Pontes's thesis adviser. Other co-authors include Andréa Taschetto of the University of New South Wales (UNSW) in Australia, a former awardee of a scholarship from FAPESP.

According to the authors, their simulations showed that one of the most notable changes in southern hemisphere summer rainfall in the mid-Pliocene compared to pre-industrial conditions occurs in subtropical regions along the subtropical convergence zones (STCZs). Another change, they add, is associated with a northward shift of the inter-tropical convergence zone (ITCZ) due to consistent increased rainfall in the northern hemisphere tropics. The total November-to-March mean rainfall along the STCZs decreases in both models.

"These changes result in drier-than-normal southern hemisphere tropics and subtropics. The evaluation of the mid-Pliocene adds a constraint to possible future warmer scenarios associated with differing rates of warming between hemispheres," the article states.

In an interview, Wainer explained that the mid-Pliocene is the most recent period in Earth's history when global warming was similar to that projected for the rest of this century. "It's possible to put the expected natural variability in this context and distinguish it from the change caused by human activity," she said. "Studying past climate extremes helps elucidate future scenarios and address the associated uncertainties."

For Pontes, this is the first detailed investigation of southern hemisphere rainfall changes in the mid-Pliocene. "Understanding atmospheric circulation and precipitation during past warm climates is useful to add constraints to future change scenarios," he said.

Current impacts

According to a report issued in July by the World Meteorological Organization (WMO), the global average temperature could rise more than 1.5 °C above pre-industrial levels by 2024, much sooner than scientists previously thought. The report warns of a high risk of extreme rainfall variability across the various regions in the next five years, with some facing drought and others flooding.

In March the WMO confirmed that 2019 was the second warmest year on record, with a global average temperature that was 1.1 °C above pre-industrial levels. The warmest ever was 2016, partly owing to a strong El Niño, characterized by unusually warm sea surface temperatures in the Equatorial Pacific.

Since the 1980s each decade has been warmer than the previous one, the WMO noted, adding that retreating ice, record sea levels, increasing ocean heat and acidification and extreme weather combine to have major impacts on the health and well-being of both humans and the environment. The problem affects world socio-economic development, causing migration and food insecurity in terrestrial and marine ecosystems.

In 2015, 195 countries signed up to greenhouse gas emission reduction targets in the Paris Agreement and promised to limit global warming to between 1.5 °C and 2 °C. These promises have not been kept.

"The United Nations has promoted measures to try to limit warming, but 1.5 °C is already having a significant impact," Pontes said. "The projections point to 3 °C by the end of the century when the consequences could look like the mid-Pliocene simulations performed in the study."

There was practically no external impact on vegetation in the mid-Pliocene, when the Amazon rainforest was much larger, generating more moisture and helping to offset the drier climate in the region, he added. Future droughts will be worse if deforestation and burning continue at the present rate.

Data published by the National Institute for Space Research (INPE) in Brazil shows a 34% increase in deforestation in the Amazon between August 2019 and July 2020 compared with a year earlier. Over 9,200 square kilometers of forest were destroyed in 12 months. Since 2013 deforestation in the Amazon has rebounded to reach high levels in consecutive years, after trending down for a period compared with the 1990s.

Data from INPE also shows a 28% increase in forest fires in the Amazon in July 2020 compared with a year earlier, itself considered the worst since 2010. For Pontes, drier weather and higher temperatures in South America could decrease annual rainfall by as much as 30%, leading to water shortages across the continent.

"The more we can mitigate warming and deforestation, the more we can help reduce the impact of climate change on the population of South America," he said.

The article recommends further research taking changes in plant cover into consideration by analyzing the effects of deforestation and warming together to estimate the possible decrease in rainfall in South America.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

How to get a handle on carbon dioxide uptake by plants

image: A tall tower with instruments to measure carbon dioxide and light at Niwot Ridge, Colorado.

Image: 
Christian Frankenberg

How much carbon dioxide, a pivotal greenhouse gas behind global warming, is absorbed by plants on land? It's a deceptively complicated question, so a Rutgers-led group of scientists recommends combining two cutting-edge tools to help answer the crucial climate change-related question.

"We need to understand how the Earth is breathing now to know how resilient it will be to future change," according to a paper in the journal Bulletin of the American Meteorological Society. The early online version was published in April 2020 and the final online version in September 2020.

Global observations suggest that natural ecosystems take up about as much carbon dioxide as they emit. Measuring how much carbon dioxide is absorbed by plants on land is complicated by the carbon exhaled simultaneously by plants and soils, the paper notes.

While plants absorb a portion of the increasing emissions of carbon dioxide from fossil fuel burning, scientists have a difficult time determining how much, said lead author Mary Whelan, an assistant professor in the Department of Environmental Sciences in the School of Environmental and Biological Sciences at Rutgers University-New Brunswick.

"By combining two tools that correspond to potential carbon uptake and light captured by leaves, we'll know how much carbon dioxide could remain in the atmosphere," Whelan said. "The two communities of scientists who use these tools need to come together, with the help of funding."

The tools focus on two indicators of photosynthesis, when plants harness sunlight to turn carbon dioxide and water into carbohydrates, generating oxygen. One indicator is carbonyl sulfide, a natural trace gas absorbed by plants. The second, called solar-induced fluorescence, is light emitted by leaves during photosynthesis.

Combining the two tools will help reveal how much carbon is being absorbed by ecosystems and the consequences for the water cycle. Collecting data via satellites, in the air and on the ground will help improve models to predict changes in the future, according to the paper.

Credit: 
Rutgers University

Evaluating impacts of COVID-19 lockdowns on children and young people

Children, who appear at a relatively lower risk from COVID-19, are disproportionally harmed by precautions involved with lockdowns, say Matthew Snape and Russell Viner in a Perspective. They note that while the role of transmission of SARS-CoV-2 by children is still uncertain, existing evidence points to educational settings playing a limited role when mitigation measures are in place. Meanwhile, ongoing school closures and losses of other systems that help and protect children are revealing indirect but very real harms being borne by them. For example, in the UK, it is estimated that the impact on education thus far may lead to a quarter of the national workforce having lower skills for a generation after the mid-2020s. What's more, many countries are seeing more evidence of accidents at home requiring hospitalizing during lockdown periods and of adversely affected mental health in the young. The authors address the concern that children in schools without symptoms may be "shedding" the virus, which could bring the virus home. Understanding this is a key to resolving what has been an "unprecedented" global disruption to primary and secondary school education, they say. They also cite studies that show minimal transmission from children positive for the virus, to their contacts. Coming months as schools reopen in the Northern Hemisphere will be an important opportunity to identify which measures schools are using to mitigate virus spread are most effective, to generate a standard "best practice" that balances young people's rights to an education with the need to protect the broader community from further transmission. The authors say that advocates of child health need to ensure that children's rights to health and social care, mental health support and education are protected throughout future pandemic waves. School closure should be undertaken "with trepidation" given the indirect harms it can cause, write the authors. Pandemic mitigation measures that impact children's wellbeing should only happen if evidence exists that those measures help, they say.

Credit: 
American Association for the Advancement of Science (AAAS)