Earth

NIST, collaborators develop new method to better study microscopic plastics in the ocean

image: Diagram depicting the water circulation within an adult tunicate, C. Robusta. Red dots signify larger size particles while green dots are smaller ones, which can include nanoplastics and are shown to sometimes be expelled from the tunicate or gathered in the gonads (reproductive gland).

Image: 
A. Valsesia et al. via Creative Commons (https://creativecommons.org/licenses/by/4.0), adapted by N. Hanacek/NIST

If you've been to your local beach, you may have noticed the wind tossing around litter such as an empty potato chip bag or a plastic straw. These plastics often make their way into the ocean, affecting not only marine life and the environment but also threatening food safety and human health.

Eventually, many of these plastics break down into microscopic sizes, making it hard for scientists to quantify and measure them. Researchers call these incredibly small fragments "nanoplastics" and "microplastics" because they are not visible to the naked eye. Now, in a multiorganizational effort led by the National Institute of Standards and Technology (NIST) and the European Commission's Joint Research Centre (JRC), researchers are turning to a lower part of the food chain to solve this problem.

The researchers have developed a novel method that uses a filter-feeding marine species to collect these tiny plastics from ocean water. The team published its findings as a proof-of-principle study in the scientific journal Microplastics and Nanoplastics.

Plastics consist of synthetic materials known as polymers that are usually made from petroleum and other fossil fuels. Each year more than 300 million tons of plastics are produced, and 8 million tons end up in the ocean. The most common kinds of plastics found in marine environments are polyethylene and polypropylene. Low-density polyethylene is commonly used in plastic grocery bags or six-pack rings for soda cans. Polypropylene is commonly used in reusable food containers or bottle caps.

"Sunlight and other chemical and mechanical processes cause these plastic objects to become smaller and smaller," said NIST researcher Vince Hackley. "With time they change their shape and maybe even their chemistry."

While there isn't an official definition for these smaller nanoplastics, researchers generally describe them as being artificial products that the environment breaks down into microscopic pieces. They're typically the size of one millionth of a meter (one micrometer, or a micron) or smaller.

These tiny plastics pose many potential hazards to the environment and food chain. "As plastic materials degrade and become smaller, they are consumed by fish or other marine organisms like mollusks. Through that path they end up in the food system, and then in us. That's the big concern," said Hackley.

For help in measuring nanoplastics, researchers turned to a group of marine species known as tunicates, which process large volumes of water through their bodies to get food and oxygen -- and, unintentionally, nanoplastics. What makes tunicates so useful to this project is that they can ingest nanoplastics without affecting the plastics' shapes or size.

For their study, researchers chose a tunicate species known as C. robusta because "they have a good retention efficiency for micro- and nanoparticles," said European Commission researcher Andrea Valsesia. The researchers obtained live specimens of the species as part of a collaboration with the Institute of Biochemistry and Cell Biology and the Stazione Zoologica Anton Dohrn research institute, both in Naples, Italy.

The tunicates were exposed to different concentrations of polystyrene, a versatile plastic, in the form of nanosize particles. The tunicates were then harvested and then went through a chemical digestion process, which separated the nanoplastics from the organisms. However, during this stage some residual organic compounds digested by the tunicate were still mixed in with the nanoplastics, possibly interfering with the purification and analysis of the plastics.

So, researchers used an additional isolation technique called asymmetrical-flow field flow fractionation (AF4) to separate the nanoplastics from the unwanted material. The separated or "fractionated" nanoplastics could then be collected for further analysis. "That is one of the biggest issues in this field: the ability to find these nanoplastics and isolate and separate them from the environment they exist in," said Valsesia.

The nanoplastic samples were then placed on a specially engineered chip, designed so that the nanoplastics formed clusters, making it easier to detect and count them in the sample. Lastly, the researchers used Raman spectroscopy, a noninvasive laser-based technique, to characterize and identify the chemical structure of the nanoplastics.

The special chips provide advantages over previous methods. "Normally, using Raman spectroscopy for identifying nanoplastics is challenging, but with the engineered chips researchers can overcome this limitation, which is an important step for potential standardization of this method," said Valsesia. "The method also enables detection of the nanoplastics in the tunicate with high sensitivity because it concentrates the nanoparticles into specific locations on the chip."

The researchers hope this method can lay the foundation for future work. "Almost everything we're doing is at the frontier. There are no widely adopted methods or measurements," said Hackley. "This study on its own is not the end point. It's a model for how to do things going forward."

Among other possibilities, this approach might pave the way for using tunicates to serve as biological indicators of an ecosystem's health. "Scientists might be able to analyze tunicates in a particular spot to look at nanoplastic pollution in that area," said Jérémie Parot, who worked on this study while at NIST and is now at SINTEF Industry, a research institute in Norway.

The NIST and JRC researchers continue to work together through a collaboration agreement and hope it will provide additional foundations for this field, such as a reference material for nanoplastics. For now, the group's multistep methodology provides a model for other scientists and laboratories to build on. "The most important part of this collaboration was the opportunity to exchange ideas for how we can do things going forward together," said Hackley.

Credit: 
National Institute of Standards and Technology (NIST)

From mice to men: Study reveals potential new target for treating acute myeloid leukemia

image: Thomas Schroeder, MD, PhD, corresponding author of the study.

Image: 
AlphaMed Press

Durham, NC -- Bone marrow failure due to acute myeloid leukemia (AML) is a significant factor behind the disease's high rate of morbidity and mortality. Previous studies in mice suggest that AML cells inhibit healthy hematopoietic (blood) stem and progenitor cells (HSPC). A study released in STEM CELLS adds to this extent of knowledge by showing how secreted cell factors, in particular a protein called transforming growth factor beta 1 (TGFβ1), leads to a breakdown in the production of healthy blood cells (a process called hematopoiesis) in humans.

The study's findings indicate that blocking TGFβ1 could improve hematopoiesis in AML patients.

Although AML makes up only about 1 percent of all cancers, it is the second most common type of leukemia diagnosed, according to the American Cancer Society. AML affects the blood and bone marrow - the spongy tissue inside bones where blood cells are made. The mortality rate is high - for those aged 20 and older, the five-year survival rate is a dismal 26 percent.

The mechanisms by which AML develops are not completely understood, but it is generally believed to begin in the hemopoietic stem cells or progenitors, which develop into myeloid cells and in turn go on to become red blood cells, white blood cells or platelets. This latest study, by researchers at Heinrich-Heine-University Düsseldorf, was designed to investigate what role fluids secreted by leukemic cells might play in inhibiting the growth of healthy hematopoietic stem and HSPC.

"Experiments using conditioned media (CM) from AML cells to address secretory mechanisms have been performed before, but mainly in mice. In order to gain new insights into how this plays out in humans, we focused on the interaction between leukemic cells and healthy HSPC using an in vitro system modeling the in vivo situation of bone marrow infiltration by AML cells," said the study's corresponding author, Thomas Schroeder, M.D., Ph.D., Department of Hematology, Oncology and Clinical Immunology. This was accomplished by exposing healthy bone marrow-derived CD34+ HSPC to supernatants derived from AML cell lines and newly diagnosed AML patients. (CD34 is a marker of human HSC, while supernatants are the products secreted by cells.)

"Our findings revealed that exposure to AML-derived supernatants significantly inhibited proliferation, cell cycling, colony formation and differentiation of healthy CD34+ HSPC," Dr. Schroeder reported. "Further experiments determined that leukemic cells induce functional inhibition of healthy HSPC, at least in part through TGFβ1.

"Blocking the TGFβ1 pathway is something that could be pharmacologically accomplished with a TGFβ1 inhibitor such as SD208. Our data indicate this could be a promising approach to improve hematopoiesis in AML patients."

Dr. Jan Nolta, Editor-in-Chief of STEM CELLS, said, "finding factors for possible intervention in the perplexing issue of suppression of normal hematopoiesis by AML cells is highly important. The possibility of targeting TGFβ1 in order to allow the normal stem and progenitor cells to expand is a promising lead toward future treatments."

Credit: 
AlphaMed Press

Across US, COVID-19 death rate higher for those with IDD

Syracuse, N.Y. - The COVID-19 death rate for people with intellectual and developmental disabilities (IDD) is higher than the general population in several states across the U.S., according to a new study published in Disability and Health Journal.

The research team that conducted the study analyzed data from 12 U.S. jurisdictions: Arizona, California, Connecticut, Illinois, Louisiana, Maryland, New Jersey, Oregon, Pennsylvania, Virginia, Washington and Washington, D.C.

The death rates were higher in all jurisdictions for those with IDD who live in congregate settings such as residential group homes. The results for people with IDD who do not live in congregate settings were mixed depending on the state, with case-fatality rates higher, lower or similar to the general population.

The study, "COVID-19 Case-Fatality Disparities among People with Intellectual and Developmental Disabilities: Evidence from 12 US Jurisdictions," was published May 12 in Disability and Health Journal.

A previous study from June 2020 by members of this research team found that the death rate for people with IDD who live in congregate settings in New York State was also higher than the general population.

"The results from these studies underscore the increased COVID-19 risk that people with IDD are facing across the U.S.," said researcher Scott Landes. "Though we are at a point where it appears that things are improving, it is important to understand that challenges still exist for some people with IDD. Though the vaccine is now officially open to all adults, there are multiple reports of people with IDD experiencing barriers to vaccination."

The research team for the most recent study includes Landes, an associate professor of sociology at Syracuse University's Maxwell School of Citizenship and Public Affairs and a research affiliate for the Lerner Center for Public Health Promotion; Dr. Margaret Turk, Distinguished Service Professor of Physical Medicine and Rehabilitation at SUNY Upstate Medical Center in Syracuse, N.Y.; and David A. Ervin, CEO of the Jewish Foundation for Group Homes in Rockville, Md.

In the most recent study, the researchers compared COVID-19 case-fatality rates among people with IDD in the 11 states and District of Columbia because those jurisdictions publicly report that data. To determine if those with IDD were dying at a higher rate than the general population, the researchers analyzed cumulative data reported from the start of the pandemic through April 13, 2021, and compared the IDD data from the 12 jurisdictions to the cumulative COVID-19 data compiled by the Center for Systems Science and Engineering at Johns Hopkins University.

To develop effective short-term and long-term public health interventions that address COVID-19 risks for those with IDD, the researchers say all states will need to report COVID-19 outcomes for this population.

"The findings of this study emphasize the need to ensure, No. 1, immediate and unfettered access to vaccines for all people with IDD, whether living in a congregate setting, family home or their own home; and No. 2, the ongoing need for states to begin or improve their reporting of COVID-19 outcomes for people with IDD," Landes said.

Credit: 
Syracuse University

Electrons riding a double wave

image: 200 MeV accelerator

Image: 
Arie Irman

Since they are far more compact than today's accelerators, which can be kilometers long, plasma accelerators are considered as a promising technology for the future. An international research group has now made significant progress in the further development of this approach: With two complementary experiments at the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) and at the Ludwig-Maximilians-Universität Munich (LMU), the team was able to combine two different plasma technologies for the first time and build a novel hybrid accelerator. The concept could advance accelerator development and, in the long term, become the basis of highly brilliant X-ray sources for research and medicine, as the experts describe in the journal Nature Communications (DOI: 10.1038/s41467-021-23000-7).

In conventional particle accelerators, strong radio waves are guided into specially shaped metal tubes called resonators. The particles to be accelerated - which are often electrons - can ride these radio waves like surfers ride an ocean wave. But the potential of the technology is limited: Feeding too much radio wave power into the resonators creates a risk of electrical charges that can damage the component. This means that in order to bring particles to high energy levels, many resonators have to be connected in series, which makes today's accelerators in many cases kilometers long.

That is why experts are eagerly working on an alternative: plasma acceleration. In principle, short and extremely powerful laser flashes fire into a plasma - an ionized state of matter consisting of negatively charged electrons and positively charged atomic cores. In this plasma, the laser pulse generates a strong alternating electric field, similar to the wake of a ship, which can accelerate electrons enormously over a very short distance. In theory, this means facilities can be built far more compact, shrinking an accelerator that is a hundred meters long today down to just a few meters. "This miniaturization is what makes the concept so attractive," explains Arie Irman, a researcher at the HZDR Institute of Radiation Physics. "And we hope it will allow even small university laboratories to afford a powerful accelerator in the future."

But there is yet another variant of plasma acceleration where the plasma is driven by near-light-speed electron bunches instead of powerful laser flashes. This method offers two advantages over laser-driven plasma acceleration: "In principle, it should be possible to achieve higher particle energies, and the accelerated electron beams should be easier to control," explains HZDR physicist and primary author Thomas Kurz. "The drawback is that at the moment, we rely on large conventional accelerators to produce the electron bunches that are needed to drive the plasma." FLASH at DESY in Hamburg, for instance, where such experiments take place, measures a good one hundred meters.

High-energy combination

This is precisely where the new project comes in. "We asked ourselves whether we could build a far more compact accelerator to drive the plasma wave," says Thomas Heinemann of the University of Strathclyde in Scotland, who is also a primary author of the study. "Our idea was to replace this conventional facility with a laser-driven plasma accelerator." To test the concept, the team designed a sophisticated experimental setup in which strong light flashes from HZDR's laser facility DRACO hit a gas jet of helium and nitrogen, generating a bundled, fast electron beam via a plasma wave. This electron beam passes through a metal foil into the next segment, with the foil reflecting back the laser flashes.

In this next segment, the incoming electron beam encounters another gas, this time a mixture of hydrogen and helium, in which it can generate a new, second plasma wave, setting other electrons into turbo mode over a span of just a few millimeters - out shoots a high-energy particle beam. "In the process, we pre-ionize the plasma with an additional, weaker laser pulse," Heinemann explains. "This makes the plasma acceleration with the driver beam far more effective."

Turbo ignition: Almost to the speed of light within just one millimeter

The result: "Our hybrid accelerator measures less than a centimeter," Kurz explains. "The beam-driven accelerator section uses just one millimeter of it to bring the electrons to nearly the speed of light." Realistic simulations of the process show a remarkable gradient of the accelerating voltage in the process, corresponding to an increase of more than a thousand times when compared to a conventional accelerator. To underscore the significance of their findings, the researchers implemented this concept in a similar form at the ATLAS laser at LMU in Munich. However, the experts still have many challenges to overcome before this new technology can be used for applications.

In any case, the experts already have possible fields of application in mind: "Research groups that currently don't have a suitable particle accelerator might be able to use and further develop this technology," Arie Irman hopes. "And secondly, our hybrid accelerator could be the basis for what is called a free-electron laser." Such FELs are considered extremely high-quality radiation sources, especially X-rays, for ultra-precise analyses of nanomaterials, biomolecules, or geological samples. Until now, these X-ray lasers required long and expensive conventional accelerators. The new plasma technology could make them much more compact and cost-effective - and perhaps also affordable for a regular university laboratory.

Credit: 
Helmholtz-Zentrum Dresden-Rossendorf

New tool factors effects of fossil-fuel emissions on ocean research

A newly developed tool will allow scientists to better gauge how centuries of fossil fuel emissions could be skewing the data they collect from marine environments.

Researchers at the University of Alaska Fairbanks led the effort, which created a way for marine scientists to factor into their results the vast amounts of anthropogenic carbon dioxide that are being absorbed by oceans. Those human-caused carbon sources can muddy research results -- a problem known as the Suess effect -- leading to flawed conclusions about the health and productivity of marine ecosystems.

"The baseline carbon signature looks different than it did before the Industrial Revolution," said Casey Clark, a former UAF postdoctoral researcher who led the effort. "People have known this for a while, but there hasn't been a consistent way to correct for a potential bias in the data."

The formula allows scientists to factor in the Suess effect in regions that include the Bering Sea, Aleutian Islands and Gulf of Alaska. But the publicly available app will also allow researchers to introduce data from other marine ecosystems, making it potentially useful for scientists around the world.

A paper introducing the new tool was published on May 20 in the journal Methods in Ecology and Evolution.

The new tool will be used for stable isotope analysis, which has become valuable in recent decades for studying plants and animals. The ratio of isotopes in animal tissues gives them a distinct chemical signature, providing a remarkably accurate picture of what those species consumed and where they lived.

But carbon, a primary focus of isotope analysis, has proven more difficult to rely upon as humans introduce more of the element into the atmosphere and oceans.

"Not only are we shifting the global climate, we've changed the baseline of what type of carbon is out there," said Nicole Misarti, the director of UAF's Water and Environmental Research Center and a co-author of the paper.

Researchers who work with ancient specimens, such as archaeologists and paleontologists, can have their results especially skewed by fossil fuel emissions when making comparisons to modern times. However, the new study shows that even researchers who want to compare today's data to information collected as recently as eight years ago need to correct for the Suess effect.

"What we did is develop a tool that will let people adapt for those changes," Misarti said. "It disentangles that human impact from other ecosystem changes."

Credit: 
University of Alaska Fairbanks

Global acceleration in rates of vegetation change

image: Fossil pollen records offer our best insights into past rates of vegetation change. A lake, or other suitable environment, is cored to retrieve the layered sediments which contain pollen grains that accumulated over thousands of years. By identifying and counting the different pollen grains researchers can then reconstruct the local vegetation composition. Finally, the rate of vegetation change is estimated from the changes in pollen abundances through time.

Image: 
Artwork by Milan Teunissen van Manen (@MilanTvM).

Wherever ecologists look, from tropical forests to tundra, ecosystems are being transformed by human land use and climate change. A hallmark of human impacts is that the rates of change in ecosystems are accelerating worldwide.

Surprisingly, a new study, published today in Science, found that these rates of ecological change began to speed up many thousands of years ago. "What we see today is just the tip of the iceberg" noted co-lead author Ondrej Mottl from the University of Bergen (UiB). "The accelerations we see during the industrial revolution and modern periods have a deep-rooted history stretching back in time."

Using a global network of over 1,000 fossil pollen records, the team found - and expected to find - a first peak of high ecosystem changes around 11,000 years ago, when the Earth was coming out of a global ice age. "We expected rates of ecological change to be globally high during this transition because the world was changing fast as glaciers retreated and the world warmed" explains co-lead author Suzette Flantua. However, the timing of peak changes varied among regions, suggesting a fairly complex suite of climate and ecosystem changes during the end of the last ice age.

More surprisingly, the team found a second period of accelerating change that began between 4.6 and 2.8 thousand years ago and has continued to present. Remarkably, these recent rates of change are now as fast or faster as the massive ecosystem transformations that accompanied the end of the last ice age.

To achieve this global analysis, the team developed new statistical tools that allowed them to calculate rates of change in stratigraphic sequences. "We found a way to compare the many records across different continents and smaller regions" explains Alistair Seddon at UiB and Bjerknes Centre for Climate Research. "That was a major breakthrough as it allowed us to demonstrate that different regions experienced peak rates of change at different times".

The study was made possible by scientists working together to create a global resource of fossil data, called the Neotoma Paleoecology Database (https://www.neotomadb.org), available freely to everyone. "Paleoecology is quickly transitioning from a local science to a global science, powered by the ongoing growth in open-access global databases and by the urgent scientific need to better understand how ecosystems respond to environmental change, at local to global scales." John (Jack) Williams said, co-senior author and a paleoecologist at the University of Wisconsin-Madison (USA).

More work is needed to understand and confirm the causes of these accelerations in rates of vegetation change. "Although some patterns seem more obvious than others, we are actually not sure which changes were caused by humans, climate, or both", clarifies Mottl. The next step forward is therefore to compare these global networks of fossil data to independent evidence of past climate change and archaeological findings. "Bringing multiple lines of evidence together should help us better understand how climate, humans, and ecosystems interacted." says Flantua who is also keen to increase the number of pollen records from the southern hemisphere where gaps in the analyses remain. An ongoing ERC project called HOPE (Humans on Planet Earth), led by Prof. J. Birks (UiB) will make strides forward and as more data is integrated, more decisive conclusions can be presented on the role of humans in shaping modern-day ecosystems.

Sadly, one of the co-founders of Neotoma and study co-author, Dr. Eric C. Grimm, passed away unexpectedly as this work was nearing completion. Eric was a much-loved palynologist who made enormous contributions to the analyses of pollen records from around the world. Thanks to his advocacy of open and shared data and software, he built the foundation for this study, and undoubtedly for many studies ahead. The other authors have dedicated this paper to his memory.

Credit: 
The University of Bergen

A new form of carbon

image: Structure of the new carbon network. The upper part shows schematically the linking of the carbon atoms, forming squares, hexagons, and octagons. The lower part is an image of the network, obtained with high-resolution microscopy.

Image: 
University of Marburg, Aalto University

Carbon exists in various forms. In addition to diamond and graphite, there are recently discovered forms with astonishing properties. For example graphene, with a thickness of just one atomic layer, is the thinnest known material, and its unusual properties make it an extremely exciting candidate for applications like future electronics and high-tech engineering. In graphene, each carbon atom is linked to three neighbours, forming hexagons arranged in a honeycomb network. Theoretical studies have shown that carbon atoms can also arrange in other flat network patterns, while still binding to three neighbours, but none of these predicted networks had been realized until now.

Researchers at the University of Marburg in Germany and Aalto University in Finland have now discovered a new carbon network, which is atomically thin like graphene, but is made up of squares, hexagons, and octagons forming an ordered lattice. They confirmed the unique structure of the network using high-resolution scanning probe microscopy and interestingly found that its electronic properties are very different from those of graphene.

In contrast to graphene and other forms of carbon, the new Biphenylene network -- as the new material is named --has metallic properties. Narrow stripes of the network, only 21 atoms wide, already behave like a metal, while graphene is a semiconductor at this size. "These stripes could be used as conducting wires in future carbon-based electronic devices." said professor Michael Gottfried, at University of Marburg, who leads the team who developed the idea. The lead author of the study, Qitang Fan from Marburg continues, "This novel carbon network may also serve as a superior anode material in lithium-ion batteries, with a larger lithium storage capacity compared to that of the current graphene-based materials."

The team at Aalto University helped image the material and decipher its properties. The group of Professor Peter Liljeroth carried out the high-resolution microscopy that showed the structure of the material, while researchers led by Professor Adam Foster used computer simulations and analysis to understand the exciting electrical properties of the material.

The new material is made by assembling carbon-containing molecules on an extremely smooth gold surface. These molecules first form chains, which consist of linked hexagons, and a subsequent reaction connects these chains together to form the squares and octagons. An important feature of the chains is that they are chiral, which means that they exist in two mirroring types, like left and right hands. Only chains of the same type aggregate on the gold surface, forming well-ordered assemblies, before they connect. This is critical for the formation of the new carbon material, because the reaction between two different types of chains leads only to graphene. "The new idea is to use molecular precursors that are tweaked to yield biphenylene instead of graphene" explains Linghao Yan, who carried out the high-resolution microscopy experiments at Aalto University.

For now, the teams work to produce larger sheets of the material, so that its application potential can be further explored. However, "We are confident that this new synthesis method will lead to the discovery of other novel carbon networks." said Professor Liljeroth.

Credit: 
Aalto University

Origins of life researchers develop a new ecological biosignature

When scientists hunt for life, they often look for biosignatures, chemicals or phenomena that indicate the existence of present or past life. Yet it isn't necessarily the case that the signs of life on Earth are signs of life in other planetary environments. How do we find life in systems that do not resemble ours?

In groundbreaking new work, a team* led by Santa Fe Institute Professor Chris Kempes has developed a new ecological biosignature that could help scientists detect life in vastly different environments. Their work appears as part of a special issue of theBulletin of Mathematical Biology collected in honor of renowned mathematical biologist James D. Murray.

The new research takes its starting point from the idea that stoichiometry, or chemical ratios, can serve as biosignatures. Since "living systems display strikingly consistent ratios in their chemical make-up," Kempes explains, "we can use stoichiometry to help us detect life." Yet, as SFI Science Board member and contributor, Simon Levin, explains, "the particular elemental ratios we see on Earth are the result of the particular conditions here, and a particular set of macromolecules like proteins and ribosomes, which have their own stoichiometry." How can these elemental ratios be generalized beyond the life that we observe on our own planet?

The group solved this problem by building on two lawlike patterns, two scaling laws, that are entangled in elemental ratios we have observed on Earth. The first of these is that in individual cells, stoichiometry varies with cell size. In bacteria, for example, as cell size increases, protein concentrations decrease, and RNA concentrations increase. The second is that the abundance of cells in a given environment follows a power-law distribution. The third, which follows from integrating the first and second into a simple ecological model, is that the elemental abundance of particles to the elemental abundance in the environmental fluid is a function of particle size.

While the first of these (that elemental ratios shift with particle size) makes for a chemical biosignature, it is the third finding that makes for the new ecological biosignature. If we think of biosignatures not simply in terms of single chemicals or particles, and instead take account of the fluids in which particles appear, we see that the chemical abundances of living systems manifest themselves in mathematical ratios between the particle and environment. These general mathematical patterns may show up in coupled systems that differ significantly from Earth.

Ultimately, the theoretical framework is designed for application in future planetary missions. "If we go to an ocean world and look at particles in context with their fluid, we can start to ask whether these particles are exhibiting a power-law that tells us that there is an intentional process, like life, making them," explains Heather Graham, Deputy Principal Investigator at NASA's Lab for Agnostic Biosignatures, of which she and Kempes are a part. To take this applied step, however, we need technology to size-sort particles, which, at the moment, we don't have for spaceflight. Yet the theory is ready, and when the technology lands on Earth, we can send it to icy oceans beyond our solar system with a promising new biosignature in hand.  

Credit: 
Santa Fe Institute

Accounting for finance is key for climate mitigation pathways

A new study published in the journal Science, highlights the opportunity to complement current climate mitigation scenarios with scenarios that capture the interdependence among investors' perception of future climate risk, the credibility of climate policies, and the allocation of investments across low- and high-carbon assets in the economy.

Climate mitigation scenarios are key to understanding the transition to a low-carbon economy and inform climate policies. These scenarios are also important for financial investors to assess the risk of missing out on the transition or making the transition happen too late and in a disorderly fashion. In this respect, the scenarios developed by the platform of financial authorities known as the Network for Greening the Financial System (NGFS) - a platform of over 80 financial authorities around the globe who take an active interest in advancing the transition toward a sustainable world economy - have been a major step to provide investors with forward-looking views on how economic activities, both low- and high-carbon, could evolve in the next decades. However, currently, these scenarios do not account for the role that the financial system (i.e., financial firms, markets, and instruments) could play in such a transition.

"The financial system can play an enabling or hampering role in the transition to a low-carbon economy, depending on expectations, in other words, their perception of risks and returns. If investors delay revising their expectations, but then their expectations change suddenly, this can lead to financial instability, making the transition more costly for society," explains lead author Stefano Battiston from the University of Zurich, Switzerland and Ca' Foscari University of Venice, Italy. "However, if investors adjust expectations in a timely fashion and reallocate capital into low-carbon investments early and gradually, they enable the transition, leading to smoother adjustments of the economy and of prices."

"Current mitigation scenarios implicitly assume that financing is provided by investors without assessment of risk, resulting in high financing costs and possible limits on funding, in particular for low-carbon firms. This is because the Integrated Assessment Models (IAMs) used in studies on the subject do not include actors such as banks that can decide to grant loans to firms, or actors like insurance firms and pension funds that can decide to invest (or not) in stock market shares of firms. As a result, in the NGFS scenarios, the orderly versus disorderly character of scenarios is assumed exogenously, independently of the role of the financial system," adds coauthor and IIASA Sustainable Service Systems Research Group Leader, Bas van Ruijven, who is also co-developer of the NGFS Climate Scenarios.

Why does this matter? Not modeling the feedback loop between the financial system and mitigation pathways limits our understanding of the dynamics and the feasibility of the low-carbon transition, and the ability to inform policy and investment decisions. This could also lead to an underestimation of risk across mitigation scenarios and trajectories of orderly and disorderly transitions.

"While climate mitigation scenarios describe what the world might look like in the next decades, they also have the power to shift markets' expectations today. This is because they are endorsed by many influential central banks and financial authorities in the world, as well as by large investors. It is therefore critical to understand if these scenarios for potential tomorrows could lead, unintentionally, to insufficient investments today. This presents an opportunity to interface IAMs with models that allow investors to carry out climate-financial risk assessments (CFRs)," says coauthor and IIASA Energy, Climate, and Environment Program Director, Keywan Riahi.

To this end, the authors have developed a framework to connect climate mitigation scenarios and financial risk assessment in a circular way, demonstrating the interplay between the role of the financial system and the timing of the climate policy introduction. IAMs generate sets of climate mitigation scenarios, which are then used by the CFR to model how investors assess the financial risk of high- and low-carbon firms along the IAM's trajectories. The resulting trajectories of financing cost across low- and high-carbon firms are fed back to the IAMs to update the respective mitigation scenarios, closing the loop between the IAM and the CFR.

By conditioning the investment decisions to the credibility of climate policy scenarios, the study considers how the role of the financial system as enabling or hampering can reverse the ordering of costs and benefits of climate mitigation policies, which are currently distorted by not considering the financial system.

With regard to the implementation of fiscal policies such as carbon pricing and the phasing out of fossil fuel subsidies, or the introduction of funding for renewable energy projects, neglecting the role of finance implies that a projected carbon price schedule could miss the emissions target because the mitigation scenario does not necessarily imply a risk perception by the financial system that leads to the investment reallocation assumed by the scenario. Thus, the framework could help the IPCC community to revise their carbon price projections obtained from climate mitigation models to make them more consistent with the role that the financial system plays.

"Our framework could support financial authorities in encouraging investors' assessment of climate-related financial risk. The new IAM-CFR scenarios would limit the underestimation of financial risk in climate stress-test exercises. Accounting for the role of the financial system also has implications for criteria used by central banks to identify eligible assets in their collateral frameworks and purchasing programs," concludes Irene Monasterolo from the Vienna University of Economics and Business and visiting scholar in the IIASA Energy, Climate, and Environment Program. "Furthermore, our results shed light on the importance for financial authorities to monitor and tame the possible moral hazard of the financial system in the dynamics of the low-carbon transition."

The team authors comprises Stefano Battiston (Univ. of Zurich, Dept. Banking and Finance and Univ. Ca Foscari of Venice) who is also Lead Author in the Chapter 15 "Finance and Investments" of the Assessment Report 6 of the IPCC, to be released in 2022. Keywan Riahi (IIASA) is also Coordinating Lead Author in the Chapter 3 "Mitigation Pathways" of the same IPCC report. Bas van Rujiven (IIASA) is a member of the scientific consortium supporting the NGFS with climate scenarios. Prof. Irene Monasterolo (Vienna Univ. of Economics and Business and Boston Univ.) has been working on climate scenarios with central banks and development banks.

Credit: 
International Institute for Applied Systems Analysis

Newly identified antibody can be targeted by HIV vaccines

DURHAM, N.C. - A newly identified group of antibodies that binds to a coating of sugars on the outer shell of HIV is effective in neutralizing the virus and points to a novel vaccine approach that could also potentially be used against SARS-CoV-2 and fungal pathogens, researchers at the Duke Human Vaccine Institute report.

In a study appearing online May 20 in the journal Cell, the researchers describe an immune cell found in both monkeys and humans that produces a unique type of anti-glycan antibody. This newly described antibody has the ability to attach to the outer layer of HIV at a patch of glycans -- the chain-like structures of sugars that are on the surfaces of cells, including the outer shells of viruses.

"This represents a new form of host defense," said senior author Barton Haynes, M.D., director of the Duke Human Vaccine Institute (DHVI). "These new antibodies have a special shape and could be effective against a variety of pathogens. It's very exciting."

Haynes and colleagues -- including lead author Wilton Williams, Ph.D., director of the Viral Genetics Analysis Core at DHVI and co-author Priyamvada Acharya, Ph.D., director of the Division of Structural Biology at DHVI -- found the antibody during a series of studies exploring whether there might be an immune response targeted to glycans that cover the outer surface of HIV.

More than 50% of the virus's outer layer is comprised of glycans. Haynes said it has long been a tempting approach to unleash anti-glycan antibodies to break down these sugar structures, triggering immune B-cell lymphocytes to produce antibodies to neutralize HIV.

"Of course, it's not that simple," Haynes said.

Instead, HIV is cloaked in sugars that look like the host's glycans, creating a shield that makes the virus appear to be part of the host rather than a deadly pathogen.

But the newly identified group of anti-glycan antibodies -- referred by the Duke team as Fab-dimerized glycan-reactive (FDG) antibodies -- had gone undiscovered as a potential option.

To date, there was only one report of a similar anti-glycan HIV antibody with an unusual structure that was found 24 years ago (termed 2G12). The Duke team has now isolated several FDG antibodies and found that they display a rare, never-before-seen structure that resembled 2G12. This structure enables the antibody to lock tightly onto a specific, dense patch of sugars on HIV, but not on other cellular surfaces swathed in host glycans.

"The structural and functional characteristics of these antibodies can be used to design vaccines that target this glycan patch on HIV, eliciting a B-cell response that neutralizes the virus," Williams said.

"These antibodies are actually much more common in blood cells than other neutralizing antibodies that target specific regions of the HIV outer layer," Williams said. "That's an exciting finding, because it overcomes one of the biggest complexities associated with other types of broadly neutralizing antibodies."

Williams said the FDG antibodies also bind to a pathogenic yeast called Candida albicans, and to viruses, including SARS-CoV-2, which causes COVID-19. Additional studies will explore ways of harnessing the antibody and deploying it against these pathogens.

Credit: 
Duke University Medical Center

Global study of glacier debris shows impact on melt rate

image: Debris covers Kennicott Glacier in Alaska.

Image: 
David Rounce

A large-scale research project at the University of Alaska Fairbanks Geophysical Institute has revealed insight into the relationship between surface debris on glaciers and the rate at which they melt.

The work is the first global assessment of Earth's 92,033 debris-covered glaciers and shows that debris, taken as a whole, substantially reduces glacier mass loss.

The results will affect sea level rise calculations and allow for improved assessment of hazards faced by nearby communities.

"This is the first step to enable us to start projecting how these debris-covered glaciers are going to evolve in the future and how they're going to affect glacial runoff and sea level rise," said glaciologist David Rounce, the lead author of a paper published April 28 in Geophysical Research Letters, a publication of the American Geophysical Union.

Rounce conducted the research while at the Geophysical Institute but now works at Carnegie Mellon University.

The effects of debris on regional and global projections of glacial melt have been unknown due to the lack of accurate debris thickness estimates, according to the study. Boulders, rocks and sand cover about 5% of Earth's ice caps and mountain glaciers. Alaska is one of the regions with the most debris cover.

"Debris-covered glaciers are going to become more prominent, and, yet, there wasn't a single global glacier model to account for debris-covered glaciers because we didn't know what the debris thickness was," Rounce said.

Rounce and Geophysical Institute glaciology professor Regine Hock, a co-author of the study, have been working on the problem since 2017 through their participation in NASA's High Mountain Asia Team and the agency's Sea Level Change Team.

The researchers used a combination of elevation change data and surface temperature from satellite imagery to arrive at the thickness levels for the entire surface of each of the world's debris-covered glaciers.

The research findings will allow for improved risk assessment for communities located near debris-covered glaciers. Rapid melting can lead to formation of glacial lakes, creating potential flood hazards.

The upper reaches of debris-laden glaciers, which tend to have a thin surface layer of rock and other material, melt faster than the more heavily covered lower reaches. That can -- but not always -- cause a glacier to stagnate and leave a stretch of dead ice at the lower end when the upper end melts to a point of not growing.

"And that dead ice can be prone to form glacial lakes," Rounce said. "So like on Kennicott Glacier, it has a glacial lake that's been forming for many years, and those glacial lakes can become flood hazards."

The research findings will also affect estimates of how sea level will change in a warming climate.

"We don't know what the impact is going to be on sea level rise estimates, but we know that it's likely going to change those estimates," Rounce said.

Alaska's Kennicott Glacier in Wrangell-St. Elias National Park and Preserve played a key role in the study as a source of validation for the research.

"Kennicott is singled out because it's one of the few debris-covered glaciers in Alaska that has debris thickness measurements and surface melt measurements beneath debris," he said. "There are very few debris-covered glaciers that have measurements that are publicly available."

Credit: 
University of Alaska Fairbanks

When Medicare chips in on hepatitis C treatment for Medicaid patients, everyone wins

Untreated hepatitis C can lead to serious and life-threatening health problems like cirrhosis and liver cancer. Direct-acting antiviral therapies introduced in recent years are highly effective, with cure rates above 95%.

But most Medicaid beneficiaries with hepatitis C don't get these drugs, which cost $20,000-$30,000, due to state budget constraints.

Now, a new USC study finds that a Medicaid-Medicare partnership could cover the lifesaving medications -- and still save $1 to $1.1 billion over 25 years. Medicaid is a joint federal and state program that provides health coverage for low-income families and others. Medicare is the federal health insurance program for people 65 and older.

The study was published today in the American Journal of Managed Care.

Researchers with the USC Schaeffer Center for Health Policy and Economics studied the state of Maryland's "total coverage" proposal, where the state receives a credit from Medicare to offset Medicaid investments in hepatitis C treatments that could lead to Medicare savings. Hepatitis C complications tend to snowball as time goes on, leading to costly treatment by the time the patient is eligible for Medicare.

"Our findings show that Medicare has significant financial incentives to partner with Medicaid to treat the majority of hepatitis C cases," said William Padula, a fellow with the USC Schaeffer Center and an assistant professor of pharmaceutical and health economics at the USC School of Pharmacy. "Joint Medicaid-Medicare coverage provides a cost-effective solution to treat all patients now to reduce harm caused by chronic infection in the United States."

Modeling different treatment options for hepatitis C

Padula and colleagues developed a model to simulate three different pathways for patients going through the care continuum for hepatitis C. The first pathway simulated standard coverage with 50% probability of screening for the virus and 20% probability of treatment with direct-acting antiviral therapies -- reflecting current state policies on screening and coverage decisions.

Two additional models simulated Medicare credits to the state Medicaid program. One was a risk-stratified total coverage model which assumed the probability of screening at 80% and a treatment rate of 60%; the other model simulated total coverage with assumed 80% probability of screening and 100% treatment rate.

The study found that the total coverage model saved $158 per patient and the risk-stratified coverage saved $178 per patient, compared with standard care. Total coverage would save $1 billion and risk-stratified total coverage would save $1.1 billion over the course of 25 years.

The researchers say their model allows states to calculate the costs and savings of a total coverage policy and state public health agencies should explore using it to meet their needs.

"Maryland may be one of the first states to pilot the concept of a total coverage solution for hepatitis C treatment through joint Medicare-Medicaid payments, but all 50 states are grappling with this challenge," Padula said. "Cost sharing between the two programs will also improve the lives of patients and minimize the probabilities of future infection."

Credit: 
University of Southern California

Worrying about your heart increases risk for mental health disorders

image: Michael Zvolensky, Hugh Roy and Lillie Cranz Cullen Distinguished University Professor of psychology at the University of Houston, reports that those who worry about heart disease in Latinx community are at greater risk for depression

Image: 
University of Houston

For coffee drinkers, a common scenario might involve drinking an extra cup only to end up with a racing heart and a subtle reminder to themselves to cut down the caffeine. But for those who have a different thinking pattern, one that includes heart-focused anxiety, the racing heart might conclude with the fear of a heart attack and a trip to the emergency room.

It turns out young Latinx adults who experience heart-focused anxiety could be at greater risk for mental health disorders.

"We have empirical evidence that individual differences in heart-focused anxiety are related to more severe co-occurring anxiety and depressive symptomatology among a particularly at-risk segment of the Latinx population," reports Michael Zvolensky, Hugh Roy and Lillie Cranz Cullen Distinguished University Professor of psychology at the University of Houston, in the Journal of Racial and Ethnic Health Disparities. The population segment to which Zvolensky refers is Latinx young adults with previous trauma who were born in the United States. Their trauma might include racism related and transgenerational stress.

This is only the second study on heart-related anxiety in the Latinx community, both conducted by Zvolensky.

"In our first study, we assessed middle aged adults, presumably more concerned about their health. This study is unique, however, because even among a group generally too young to experience mounting health concerns, we are seeing a similar pattern, which tells us it's probably relevant to the whole Latinx population," said Zvolensky.

According to previous research, the Latinx population can somaticize mental health problems, meaning they don't view them as mental health issues, but rather turn them into physical symptoms and report them as such. As examples, anxiety might be reported as a headache or a problem with breathing.

"This population also struggles with a lot of chronic physical health co-morbidities including heart disease and obesity, so this research is a good fit for a population who tends to blame mental health issues on physical ailments, which generates greater mental health risk," said Zvolensky, who is also director of the Anxiety and Health Research Laboratory/Substance Use Treatment Clinic at UH.

To make matters worse, treatment for mental health conditions among Latinx populations is often limited or nonexistent.

"Latinx persons underutilize mental health services compared to non-Latinx whites and are more likely to use primary care for the delivery of mental health services which are often inadequate for successfully addressing mental health problems," said Zvolensky, who created and assessed reports from 169 college aged, Latinx college students who had been exposed to trauma.

"Results indicated that heart-focused anxiety was a statistically significant predictor for general depression and overall anxiety," said Zvolensky.

Clinically, the results of the study could ultimately guide the development of specialized intervention strategies.

"We can screen for heart-focused anxiety and that's much more efficient and precise than screening for a whole range of mental health problems," said Zvolensky. "If you reduce heart-focused anxiety, you do that person a great service because you're likely decreasing their risk for a whole range of mental health problems. And that's called precision medicine."

Credit: 
University of Houston

New smartphone app predicts vineyard yields earlier, more accurately

ITHACA, N.Y. - Cornell University engineers and plant scientists have teamed up to develop a low-cost system that allows grape growers to predict their yields much earlier in the season and more accurately than costly traditional methods.

The new method allows a grower to use a smartphone to record video of grape vines while driving a tractor or walking through the vineyard at night. Growers may then upload their video to a server to process the data. The system relies on computer-vision to improve the reliability of yield estimates.

Traditional methods for estimating grape cluster numbers are often done manually by workers, who count a subset of clusters on vines and then scale their numbers up to account for the entire vineyard. This strategy is laborious, costly and inaccurate, with average cluster count error rates of up to 24% of actual yields. The new method cuts those maximum average error rates by almost half.

"This could be a real game-changer for small and medium-sized farms in the Northeast," said Kirstin Petersen, assistant professor of electrical and computer engineering in the College of Engineering.

Petersen is a co-author of the paper, "Low-Cost, Computer Vision-Based, Prebloom Cluster Count Prediction in Vineyards," which published in the journal Frontiers in Agronomy. Jonathan Jaramillo, a doctoral student in Petersen's lab, is the paper's first author; Justine Vanden Heuvel, professor in the School of Integrative Plant Science Horticulture Section in the College of Agriculture and Life Sciences, is a co-author.

When workers manually count clusters on a vine, accuracy greatly depends on the person counting. In an experiment, the researchers found that for a panel of four vines containing 320 clusters, manual counts ranged from 237 to 309. Workers will count the number of grape clusters in a small portion of the vineyard to get an average number of clusters per row. Farmers will then multiply the average by the total number of rows to predict yields for a vineyard. When cluster numbers are miscounted, multiplying only further amplifies inaccurate yield predictions.

"We showed that compared to the technology, a farmer would have to manually count 70% of their vineyard to gain the same level of confidence in their yield prediction," Petersen said, "and no one would do that."

High tech, accurate robot counters do exist, but they cost upwards of $12,000, making them inaccessible to small and medium-sized growers. Another disadvantage: they count grapes when they are closer to ripening, late in the season, in September or October. The new method counts clusters in May to June.

"Having good predictions earlier in the season gives farmers more time to act on information," Jaramillo said. Famers may then secure labor and buyers in advance. Or, if they are making wine, they can acquire the right amount of equipment for producing and packaging it. "Not having these things lined up in advance can cause problems for growers last minute and ultimately reduce profits," Jaramillo said.

Now an unskilled laborer can simply drive a tractor up and down the rows with a smartphone set up on a gimbal. While details of a public release are still being worked out, the researchers will field test an app this summer. The researchers intend for this app to be open sourced, and the machine learning components setup such that users simply upload their video to a server that will process the data for them.

Credit: 
Cornell University

Scientists reveal structural details of how SARS-CoV-2 variants escape immune response

LA JOLLA, CA--Fast-spreading variants of the COVID-19-causing coronavirus, SARS-CoV-2, carry mutations that enable the virus to escape some of the immune response created naturally or by vaccination. A new study from scientists at Scripps Research, along with collaborators in Germany and the Netherlands, has revealed key details of how these escape mutations work.

The scientists, whose study appears in Science, used structural biology techniques to map at high resolution how important classes of neutralizing antibodies bind to the original pandemic strain of SARS-CoV-2--and how the process is disrupted by mutations found in new variants first detected in Brazil, the United Kingdom, South Africa and India.

The research also highlights that several of these mutations are clustered in one site, known as the "receptor binding site," on the spike protein of the virus. Other sites on the receptor binding domain are unaffected.

"An implication of this study is that, in designing next-generation vaccines and antibody therapies, we should consider increasing the focus on other vulnerable sites on the virus that tend not to be affected by the mutations found in variants of concern," says co-lead author Meng Yuan, PhD.

Yuan is a postdoctoral research associate in the laboratory of senior author Ian Wilson, DPhil, Hansen Professor of Structural Biology and Chair of the Department of Integrative Structural and Computational Biology at Scripps Research.

How 'variants of concern' escape immune response

SARS-CoV-2 "variants of concern" include the UK's B.1.1.7 variants, South Africa's B.1.351 variants, Brazil's P.1 variants and India's B.1.617 variants. Some of these variants appear to be more infectious than the original Wuhan strain. Recent studies have found that antibody responses generated through natural infection to the original strain or via vaccination are less effective in neutralizing these variant strains.

Because of the variants' potential to spread and cause disease--perhaps in some cases, despite vaccination--scientists consider it urgent to discover how the variants manage to escape much of the prior immune response in the body, including the antibody response.

In the study, the researchers focused mainly on three mutations in the SARS-CoV-2 spike protein: K417N, E484K and N501Y. Alone or in combination, these mutations are found in most major SARS-CoV-2 variants. All of the mutations are found in the SARS-CoV-2 receptor binding site, which the where the virus attaches to host cells.

The researchers tested representative antibodies from the major classes that target the general area in and around the receptor binding site. They found that many of these antibodies lose their ability to effectively bind and neutralize the virus when the mutations are present.

Using structural imaging techniques, the team then mapped the relevant portion of the virus at atomic-scale resolution to examine how the mutations affect sites where antibodies otherwise would bind and neutralize the virus.

"This work provides a structural explanation for why antibodies elicited by COVID-19 vaccines or natural infection by the original pandemic strain are often ineffective against these variants of concern," Wilson says.

Zeroing in on points of vulnerability

The findings suggest that while antibody responses to the SARS-CoV-2 receptor binding site can be very potent in neutralizing the original Wuhan strain, certain variants are able to escape--perhaps eventually necessitating updated vaccines.

At the same time, the study underlines the fact that the three key viral mutations, which SARS-CoV-2 seems inherently prone to develop, do not alter other vulnerable sites on the virus outside the receptor binding site. The researchers specifically showed that virus-neutralizing antibodies targeting two other areas outside the receptor binding site were largely unaffected by these three mutations.

This suggests that future vaccines and antibody-based treatments could provide broader protection against SARS-CoV-2 and its variants by eliciting or utilizing antibodies against parts of the virus that lie outside the receptor binding site. The researchers note that broad protection against variants may be necessary if, as seems likely, the virus becomes endemic in the human population.

The Wilson lab and collaborators in this study are continuing to study human antibody responses to variants of concern and hopes to identify strategies for broad protection against not only SARS-CoV-2 and its variants but also SARS-CoV-1 and other related, emergent coronaviruses.

Credit: 
Scripps Research Institute