Culture

Science snapshots from Berkeley Lab -- April 1, 2021

image: CAPTION:
Staff engineer Bruis van Vlijmen demonstrates how he works in the Battery Informatics Lab at SLAC.

Image: 
Jacqueline Orrell/SLAC National Accelerator Laboratory

X-Ray Experiments, Machine Learning Could Trim Years Off Battery R&D

By Glenn Roberts Jr.

An X-ray instrument at Berkeley Lab contributed to a battery study that used an innovative approach to machine learning to speed up the learning curve about a process that shortens the life of fast-charging lithium batteries.

Researchers used Berkeley Lab's Advanced Light Source, a synchrotron that produces light ranging from the infrared to X-rays for dozens of simultaneous experiments, to perform a chemical imaging technique known as scanning transmission X-ray microscopy, or STXM, at a state-of-the-art ALS beamline dubbed COSMIC.

Researchers also employed "in situ" X-ray diffraction at another synchrotron - SLAC's Stanford Synchrotron Radiation Lightsource - which attempted to recreate the conditions present in a battery, and additionally provided a many-particle battery model. All three forms of data were combined in a format to help the machine-learning algorithms learn the physics at work in the battery.

While typical machine-learning algorithms seek out images that either do or don't match a training set of images, in this study the researchers applied a deeper set of data from experiments and other sources to enable more refined results. It represents the first time this brand of "scientific machine learning" was applied to battery cycling, researchers noted. The study was published recently in Nature Materials.

The study benefited from an ability at the COSMIC beamline to single out the chemical states of about 100 individual particles, which was enabled by COSMIC's high-speed, high-resolution imaging capabilities. Young-Sang Yu, a research scientist at the ALS who participated in the study, noted that each selected particle was imaged at about 50 different energy steps during the cycling process, for a total of 5,000 images.

The data from ALS experiments and other experiments were combined with data from fast-charging mathematical models, and with information about the chemistry and physics of fast charging, and then incorporated into the machine-learning algorithms.

"Rather than having the computer directly figure out the model by simply feeding it data, as we did in the two previous studies, we taught the computer how to choose or learn the right equations, and thus the right physics," said Stanford postdoctoral researcher Stephen Dongmin Kang, a study co-author.

Patrick Herring, senior research scientist for Toyota Research Institute, which supported the work through its Accelerated Materials Design and Discovery program, said, "By understanding the fundamental reactions that occur within the battery, we can extend its life, enable faster charging, and ultimately design better battery materials."

Read SLAC's release, "In a leap for battery research, machine learning gets scientific smarts."

To Speed Discovery, Infrared Microscopy Goes 'Off the Grid'

By Lori Tamura

Question: What do a roundworm, a Sharpie pen, and high-vacuum grease have in common? Answer: They've all been analyzed in recent proof-of-principle microscopy experiments at Berkeley Lab's Advanced Light Source (ALS).

In the journal Communications Biology , researchers from Caltech, UC Berkeley, and the Berkeley Synchrotron Infrared Structural Biology Imaging Program (BSISB) reported a more efficient way to collect "high-dimensional" infrared images - where each pixel contains rich physical and chemical information. With the new method, scans that would've taken up to 10 hours to complete can now be done in under an hour, potentially broadening the scope of biological spectromicroscopy to time-sensitive experiments.

"We realized that sampling our model organism - the small roundworm C. elegans - as it changes over time was challenging for software rather than hardware reasons," said Elizabeth Holman, a graduate student in chemistry at Caltech and co-first author of the paper. "For example, image sampling was limited to uniform-grid raster scans with rectangular boundaries and fixed distances between sample points."

The new technique, implemented at the ALS with co-first author Yuan-Sheng Fang, a graduate student in physics at UC Berkeley, uses a grid-less, adaptive approach that autonomously increases sampling in areas displaying greater physical or chemical contrast. In the proof-of-concept infrared microscopy experiments, the researchers examined two samples.

The first was a two-component system in which both components (permanent-marker ink and high-vacuum grease) were well characterized. Details of the sample were very difficult to see clearly with the naked eye, so it was a good test of how the software would perform with minimal guidance from a human experimenter. The second sample was a live, larval-stage C. elegans, a biological model system studied by thousands of researchers.

In both cases, autonomous adaptive data acquisition (AADA) methods clearly outperformed nonadaptive methods. In the second example, increased sampling density corresponded with known C. elegans anatomical features, and the head region was mapped in 45 minutes versus about 4.9 hours using commercially available software.

"Outside of our specific published work, the results suggest that integrating AADA into existing scanning-based satellite, drone, and/or microscope techniques can facilitate research in fields ranging from hyperspectral remote sensing to ocean and space exploration," said Holman.

Actor in a Supporting Role: Substrate Effects on 2D Layers

By Lori Tamura

Atomically thin layers are of great technological interest because of potentially useful electronic properties that emerge as the layer thickness approaches the 2D limit. Such materials tend to form weak bonds outside the layer and are thus generally assumed to be unaffected by substrates that provide physical support.

To make further progress, however, scientists must rigorously test this assumption, not only to better understand single-layer physics, but also because the existence of substrate effects raises the possibility of tuning layer properties by tweaking the substrate.

As reported in the journal Physical Review Letters, a team led by Tai-Chang Chiang of the University of Illinois at Urbana-Champaign and his postdoctoral associate, Meng-Kai Lin, used Berkeley Lab's Advanced Light Source (ALS) to probe changes in the electronic properties of a 2D semiconductor, titanium telluride, as the thickness of a substrate, platinum telluride, was increased. Single-layer titanium telluride is highly sensitive to what lies underneath, making it particularly useful as a test case for investigating substrate coupling effects.

The results showed that as the substrate thickness increased, a dramatic and systematic variation occurred in the single-layer titanium telluride. An electronic phenomenon known as a charge density wave -- a coupled charge and lattice distortion characteristic of single-layer titanium telluride -- was suppressed.

"The experimental findings, combined with first-principles theoretical simulations, led to a detailed explanation of the results in terms of the basic quantum mechanical interactions between the single layer and the tunable substrate," said Lin.

Given that the interfacial bonding remained weak, the researchers concluded that the observed changes were correlated with the substrate's transformation from a semiconductor to a semimetal as it increased in thickness.

"This systematic study illustrates the crucial role that substrate interactions play in the physics of ultrathin films," said Lin. "The scientific understanding derived from our work also provides a framework for designing and engineering ultrathin films for useful and enhanced properties."

Credit: 
DOE/Lawrence Berkeley National Laboratory

Distinct Parkinson's disease symptoms tied to different brain pathways

image: A three-dimensional rendering of a mouse hemisphere shows brain-wide projection patterns of GPe neurons labeled by mRuby2 (soma, axonal fibers) and eGFP (pre-synaptic sites).

Image: 
Lim Lab, UC San Diego

Parkinson's disease (PD) is well known as a debilitating disease that gradually worsens over time. Although the disease's progression has been largely tied to the loss of motor functions, non-motor symptoms, including the loss of cognitive abilities, often emerge early in the disease.

Much less understood is the role that specific neural circuits play in these distinct motor and non-motor functions.

A new study led by neurobiologists at the University of California San Diego and their colleagues found that specific, identifiable neural pathways are charged with particular functions during stages of the disease. Their findings, published recently in Nature Neuroscience, can help form the basis for improving therapeutic strategies for precise symptoms of Parkinson's at various levels of disease progression.

The researchers used a mix of approaches to shed more light on the anatomical and functional importance of a center of brain circuitry known as the basal ganglia, located deep in the cranium. Specifically, the researchers, working in mice, investigated circuit pathways tied to specific neurons in the external globus pallidus, or GPe, and their role in different Parkinson's disease-related behaviors. The GPe is known for its strong output and influence on several downstream brain regions.

The investigations included a multi-pronged approach using electrophysiology, viral tracing and behavioral experiments. The researchers identified two populations of GPe neurons and their distinctive pathways tied to different behavioral symptoms.

"Our work demonstrates that the distinct neural circuitries in the basal ganglia are differentially involved in the motor and non-motor symptoms of Parkinsonian-like behaviors that occur at different stages of the disease," said Lim, an associate professor in the Neurobiology Section of the Division of Biological Sciences at UC San Diego. "This suggests that evaluation of the detailed circuit mechanisms is needed to fully understand the changes in brain during the progression of PD, and could provide better therapeutic strategies for the treatment of PD."

Lim said the most surprising finding from the research was the fact that dopaminergic neurons, those that are gradually lost during Parkinson's disease progression, could be linked so specifically to changes in different brain areas.

"Selective manipulation of specific changes can rescue one type of symptom--without affecting other symptoms--of Parkinson's Disease," said Lim.

With the new framework in hand, Lim and his colleagues are now looking deeper at the circuit pathways and how they are tied to different disease symptom stages, in particular with an emphasis on delaying the progression of the disease.

"Our findings provide a novel framework for understanding the circuit basis of varying behavioral symptoms of the Parkinsonian state, which could provide better strategies for the treatment of PD," the researchers write in the paper.

Credit: 
University of California - San Diego

Therapeutic resistance linked to softer tissue environment in breast cancer

image: Compared with an untreated breast tumor (left), the extracellular collagen matrix (blue) is reorganized in a residual tumor after therapy (right), activating a signaling pathway in tumor cells (magenta) that allows them to survive and initiate tumor recurrence.

Image: 
: ©2021 Drain et al. Originally published in <em>Journal of Experimental Medicine</em>. https://doi.org/10.1084/jem.20191360

Researchers at the University of California, San Francisco, have discovered that aggressive, triple-negative breast cancers (TNBCs) can evade treatment by reorganizing and softening the collagen matrix that surrounds the cancer cells. The study, which will be published April 2 in the Journal of Experimental Medicine (JEM), shows that the softer matrix activates a signaling pathway that promotes the cancer cells’ survival, and suggests that targeting this pathway could enhance the effectiveness of chemo- and radiotherapy in TNBC patients.

TNBC is an aggressive type of breast cancer with worse survival rates than other forms of the disease. Because TNBC cells lack the HER2 signaling receptor and the estrogen and progesterone hormone receptors, they cannot be eliminated by drugs that specifically target these proteins. Instead, TNBC patients are normally treated with a combination of surgery and chemotherapy/radiation, but these treatments are often unable to completely eliminate the tumor, leading to local recurrence and metastasis to other tissues.

“Currently, there are no reliable methods to identify patients that are likely to respond well and those that are likely to be resistant to treatment,” says Valerie M. Weaver, a professor and Director of the Center for Bioengineering and Tissue Regeneration in the Department of Surgery at UCSF. “There is an urgent need for a deeper understanding of the biological basis for therapeutic response in TNBC that will inform treatment decisions and the development of effective strategies to overcome therapy resistance.”

One factor that could influence the response to therapy is the meshwork of collagen and other proteins that surrounds the cancer cells within tumors. This extracellular matrix can activate various signaling pathways within cells that determine whether or not cells die when they are exposed to stress.

Weaver and colleagues, including Catherine C. Park, professor and Chair of the Department of Radiation Oncology at UCSF, and graduate student Allison P. Drain, analyzed the extracellular matrix in tumor samples taken from TNBC patients. In untreated TNBC, collagen proteins were aligned into long, linear fibers, creating a matrix that is much stiffer than normal, healthy breast tissue. But in TNBC treated with chemotherapy, the collagen fibers in the remaining tumor were reorganized to form a much softer matrix.

“This raised the intriguing possibility that the softened, remodeled extracellular matrix might be causally linked to the pathogenesis of the treatment-resistant, residual tumor tissue,” Weaver says.

The researchers found that tumors grown in the lab or in mice were more resistant to both radiation and the chemotherapy agent paclitaxel when they were surrounded by a softer extracellular matrix. Cancer cells grown in a stiff environment activate a signaling protein called JNK that makes the cells more likely to die in response to stress. In contrast, cancer cells surrounded by a softer matrix activate a protein called NF-kB that counteracts JNK and promotes cell survival. Accordingly, treating mice with a drug that inhibits NF-kB improved their therapeutic response, significantly slowing tumor growth when combined with radiotherapy.

Weaver and colleagues found that TNBC patients whose tumors had high levels of NF-kB activity showed higher rates of treatment resistance. NF-kB activity was also elevated in remodeled, softened tumors that remained after treatment.

“Therapies designed to modulate stiffness-dependent NF-κB or JNK activity might therefore improve the treatment response of TNBCs and enhance the long-term survival of these patients,” Weaver says. “However, further efforts will be needed to clarify the nature of the mechanisms that mediate treatment resistance and develop effective, selective, and safe therapeutics.”

Credit: 
Rockefeller University Press

Whole-body screening and ed. in melanoma-prone families may improve early detection rates

Bottom Line: Among patients at high risk of melanoma, those who received routine skin cancer screening and education about skin self-exams were significantly more likely to be diagnosed with thinner and earlier stage melanomas.

Journal in Which the Study was Published: Cancer Epidemiology, Biomarkers & Prevention, a journal of the American Association for Cancer Research

Author: Michael Sargen, MD, a dermatologist and clinical fellow in the Division of Cancer Epidemiology and Genetics at the National Cancer Institute (NCI), part of the National Institutes of Health (NIH)

Background: "Whole-body screening for melanoma is currently routine for individuals at high risk for melanoma. These individuals include members of melanoma-prone families, categorized as having at least two relatives who have had melanoma, and those with inherited pathogenic gene variants that increase melanoma risk," said Sargen. "However, the benefit of screening in melanoma-prone families has not been previously quantified."

How the Study was Conducted: To better understand if screening and education among melanoma-prone families could result in earlier melanoma detection, Sargen and colleagues evaluated data from the NCI Familial Melanoma Study, which was initiated in 1976 to investigate inherited and environmental risk factors for the disease. "At enrollment and subsequent in-person visits to the NIH, study participants received whole-body screening for melanoma, total body photographs with closeups of potentially problematic moles, education about the appearance of melanoma, and strategies for protecting their skin from ultraviolet (UV) damage," Sargen explained. Participants in the study were also counseled to follow up with their local dermatologist annually for whole-body screening exams.

To evaluate the success of the study, Sargen and colleagues compared differences in melanoma thickness and tumor stage between participants diagnosed with melanoma before and after enrollment (the pre-study cohort and the prospective cohort, respectively). "Tumor thickness, or how deep the tumor grows beneath the surface of the skin, is associated with an increased risk of death from melanoma," said Sargen. The researchers also compared tumor thickness trends between participants in their study and cases in the general population by using data from Surveillance, Epidemiology, and End Results (SEER) registries.

Results: There was a total of 293 melanoma cases in the NCI Familial Melanoma Study, with 246 cases in the pre-study cohort, and 47 cases in the prospective cohort. Participants enrolled in the study from 1976 through 2014. There was a total of 79,530 cases analyzed in the SEER registries between 1973 to 2016. Because all of the participants in the NCI Familial Melanoma Study were white, analyses of SEER data were also restricted to white patients. Information on melanoma thickness was missing for 24 percent of melanoma cases in the NCI Familial Melanoma Study and 8.7 percent of melanoma cases found in the SEER registry; the researchers imputed the missing data.

Sargen and colleagues evaluated if the interventions in the NCI Familial Melanoma Study impacted melanoma thickness and tumor stage at diagnosis by comparing these features between the prospective and pre-study cohorts. After adjusting for gender and age, the researchers found that cases in the prospective cohort had significantly thinner melanomas compared with cases in the pre-study cohort (0.6mm versus 1.1mm, respectively). Further, cases in the prospective cohort were significantly more likely to be diagnosed at the early T1 stage compared with cases in the pre-study cohort (83 percent versus 40 percent, respectively).

Next, the researchers evaluated whether changes in melanoma thickness over time among cases in the NCI Familial Melanoma Study differed from trends observed in the general population. "Melanoma thickness at diagnosis in the United States has been decreasing since 1973, when we started tracking such data," Sargen explained. When the researchers compared the observed tumor thickness among members of melanoma-prone families with the expected tumor thickness based on the U.S. general population with respect to age, calendar period, and gender, they found that the melanomas diagnosed after enrollment in the NCI Familial Melanoma study were thinner than the predictions based on the U.S. general population.

Author's Comments: "This suggests that the downward trend in melanoma thickness observed in the general population does not fully explain the reductions in thickness seen in melanoma-prone families, and that long-term surveillance may assist in the earlier diagnosis of melanoma in high-risk populations," Sargen said.

"Our results suggest that the screening and education provided in the NCI Familial Melanoma Study may improve early detection of melanoma in melanoma-prone families," he added.

Study Limitations: Limitations of this study include the relatively small sample size of melanoma cases in the NCI Familial Melanoma Study and the imputation of missing melanoma thickness data.

"Additionally, since this was a prospective cohort study, we were not able to distinguish the independent effect of each intervention," Sargen said. "Randomized control studies are needed to understand the impact of each aspect of the intervention, such as whole-body screening, melanoma education, or strategies for skin protection."

Credit: 
American Association for Cancer Research

Consumer resistance to sustainability interventions

Researchers from University of Queensland, University of Melbourne, and Universidad Finis Terrae published a new paper in the Journal of Marketing that studies consumer resistance to a nationwide plastic bag ban implemented in Chile in 2019.

The study, forthcoming in the Journal of Marketing, is titled "How Do I Carry All This Now?': Understanding Consumer Resistance to Sustainability Interventions" and is authored by Claudia Gonzalez-Arcos, Alison M. Joubert, Daiane Scaraboto, Rodrigo Guesalaga, and Jörgen Sandberg.

As environmental crisis challenges accelerate, governments are searching for solutions to reduce the negative impacts of economic activity. One popular measure has been to ban disposable plastic bags. Bans on plastic bags may seem like an easy way to reduce plastic pollution, but are often met with strong pushback by consumers, retailers, and other members of society.

The researchers studied consumer resistance to a nationwide plastic bag ban implemented in Chile in February 2019 with the goal of answering the following questions: What causes consumer resistance to sustainability interventions such as bans on plastic bags? And how can consumer resistance be reduced to make such interventions more effective? They conducted interviews, observed consumers, and collected documents, news articles, and social media posts related to the Chilean ban, starting in 2013 when an initial ban in coastal areas was announced until four months after the implementation of the ban in the entire country in 2019.

Gonzalez-Arcos says, "We discovered that consumers refuse to accept or support a sustainability intervention because the individual behaviors being targeted--in this case using disposable plastic bags for shopping--are not separate from, but embedded in, social practices." Social practices are activities, materials, and meanings that are similarly understood and shared by a group of people. Eating, cooking, shopping, driving, and reading are examples of social practices shared by large groups. These practices determine people's way of life and, to a large extent, who they are. From this perspective, a behavior such as using a plastic bag to carry groceries is simply a performance of the socially shared, habituated practice of shopping.

An intervention like banning plastic bags triggers change in the social practice of shopping because plastic bags are one of the materials that constitute this practice. Plastic bags are used in many shopping activities (bagging groceries, carrying them home) and meanings (convenience, speed). To change the social practice of shopping after a plastic bag ban, consumers need to navigate three processes: (1) sensemaking, which means understanding and developing new meanings for the changing shopping practice; (2) accommodating, which means developing new competencies for using and handling the new materials used to shop; and (3) stabilizing, which means performing the changed practice often and efficiently. Consumers do not find these processes easy: engaging in them disrupts routines, lifestyles, and even consumers' perceptions of themselves.

In addition to changed social practices such as shopping, consumers face other challenges that prompt them to resist plastic bag bans. Plastic bags and other sustainability interventions often fail to acknowledge that individual actions are part of broader social practices and aim for individual behavioral change rather than change to the social practice itself. Consequently, consumers face three major challenges: (1) battles about who is responsible for making practices more sustainable; (2) unsettling emotions brought about by the changing practice; and (3) changes to other linked practices that dismantle their ways of life. Once these reasons for consumer resistance are known, they provide greater clarity around why consumers will push back against sustainability interventions.

"Consumer resistance interferes with social practice change, which significantly undermines the effectiveness of the sustainability intervention. Our findings show policy makers and other agents involved in sustainability interventions that changing social practices--not individual behaviors--should be their primary goal," explains Joubert. The study presents a framework for designing and managing practice-based sustainability interventions that considers the role of consumer resistance.

To plan and design a practice-based intervention, the researchers recommend the following steps:

Identify the practice being targeted (e.g., shopping), and how it is likely to be disrupted (e.g., material will be eliminated--plastic bags).

Distribute responsibility for change among those involved in the practice (e.g., consumers, retailers, bag manufacturers, government).

Determine potential emotions that may manifest (positive to leverage and negative to placate).

Identify links between the targeted practice and other social practices (e.g., the plastic bags used in shopping are also used for garbage and waste management).

Next, to monitor and adjust practice-based interventions if consumer resistance emerges, these three main strategies are recommended:

Refocus sensemaking if consumers are experiencing tension and lacking focus.

Encourage accommodation if consumers are avoiding risks and restricting their experimentation during the change process.

Accelerate stabilization if consumers are grappling with discomfort and do not seem to be able to settle with a new version of the social practice.

Credit: 
American Marketing Association

Criteria for selecting COVID-19 patients for lung lung transplantation

In May 2020, a team led by thoracic surgeon Konrad Hoetzenecker of the Department of Surgery of MedUni Vienna and Vienna General Hospital performed a lung transplant on a 44-year-old patient who had been seriously ill with Covid-19, making her the first patient in Europe to receive a lung transplant for this indication. The Vienna lung transplantation programme now plays a leading role in an international consortium comprising experts from the USA, Europe and Asia. Based on the expertise from Vienna, approximately 40 transplants have now been carried out on Covid-19 patients throughout the world. In a study published in the leading journal "The Lancet Respiratory Medicine", the consortium has now proposed the first general selection criteria for lung transplantation in Covid-19 patients.

"We have collated the first experiences in the world of performing lung transplants on Covid-19 patients. It is clear that such a complex intervention should only be considered for patients who, by virtue of their age and good general health, have a good chance of recovery with new lungs," explains Konrad Hoetzenecker, Head of the lung transplantation programme at MedUni Vienna and Vienna General Hospital. The Vienna team performs around 100 lung transplants a year, making it one of the largest programmes in the world, alongside Toronto, Cleveland and Hanover.

Medical guidelines for the world

The following factors were established as criteria for potential transplantation: exhaustion of all conservative treatment options, no recovery of the Covid-19-damaged lungs despite at least four weeks of ventilation/ECMO, evidence of advanced and irreversible lung damage in several consecutive CT scans, age below 65 and no relevant comorbidities. In addition to this, candidates for a lung transplant must be in good physical condition and have a good chance of complete physical rehabilitation following the transplant. "These guidelines can be applied worldwide for making a sound selection of patients who are suitable for a lung transplant following a Covid-19 infection."

The surgical team at MedUni Vienna and Vienna General Hospital has meanwhile carried out twelve lung transplantations on Covid-19 patients, demonstrating that even the most seriously ill patients, who would otherwise die, can survive with a lung transplant.

Patient No. 1

In March 2020, patient number one suffered total pulmonary failure as a result of Covid-19, so that artificial ventilation was no longer possible. She could only be kept alive by the circulation pump. At the time of the transplant, the PCR test showed that virus particles were still present but were no longer infectious. The MedUni Vienna/Vienna General Hospital thoracic surgeons and surgical team managed to replace the patient's completely destroyed lungs with new donor lungs.

Credit: 
Medical University of Vienna

In-situ nanoscale insights into the evolution of solid electrolyte interphase shells

image: The SEI shells evolution processes and degradation mechanism at the electrode/electrolyte interface.

Image: 
©Science China Press

The interfacial decomposition products forming the so-called solid-electrolyte interphase (SEI) during the first charging/discharging significantly determine the electrochemical performances of lithium (Li) batteries. To date, the dynamic evolutions, chemical compositions, stabilities and the influencing factors of the SEI films have been captured tremendous attentions.

It's noted that, in contrast to the SEI film formation at the surface of electrodes, a kind of SEI shells usually conformally forms at the outmost layer of the on-site deposited Li once the freshly deposited Li contacts with the electrolyte, which could directly influence Li nucleation, growth behaviors and electrochemical properties at the electrode/electrolyte interface.

Furthermore, the chemical/morphological instabilities of the on-site formed SEI shell pose challenges for the in-situ characterizations. Directly capturing the dynamic evolution of the SEI shells is crucial to interprete their impacts on the anode/elelctrolyte interface and battery performances.

The electrochemical atomic force microscopy (EC-AFM) enables the real-time characterization of the morphology change, mechanical modulus and potential/current distribution at the electrode/electrolyte interface under working conditions, providing an important in-situ analysis method with high spatial resolution for exploring the dynamic evolution of the on-site formed SEI shell on the deposited Li.

Recently, Prof. Li-Jun Wan and Prof. Rui Wen et al. provide the straightforward visualized evidence of SEI shells evolution during Li deposition/stripping to reveal anode degradation via in-situ EC-AFM.

During Li deposition, the quasi-spherical Li particles nucleate and grow on a Cu electrode. Subsequently, the collapse of the SEI shells is distinctly captured with the continuous Li stripping. As the cycling progresses, new Li deposits are prone to renucleating on the deposit-free sites with higher electrochemical activity. The fresh SEI shells form on freshly-deposited Li while the original SEI shells retain their collapsed morphology at the same position. Severe SEI regeneration/collapse along with electrolyte depletion and interfacial impedance increasing take one of the responsiblities for the degradation of anodes.

This work reveals the interfacial evolution at nanoscale, provides deep insights into the fundamental comprehension of SEI properties and further guides improvement strategies of the interface design in Li batteries.

Credit: 
Science China Press

Mapping policy for how the EU can reduce its impact on tropical deforestation

EU imports of certain products contribute significantly to deforestation in other parts of the world.

In a new study, researchers from Chalmers University of Technology, Sweden, and University of Louvain, Belgium, evaluated thousands of policy proposals for how the EU could reduce this impact, to assess which would have the largest potential to reduce deforestation - while also being politically feasible.

"Unsurprisingly, there is weaker support for tougher regulations, such as import restrictions on certain goods. But our study shows that there is broad support in general, including for certain policies that have real potential to reduce imported deforestation," says Martin Persson, Associate Professor of Physical Resource Theory at Chalmers University of Technology.

Previous research has already shown the EU's great impact in this area. More than half of tropical deforestation is linked to production of food and animal feed, such as palm oil, soybeans, wood products, cocoa and coffee - goods which the EU imports in vast quantities. The question is, what can the EU do to reduce its contribution to deforestation?

"This issue is particularly interesting now, as this year the EU is planning to present legislative proposals for reducing deforestation caused by European consumption. The question has been discussed by the EU since 2008, but now something political is actually happening," says Simon Bager, a doctoral student at the University of Louvain (UCLouvain), and lead author of the study.

The authors of the article mapped 1 141 different proposals, originating from open consultations and workshops, where the EU has collected ideas from companies, interest groups and think tanks. The researchers also compiled proposals from a large number of research reports, policy briefs and other publications, where different stakeholders have put forward various policy proposals. After grouping together similar proposals, they arrived at 86 unique suggestions.

Two suggestions stand out from the crowd

Finding proposals for measures that would have the desired effect but are also possible to implement in practice, and enjoy the necessary political support, is no easy task. But after their extensive survey, the researchers identify two policy options in particular which show particular promise:

- The first is to make importers of produce responsible for any deforestation in their supply chains, by requiring them to carry out the requisite due diligence."If the importing companies' suppliers have products that contribute to deforestation, the company may be held responsible for this. We consider such a system to be credible and possible to implement both politically and practically - there are already examples from France and England where similar systems have been implemented or are in the process thereof," says Simon Bager. "Due diligence is also the measure which is most common in our survey, put forward by many different types of actors, and there is broad support for this proposal. However, it is important to emphasise that for such a system to have an impact on deforestation, it must be carefully designed, including which companies are affected by the requirements, and which sanctions and liability options exist."

- The other possibility is to support multi-stakeholder forums, where companies, civil society organisations, and politicians come together to agree on possible measures for ridding a supply-chain, commodity, or area, of deforestation. There are positive examples here too, the most notable being the Amazon Soy Moratorium from 2006, when actors including Greenpeace and the World Wide Fund for Nature gathered with soy producers and exporters and agreed to end soy exports from deforested areas in the Amazon rainforest. "Examples such as these demonstrate the effect that multi-stakeholder forums can have. And in our opinion, it is a measure that is easier to get acceptance for, because it is an opportunity for the affected parties to be directly involved in helping design the measures themselves," says Martin.

A delicate balance

The researchers also investigated how to deal with the trade-off between policy impacts and feasibility. An important part of this is combining different complementary measures. Trade regulations on their own, for example, risk hitting poorer producing countries harder, and should therefore be combined with targeted aid to help introduce more sustainable production methods, increasing yields without having to resort to deforestation. This would also reduce the risk of goods that are produced on deforested land simply being sold in markets other than the EU.

"If the EU now focuses on its contribution to deforestation, the effect may be that what is produced on newly deforested land is sold to other countries, while the EU gets the 'good' products. Therefore, our assessment is that the EU should ensure that the measures introduced are combined with those which contribute to an overall transition to sustainable land use in producing countries," says Simon Bager.

In conclusion, the researchers summarise three essential principles needed for new measures, if the EU is serious about reducing its impact on tropical deforestation. "First, enact measures that actually are able to bring about change. Second, use a range of measures, combining different tools and instruments to contribute to reduced deforestation. Finally, ensure the direct involvement of supply chain actors within particularly important regions, expanding and broadening the measures over time," concludes Simon Bager.

The authors hope that the research and identified policy options can serve as inspiration for policy makers, NGOs, industries, and other stakeholders working to address the EU's deforestation footprint. With at least 86 different unique alternatives, there is a wide range of opportunities to focus on the problem - very few of these are political 'non-starters' or proposals which would have no effect on the issue.

Credit: 
Université catholique de Louvain

Scientists turn to deep learning to improve air quality forecasts

Air pollution from the burning of fossil fuels impacts human health but predicting pollution levels at a given time and place remains challenging, according to a team of scientists who are turning to deep learning to improve air quality estimates. Results of the team's study could be helpful for modelers examining how economic factors like industrial productivity and health factors like hospitalizations change with pollution levels.

"Air quality is one of the major issues within an urban area that affects people's lives," said Manzhu Yu, assistant professor of geography at Penn State. "Yet existing observations are not adequate to provide comprehensive information that may help vulnerable populations to plan ahead."

Satellite and ground-based observations each measure air pollution, but they are limited, the scientists said. Satellites, for instance, may pass a given location at the same time each day and miss how emissions vary at different hours. Ground-based weather stations continuously collect data but only in a limited number of locations.

To address this, the scientists used deep learning, a type of machine learning, to analyze the relationship between satellite and ground-based observations of nitrogen dioxide in the greater Los Angeles area. Nitrogen dioxide is largely associated with emissions from traffic and power plants, the scientists said.

"The problem right now is nitrogen dioxide varies a lot during the day," Yu said. "But we haven't had an hourly, sub-urban scale product available to track air pollution. By comparing surface level and satellite observations, we can actually produce estimates with higher spatial and temporal resolution."

The learned relationship allowed the researchers to take daily satellite observations and create hourly estimates of atmospheric nitrogen dioxide in roughly 3-mile grids, the scientists said. They recently reported their findings in the journal Science of the Total Environment.

"The challenge here is whether we can find a linkage between measurements from earth's surface and satellite observations of the troposphere, which are actually far away from each other. That's where deep learning comes in."

Deep learning algorithms operate much like the human brain and feature multiple layers of artificial neurons for processing data and creating patterns. The system learns and trains itself based on connections it finds within large amounts of data, the scientists said.

The scientists tested two deep-learning algorithms and found the one that compared the ground-based observations directly to the satellite observations more accurately predicted nitrogen dioxide levels. Adding information like meteorological data, elevation and the locations of the ground-based stations and major roads and power plants improved the prediction accuracy further.

Yu said the study could be repeated for other greenhouse gases and applied to different cities or on regional and continental scales, the scientists said. In addition, the model could be updated when new, higher-resolution satellites are launched.

"With a high spatiotemporal resolution, our results will facilitate the study between air quality and health issues and improve the understanding of the dynamic evolution of airborne pollutants," Yu said.

Credit: 
Penn State

Realtime imaging of female gamete formation in plants

image: Development of the female gamete was observed over 20 hours, clearly showing the division of the nuclei and formation of the egg, central and synergid cells.

Image: 
Issey Takahashi

Scientists from Nagoya University, Yokohama City University and Chubu University have developed a system which enables the live imaging of the formation of the female gamete in plants.

In flowering plants, the sperm cell and egg cell meet and fertilization takes place in the flower. While sperm cells are made in the pollen, egg cells are made in the ovule, the structure that becomes the seed. However, as the ovule is buried deep within the pistil, it has thus far been impossible to observe the formation of the egg cell in living plants.

The team, led by Dr Daisuke Kurihara and Dr Tetsuya Higashiyama of Nagoya University Institute of Transformative Bio-Molecules (WPI-ITbM), Dr Daichi Susaki of Yokohama City University Kihara Institute for Biological Research and Dr Takamasa Suzuki of Chubu University College of Bioscience and Biotechnology, using the ovule culturing technology that they had developed previously, succeeded in capturing images of the egg cell being formed inside the ovule. On top of that, they were able to isolate the egg cell and its neighboring cells, and by analyzing the genes expressed in these few cells, identify how the cells adjoining the egg cell determine its fate.

Living things which carry out sexual reproduction produce offspring via a fertilization process involving male and female gametes. In animals, the female gamete (the egg) is produced by meiosis, a type of cell division that halves the number of chromosomes present in the cell. However, the process in flowering plants is rather more lengthy. Following meiosis, karyomitosis (nuclear division) takes place three times within the cell, resulting in the production of a single cell with eight nuclei. This cell then divides, producing cells with a variety of different roles including two gametes, the egg cell and central cell, and the synergid cells. However, it was not yet understood precisely how the two female gametes were produced among the seven new cells that result from this process of division.

Using an ovule culturing method that they had developed previously, the research team attempted the observation in real time of the formation of the female gamete in Arabidopsis thaliana. They saw that when the first nuclear division takes place, the resulting two nuclei go to the opposite ends of the cell. Dividing again into four, the nuclei then line up along the edge of the cell. Finally, dividing again into eight, the plasma membranes are constructed around the nuclei, forming the cells which are attached to the two gametes (the egg cell and central cell). Having observed 157 cases of this division, they found that the nuclei close to where the pollen tube penetrates would become the nuclei of the synergid, egg and central cells, demonstrating that the position of the nuclei within the cell has a strong correlation with cell fate.

Continuing, in order to find out when the various cells' fates are determined, they analyzed the time at which expression of the specific transcription factor myb98, important for the differentiation and function of the synergid cells, commenced. They found that myb98 begins to be expressed very shortly after the nuclei divide into 8 and are enclosed by the plasma membranes. Given that the specific transcription factor for the egg cell can also be found in the egg cell at the same early stage, it can be considered that cell fate is determined immediately after the formation of the plasma membranes, or possibly even earlier.

The time at which cell fate is determined is significant because it gives us an insight into how plants remain adaptable to environmental conditions by flexibly changing cell fate and thus ensuring the survival of crucial cells such as gametes.

Looking to the future, the research team's focus will be on discovering how the cell fate change is accomplished, and explaining its molecular mechanism. Once the molecular mechanism has been analyzed, it is expected that this field of research will contribute to the development of methods to increase plant fertilization rates and environmental resistance, offering the prospect of solving key issues in food supply that affect millions of people around the world.

Credit: 
Institute of Transformative Bio-Molecules (ITbM), Nagoya University

Massive X-ray screening identifies promising candidates for COVID drugs

video: A team of researchers has identified several candidates for drugs against the coronavirus SARS-CoV-2 at DESY´s high-brilliance X-ray lightsource PETRA III. They bind to an important protein of the virus and could thus be the basis for a drug against Covid-19. In a so-called X-ray screening, the researchers tested almost 6000 known active substances that already exist for the treatment of other diseases. The team was able to identify a total of 37 substances that bind to the main protease (Mpro) of the SARS-CoV-2 virus, as the scientists report in the journal Science. Seven of these substances inhibit the activity of the protein and thus slow down the multiplication of the virus. Two of them do this so promisingly that they entered preclinical studies. This drug screening - probably the largest of its kind - also revealed a new binding site for active substances on the main protease of the virus.

Image: 
DESY, Liza Arbeiter

A team of researchers has identified several candidates for drugs against the coronavirus SARS-CoV-2 at DESY´s high-brilliance X-ray lightsource PETRA III. They bind to an important protein of the virus and could thus be the basis for a drug against Covid-19. In a so-called X-ray screening, the researchers, under the leadership of DESY, tested almost 6000 known active substances that already exist for the treatment of other diseases in a short amount of time. After measuring about 7000 samples, the team was able to identify a total of 37 substances that bind to the main protease (Mpro) of the SARS-CoV-2 virus, as the scientists report online today in the journal Science. Seven of these substances inhibit the activity of the protein and thus slow down the multiplication of the virus. Two of them do this so promisingly that they are currently under further investigation in preclinical studies. This drug screening - probably the largest of its kind - also revealed a new binding site on the main protease of the virus to which drugs can bind.

In contrast to vaccines, which help healthy people to defend themselves against the virus, drug research is looking for drugs that slow down or stop the reproduction of the virus in the body of people who are already infected. Viruses cannot reproduce on their own. Instead, they introduce their own genetic material into the cells of their host and make them produce new viruses. Proteins such as the main protease of the virus play an important role in this process. Protease cuts protein chains produced by the host cell according to the blueprint of the virus genetic material into smaller parts that are necessary for the reproduction of the virus. If the main protease can be blocked, the cycle can possibly be interrupted; the virus can no longer reproduce and the infection is defeated.

Beamline P11 of DESY´s PETRA III research lightsource specialises in structural biology studies. Here, the three-dimensional structure of proteins can be imaged with atomic precision. The research team led by DESY physicist Alke Meents used this special capability to examine several thousand active substances from a library of the Fraunhofer Institute for Translational Medicine and Pharmacology and another library from the Italian company Dompé Farmaceutici SpA to see whether and how they "dock" to the main protease - the first important step in blocking it. Like a key in a lock, the drug molecule fits into a binding centre of the protease. The advantage of the drug library is that it contains active substances that have already been approved for the treatment of humans or those that are currently in various testing phases. Suitable candidates to combat SARS-CoV-2 could therefore be used in clinical trials considerably faster, saving months or years of drug development.

The special technical equipment at the PETRA III station P11 includes fully automated sample changes with a robotic arm, so that each of the more than 7000 measurements took only about three minutes. With the help of automated data analysis, the team was able to quickly separate the wheat from the chaff. "Using a high-throughput method, we were able to find a total of 37 active substances that bind with the main protease," says Meents, who initiated the experiments.

In a next step, the researchers at the Bernhard Nocht Institute for Tropical Medicine investigated whether these active substances inhibit or even prevent virus replication in cell cultures and how compatible they are for the host cells. This reduced the number of suitable active substances to seven, two of which stood out in particular. "The active substances Calpeptin and Pelitinib clearly showed the highest antivirality with good cell compatibility. Our cooperation partners have therefore already started preclinical investigations with these two substances," explains DESY researcher Sebastian Günther, first author of the Science publication.

In their drug screening using protein crystallography, the researchers did not examine fragments of potential drugs as is usually the case, but complete molecules of the drug. In the process, however, the team of more than 100 scientists also discovered something completely unexpected: they found a binding site on the main protease that had been completely unknown until then. "It was not only a nice surprise that we were able to discover a new drug binding site on the main protease - a result that can really only be achieved at a synchrotron light source like PETRA III - but that even one of the two promising drug candidates binds precisely to this site," says Christian Betzel from the excellence cluster CUI of the University of Hamburg, co-initiator of the study.

"A particular strength of our method of X-ray screening compared to other screening methods is that we obtain the three-dimensional structure of the protein-drug complexes as a result and can thus identify the binding of the drugs to the protein at the atomic level. Even if the two most promising candidates do not make it into clinical trials, the 37 substances that bind to the main protease form a valuable database for drug developments based on them," explains Patrick Reinke, DESY researcher and co-author of the publication.

"The investigations at PETRA III impressively show how relevant high-brilliance synchrotron lightsources are for the development of future medicines and for health research as a whole," stresses Helmut Dosch, Chairman of the DESY Directorate. "We must and want to expand our infrastructures even more in the future to cope with health crises like the current one."

Credit: 
Deutsches Elektronen-Synchrotron DESY

Cannabis use disorder linked to increased complications after spinal surgery

April 2, 2021 - For patients undergoing spinal surgery, the diagnosis of cannabis use disorder is associated with higher complication rates, including substantially increased risks of stroke and respiratory complications, reports a study in Spine. The journal is published in the Lippincott portfolio by Wolters Kluwer.

"Chronic cannabis use among patients undergoing spine surgery is associated with higher rates of inpatient neurovascular, thromboembolic, and pulmonary complications, and less favorable overall discharge disposition," according to the new research by Ankit Indravadan Mehta, MD, FAANS and colleagues of the University of Illinois at Chicago. "The treatment of these patients is also associated with increased length of stay and cost of hospitalization."

For chronic cannabis users, surgical care may be 'extremely complex and difficult'

Using a national hospital database (Nationwide Inpatient Sample), the researchers identified nearly 433,000 patients who underwent common elective spinal surgery procedures between 2012 and 2015. About 2,400 patients had a diagnosis of cannabis use disorder, defined as continued use of cannabis despite significant distress or impairment.

On initial analysis, there were some important differences between patients with and without cannabis use disorder. Patients diagnosed with problematic cannabis use were younger, more likely to be male, and had lower rates of accompanying medical disorders (comorbidity). They were also much more likely to use tobacco: about 71 versus 31 percent.

Using a technique called propensity score matching, Dr. Mehta and colleagues created matched groups of 2,184 chronic cannabis users versus non-users with similar characteristics and comorbidities. A wide range of complications and other hospital outcomes of spinal surgery were compared between groups.

Patients with cannabis use disorder were at increased risk for several types of complications after spinal surgery. The cannabis users were about twice as likely to develop respiratory and blood clot-related (thromboembolism) complications. They also had nearly a threefold increase in the risk of stroke and other neurologic complications. Risk of bloodstream infection (septicemia or sepsis) was increased by 50 percent.

There were also more myocardial infarctions (heart attacks) among patients with cannabis use disorder. However, on further analysis, this was related to their much higher rate of tobacco use.

Patients with cannabis use disorder also spent nearly two more days in the hospital (about seven versus five days) and had hospital costs nearly $15,000 higher. They were also more likely to be discharged to a nursing or rehabilitation facility and less likely to receive home health care.

The results are of special concern as cannabis use becomes increasingly legalized and accepted in the United States. Estimates suggest that more than 15 percent of Americans use cannabis. Rates of cannabis use and cannabis use disorder have "increased drastically" in recent years, according to the authors.

The findings have important implications for anesthesia and surgical management in patients with cannabis use disorder, Dr. Mehta and colleagues believe. For example, such patients may need higher doses of opioid medications to achieve adequate pain management after spinal surgery. That may lead to slower recovery, requiring longer hospital stays and more intensive rehabilitation that cannot be done at home. The researchers also note previous studies outlining the mechanisms by which cannabis use may lead to increased risk of respiratory, thromboembolic, and neurologic complications.

"This study reveals multiple aspects of perioperative care that are extremely complex and difficult in patients with cannabis use disorder," Dr. Mehta and coauthors write. "Their high-risk cardiorespiratory profiles combined with specialized anesthetic considerations [before, during, and after surgery] necessitate very unique perioperative management. We hope this research will lead to future prospective trials for specific interventions that could improve outcomes in this specific population."

Credit: 
Wolters Kluwer Health

Weight loss changes people's responsiveness to food marketing: study

Obesity rates have increased dramatically in developed countries over the past 40 years -- and many people have assumed that food marketing is at least in part to blame. But are people with obesity really more susceptible to food marketing? And if they are, is that a permanent predisposition, or can it change over time?

According to a new study by UBC Sauder School of Business Assistant Professor Dr. Yann Cornil (he/him/his) and French researchers, people with obesity do tend to be more responsive to food marketing -- but when their weight drops significantly, so does their responsiveness to marketing.

For the study, which was published in the Journal of Consumer Psychology, the researchers followed three groups: patients with severe obesity before they had gastric bypass or other weight-loss surgeries (collectively known as bariatric surgery), as well as three and 12 months after; people with obesity who were not undergoing bariatric surgery; and people who were not obese.

To measure their responsiveness to food marketing, the researchers evaluated what's called framing effects -- that is, how branding, advertising, and labeling "frame," and thus influence food evaluations and choices. In one study, participants were asked to estimate the calorie content in well-known snacks and drinks including some, which marketers typically framed as healthy (i.e. apple juice, granola bars), and others, which are not framed as healthy (i.e. soft drinks, chocolate bars).

The researchers found that everyone underestimated the calorie content of snacks that were framed as healthy but the effect was more pronounced in people with obesity.

To further test the framing effect, the researchers had participants hypothetically choose a portion of french fries from a fast food restaurant, and gave them the nutritional information they would need to make an informed decision. The three options were always the same in quantity -- 71g, 117g, and 154g -- but in one instance they were labeled small, medium and large, and in another instance the same portions were labeled mini, small and medium: a marketing tactic aimed at making larger portions seem more reasonable.

"We measured how likely people were sensitive to that framing, and whether it would change their choice of fries quantity depending on how the portions are labeled," explains Dr. Cornil, who says the people with obesity were more likely to follow the labeling and not the actual information about quantity -- so they would choose the portion labeled "medium" even though that's quite large.

Overall the researchers, who worked closely with the Pitié-Salpêtrière Hospital in Paris, found that the people with obesity tended to be more responsive to food marketing -- but when they lost a significant amount of weight because of bariatric surgery, their level of responsiveness to food marketing dropped substantially.

"People with obesity going through bariatric surgery will become less responsive to marketing over time," says Dr. Cornil. "And after 12 months, their responsiveness to marketing reaches the level of people with more medically-recommended weight."

Dr. Cornil says it's not clear whether people with obesity become less responsive to marketing because of physiological changes following the surgery -- hormonal, neurological shifts or changes to the gut microbiota -- or because of people's desire to change their lifestyles and habits. Another possible reason, he adds, is that people's tastes tend to shift following bariatric surgery.

"The results clearly suggest a bidirectional influence between people's weight status, psychology and responsiveness to the environment -- including marketing," says Dr. Cornil. "So, it's a complex relationship."

However, had the researchers found the responsiveness to marketing remained high even after weight loss, it would have pointed to a deeper-rooted predisposition.

"That would mean people are endowed with unchangeable psychological characteristics that would always make them more responsive to marketing -- which would make it very difficult to sustain a medically-recommended weight," he explains. "But one of the positive things is that after significant weight loss, people become less responsive to marketing, such that it is more sustainable to remain at a lower body mass index."

Dr. Cornil says the findings are especially important because for years, researchers have assumed that marketing messages -- especially for foods that are high-calorie and low in nutrition -- are at least partly responsible for the obesity epidemic, but there wasn't clear empirical evidence.

"Our results provide important insights for policy-makers in charge of regulating food marketing in order to curb obesity," says Dr. Cornil.

Credit: 
University of British Columbia

SLAS Technology April issue dives into reactive oxygen species

Oak Brook, IL - The April edition of SLAS Technology features the cover article "Therapeutic Potential of Reactive Oxygen Species: State of the Art and Recent Advances" by Valeria Graceffa, Ph.D. (Institute of Technology Sligo, Sligo, Ireland).

The cover article explores the therapeutic potential of reactive oxygen species (ROS) including applications ranging from wound healing and hair growth enhancement, to cancer treatment, stem cell differentiation and tissue engineering. At low concentrations, ROS can be utilized as inexpensive and convenient inducers of tissue regeneration, triggering stem cell differentiation and enhancing collagen synthesis. Recent cancer studies have represented ROS as the 'Achilles Heels' of cancers given their high basal levels, leaving tumoral cells unable to sufficiently handle the additional source of oxidative stress. Because of this, higher doses of ROS are used for targeted killing of tumor cells. Lack of understanding of the technicalities between anabolic and cytotoxic effects means limited ROS clinical translation opportunities. While there is ample potential for the use of ROS, new strategies to control temporal pattern of ROS release have yet to be developed.

The April issue of SLAS Technology includes six articles of original research including:

Highly Versatile Cloud-Based Automation Solution for the Remote Design and Execution of Experiment Protocols during the COVID-19 Pandemic

Metabolomic Signatures in Pediatric Crohn's Disease Patients with Mild or Quiescent Disease Treated with Partial Enteral Nutrition: A Feasibility Study

Acoustic Ejection/Full-Scan Mass Spectrometry Analysis for High-Throughput Compound Plate Quality Control

An Emerging Fluorescence-Based Technique for Quantification and Protein Profiling of Extracellular Vesicles

Development of an Enhanced Throughput Radial Cell Migration Device

A Variable Scheduling Maintenance Culture Platform for Mammalian Cells

Other articles include:

Application of Artificial Intelligence to Address Issues Related to the COVID-19 Virus

Therapeutic Potential of Reactive Oxygen Species: State of the Art and Recent Advances

Recent Developments in Bacterial Chemotaxis Analysis Based on the Microfluidic System

Development of a Gold Nanoparticle-Based Colorimetric Sensor for Water for Injection At-Line Impurity Testing

Integrating Mobile Robots into Automated Laboratory Processes: A Suitable Workflow Management System

Credit: 
SLAS (Society for Laboratory Automation and Screening)

Genome sequencing shows coronavirus variation drives pandemic surges

Genome sequencing of thousands of SARS-CoV-2 samples shows that surges of COVID-19 cases are driven by the appearance of new coronavirus variants, according to new research from the School of Veterinary Medicine at the University of California, Davis published April 1 in Scientific Reports.

"As variants emerge, you're going to get new outbreaks," said Bart Weimer, professor of population health and reproduction at the UC Davis School of Veterinary Medicine. The merger of classical epidemiology with genomics provides a tool public health authorities could use to predict the course of pandemics, whether of coronavirus, influenza or some new pathogen.

Although it has just 15 genes, SARS-CoV-2 is constantly mutating. Most of these changes make very little difference but sometimes the virus becomes more or less transmissible.

Weimer and graduate student DJ Darwin R. Bandoy initially analyzed the genomes of 150 SARS-CoV-2 strains, mostly from outbreaks in Asia prior to March 1, 2020, as well as epidemiology and transmission information for those outbreaks. They classified outbreaks by stage: index (no outbreak), takeoff, exponential growth and decline. The ease of transmission of a virus is set by the value R, or reproductive number, where R is the average number of new infections caused by each infected person.

They combined all this information into a metric called GENI, for pathogen genome identity. Comparing GENI scores with the phase of an epidemic showed that an increase in genetic variation immediately preceded exponential growth in cases, for example in South Korea in late February. In Singapore, however, bursts of variation were associated with smaller outbreaks that public health authorities were able to quickly bring under control.

20,000 virus samples

Weimer and Bandoy then looked at 20,000 sequences of SARS-CoV-2 viruses collected and from February to April 2020 in the United Kingdom and compared them with data on cases.

They found that the GENI variation score rose steadily with the number of cases. When the British government imposed a national lockdown in late March, the number of new cases stabilized but the GENI score continued to rise. This shows that measures such as banning gatherings, mask mandates and social distancing are effective in controlling spread of disease in the face of rapid virus evolution.

It could also help explain "superspreader" events when large numbers of people get infected in a single incident where precautions are relaxed.

Weimer said he hopes that public health authorities will take up the approach of measuring virus variation and linking it to the local transmission rate, R.

"In this way you can get a very early warning of when a new outbreak is coming," he said. "Here's a recipe for how to go about it."

Credit: 
University of California - Davis