Tech

IU researchers find disease-related gene changes in kidney tissue

INDIANAPOLIS--Researchers from Indiana University have identified key genetic changes in the interstitial kidney tissue of people with diabetes, a discovery that signifies the potential for a revolutionary new genetic approach to the treatment of kidney disease. They will contribute their findings to the Kidney Precision Medicine Project's (KPMP) "cell atlas," a set of maps used to classify and locate different cell types and structures within the kidney.

They shared their groundbreaking findings in a study published on February 10, 2021, in Science Advances.

In the study, researchers investigated the kidney tissue of healthy people and people with diabetes using a technique called "regional transcriptomics." This technique involves a rapid stain of kidney tissue, and then using a laser to cut out microscopic regions of interest.

They found that important genes change when a scar forms on the interstitium, said Daria Barwinska, PhD, the lead author of the study and an Assistant Scientist in the Department of Medicine at Indiana University School of Medicine.

"The interstitium is the 'glue' that holds the kidney together. It is one of the least characterized parts of the kidney, but scars in the interstitium caused by diseases such as diabetes can contribute to kidney disease," said Barwinska.

Acute kidney injury (AKI) and chronic kidney disease (CKD) affect millions of people in the United States and globally. However, no effective therapies exist for AKI, and only a few are available for CKD. The KPMP, a multi-site project focused on understanding and finding new treatments to AKI and CKD, is seeking to bring treatment for these conditions "into the molecular era," according to Michael Eadon, MD.

IU is one of KPMP's many "tissue interrogation sites" across the country. Collectively, these sites are working together bring cutting-edge technologies to aid in the interrogation of human kidney biopsies.

"Many diseases can look the same under the microscope, but they have very different causes," said Eadon, who is the study's corresponding author and an Assistant Professor of Medicine in the Department of Medicine at IU School of Medicine. "We're seeking to understanding how different genes contribute to very common kidney diseases."

This study could usher in the era of new and better treatments for millions of people with AKI and CKD.

"A personalized medicine approach that understands how different diseases affect a patient's genes will aid in finding potential treatments for kidney disease," said Barwinska. "This approach can meet any single patient's needs."

Credit: 
Indiana University School of Medicine

Biodegradable microcapsules deliver nerve growth factor to guide neuronal development

Researchers from Skoltech and their colleagues have demonstrated that nanoengineered biodegradable microcapsules can guide the development of hippocampal neurons in an in vitro experiment. The microcapsules deliver nerve growth factor, a peptide necessary for neuron growth. The paper describing this work was published in the journal Pharmaceutics.

Many neurodegenerative conditions that can lead to severe disorders are associated with depleted levels of growth factors in the brain - neuropeptides that help neurons grow, proliferate and survive. Some clinical studies of Alzheimer's and Parkinson's diseases have shown that delivering these growth factors to specific degenerating neurons can have a therapeutic effect. However, it is rather hard to do in practice; when taken systemically, as a typical drug, they can have difficulty penetrating the blood-brain barrier and severe side effects. And using viruses to deliver them as a form of gene therapy means one cannot easily stop the treatment once it is initiated, which also leads to safety concerns.

Olga Sindeeva and Gleb Sukhorukov of Skoltech and Queen Mary University of London, together with their colleagues, decided to try to use targeted delivery in the form of microcapsules with an average size of 2 to 3 micrometers bringing one of these neuropeptides, nerve growth factor (NGF), directly to the needed location. To produce the capsules, the team used poly-L-arginine and dextran and the layer-by-layer technique, when very thin films are deposited, as the name suggests, layer by layer to form a capsule. Earlier studies showed that these capsules are biocompatible and can function properly both in tissue slices and in rodents, with no detectable adverse effects.

"Encapsulation via Layer-by-Layer (LbL) technology has an advantage, first of all, in its versatility, meaning the possibility to tailor different functions to the shell of the capsules and incorporate various cargo. Also, unlike many other encapsulation methods, LbL encapsulation is processed in aqueous conditions, which is absolutely compatible with such fragile molecules as some peptides, proteins and in particular growth factors," Sukhorukov said.

For their new experiment, researchers used cultures of rat hippocampal neurons, which were studied after adding the microcapsules. It turned out that NGF that was delivered in them boosted neuronal growth quite significantly, and this new growth was not arbitrary but rather directed towards the clusters of microcapsules. NGF also enhanced neurite branching, a process that leads to the formation of axons and dendrites, the main functional elements of a neuron. Finally, neurons treated with NGF in microcapsules were able to form functional synapses. If this approach works in clinical applications, it may mean a reversal of neurodegeneration observed in many diseases.

"The properties of LbL microcapsules can be tuned according to the application. So, biocompatibility, degradation and controlled release are defined by the choice of polymers used in capsule shell composition. Controlled release could also be achieved by external stimuli such as ultrasound, laser, or a magnetic field," Gleb Sukhorukov noted.

The team plans to test their technology of controlled release of NGF for a speedy recovery after neural injury. Earlier studies also showed that a similar approach can work with other growth factors such as basic fibroblast growth factor, a protein involved in a host of biological processes including embryonic development and tissue repair.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

NREL heats up thermal energy storage with new solution meant to ease grid stress

Scientists from the National Renewable Energy Laboratory (NREL) have developed a simple way to better evaluate the potential of novel materials to store or release heat on demand in your home, office, or other building in a way that more efficiently manages the building's energy use.

Their work, featured in Nature Energy, proposes a new design method that could make the process of heating and cooling buildings more manageable, less expensive, more efficient, and better prepared to flexibly manage power from renewable energy sources that do not always deliver energy when it is most needed.

The paper, "Rate Capability and Ragone Plots for Phase Change Thermal Energy Storage," was authored by NREL's Jason Woods, along with co-authors Allison Mahvi, Anurag Goyal, Eric Kozubal, Wale Odukomaiya, and Roderick Jackson. The paper describes a new way of optimizing thermal storage devices that mirrors an idea used for batteries, helping to inform what new thermal storage materials are needed for buildings and how the devices should be designed with these materials.

Thermal energy storage allows buildings to function like a huge battery by storing thermal energy in novel materials until it can be used later. One example is a heat pump. While electricity is needed initially to create and store the heat, the heat is used later without using additional electricity.

In another example, some materials have the ability to change phases, like ice that can transition from a solid to a liquid. As the ice melts, it absorbs energy from and cools a working fluid, which can then be used to cool a building space. Because phase change occurs at a nearly constant temperature, useful energy can be provided or stored for a longer period at a steady temperature. Thermal energy storage is typically very "round trip" energy efficient.

The authors discovered that a Ragone plot, often used to characterize batteries, also works well to describe the potential effectiveness of various thermal storage device candidates. A Ragone plot shows the tradeoff between how much energy a device can store and its discharge power, or how quickly the device can release energy. This foundational approach makes comparisons between different thermal storage materials or device improvements easier to evaluate. It serves as a starting point for defining targets and is a useful design tool to develop new thermal storage materials and devices that can serve as novel, alternative energy storage options.

"This Ragone framework ensures cost-effective design of thermal storage materials and devices depending on the power and energy requirements of a particular application," said Jason Woods, a senior research engineer at NREL and lead author of the newly published paper.

Mahvi, a postdoctoral researcher at NREL, said another advantage is enabling technologies that can mitigate blackouts on the grid. "Most of peak electricity demand--especially in the summer when you might see blackouts--is driven by air conditioning. If you can move that demand to some other time of the day, you can help alleviate the strain on the grid, keeping the grid operational, but also keeping people indoors comfortable."

"Thermal energy storage systems will need to become more flexible and adaptable with the addition of onsite power generation, electric vehicle charging, and the combination of thermal storage with batteries," Woods said. "Part of this flexibility requires higher power--but this higher power comes at a cost of available energy, as this publication highlights."

The way in which the thermal energy storage is used will impact its performance. Scientists need to consider questions about how stored energy can best be used to keep building occupants comfortable, or for different applications like maintaining electronic equipment at a safe temperature.

"Which one works best for me and my application will depend on what the requirements are. How much do I need to store, and how fast do I need to discharge it?" Mahvi said. "This framework will allow us to optimize thermal storage systems from the material to the component scale to increase the power density while still accessing as much of the available capacity as possible. This will result in more efficient devices that can be used for a wide range of applications."

The researchers developed a computer model to understand the various design tradeoffs with these thermal storage devices, including ones that require high power (release the energy quickly) and low power (release the energy slowly). They also built a prototype phase change thermal storage device, illustrating this power-energy tradeoff in practice.

The Building Technologies Office in the U.S. Department of Energy's Office of Energy Efficiency and Renewable Energy funded this research.

NREL is the U.S. Department of Energy's primary national laboratory for renewable energy and energy efficiency research and development. NREL is operated for the Energy Department by the Alliance for Sustainable Energy, LLC.

Credit: 
DOE/National Renewable Energy Laboratory

Researchers solve riddle of plant immune system

image: Lennart Mohnike collecting leaf material from bacteria infected plants

Image: 
Philipp William Niemeyer

How do plants build resilience? An international research team led by the University of Göttingen studied the molecular mechanisms of the plant immune system. They were able to show a connection between a relatively unknown gene and resistance to pathogens. The results of the study were published in the journal The Plant Cell.

Scientists from "PRoTECT" - Plant Responses To Eliminate Critical Threats - investigated the molecular mechanisms of the immune system of a small flowering plant known as thale cress (Arabidopsis thaliana). PRoTECT is an International Research Training Group (IRTG) founded in 2016 with the University of Göttingen and the University of British Columbia in Vancouver. The aim of the study was to identify and describe a specific gene of a particularly disease-resistant plant. The team observed that plants that do not possess this previously little known gene strongly accumulate active acids. In addition, these plants show a significantly increased resistance to pathogens. However, this resistance is accompanied by extremely reduced growth.

"We have succeeded in deciphering the molecular connection between the gene product and the inactivation of the acids during normal plant growth," says Professor Ivo Feußner from the Göttingen Centre for Molecular Biosciences (GZMB). Understanding this interaction provides scientists with a promising approach to improving the natural resistance of crops. "The basic results can be used to help breeders isolate less susceptible plants," says Lennart Mohnike, first author of the study. "This offers scientists an important way to increase food security and could lead to reduced pesticide use."

Credit: 
University of Göttingen

Nursing home staff responses to pandemic reveal resilience, shortcomings: Concordia study

image: Daniel Dickson: "Frontline workers' professional knowledge, experience and abilities make them really valuable assets."

Image: 
Concordia University

The ongoing health disaster of the past 12 months has exposed the crises facing nursing homes in Canada and the United States and the struggles of the staff working in them.

Writing in the Journal of Comparative Policy Analysis: Research and Practice, PhD student Daniel Dickson, his supervisor Patrik Marier, professor of political science, and co-author Robert Henry Cox of the University of South Carolina perform a comparative analysis of those workers’ experiences. In it, they look at Quebec (including those at government-run CHSLDs), British Columbia, Washington State and Ohio by reviewing 336 articles in six newspapers published between late-February and mid-June 2020.

“We wanted to see how the pandemic affected the discretion of frontline workers,” Dickson says. “Their professional knowledge, their experience and their abilities make them really valuable assets.”

Even if, the researchers note, they are hardly treated as such. For years, care workers — 85 per cent of whom are women and 50 per cent of whom are born outside the country, according to a recent Canadian study — coped with low levels of pay, status and likelihood of advancement. Now, they write, care workers are also faced with “an existential threat to themselves or their immediate families” in the form of the novel coronavirus.

To analyze how care workers responded to this added job stressor, the researchers adapted a trusted organizational studies model first developed by Albert Hirschman in 1970. They could take up a pattern of resistance, meaning increased absenteeism or refusing to work without added compensatory pay; they could try innovation, in which they communicate their concerns to management and create new protocols better suited to a new, more dangerous reality; or they could apply improvisation, where frustrated employees stay out of a sense of duty and try to make do with what they have.

“We expected a lot of resistance, that frontline workers would be looking for ways to diverge from policy intent or just quit their jobs,” Dickson shares. “The ideal case would be more innovation, where the experiences of frontline workers would be privileged and a lot of weight would be given to their knowledge. But we know that they are not afforded that position in the policy process; they are not given this kind of esteem or support.”

Coping in the epicentre

Resistance was widespread in Quebec. The research sample of 336 newspaper articles contained 45 that explicitly mentioned resistance, with 78 per cent of those originating in Quebec. Absenteeism was especially high in Montreal, where nursing and long-term care homes were hit early and hard.

The province was also the leader when it came to mentions of improvisation. A total of 77 articles discussed some form of improvisation, with 79 per cent originating in Quebec. This included articles on the government recruiting health professionals from all fields and calling in the army as well as care workers using coffee filters beneath their handsewn face coverings due to a shortage of N95 masks.

Innovation was noted in British Columbia, where staff were able to adhere to protocols by adopting Zoom meetings for virtual patient visits and “doorway bingo,” and in the US, where staff stepped in for family unable to see their dying loved ones. These responses, the authors note, stemmed most from workers’ experience and expertise.

The analysis exposed some serious fault lines in long-term care regimes north and south of the border, Marier adds. “We talk about how great Canada’s health-care system is vis-à-vis that of the United States, especially in terms of universal accessibility, but we are not that different when it comes to long-term care.”

 

Read the cited paper: “Resistance, Innovation, and Improvisation: Comparing the Responses of Nursing Home Workers to the COVID-19 Pandemic in Canada and the United States.”

DOI

10.1080/13876988.2020.1846994

Credit: 
Concordia University

RUDN University biologists studied the effect of jungles on global warming

image: Biologists from RUDN University described the role of tropical rainforests in the production of methane, the second most harmful greenhouse gas after CO2. It turned out that some areas of rainforests not only consumed methane but also emitted it.

Image: 
RUDN University

Biologists from RUDN University described the role of tropical rainforests in the production of methane, the second most harmful greenhouse gas after CO2. It turned out that some areas of rainforests not only consumed methane but also emitted it. The results of the study were published in the Forests journal.

Although the share of methane in the atmosphere is relatively small (less than 1%), its contribution to the greenhouse effect is 20 to 30 times bigger than that of the same amount of carbon dioxide. The tropics are one of the main sources of methane. Previously, soil scientists used to focus only on swampy tropical areas, because methane-producing microorganisms live and multiply in the anaerobic conditions of swamps. As for other regions of tropical forests, scientists believed they rather stored methane than emitted it. Still, regular forest soils can also be sources of methane emissions because of small anaerobic pockets in them. Although the intensity of methane production there is lower than in swamp forests, on the global scale, their existence changes our understanding of the climatic impact of rainforests. A soil scientist from RUDN University together with his colleagues traveled to the equatorial rainforests of Ancasa Conservation Area in Ghana to study the role of tropical ecosystems in methane circulation between the soil and the atmosphere.

"We studied two land plots, on top of a hill and at its foot. The plots had opposite conditions and completely different methane production and absorption processes. Our analysis confirmed that equatorial forest soils remain a continuous source of methane all year round," said Riccardo Valentini, a PhD, the head of the Science and Research Laboratory "Smart Technologies for Sustainable Development of the Urban Environment in the Global Change" at RUDN University

The team spent two years monitoring methane flows from and to the soil. The average annual temperature in the area was around 25 ? and annual precipitation amounted to 1,500-2,000 mm with late March to mid-July and September to November being the wettest periods of the year. The measurements were taken in stationary closed clambers using gas chromatography.

According to the team, regions with high levels of methane emissions (both on top of the hill and at its foot) released more methane throughout the year than was absorbed by the soil in drier areas or during the dry season. The plot on top of the hill stored the gas quite slowly and its daily methane flow per square meter varied from absorbing up to 1.29 mg to releasing 0.44 mg of the gas. The lowland plot turned out to be the source of methane emissions. Although in the dry season it stored 0.67 mg per square meter every day, on other days of the year it released up to 188.09 mg.

"The annual methane budget of the soils in Ancasa Conservation Area (adjusted for landscape features) amounts to about 3.3 kg of methane per ha. Our results show that rainforests could be considered sources of methane emissions rather than its storages," added Riccardo Valentini from RUDN University.

Credit: 
RUDN University

Combination treatment for common glioma type shows promise in mice

WHAT

Gliomas are common brain tumors that comprise about one third of all cancers of the nervous system. In a study funded by the National Institutes of Health (NIH), researchers tested a novel combination treatment approach on mice with tumors with characteristics similar to human astrocytomas--a type of slow-growing glioma--and found tumor regression in 60 percent of the mice treated. These encouraging results, published in the Journal of Clinical Investigation, could be the first step toward developing a treatment for this type of brain cancer.

Led by senior authors Maria Castro, Ph.D. and Pedro Lowenstein, M.D., Ph.D. along with a team of researchers at the University of Michigan Rogel Cancer Centerb in Ann Arbor specifically tested inhibitors of the compound D-2-Hydroxyglutarate (D-2-HG), which is produced by cancer cells, on a mouse version of astrocytoma carrying mutations in the genes IDH1 and ATRX, along with an inactivated form of the tumor suppressor protein 53 (TP53) gene.

When the implanted mice were treated with a drug to block the production of D-2-HG along with standard of care radiation and temozolomide (chemotherapy) treatments, their survival significantly improved. Looking more closely at tumor cells grown in dishes, the researchers saw that blocking D-2-HG caused the cells to become more susceptible to radiation treatment. However, the treatment also increased the amount of an "immune checkpoint" protein, which tumors use to turn off T cells and evade the immune system.

Inhibiting this immune checkpoint protein with an additional drug resulted in an even greater improvement in survival, because the mouse's own immune system was able to attack the tumor.

Importantly, this combination therapy also led to immunological memory against the glioma, meaning that the mouse now had T cells tailored to the specific tumor. Because gliomas almost always grow back after treatment, these T cells make the animal better prepared to fend off regrowth.

It must be emphasized that these experiments were performed in mice. Nonetheless the preclinical results produced by this combination therapy could represent a key advance in developing an improved treatment regimen, which combines D-2-HG and immune checkpoint inhibition, radiation, and temozolomide, for patients with astrocytomas.

Credit: 
NIH/National Institute of Neurological Disorders and Stroke

Out of this world: U of I researchers measure photosynthesis from space

image: CABBI's Kaiyu Guan (left) and Chongya Jiang hope to leverage their cutting-edge SLOPE GPP product not only toward the advancement of agricultural science, but for the well-being of humankind. By using accurate, timely satellite data to measure crops' CO2 intake, the research team can gauge the overall heath and productivity of bioenergy ecosystems.

Image: 
Center for Advanced Bioenergy and Bioproducts Innovation (CABBI)

As most of us learned in school, plants use sunlight to synthesize carbon dioxide (CO2) and water into carbohydrates in a process called photosynthesis. But nature's "factories" don't just provide us with food -- they also generate insights into how ecosystems will react to a changing climate and carbon-filled atmosphere.

Because of their ability to make valuable products from organic compounds like CO2, plants are known as "primary producers." Gross primary production (GPP), which quantifies the rate of CO2 fixation in plants through photosynthesis, is a key metric to track the health and performance of any plant-based ecosystem.

A research team with the U.S. Department of Energy's Center for Advanced Bioenergy and Bioproducts Innovation (CABBI) at the University of Illinois Urbana-Champaign developed a product to accurately measure GPP: the SatelLite Only Photosynthesis Estimation Gross Primary Production (SLOPE GPP) product at a daily time step and field-scale spatial resolution.

The team leveraged the Blue Waters supercomputer, housed at the U of I National Center for Supercomputing Applications (NCSA), in their research. Their paper was published in Earth System Science Data in February 2021.

"Quantifying the rate at which plants in a given area process CO2 is critical to a global understanding of carbon cycling, terrestrial land management, and water and soil health -- especially given the erratic conditions of a warming planet," said Kaiyu Guan, project leader and NCSA Blue Waters Professor.

"Measuring photosynthesis is especially pertinent to agricultural ecosystems, where plant productivity and biomass levels are directly tied to crop yield and therefore food security. Our research directly applies to not only ecosystem service, but also societal well-being," said Chongya Jiang, a research scientist on the project.

Of particular intrigue is the relevance of GPP monitoring to bioenergy agricultural ecosystems, where the crops' "factories" are specially designed to produce renewable biofuels. Quantifying CO2 fixation in these environments is instrumental to optimizing field performance and contributing to the global bioeconomy. CABBI scientists, such as Sustainability Theme researcher Andy VanLoocke, suggest that this critical new data can be used to constrain model simulations for bioenergy crop yield potentials.

The technology used in this experiment is cutting-edge. As its name suggests, it is purely derived from satellite data, and therefore completely observation-based as opposed to relying on complex, uncertain modeling methods.

One example of an observation-based technology is solar-induced chlorophyll fluorescence (SIF), a weak light signal emitted by plants that has been used as a novel proxy for GPP. Inspired by their years-long ground observations of SIF, Guan's group developed an even more advanced method to improve GPP estimation: integrating a new vegetation index called "soil-adjusted near-infrared reflectance of vegetation" (SANIRv) with photosynthetically active radiation (PAR).

SLOPE is built on this novel integration. SANIRv represents the efficiency of solar radiation used by vegetation, and PAR represents the solar radiation that plants can actually use for photosynthesis. Both metrics are derived from satellite observations.

Through an analysis of 49 AmeriFlux sites, researchers found that PAR and SANIRv can be leveraged to accurately estimate GPP. In fact, the SLOPE GPP product can explain 85% of spatial and temporal variations in GPP acquired from the analyzed sites -- a successful result, and the best performance ever achieved benchmarked on this gold-standard data. As both SANIRv and PAR are "satellite only," this is an achievement that researchers have long been seeking but is just now being implemented in an operational GPP product.

Existing processes to quantify GPP are inefficient for three key reasons: spatial (image-based) precision, temporal (time-based) precision, and latency (delay in data availability). The SLOPE GPP product created by Guan's team uses satellite images twice as sharp as most large-scale studies (measuring at 250 meters versus the typical >500 meters) and retrieves data on a daily cycle, eight times finer than the norm. More importantly, this new product has between one and three days latency, whereas existing datasets lag behind by months or even years. Finally, the majority of GPP products employed today are analysis- rather than observation-based -- the metrics they use to calculate GPP (e.g., soil moisture, temperature, etc.) are derived from algorithms rather than real-world conditions gleaned from satellite observations.

"Photosynthesis, or GPP, is the foundation for quantifying the field-level carbon budget. Without accurate GPP information, quantifying other carbon-related variables, such as annual soil carbon change, is much less reliable," Guan said. "The Blue Waters supercomputer made our peta-bytes computing possible. We will use this novel GPP data to significantly advance our ability to quantify agricultural carbon budget accounting, and it will serve as a primary input to constrain the modeling of soil organic carbon change for every field that requires soil carbon quantification. In addition to the SLOPE GPP data, similar methods allow us to generate GPP data at 10-meter and daily resolution to even enable sub-field precision agricultural management."

Credit: 
University of Illinois at Urbana-Champaign Institute for Sustainability, Energy, and Environment

New study reports activated B. infantis EVC001 improves health outcomes in preterm infants

image: The new study, Impact of probiotic B. infantis EVC001 on the gut microbiome, nosocomially acquired antibiotic resistance, and enteric inflammation in preterm infants reports probiotic supplementation with EVC001 substantially reduces inflammation, diaper rash and antibiotic use in preterm infants. The paper was published in Frontiers in Pediatrics.

Image: 
Evolve BioSystems, Inc.

DAVIS, Calif., Feb 16, 2021 - Researchers publishing in the peer-review journal Frontiers in Pediatrics report that pre-term infants fed Bifidobacterium longum subsp. infantis (activated B. infantis EVC001) experienced significantly lower level of intestinal inflammation, 62% less diaper rash, and required 62% fewer antibiotics- all of which are critical health indicators in neonatal care.

The study, Impact of probiotic B. infantis EVC001 feeding in premature infants on the gut microbiome, nosocomially acquired antibiotic resistance, and enteric inflammation, is the first to quantify the impact of feeding B. infantis EVC001 on key health indicators specifically in pre-term infants. The work was conducted at two neonatal intensive care units (NICU) in Southern California.

"Neonatologists, neonatal nurses and other hospital-based clinical staff are acutely focused on affecting the factors that positively impact quality of care and length of stay among pre-term infants," said Dr. Karl Sylvester, Pediatric Surgeon, Stanford, California. "This study provides compelling evidence that feeding B. infantis EVC001 to preterm infants, along with human milk, yields meaningful reduction in gut dysbiosis, antibiotic resistant gene abundance and enteric inflammation - all leading health indicators that are linked to key health outcomes."

The study included 77 infants born before 39 weeks gestational age. One group of infants, born at less than 32 weeks, was fed B. infantis EVC001. The other group, born after 32 weeks, did not receive the probiotic. Researchers gathered fecal samples from each infant over the duration of stay in the NICU to assess gut microbial composition and development, as well as enteric inflammation.

The infants who received B. infantis EVC001 experienced far lower levels of pathogenic bacteria, including Escherichia and Staphylococcus epidermidis, known to cause enteric inflammation. The B. infantis EVC001 group also experienced substantially lower levels of IL-8 and TNF-α, both of which are proinflammatory cytokines; as well as a 1.5-fold lower level of fecal calprotectin.

The study group who received B. infantis EVC001 also experienced 62 percent less diaper rash and 62 percent less antibiotic exposure than those who did not; and they experienced an 81% percent reduction in antibiotic resistant genes, the majority of which are known to be present in Enterobacteria such as Klebsiella and E. coli. These species of bacteria are associated with necrotizing enterocolitis (NEC), the leading cause of death from gastrointestinal disease in preterm infants.

"EVC001 and breastmilk combine to restore a healthy microbiome, create a protective environment in the preterm infant gut and improve clinical outcomes," said David Kyle, PhD, Chairman and Chief Scientific Officer, Evolve BioSystems, Inc. "We've observed significant benefits with EVC001 in term infants both in clinical studies and at home use. These new data in preterm infants is unequivocal, and we believe should be considered for widespread acceptance as Standard of Care in the NICU."

Credit: 
Coyne PR

Electricity source determines benefits of electrifying China's vehicles

Each year an estimated 1.2 million Chinese citizens die prematurely due to poor air quality. And public health consequences are particularly dire during extreme air quality events, such as infamous "Airpocalypse" winter haze episodes.

A team of Northwestern University researchers wondered if widespread adoption of electric vehicles (EVs) could help China avoid these deadly events. The answer? It depends.

In a new study, the researchers concluded air quality and public health benefits of EVs -- as well as their ability to reduce carbon emissions -- in China are dependent on the type of transport electrified and the composition of the electric grid.

The study was published today (Feb. 16) in the February 2021 issue of the journal Earth's Future.

"A significant fraction of China's electricity is currently sourced from the burning of coal, which is a very carbon intensive power source," said Jordan Schnell, the study's lead author. "When the coal-heavy power is used to charge light-duty vehicles, carbon emissions are reduced because of the efficiency of light-duty EVs. Heavy-duty electric vehicles require significantly more energy, so we see a net increase in carbon dioxide emissions. However, even when heavy-duty vehicle power is sourced entirely by coal-fired electricity, we still see air quality improvements because the on-road emission reductions outweigh what the power plants add. Fine particulate matter emissions, which are a main ingredient in haze, are reduced."

"We find that to achieve net co-benefits from heavy-duty EV adoption, the real sticking point in China's infrastructure lies in its power-generation sector. Greater adoption of emission-free renewable power generation is needed," said Northwestern's Daniel Horton, the study's senior author. "For light-duty vehicles the co-benefits are already there under the current infrastructure."

Horton is an assistant professor of Earth and planetary sciences in Northwestern's Weinberg College of Arts and Sciences. At the time of the research, Schnell was a postdoctoral fellow in Horton's laboratory. He is now a research scientist at the Cooperative Institute for Research in Environmental Sciences at the University of Colorado at Boulder.

Pollutants come from a variety of sources, both natural and human-caused, and include emissions from transportation and power-generation facilities. Adoption of EVs reduces the emission of air pollutants and greenhouse gases from vehicle tailpipes, but the overall emissions budget must account for the shifting of emissions to the power plants used to charge EV batteries. Country-wide vehicle distribution and energy consumption also play roles in overall emissions profiles. Heavy-duty versus light-duty vehicles, for example, differ substantially, complicating the net outcome.

To reconcile these complicating factors, the researchers combined chemistry-climate modeling with emissions, weather and public health data. The researchers examined the air quality and climate benefits and tradeoffs of heavy-duty versus light-duty vehicle adoption using meteorological conditions from a notorious Airpocalypse event in January 2013. Unlike previous EV-air quality studies that focused on chronic exposure, the researchers focused on the acute public health impacts of exposure to this short, but extremely hazardous, haze event.

The researchers discovered that EV adoption could have a modest role in reducing the public health burden of individual Airpocalypse events, with the extent depending on the type of vehicle electrified. They also found the realization of public health and climate co-benefits depended on the addition of emission-free renewable energy to the electric grid.

During the January 2013 extreme pollution episode, poisonous haze hung over much of China, including the major population centers of Beijing, Tianjin and Hebei. Acute exposure to the record-high levels of fine particulate matter and nitrogen dioxide increased pollution-related respiratory diseases, heart disease and stroke, which the researchers estimate led to approximately 32,000 premature deaths and $14.7 billion in health care costs.

To assess the consequences of EV adoption, in one model simulation scenario the researchers replaced roughly 40% of China's heavy-duty vehicles (such as construction equipment, long-haul trucks and buses) with electrified versions. An alternative scenario simulated the replacement of roughly 40% of China's light-duty vehicles with electric alternatives. The energy needed to charge the EV batteries is equivalent in both scenarios and is sourced from power-generation facilities on the grid. Emissions of greenhouse gases and air pollutants are determined according to the battery-charging load and power plant profile.

The research team found that electrifying 40% of heavy-duty vehicles consistently improved air quality -- avoiding up to 562 premature deaths. It did not, however, reduce greenhouse gas emissions. Light-duty EV adoption, on the other hand, reduced carbon dioxide emissions by 2 megatons but had more modest air quality benefits.

To drive home this point, the researchers provided an additional scenario comparison. When all traffic emissions are removed from the 2013 event, air quality improvements lead to a 6% decrease in acute premature mortality. When all power-sector emissions are removed, however, acute premature mortality declines 24%.

"Overall, we found that EV-induced pollution changes and avoided premature deaths were modest for the extreme event," Schnell said. "But air quality will improve more drastically as the power-generation sector moves away from fossil fuel-fired electricity."

Credit: 
Northwestern University

Water is a probable vector for mammalian virus transmission

image: A dazzle of zebras at a waterhole in East Africa. Seasonal waterholes can operate as key locations for pathogen transmission within and between species.

Image: 
Peter Seeber / Leibniz-IZW

Water is a necessity for all life but its availability can be limited. In geographical areas experiencing dry seasons, animals congregate near the few freshwater sources, often reaching large densities. At these sites many animals from different species come to the same spots to drink, potentially operating as key locations for pathogen transmission within and between species. An international team of scientists lead by the German Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW) suggests that viruses can use restricted freshwater sources as a vector to be spread among animals. The key prediction of this idea is that animal viruses remain stable and infectious in water. The team tested this idea by sampling water holes in ecosystems of Africa and Mongolia with pronounced dry seasons and growing viruses in such water. The scientific results demonstrated that this was indeed possible and are published in Science of the Total Environment.

The distribution of freshwater varies geographically and seasonally, with places such as East Africa and Central Asia experiencing pronounced seasonal shortages. Water scarcity results in frequent, unstable congregations of numerous wildlife species in the vicinity of freshwater sources. The objective of the scientific research work was to determine whether viral stability in restricted water sources is sufficient for some mammalian viruses, thereby enabling their spread through the medium of water.

Equine herpesviruses (EHV) were selected as a model as they are known to remain stable and infectious in water for weeks under laboratory conditions and circulate in wildlife in both Africa and Mongolia. "We knew from our previous work, particularly with zebras in Africa, that equids become stressed when they are forced to aggregate in the dry season. When we looked at the effects of stress in captive zebras, we could see that it was associated with shedding of EHVs into the environment. This suggested that just at the time when animals are forced to congregate, they are most likely to be stressed and shed viruses. The stress is acting as a sort of signal to the virus to get into the water to infect more individuals," says Prof Alex Greenwood, the leader of this scientific work.

"Congregations also help explain some odd results from both captive and free-living wildlife, such as the infection of non-equids with EHVs, for example, rhinos," explains Dr Peter Seeber from the Leibniz-IZW who collected water and zebra samples in East Africa. "If rhinos share the water with equids, they are likely exposed to the virus," adds Dr Sanatana Soilemetzidou form the Leibnz-IZW, who collected the water and animal samples in Mongolia.

To be validated, the entire concept of water as a viral vector hinged on showing that EHV remain stable in environmental water. "We were unsure what to expect given that culturing viruses from the environment is challenging given all the other microbes that can grow when you try to isolate a virus," comments Dr Anisha Dayaram, the lead author from the Leibniz-IZW. Dayaram and her colleagues succeeded in doing exactly that, using the water samples collected from both Africa and Mongolia. Under cell culture conditions they demonstrated that EHVs could indeed replicate and remained infectious.

This seems to explain the rather odd result that EHVs seem to show limited viral evolution. Viruses tend to evolve rapidly but EHVs change little over time and are in this sense surprisingly stable. The EHVs found in Mongolia and Africa are nearly identical to those in domestic horses. This may represent a constraint or equilibrium state for the EHV found in water holes. "Our results suggest that the stability of EHV in water may be the result of a long evolutionary process, which has led such viruses to be truly adapted to using water as a vector," explains Dr Alexandre Courtiol, another Leibniz-IZW scientist involved in this scientific work.

EHVs are not the only viruses that can be shed into and spread through water. Further research should uncover whether other viruses may use water in a similar way as vector to spread among animals. The work also suggests that understanding viral dynamics will require looking beyond virus host interactions in some cases and should include adaptations to transmission via the environment.

Credit: 
Leibniz Institute for Zoo and Wildlife Research (IZW)

Method for temporal monitoring of microplastic sedimentation

image: Microplastics have been found in nearly all organisms and habitats everywhere in the world.

Image: 
UEF/ Raija Törrönen

The effects of microplastics on our health and the environment are being rigorously studied all across the world. Researchers are identifying microplastic sources and their potential routes to the environment by examining rainwater, wastewater, and soil.

Microplastics have been found in nearly all organisms and habitats everywhere in the world. However, factors contributing to the influx and accumulation of microplastics in water ecosystems aren't fully understood yet. The focus of microplastics research has, for a long time, been on the age of microplastics found in sediments, and on the time it takes to accumulate there. So far, however, temporal changes in sedimentation haven't really been considered in microplastics research.

Researchers in Finland (University of Eastern Finland, University of Turku, University of Helsinki) have tested the sediment trap method to analyse annual accumulation rates and possible seasonal variation. The sediment trap was installed on the lake bottom in the deepest part of the basin. Microplastics sink to the bottom of the basin together with other solid material, and they accumulate in sediment traps that are emptied on a regular basis. The sediment trap itself is a well-established method for sedimentological studies. The main focus in the development of the method was on the isolation of microplastic samples from sediment, and on testing the method's feasibility in a body of water.

The researchers tested the method in the basin of Huruslahti Bay, Finland, in 2017-2018. The annual amount of microplastics accumulating in the sediment of Huruslahti Bay was 32,400 pieces per square metre in a year. The most important advantage of the method is that it can be used to determine the time it takes from microplastics to enter and accumulate in bodies of water. Time plays an important role when modelling the influx and accumulation of microplastics, as well as in predicting any environmental risks they might cause in the future.

Credit: 
University of Eastern Finland

The water surface is a fantastic place for chemical reactions

Using an advanced technique, scientists from the RIKEN Cluster for Pioneering Research have demonstrated that a chemical reaction powered by light takes place ten thousand times faster at the air-water interface--what we usually call the water surface--than in the bulk of the water, even when the light has equivalent energy. This finding could help our understanding of the many important chemical and biological processes that take place at the water surface.

Water is the most important liquid in nature, and research has shown that there is in fact something special about the interface. For reasons that were not well understood, it appears that some chemical reactions take place readily when the molecules are partly in the water, but not when they are fully dissolved.

One issue hampering understanding is that how chemical reactions actually proceed at the interface is not well understood. To investigate this, the RIKEN group used an advanced technique called ultra-fast phase-sensitive interface-selective vibrational spectroscopy. It's a mouthful, but essentially it means that you can get a high-speed movie of the intermediate molecules created as a chemical reaction takes place at an interface. In this case, "high-speed" means about one hundred femtoseconds, or less than a trillionth of a second.

Using the method, they analyzed the photoionization of phenol, a reaction that has been well studied in bulk water, using equivalent high-speed pulses of ultraviolet light. The experiments showed that the same reaction took place at the interface but that due to differences in the conditions there, the reaction took place roughly ten thousand times faster.

According to Satoshi Nihonyanagi, one of the authors of the study, published in Nature Chemistry, "It was exciting to find that the reaction speed for phenol is so phenomenally different, but in addition, our method for directly observing chemical reactions at the water surface in real time could also be applied to other reactions, and could help us get a better understanding of how reactions proceeds in this special environment."

According to Tahei Tahara, the leader of the research group, "The fact that the there is a 10,000-fold difference in the reaction rate of a basic organic molecule such as phenol between the bulk water and the water surface is also very important for catalytic chemistry, the field of study that aims to promote and control chemical reactions. IN addition, water in nature exists as seawater, which has bubbles and aerosols, thus having a vast surface area. Our work could help us to understand how molecules are adsorbed on the surface of water, leading to chemical reactions that have an enormous impact on the global environment."

Credit: 
RIKEN

Dual character of excitons in the ultrafast regime: atomic-like or solid-like?

image: Attosecond measurement of an exciton in an MgF2 crystal.

Image: 
Polimi

Milan, 15 February 2021 - Excitons are quasiparticles which can transport energy through solid substances. This makes them important for the development of future materials and devices - but more research is needed to understand their fundamental behaviour and how to manipulate it. Researchers at Politecnico di Milano in collaboration with the Institute of Photonics and Nanotechnologies IFN-CNR and a theory group from the Tsukuba University (Japan) and the Max Plank Institute for the Structure and Dynamics of matter (Hamburg, Germany), have discovered that an exciton can simultaneously adopt two radically different characters when it isstimulated by light. Their work, now published in Nature Communications, yields crucial new insights for current and future excitonics research.

Excitons consist of a negatively charged electron and a positively charged hole in solids. They are a so-called many-body-effect, produced by the interaction of many particles, especially when a strong light pulse hits the solid material. In the past decade, researchers have observed many-body-effects down to the unimaginably short attosecond time scale, in other words billionths of a billionth of a second.

However, scientists have still not reached a fundamental understanding of excitons and other many-body effects due to the complexity of the ultrafast electron dynamics when many particles interact. The research team from Politecnico di Milano, the University of Tsukuba and the Max Planck Institute for the Structure and Dynamics (MPSD) wanted to explore the light-induced ultrafast exciton dynamics in MgF2 single crystals by employing state-of-the-art attosecond transient reflection spectroscopy and microscopic theoretical simulations.

By combining these methods, the team discovered an entirely new property of excitons: The fact that they can simultaneously show atomic-like and solid-like characteristics. In excitons displaying an atomic character, the electrons and holes are tightly bound together by their Coulomb attraction - just like the electrons in atoms are bound by the nucleus. In excitons with a solid-like character, on the other hand, the electrons move more freely in solids, not unlike waves in the ocean.

"These are significant findings - says lead author Matteo Lucchini from the Politecnico di Milano - because understanding how excitons interact with light on these extreme time scales allows us to envision how to exploit their unique characteristics, fostering the establishment of a new class of electro-optical devices."

During their attosecond experiment performed at the Attosecond Research Center (ARC, http://www.attosecond.fisi.polimi.it) within the ERC project AuDACE ("Attosecond Dynamics in AdvanCed matErials", http://www.audaceproject.it), the researchers managed to observe the sub-femtosecond dynamics of excitons for the first time, with signals consisting of slow and fast components. This phenomenon was explained with advanced theoretical simulations, adds co-author Shunsuke Sato from the MPSD and the University of Tsukuba: "Our calculations clarified that the slower component of the signal originates from the atomic-like character of the exciton while the faster component originates from the solid-like character - a ground-breaking discovery, which demonstrates the co-existence of the dual characters of excitons!"

This work opens up an important new avenue for the manipulation of excitonic as well as materials' properties by light. It represents a major step towards the deep understanding of non-equilibrium electron dynamics in matter and provides the fundamental knowledge for the development of future ultrafast optoelectronic devices, electronics, optics, spintronics, and excitonics.

Credit: 
Politecnico di Milano

The vertical evolution of volatile organic compounds vary between winter and summer

image: Scientists prepare to release the tethered balloon to sample the lowest layers of the atmosphere.

Image: 
Guiqian Tang

Scientists have discovered that pollution concentration varies between seasons. A new study, conducted in the North China Plain, determined where volatile organic compounds (VOCs) are distributed within the vertical layers of the atmosphere, and found notable changes from winter to summer.

"The concentration of VOCs in the vertical direction was much higher in winter than that in summer and their emission sources showed different contributions in both seasons," said Guiqian Tang, associate professor in the Institute of Atmospheric Physics, Chinese Academy of Sciences, and the corresponding author of a study just published in Advances in Atmospheric Sciences .

The researchers conducted a field campaign from June 8 to July 3, 2019. They focused on VOCs, which react with nitrogen oxides and carbon monoxide to produce excess ozone (O3), a notable greenhouse gas. The most serious photochemical and VOC pollution in the North China Plain is concentrated near the city of Shijiazhuang, where the team chose to conduct their research.

"Photochemical pollution in summer over the North China Plain is a serious problem, but the mechanisms are not fully understood," said Dr. Tang. He also cited the importance of analyzing the vertical layers of the atmosphere to understand how VOCs evolve vertically and contribute to O3 formation.

Researchers used a tethered balloon, which was better suited than tower, aircraft, and unmanned aerial vehicles for low altitude observations below 1000 m. The balloon explored the vertical evolution of VOCs in the atmospheric boundary layer near the surface.

"Our findings in this campaign were compared with the results of our winter observations from January 2019. It showed that the VOC concentrations and proportions in winter and summer exhibited significant differences vertically," Dr. Tang said.

The total VOC concentration was relatively uniform through the year below 600 m, slightly increasing from 600 m to 1000 m through summer. Alkanes were the largest chemical species in both winter and summer, according to the researchers. These VOCs result from gasoline combustion in vehicles and industrial sources, and accounted for the largest proportion of VOCs that gradually increased at each sampling height.

"We should take measures to improve oil quality to limit exhaust emissions of motor vehicles to reduce VOC pollution," said Dr. Tang. He and his collaborators showed that gasoline emissions from vehicles should be controlled at the surface. Meanwhile, the effects of reducing VOC-based industrial pollution is most impactful at high altitudes.

Soon, researchers also plan to observe the vertical evolution of nitrogen oxides to further explore the formation mechanisms of O3. Nitrogen oxides and carbon monoxide present in the atmosphere are critical components in the reaction that produces ozone in the lower atmosphere.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences