Tech

Beneficial bacteria help wheat stand the heat

video: Researchers at KAUST have identified bacteria that can boost heat tolerance in wheat. The team's bacteria-bolstered wheat had up to 50% higher yields than normal.

Image: 
© 2021 KAUST; Anastasia Serin.

Bacteria plucked from a desert plant could help crops survive heatwaves and protect the future of food.

Global warming has increased the number of severe heatwaves that wreak havoc on agriculture, reduce crop yields and threaten food supplies. However, not all plants perish in extreme heat. Some have natural heat tolerance, while others acquire heat tolerance after previous exposure to higher temperatures than normal, similar to how vaccines trigger the immune system with a tiny dose of virus.

But breeding heat tolerant crops is laborious and expensive, and slightly warming entire fields is even trickier.

There is growing interest in harnessing microbes to protect plants, and biologists have shown that root-dwelling bacteria can help their herbaceous hosts survive extreme conditions, such as drought, excessive salt or heat.

"Beneficial bacteria could become one of the quickest, cheapest and greenest ways to help achieve sustainable agriculture," says postdoc Kirti Shekhawat. "However, no long-term studies have proven they work in the real world, and we haven't yet uncovered what's happening on a molecular level," she adds.

To fill this knowledge gap, Shekhawat, along with a team led by Heribert Hirt, selected the beneficial bacteria SA187 that lives in the root of a robust desert shrub, Indigofera argentea. They coated wheat seeds with the bacteria and then planted them in the lab along with some untreated seeds. After six days, they heated the crops at 44 degrees Celsius for two hours. "Any longer would kill them all," says Shekhawat.

The untreated wheat suffered leaf damage and ceased to grow, while the treated wheat emerged unscathed and flourished, suggesting that the bacteria had triggered heat tolerance. "The bacteria enter the plant as soon as the seeds germinate, and they live happily in symbiosis for the plant's entire life," explains Shekhawat.

The researchers then grew their wheat for several years in natural fields in Dubai, where temperatures can reach 45 degrees Celsius. Here, wheat is usually grown only in winter, but the bacteria-bolstered crops consistently had yields between 20 and 50 percent higher than normal. "We were incredibly happy to see that a single bacterial species could protect crops like this," says Shekhawat.

The team then used the model plant Arabidopsis to screen all the plant genes expressed under heat stress, both with and without the bacteria. They found that the bacteria produce metabolites that are converted into the plant hormone ethylene, which primes the plant's heat-resistance genes for action. "Essentially, the bacteria teach the plant how to use its own defense system," says Shekhawat.

Thousands of other bacteria have the power to protect plants against diverse threats, from droughts to fungi, and the team is already testing some on other crops, including vegetables. "We have just scratched the surface of this hidden world of soil that we once dismissed as dead matter," says Hirt. "Beneficial bacteria could help transform an unsustainable agricultural system into a truly ecological one."

Credit: 
King Abdullah University of Science & Technology (KAUST)

Keep it moving: How a biomaterial mobility may revolutionize immunomodulation

image: LPS-stimulated mouse Kupffer cells on PRX surfaces with high mobility had smaller areas of spread than those on PRX surfaces with low mobility, whereas the cells showed higher expression levels of Toll-like receptor 4 (Tlr4), M1 marker and pro-inflammatory cytokine genes, and frequently formed multiple cytoplasmic vacuoles in the their cytoplasms. In contrast, the LPS-stimulated Kupffer cells on PRX surfaces with low mobility had wider areas of spread, enhancing the expression of M2 marker and the Yap genes, while decreasing the expression of the pro-inflammatory genes. These results suggest that the highly mobile surfaces favor the M1 polarization of Kupffer cells, whereas less mobile surfaces favor the M2 polarization.

Image: 
Department of Organic Biomaterials,TMDU

Researchers from Tokyo Medical and Dental University (TMDU) identify biomaterials that can be used to modulate liver immune cell behavior

Tokyo, Japan - Biomaterials are substances, natural or manmade, that are used in medicine to interact with the human body for various purposes, such as wound healing and tissue regeneration. Previous work on biomaterials has shown that they can affect cells in many ways, including how they grow, move, and the type of cell they develop into. Scientists have recently begun investigating biomaterials with properties that can be fine-tuned to optimize their use in regenerative medicine. Now, researchers at Tokyo Medical and Dental University (TMDU) have identified a polymer with tunable mobility properties that can alter the immune activity of specific liver cells.

In an article published in Biomaterials Science, the TMDU researchers report how they varied the mobility of specific biomaterials and observed significant effects on mouse Kupffer cells. These are liver cells that form part of the innate immune system--the body's first line of defense against an infection in this organ.

The group previously worked with polymer-based biomaterials called polyrotaxane. Other molecules can be weaved within the polymer structure, and their ability to move freely throughout this structure is what the researchers refer to as "molecular mobility." Polyrotaxane mobility can be adjusted by adding more molecules within the polymer, and this could affect the fate and maintenance of cells interacting with the biomaterials. Because of this, the TMDU group became interested in whether the biomaterials could be utilized to manipulate the immune system.

"We hypothesized that the polyrotaxane molecular mobility could serve as a sort of mechanical cue to the cells in the surrounding environment," says lead author of the study Yoshinori Arisaka. "Using this property to possibly modulate immune cell activity could revolutionize immunomodulation."

The researchers cultured Kupffer cells on a surface coated with polyrotaxane. They treated the cells with lipopolysaccharide, which is a molecule used as an immune activator. They adjusted the molecular mobility of the surface and then examined cell movement and shape, as well as the expression levels of certain inflammation-related genes. Interestingly, they found that the surface mobility significantly affected the movement and gene expression profile of the Kupffer cells.

"The surfaces with higher mobility increased expression of pro-inflammatory genes in the cells," explains Nobuhiko Yui, senior author. "This means that the cells were behaving as if they were part of an active immune response."

The authors believe that this system may be the groundwork for using biomaterials in humans to balance immune activity.

"Our data demonstrate that mechanical cues may play a role in regulating cell behavior," says Arisaka.

This work is a critical step forward in biomaterials research. Mechanically regulating immune system activity with novel biomaterials may transform regenerative medicine.

Credit: 
Tokyo Medical and Dental University

During the first wave of coronavirus pandemic older adults left home mainly for physical activity

image: Physical exercise was the most common reason to leave home during the first wave of the pandemic in 2020.

Image: 
Photo: University of Jyväskylä

In spring 2020, when the first wave of the coronavirus pandemic hit Finland, older adults drastically reduced their out-of-home activities. During the period of government restrictions, physical exercise was the most common reason to leave home, a recent study at the University of Jyväskylä Faculty of Sport and Health Sciences finds.

"In spring 2020, it was feared that the closure of many activity destinations and the recommendations to avoid close contact with persons from other households put in place by the government would decrease physical activity levels, and thus, negatively affect older adults' physical functional capacity," Senior Researcher Erja Portegijs explains. "According to our research results, this was however, not the case."

Throughout the restriction period, physical exercise and walking outdoors, for example, in nature was possible, and even encouraged by the government later in the spring.

"This study shows that physical exercise was the most common reason to go out," Portegijs adds. "Otherwise, older participants had few reasons to go out beyond grocery shopping during the first spring of the pandemic."

Previous research shows that all activities outside of one's home are beneficial for physical activity. As the reasons to leave home were markedly limited during the first spring of the pandemic, more research is needed to determine the long-term effects on mobility and maintaining functional capacity.

"This research is unique, even though it was based on the data of 44 participants only," Portegijs says. "Previously, we did not know where older adults moved and for what reason. Studying where people go to is possible using a map-based questionnaire. This is one of the first studies utilizing such a questionnaire among older adults."

As coronavirus-related measures have varied significantly between countries, it is not sure whether these results are generalizable to other countries. In Finland, curfews were not implemented and governmental restrictions were mostly based on recommendations rather than enforced regulations.

In 2017 and 2018, a map-based questionnaire was used to collect data on frequently visited activity destinations as part of the larger AGNES study among 75-, 80-, and 85-year-old adults living in Jyväskylä city in Central Finland. In May and June 2020, participants were invited to complete the map-based questionnaire following a postal questionnaire. Only a small portion of participants was able to use digital devices independently and thus to participate. These participants had somewhat better health and function than the others.

"As abilities to use digital devices improve among the aging population, the relevance of map-based research methods will further increase," Portegijs reflects.

Credit: 
University of Jyväskylä - Jyväskylän yliopisto

Children exposed to tobacco smoke use more emergent health services

image: A health services researcher, Ashley Merianos has extensive training and experience in the epidemiology and prevention of substance use with an emphasis on tobacco, quantitative statistical methods, and clinical and translational research in the pediatric healthcare setting.

Image: 
University of Cincinnati

Tobacco smoke-exposed children utilize emergency and urgent care services more often than unexposed children, which contributes to a large toll on the nation's health care system, says research led by the University of Cincinnati.

The study, recently published in the journal PLOS ONE, concluded:

· Children who are exposed to tobacco smoke have higher pediatric emergency department visit costs compared to unexposed children.

· A higher number of tobacco smoke-exposed children had an urgent care visit over a one-year period compared to unexposed children.

· Tobacco smoke-exposed children had nearly twice the risk of being admitted to the hospital over a one-year period compared to unexposed children.

"Despite major progress in tobacco control, about 4-in-10 children remain exposed to tobacco smoke. This exposure places developing children at higher risk for many health problems, including respiratory illnesses such as asthma, bronchiolitis and pneumonia," says health services researcher and lead author Ashley Merianos, an associate professor of health promotion and education in UC's School of Human Services.

Merianos is also a research affiliate member of Cincinnati Children's Hospital Medical Center, the Thirdhand Smoke Research Consortium and the American Academy of Pediatrics Tobacco Consortium.

The study, Merianos says, also lends insight into preventions such as standardizing and initiating tobacco smoke exposure reduction interventions in the urgent care, emergency and inpatient settings and promoting voluntary smoke-free home and car policies to help reduce children's tobacco smoke exposure and related consequences.

"If every health care provider were to use each pediatric visit as an opportunity to screen and advise parents who smoke or vape to counsel parents about the dangers of secondhand and thirdhand smoke exposure to their children, rates of pediatric tobacco smoke exposure would decline," says pediatric emergency physician and senior author Melinda Mahabee-Gittens, a professor of pediatrics at Cincinnati Children's.

Credit: 
University of Cincinnati

Effective Field Theories and the nature of the universe

What is the world made of? This question, which goes back millennia, was revisited by theoretical physicist Steven Weinberg from the University of Texas in Austin, TX, USA in the first of an international seminar series, 'All Things EFT'. Weinberg's seminar has now been published as an article in the journal EPJ H.

And Weinberg is well placed to discuss both Effective Field Theories (EFTs) and the nature of the Universe, as he shared the 1979 Nobel Prize for Physics for developing a theory to unify the weak and electromagnetic interactions between elementary particles. This fed into the development of the widely used Standard Model of particle physics that unifies these two forces with the strong interaction.

The introduction to the article describes Weinberg as the 'pioneer' of EFTs. In his wide-ranging talk, Weinberg sets out the early history of EFTs from a personal perspective and describes some implications for future research.

Briefly, an EFT is a type of theory or approximation that describes a physical phenomenon at given length or energy scales, while averaging over shorter length or higher energy scales. Weinberg describes how the unifying Standard Model came to be seen as a valid approximation to a more fundamental theory that will likely take over at the highest energies, such as string theory.

He remembers how physicists of 1950s and 1960s had difficulty linking quantum field theory to the strong interaction. Eventually, he and others produced a standardised methodology that could fit observed data at least as well as the rather cumbersome mathematics that was being used. These ideas can be generalised; eventually, he states, 'all [physicists'] theories will survive as approximations to a future theory'.

As Weinberg explains, the techniques of EFTs apply to diverse areas including hadronic physics and superconductivity. Weinberg clearly enjoys and values teaching, and his introduction to this key concept of particle physics in this first lecture is both engaging and enlightening.

Credit: 
Springer

Clarity needed in classification systems for processed foods

During this unique study researchers from the University of Surrey and European Food Information Council (EUFIC) reviewed over 100 scientific papers to examine if different criteria exist in developing classification systems for processed foods and, if so, what distinguishes them.

Classification systems that categorise foods according to their "level of processing" have been used to predict diet quality and health outcomes, inform guidelines and in product development.

Researchers found that most classification system's criteria are not aligned with existing scientific evidence on nutrition and food processing. It is thought that this may stem from different perspectives and intentions behind the development of some classification systems. Researchers also noted a failure to include measurements of nutritional content within some systems which may be confusing to consumers. The authors contrast this with nutrient profiling schemes such as Nutri-score, which converts the nutritional value of products into a simple code consisting of five letters.

Only a few of the classification systems examined in the analysis also acknowledge food processing done at home, and instead focus more on industrially processed foods. Researchers believe that this omission is misguided as food that is homemade is not automatically a healthier choice.

Categorisation of foods deemed 'ultra-processed' and what is meant by the term, was also examined by researchers. While there is a lot of confusion and disagreement about the term, from the evidence available it is thought that these foods could relate to obesity by energy density and food properties such as texture. However this will need confirmation through further research studies.

Christina Sadler, a postgraduate researcher and PhD candidate at the University of Surrey and a Senior Manager at EUFIC who led on the research, said: "We found that food processing and the degree of processing used are interpreted in different ways by different classification systems. It is concerning that there are no clear agreements on what features make food more or less processed, and how this relates to healthy eating advice, which may make it more difficult for consumers to make informed choices consistently."

"What is needed is clarification of the underlying methods, meanings and rationales of food classification systems so that foodstuffs can be classified consistently. This will help inform public health and ensure we eat a more balanced diet."

Credit: 
University of Surrey

Corona waste kills animals throughout the entire world

image: The scientific publication is the first mention of animals integrating corona litter into their nests; in this case a coot's nest.

Image: 
Alexander Schippers

It all started when litter researchers found a perch in the canals of Leiden that had become caught up in a latex glove. As far as we know, this was the first Dutch victim of corona waste. Since then, they have been trying to obtain an overall picture of the consequences of the corona waste mountain on animals.

Biologists Auke-Florian Hiemstra from Naturalis Biodiversity Center and Liselotte Rambonnet from Leiden University started a quest to determine how often and where interactions between corona waste and animals occur. They collected observations from Brazil to Malaysia and from social media to local newspapers and international news websites. A fox in the United Kingdom, birds in Canada, hedgehogs, seagulls, crabs, and bats - it transpired that all sorts of animals, everywhere, become entangled in face masks.

They found reports about apes chewing on face masks, and about a penguin with a face mask in its stomach. Pets too, especially dogs, were found to swallow face masks. "Animals become weakened due to becoming entangled or starve due to the plastic in their stomach," Rambonnet emphasizes. The diversity of animals influenced by corona waste is considerable. "Vertebrates and invertebrates on land, in freshwater, and in seawater become entangled or trapped in corona waste," says Hiemstra. In their overview article in the journal Animal Biology, they also write that some animals use the waste as nest material. For example, coots in Dutch canals use face masks and gloves as nest material. "And the packaging from paper handkerchiefs is found in nests too. As such, we even see the symptoms of COVID-19 in animal structures," says Hiemstra.

Citizen Science

The scientists from Leiden were able to create their overview thanks to the observations of photographers, litter collectors, birdwatchers, wildlife rescue centers, and veterinarians who shared the observations via social and traditional media. Rambonnet: "As a result of this, we can learn more about the impact of this category of disposable products on wildlife. We therefore ask people to keep sharing their observations so that we can maintain an up-to-date overview." To facilitate this, the duo has set up the website http://www.covidlitter.com. Rambonnet and Hiemstra hope that this overview will increase people's awareness of the danger of face masks and gloves for wildlife. Furthermore, they call upon everybody to use reusable face masks.

Credit: 
Naturalis Biodiversity Center

Waste from making purple corn chips yields a natural dye, supplements, kitty litter

image: Researchers extracted pigment from purple corn cobs (left) for supplements and dyeing fabrics (bottom right), and tested the remaining grounds (top right) for animal litter.

Image: 
Patrizia De Nisi

The more colorful a food, the more nutritious it probably is. For example, purple corn contains compounds associated with a reduced risk of developing diabetes and heart disease. The cobs contain the same compounds but are typically thrown out. Now, researchers report a step-wise biorefinery approach in ACS Sustainable Chemistry & Engineering that uses the whole cob, producing a dye and a possible nutraceutical with the pigments, and an animal litter with the left-overs.

Eating a rainbow of fruits and vegetables provides a variety of health benefits, with vitamins and nutrients stored within the plant's color-producing compounds. One group of compounds contributing distinct hues to food are anthocyanins -- vibrant pigments desired as natural dyes that also have antioxidant and anti-inflammatory properties. Anthocyanins are found in purple corn's kernels and the corncobs, which are typically discarded. Past attempts at repurposing cobs have involved harmful and expensive solvents to extract compounds. Water could be used as an eco-friendly and cost-effective agent for this process, but it is not very efficient. And then the insoluble cob material is still left over as waste. So, Fabrizio Adani, Roberto Pilu, Patrizia De Nisi and colleagues wanted to extract beneficial pigments from purple corncobs with a multi-step approach to make many value-added products, while also closing the loop with zero waste at the end.

The researchers devised a biorefinery approach to extract anthocyanins from a new variety of purple corn they developed. First, ground-up corncobs and water were mixed and heated, removing 36% of the pigments compared to methods with acetone and ethanol solvents. The pigments from this step were used to dye cotton and wool fabrics. In the next step of the biorefinery method, the researchers removed an additional 33% of the anthocyanin content from the water-treated cobs with an ethanol mixture. These extracts showed antioxidant activity and anti-inflammatory properties in cells in petri dishes and could be used in the future to develop nutraceutical supplements, the researchers say. Finally, the team found that the remaining insoluble purple grounds were similar to commercial corncob animal litter. In tests, the residual cob material was even more absorbent than the commercial product. And because the material still contains anthocyanins, which have antimicrobial activity, the purple litter could fight bacteria and reduce odors, the researchers say. Used purple corn cob litter could also be composted along with other organic matter, resulting in no waste, they explain.

Credit: 
American Chemical Society

Dow-like index for energy prices might help smooth transition to clean power

image: The energy price index reflects the changes in energy prices resulting from the type of energy sources available and their supply chains.

Image: 
Rachel Barton/Texas A&M Engineering

Since the early industrial revolution in the mid-1700s, fossil fuels have acquired an ever-growing footprint in energy production. However, the environmental concerns of fossil fuels use and their inevitable depletion have led to a global shift toward renewable energy sources. These transitions, however, raise questions about the best choice of renewables and the impact of investing in these resources on consumer cost.

In a recent study published in the journal Nature Communications, researchers at Texas A&M University have devised a metric that reflects the average price of energy in the United States. Much like how the Dow index indicates trends in stock market prices, the researchers' metric reflects the changes in energy prices resulting from the type of energy sources available and their supply chains.

"Energy is affected by all kinds of events, including political developments, technological breakthroughs and other happenings going on at a global scale," said Stefanos Baratsas, a graduate student in the Artie McFerrin Department of Chemical Engineering at Texas A&M and the lead author on the study. "It's crucial to understand the price of energy across the energy landscape along with its supply and demand. We came up with one number that reflects exactly that. In other words, our metric monitors the price of energy as a whole on a monthly basis."

Today, the energy industry is largely dominated by fossil fuels, like coal, natural gas and petroleum. An increase in fossil fuel consumption, particularly in the last few decades, has raised increasing concerns about their environmental impact. Most notably, the Intergovernmental Panel on Climate Change has reported an estimated increase at 0.2 degrees Celsius per decade in global temperature, which is directly linked to burning fossil fuels.

But only around a 11% share of the total energy landscape comes from renewable sources. Although many countries, including the United States, have committed to using more renewable energy sources, there isn't a way to quantitatively and accurately measure the price of energy as a whole. For example, an establishment might use a combination of solar and fossil fuels for various purposes, including heating, power and transportation. In this case, it is unclear how the price would change if there is an increased tax on fossil fuels or subsidies in favor of renewables are introduced.

"Energy transition is a complex process and there is no magic button that one can press and suddenly transition from almost 80% carbon-based energy to 0%," said Dr. Stratos Pistikopoulos, director of the Texas A&M Energy Institute and senior author on the study. "We need to navigate this energy landscape from where we are now, toward the future in steps. For that, we need to know the consolidated price of energy of end users. But we don't have an answer to this fundamental question."

To address this research gap, the researchers first identified different energy feedstocks, such as crude oil, wind, solar and biomass, and their energy products. So, for example, crude oil's energy products are petrol, gasoline and diesel. Next, they categorized the energy end users as either residential, commercial, industrial or transportation. Further, they obtained information on which energy product and how much of it is consumed by each user from the United States Energy Information Administration. Last, they identified the supply chains that connected the energy products to consumers. All this information was used to calculate the average price of energy, called the energy price index, for a given month and forecast energy prices and demands for future months.

As a potential real-world use of this metric, the researchers explored two policy case studies. In the first scenario, they studied how the energy price index would change if a tax on crude oil was imposed. One of their main findings upon tracking the energy price index was that around $148 billion could be generated in four years for every $5-per-barrel increase in crude oil tax. Also, this tax would not significantly increase the monthly cost of energy for U.S. households. In the second case study that explored the effect of subsidies in the production of electricity from renewable energy sources, they found that these policies can cause a dip in energy prices even with no tax credit.

Baratsas said their approach offers a way to optimize policies at the state, regional and national level for a smooth and efficient transition to clean energy. Further, he noted that their metric could adapt or self-correct its forecasting of energy demands and prices in the event of sudden, unforeseen situations, like the COVID-19 pandemic that may trigger a drastic decrease in the demand for energy products.

"This metric can help guide lawmakers, government or non-government organizations and policymakers on whether, say, a particular tax policy or the impact of a technological advance is good or bad, and by how much," said Pistikopoulos. "We now have a quantitative and accurate, predictive metric to navigate the evolving energy landscape, and that's the real value of the index."

Credit: 
Texas A&M University

Texas A&M researchers optimize materials design using computational technologies

The process of fabricating materials is complicated, time-consuming and costly. Too much of one material, or too little, can create problems with the product, forcing the design process to begin again. Advancements in the design process are needed to reduce the cost and time it takes to produce materials with targeted properties.

Funded by the National Science Foundation (NSF), researchers at Texas A&M University are using advanced computational and machine-learning techniques to create a framework capable of optimizing the process of developing materials, cutting time and costs.

"Our general focus is working on materials design by considering process-structure-property relationships to produce materials with targeted properties," said Dr. Douglas Allaire, associate professor in the J. Mike Walker '66 Department of Mechanical Engineering. "In our work, we demonstrate a microstructure sensitive design of alloys with a Bayesian optimization framework capable of exploiting multiple information sources."

Bayesian optimization-based frameworks use prior knowledge as models to predict outcomes. In the past, researchers have used this framework in correlation with a single information source (simulation or experiment). If that method failed, the process starts again with the hopes of making the right adjustments based on this model.

The researchers have rejected this notion and instead believe that many information sources can be pulled using a Bayesian framework to develop a more complete picture of underlying processes. They have combined multiple information sources to create materials with targeted properties more efficiently by looking at data in its entirety rather than its parts.

"What we think, that is very different, is that you can have many different potential models or information sources," said Dr. Raymundo Arróyave, professor in the Department of Materials Science and Engineering. "There are many ways to understand/model the behavior of materials, either through experiments or simulations. Our idea is to combine all of these different models into a single, 'fused' model that combines the strengths of all the other models while reducing their individual weaknesses."

Their research, titled "Efficiently exploiting process-structure-property relationships in material design by multi-information source fusion," was recently published in Vol. 26 of the Acta Materialia journal.

"These model chains have historically not considered the breadth of available information sources," said Allaire. "They consider single models along the chain from process, through structure, to property. As a result, they are not as efficient or accurate as they could be."

The researchers are currently testing this framework by developing dual-phase steels typically used on automobile frames. Dual-phase steels are made out of two phases with very different and complementary properties.

"There are two phases; the martensite phase makes this particular steel very strong," said Arróyave. "The ferritic phase is softer and makes the steel more compliant and amenable to deformation. With only martensitic microstructures, these materials are strong, but they break easily. However, if you combine the strength of martensite with the ductility of ferrite, you can make steels that are very strong, can absorb energy during impact and that can be fabricated into complex shapes such as car frames."

Using the method developed in this work, the goal is to develop a framework that more precisely and effectively predicts the needed composition and processing (recipe) for a specific design. In turn, this decreases the number of simulations and experiments required, drastically reducing costs.

"The knowledge that we gain about the material design process as a whole using our framework is much greater than the sum of all information extracted from individual models or experimental techniques," said Dr. Ankit Srivastava, assistant professor for the materials science and engineering department. "The framework allows researchers to efficiently learn as they go, as it not just collects and fuses information from multiple models/experiments but it also tells them which information source i.e., a particular model or experiment provides them the best value for their money or time, which really enhances the decision-making process."

In the future, they hope their framework is widely used when attempting tasks that involve integrated computational materials design.

"Our hope is that by presenting these model fusion-based Bayesian optimization capabilities, we will make the search process for new materials more efficient and accurate," said Allaire. "We want any researcher to use the models that they have available to them without worrying as much about how to integrate the models into their own modeling chain because our Bayesian optimization framework handles that integration for them."

Credit: 
Texas A&M University

Dangerous landfill pollutants ranked in order of toxicity by MU researchers

image: Agroforestry phytoremediation buffer system in southeastern Wisconsin.

Image: 
Missouri University of Science and Technology

COLUMBIA, Mo. - Nearly 2,000 active landfills are spread across the U.S., with the majority of garbage discarded by homes and businesses finding its way to a landfill. The resulting chemicals and toxins that build up at these sites can then leach into soil and groundwater, and this "leachate" can present serious hazards to the environment and to the people who live nearby.

To help environmental agencies battle the toxic threats posed by landfills, researchers at the University of Missouri -- in partnership with the USDA Forest Service -- have developed a system that ranks the toxins present in a landfill by order of toxicity and quantity, allowing agencies to create more specific and efficient plans to combat leachate.

"Leachate from landfills can cause cancer and other serious harm, and it's a threat whether it's ingested, inhaled or touched," said Chung-Ho Lin, an associate research professor with the MU Center for Agroforestry in the College of Agriculture, Food and Natural Resources. "This is the first time a system has been created that can automatically prioritize the pollutants released from a landfill based on their toxicity and abundance."

The system relies on an algorithm created by Elizabeth Rogers, a doctoral student working under Lin's guidance at the University of Missouri and a USDA Pathways Intern. Rogers drew from a previously existing system designed to prioritize chemicals in "fracking" wastewater and adapted it to apply to landfill pollution.

Combining the algorithm with three "toxicity databases" that are referenced when analyzing a sample from a landfill, the system takes a traditionally time-consuming and expensive process -- identifying a pollutant and determining its abundance and potential harm -- and makes it routine. The result is a prioritization system that can rank pollutants by taking into account both their overall toxicity and prevalence at a given site. In addition, the prioritization of pollutants can be easily customized based on factors and goals that can vary from site to site.

Ronald Zalesny Jr., a supervisory research plant geneticist for the USDA Forest Service who is also mentoring Rogers, worked with Lin and Rogers on the study optimizing the prioritization system and exploring its utility. For him, the ability to easily identify, quantify and rank landfill pollutants meets a very real need.

Zalesny Jr. is a principal investigator for a project that harnesses trees to clean up contaminated soils and water at landfills. Through a natural process known as phytoremediation, the poplar and willow trees help degrade, absorb and otherwise inhibit pollutants and the groundwater runoff that carries them.

Knowing which pollutants are the most important targets at a given location is crucial, said Zalesny Jr., because different trees employ different methods of removing pollutants from the soil, and no single method will work on every type of pollutant.

"In the past, we have mostly targeted the most common pollutants, such as herbicides and contaminants from crude oil," Zalesny Jr. said. "Using this prioritization tool, we could now go to basically any contaminated site, identify the top contaminants and match those contaminants with our trees to create a sustainable, long-term solution for cleaning up pollution."

Zalesny Jr.'s project is part of the Great Lakes Restoration Initiative, which seeks to protect the Great Lakes from environmental degradation by providing relevant funding to federal agencies. If contaminated runoff from landfills makes its way into rivers and streams, it could ultimately make its way into the Great Lakes, Zalesny Jr. said.

Rogers, who created the algorithm that can quickly sort pollutants by their relative toxicity, sees another important benefit to the system. While many landfill regulations have not been updated in decades, new classes of contaminants continue to arrive in landfills, posing a problem for those seeking to mitigate their effects. By offering scientists and researchers up to date information about hundreds of possible pollutants, the prioritization system could help environmental agencies tackle more of these dangerous new arrivals.

"Some of the most potentially harmful compounds that we identified using this scheme were from things like antibiotics or prescription medications, which could have serious impacts on the human endocrine system," Rogers said. "There were also compounds from personal care products. And while we know these newer classes of compounds can have negative impacts, there is still a lot we don't know about them, and they're ending up in landfills. Hopefully the use of this system will encourage more research into their impacts."

Credit: 
University of Missouri-Columbia

New research finds seating assignments on airplanes can reduce the spread of COVID-19

>

CATONSVILLE, MD, March 24, 2021 - COVID-19 has been shown to spread on airplanes by infected passengers, so minimizing the risk of secondary infections aboard aircraft may save lives. New research in the INFORMS journal Service Science uses two models to help solve the airplane seating assignment problem (ASAP). The models can lower the transmission risk of COVID-19 more so than the strategy of blocking the middle seats, given the same number of passengers.

"Blocking the middle seat on an airplane may provide limited benefit in reducing the risk of transmission of COVID-19. Rather, other health protocols are better supported at preventing the transmission of the virus," said Sheldon Jacobson, Founder Professor of computer science at the University of Illinois Urbana-Champaign.

The study, "Airplane Seating Assignment Problem," was conducted by Jacobson, John Pavlik and Ian Ludden, all from the University of Illinois Urbana-Champaign, alongside Edward Sewell from Southern Illinois University Edwardsville.

The researchers introduce two variations of a seating assignment model to optimize how each plane should seat passengers depending on the plane's size and how seats are distributed on the aircraft.

The two models are called vertex packing risk minimization (VPR) and risk-constrained vertex packing (RCV). To determine which to use to solve ASAP, you match the specific airplane being filled using the best-known disease transmission risks, then solve the instance to generate an optimal seating solution.

"Every seat on the airplane corresponds to a single vertex in the graph. Edges between vertices represent the risk of transmitting a disease between each pair of seats. If either seat in a pair is empty, then there is no risk of transmitting a disease, and hence VPR and RCV are closely related to the vertex packing problem," continued Jacobson.

"Given an undirected graph modeling an airplane seating assignment problem instance with appropriate risks on the edges, the solution to VPR is an optimal configuration of seats to minimize the risk of transmission for a fixed number of passengers on the airplane. Similarly, the solution to RCV is the maximum number of passengers and corresponding seat configuration given a maximum acceptable total risk of transmission."

Credit: 
Institute for Operations Research and the Management Sciences

Shining light to make hydrogen

image: D. desulfuricans-CdS hybrids display high H2 production activity, high stability and a remarkable efficiency in the direct use of solar energy

Image: 
Inês Cardoso Pereira; Mónica Martins

Decarbonizing the economy and achieving the transition from fossil fuels to renewable energies is one of the most urgent global challenges of the 21st century. Hydrogen can play a key role in this process as a promising climate-neutral energy vehicle. Yet, the so-called green hydrogen economy requires that hydrogen production be based exclusively on renewable energy. In addition, it should ideally not use expensive and rare metal catalysts, whose production has severe environmental consequences. To address this challenge, ITQB NOVA researchers Inês Cardoso Pereira and Mónica Martins are working on an innovative technology to produce hydrogen from light using non-photosynthetic microorganisms.

Hydrogen offers exciting new possibilities as an energy vehicle, but today's hydrogen production is still mostly done from fossil fuels. On the other hand, solar energy is the most abundant and ultimate ideal source, among various renewable options. Thus, sustainable strategies using the direct conversion of solar energy into valuable fuels like hydrogen are urgently needed.

In a study now published in Angewandte Chemie International Edition, the scientists describe a new approach based on biohybrid systems. These combine high hydrogen producing non-photosynthetic bacteria with self-produced cadmium sulfide (CdS) semiconductor nanoparticles that are very efficient in capturing light. "The development of biohybrids is a very exciting new area of research, where we can combine the high catalytic efficiency and specificity of biological systems with synthetic materials that have outstanding performances in capturing solar or electrical energies" highlights Inês Cardoso Pereira, head of the Bacterial Energy Metabolism Lab. "This field is growing rapidly and the most promising approach is to combine intact microorganisms with nanoparticles produced at their surface, which allows direct energy transfer between them".

The researchers investigated light-driven hydrogen production by biohybrids based on several bacteria. All the biohybrids generated produced H2 from light, but the one using Desulfovibrio desulfuricans, a bacterium found in soils, presented an outstanding activity. This bacterium contains high levels of hydrogenases, the enzymes involved in hydrogen production, and are efficient in producing extracellular sulfide nanoparticles. These self-produced nanoparticles capture light, which the bacterium can then use to produce H2. The results reveal that the D. desulfuricans-CdS hybrids display high H2 production activity, high stability and a remarkable efficiency in the direct use of solar energy, even in the absence of expensive and toxic mediators.

The use of microorganisms and self-produced light harvesting materials is a low-cost and sustainable approach to generate fuels. "This new biohybrid system is strong candidate for the development of a bioreactor prototype for greener H2 production" explains Mónica Martins.

Credit: 
Instituto de Tecnologia Química e Biológica António Xavier da Universidade NOVA de Lisboa ITQB NOVA

Fatty liver hepatitis is caused by auto-aggressive immune cells

Non-alcoholic steatohepatitis (NASH), often called 'fatty liver hepatitis', can lead to serious liver damage and liver cancer. A team of researchers at the Technical University of Munich (TUM) has discovered that this condition is caused by cells that attack healthy tissue - a phenomenon known as auto-aggression. Their results may help in the development of new therapies to avoid the consequences of NASH.

Fatty liver disease (NASH) is often associated with obesity. However, our understanding of the causes has been very limited. A team working with the immunologist Prof. Percy Knolle of TUM has now explored this process step by step in model systems based on mice - and gained promising insights into the mechanisms causing NASH in humans. "We have seen all of the steps observed in the model systems in human patients," says Prof. Knolle. The team's results will be published in Nature.

Auto-aggressive immune cells destroy liver tissue

The immune system protects us against bacteria and viruses and the development of cancerous tumors. The so-called CD8 killer T cells play an important role here. They specifically recognize infected body cells and eliminate them. With fatty liver hepatitis, the CD8 T cells have lost this targeted deactivation ability. "We have discovered that, in NASH, the immune cells are not activated by certain pathogens, but rather by metabolic stimuli," says Michael Dudek, the first author of the study. "The T cells activated in this way then kill liver cells of all types."

Sequential activation of T cells

Until that point, the immune cells undergo a unique, step-by-step - and previously unknown - activation process. The T cells develop their auto-aggressive properties only when exposed to inflammation signals and products of fat metabolism in the right order. "Like when we use the combination to unlock a safe, the T cells are switched to 'deadly mode' only through the defined sequence of activation stimuli," says Prof. Knolle, a professor of molecular immunology at TUM. As the trigger for the killing of tissue cells, the international team of researchers identified a basically harmless metabolite: the presence of the energy-carrying molecule ATP outside cells. When auto-aggressive CD8 T cells in the liver reacted with ATP, they destroyed nearby cells, thus causing NASH.

Auto-aggression, but not an auto-immune disorder

The destruction of tissue through auto-aggressive immune cells, as discovered by the researchers, differs from familiar auto-immune disorders, in which immune system cells specifically attack certain cells in the body. The authors note, however, that the tissue-destroying auto-aggressive T cells may also play a role in auto-immune pathologies that has yet to be discovered.

New therapies for fatty liver hepatitis

Until now, the only way of reversing the effects of fatty liver hepatitis was to eliminate the underlying factors - namely obesity and a high-calorie diet. In other words, patients had to change their lifestyles. The realization that the disease is caused by activated immune cells now suggests possibilities for the development of new therapies. "The destructive auto-aggressive form of the immune response is fundamentally different from the protective T cell immune response to viruses and bacteria," says Prof. Knolle. He is confident that further research can identify targeted immunotherapies that simply prevent the destruction of tissue.

Credit: 
Technical University of Munich (TUM)

Photosynthesis could be as old as life itself

Researchers find that the earliest bacteria had the tools to perform a crucial step in photosynthesis, changing how we think life evolved on Earth.

The finding also challenges expectations for how life might have evolved on other planets. The evolution of photosynthesis that produces oxygen is thought to be the key factor in the eventual emergence of complex life. This was thought to take several billion years to evolve, but if in fact the earliest life could do it, then other planets may have evolved complex life much earlier than previously thought.

The research team, led by scientists from Imperial College London, traced the evolution of key proteins needed for photosynthesis back to possibly the origin of bacterial life on Earth. Their results are published and freely accessible in BBA - Bioenergetics.

Lead researcher Dr Tanai Cardona, from the Department of Life Sciences at Imperial, said: "We had previously shown that the biological system for performing oxygen-production, known as Photosystem II, was extremely old, but until now we hadn't been able to place it on the timeline of life's history. Now, we know that Photosystem II show patterns of evolution that are usually only attributed to the oldest known enzymes, which were crucial for life itself to evolve."

Photosynthesis, which converts sunlight into energy, can come in two forms: one that produces oxygen, and one that doesn't. The oxygen-producing form is usually assumed to have evolved later, particularly with the emergence of cyanobacteria, or blue-green algae, around 2.5 billion years ago.

While some research has suggested pockets of oxygen-producing (oxygenic) photosynthesis may have been around before this, it was still considered to be an innovation that took at least a couple of billion years to evolve on Earth.

The new research finds that enzymes capable of performing the key process in oxygenic photosynthesis - splitting water into hydrogen and oxygen - could actually have been present in some of the earliest bacteria. The earliest evidence for life on Earth is over 3.4 billion years old and some studies have suggested that the earliest life could well be older than 4.0 billion years old.

Like the evolution of the eye, the first version of oxygenic photosynthesis may have been very simple and inefficient; as the earliest eyes sensed only light, the earliest photosynthesis may have been very inefficient and slow.

On Earth, it took more than a billion years for bacteria to perfect the process leading to the evolution of cyanobacteria, and two billion years more for animals and plants to conquer the land. However, that oxygen production was present at all so early on means in other environments, such as on other planets, the transition to complex life could have taken much less time.

The team made their discovery by tracing the 'molecular clock' of key photosynthesis proteins responsible for splitting water. This method estimates the rate of evolution of proteins by looking at the time between known evolutionary moments, such as the emergence of different groups of cyanobacteria or land plants, which carry a version of these proteins today. The calculated rate of evolution is then extended back in time, to see when the proteins first evolved.

They compared the evolution rate of these photosynthesis proteins to that of other key proteins in the evolution of life, including those that form energy storage molecules in the body and those that translate DNA sequences into RNA, which is thought to have originated before the ancestor of all cellular life on Earth. They also compared the rate to events known to have occurred more recently, when life was already varied and cyanobacteria had appeared.

The photosynthesis proteins showed nearly identical patterns of evolution to the oldest enzymes, stretching far back in time, suggesting they evolved in a similar way.

First author of the study Thomas Oliver, from the Department of Life Sciences at Imperial, said: "We have used a technique called Ancestral Sequence Reconstruction to predict the protein sequences of ancestral photosynthetic proteins. These sequences give us information on how the ancestral Photosystem II would have worked and we were able to show that many of the key components required for oxygen evolution in Photosystem II can be traced to the earliest stages in the evolution of the enzyme."

Knowing how these key photosynthesis proteins evolve is not only relevant for the search for life on other planets, but could also help researchers find strategies to use photosynthesis in new ways through synthetic biology.

Dr Cardona, who is leading such a project as part of his UKRI Future Leaders Fellowship, said: "Now we have a good sense of how photosynthesis proteins evolve, adapting to a changing world, we can use 'directed evolution' to learn how to change them to produce new kinds of chemistry. We could develop photosystems that could carry out complex new green and sustainable chemical reactions entirely powered by light."

Credit: 
Imperial College London