Tech

Lactobacillus balances gut microbiome and improves chronic-alcohol-induced liver injury

image: Journal of Medicinal Food is an the only peer-reviewed journal focusing exclusively on the medicinal value and biomedical effects of food materials.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, December 9, 2019--Researchers demonstrated that Lactobacillus rhamnosus can dose-dependently reestablish a balanced intestinal microbiome and counter the liver-damaging effects of alcohol consumption in mice to reverse the results of chronic alcohol-induced liver injury. The design, results, and implications of this new study are published in Journal of Medicinal Food, a peer-reviewed journal from Mary Ann Liebert, Inc., publishers. Click here to read the full-text article free on the Journal of Medicinal Food website through January 9, 2019.

The article entitled "Lactobacillus rhamnosus Granules Dose-Dependently Balance Intestinal Microbiome Disorders and Ameliorate Chronic Alcohol-Induced Liver Injury" was coauthored by Yuhua Wanga and colleagues from Jilin Agricultural University, National Processing Laboratory for Soybean Industry and Technology, and National Engineering Laboratory for Wheat and Corn Deep Processing, Changchun, China.

In this study, mice consumed alcohol for 8 weeks and were fed Lactobacillus rhamnosus granules (LGG) the last 2 weeks, in varying doses (low, medium, and high) together with a high fat diet. The researchers showed that LGG administration dose-dependently improved alcohol-induced liver injury by reducing fat accumulation and the inflammatory response in liver. LGG consumption also ameliorated the liver damage. The probiotic effect of the LGG restored a healthy balance in the gut microbiome, which was damaged by the alcohol consumption. LGG reduced the number of gram-negative bacteria and increased gram-positive bacteria, including in the ileum and cecum.

"This demonstration of the impact of a probiotic intervention correcting alcohol-induced dysbiosis and reducing liver inflammation and fat accumulation has exciting potential in the prevention and treatment of alcohol-induced liver disease as well as non-alcoholic fatty liver disease," states Journal of Medicinal Food Editor-in-Chief Michael Zemel, PhD, Professor Emeritus, The University of Tennessee and Chief Scientific Officer, NuSirt Biopharma.

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Consider soil in fall-applied ammonia rates, Illinois study says

URBANA, Ill. - Fall-applied anhydrous ammonia may not fulfill as much of corn's nitrogen needs as previously assumed. According to a new study from the University of Illinois, the effectiveness of the practice depends on the soil.

The study used a "tagged" form of ammonia to determine how much of the nitrogen in corn grain and plant material comes from fertilizer, versus nitrogen supplied naturally by the soil.

"There have been a number of studies to compare yields with fall- versus spring-applied ammonia or other treatments. But our study is different because we are tracing the nitrogen from the fertilizer ammonia into either the grain or the whole corn plant above ground. That's what makes this unique," says Richard Mulvaney, professor in the Department of Natural Resources and Environmental Sciences at Illinois.

Mulvaney and his graduate student, Kelsey Grieshiem, used a stable isotope of nitrogen, 15N, in formulating the tagged ammonia. They applied it at a typical rate of 200 pounds per acre in mid- to late November in six Illinois fields in 2016 and 2017.

The fields differed in soil type and crop rotation. Four were Mollisols, which Mulvaney describes as prairie soils: rich, black, and productive. The two others were Alfisols, or timber soils, which are typically poorer in comparison to Mollisols. Two of the Mollisol fields were cropped to continuous corn, while the rest were a under a corn-soy rotation.

After the fall ammonia application, the researchers looked for the 15N isotope in corn plant and grain material at harvest in the following growing season. Any nitrogen not tagged with the isotope was assumed to have come from natural nitrogen stores in the soil, rather than from the fertilizer.

"Just as we expected, the poorest soil showed the highest uptake efficiency while the richer soils were much lower," Griesheim says. "On average, only 21% of the nitrogen applied was recovered in the grain, ranging from 34% at the poorest Alfisol to 12% for the richest Mollisol.

"Farmers apply ammonia in the fall thinking they've supplied nitrogen to their corn crop for the coming year. But based on our results, most of the fertilizer nitrogen will not be taken up by the crop."

If it were up to Mulvaney, he would recommend nitrogen applied as a side dressing, which delivers the fertilizer when the plant is actively growing. "You're fighting time with fall applications," he says. "You're counting on keeping the nitrogen in the soil for six months before the next crop needs to take it up."

The study also evaluated the effectiveness of N-serve (nitrapyrin), a nitrification inhibitor commonly applied in the fall with anhydrous ammonia. This product is meant to slow down the microbial conversion from ammonium, which is immobile in the soil, to nitrate, which, during a wet spring, can leach away or be lost as a gas.

"Nitrapyrin has long been used in conjunction with fall-applied ammonia with the motive of increasing uptake efficiency. We didn't find that to be the case at all. And in fact, in our study, the only significant effect of the product was a yield decrease on continuous corn," Mulvaney says.

Should farmers apply anhydrous ammonia in the fall?

Griesheim says, "Considering the low uptake efficiencies observed in our study, farmers should think twice before putting their nitrogen on in the fall. Low fertilizer recoveries mean less return from the farmer's fertilizer investment and a higher risk of environmental pollution."

Adds Mulvaney, "To make matters worse, farmers have been taught for decades that they should follow yield-based recommendations, such that the soils that generate the highest yields need the most fertilizer. But that inverts reality. The highest yielding soils need the least fertilizer. Our research supports that."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Research shows ramping up carbon capture could be key to mitigating climate change

image: New research shows that capturing carbon dioxide under the ocean floor, as depicted in this graphic, could help meet emission reduction goals and fight climate change.

Image: 
UK Department of Energy & Climate Change

As the world gathers in Madrid to discuss how to reduce greenhouse gas emissions to fight climate change, a newly released study makes the case that trapping emissions underground could go a long way toward solving the problem.

The study - from The University of Texas at Austin, the Norwegian University of Science and Technology and the Equinor Research Centre - looks at the technology of carbon capture and storage (CCS), which is a method of capturing carbon dioxide from industrial and power plants and storing it more than a mile underground within tiny spaces in the rock.

The United Nation's Intergovernmental Panel on Climate Change (IPCC) has stated that CCS needs to achieve 13% of the world's necessary emission reductions by 2050. Some policy-makers, industry representatives and nongovernment organizations are dubious that CCS can meet its portion of the goal, but the new study published in Nature Scientific Reports shows that CCS could achieve its targets.

"With this paper, we provide an actionable, detailed pathway for CCS to meet the goals," said coauthor Tip Meckel of UT's Bureau of Economic Geology. "This is a really big hammer that we can deploy right now to put a dent in our emissions profile."

The paper looks at the amount of geological space available in formations that is likely suitable to hold greenhouse gas emissions, keeping them from the atmosphere. It also calculates the number of wells needed worldwide to reach the IPCC's 2050 goal.

It concludes there is easily enough space in the word's nearshore continental margins to meet the IPCC's goal of storing 6 to 7 gigatons of carbon dioxide a year by 2050, and that the goal could be achieved by installing 10,000 to 14,000 injection wells worldwide in the next 30 years.

That may sound like a lot of wells, but the researchers point out that the oil and gas industry has already shown that speedy build-up of infrastructure is possible. They point to the worldwide CCS deployment required over the next three decades being roughly equivalent to the development of oil and gas infrastructure in the Gulf of Mexico over the past 70 years, or five times the development of Norwegian oil and gas infrastructure in the North Sea.

"The great thing about this study is that we have inverted the decarbonization challenge by working out how many wells are needed to achieve emissions cuts under the 2-degree (Celsius) scenario," said lead author Philip Ringrose, an adjunct professor at the Norwegian University of Science and Technology. "It turns out to be only a fraction of the historical petroleum industry -- or around 12,000 wells globally. Shared among 5-7 continental CCS hubs -- that is only about 2,000 wells per region. Very doable! But we need to get cracking as soon as possible."

Meckel also points out that there are significant tax credits in the United States to help make carbon capture projects possible and said this could act as a model for other countries, particularly those with industries near the coast where CO2 could be more easily transported by pipelines into underground geological formations off shore.

Computer models used by the IPCC and the International Energy Agency call for reductions in carbon dioxide emissions to come from a blend of growth in renewables, energy efficiency, and decarbonizing energy production and consumption through technologies like CCS. Other strategies include replacing coal power generation with natural gas; replacing wood, biomass and other carbon-based cooking fuels; using natural gas and fuel cell vehicles; and using pipelines instead of diesel barges, trains and trucks to transport natural gas.

Representatives from 200 countries are currently in Madrid to hammer out details for how to meet the emission reductions goals at the UN Climate Change Conference COP 25. Meckel said this is the perfect time for leaders to take a hard look at how carbon storage could play a significant role.

"We've shown that the geology and the speed of the development could meet the goal," said Meckel. "This is a very pragmatic way of going after this."

Credit: 
University of Texas at Austin

You create your own false information, study finds

COLUMBUS, Ohio - Along with partisan news outlets and political blogs, there's another surprising source of misinformation on controversial topics - it's you.

A new study found that people given accurate statistics on a controversial issue tended to misremember those numbers to fit commonly held beliefs.

For example, when people are shown that the number of Mexican immigrants in the United States declined recently - which is true but goes against many people's beliefs - they tend to remember the opposite.

And when people pass along this misinformation they created, the numbers can get further and further from the truth.

"People can self-generate their own misinformation. It doesn't all come from external sources," said Jason Coronel, lead author of the study and assistant professor of communication at The Ohio State University.

"They may not be doing it purposely, but their own biases can lead them astray. And the problem becomes larger when they share their self-generated misinformation with others."

Coronel conducted the study with Shannon Poulsen and Matthew Sweitzer, both doctoral students in communication at Ohio State. The study was published online in the journal Human Communication Research and will appear in a future print edition.

The researchers conducted two studies.

In the first study, the researchers presented 110 participants with short written descriptions of four societal issues that involved numerical information.

On two of those societal issues, the researchers did pre-tests and found that the factually accurate numerical relationship fit with many people's understanding of the issue. For example, many people generally expect more Americans to support same-sex marriage than oppose it, which coincides with public opinion polls.

But the researchers also presented participants with two issues for which the numbers didn't fit with how most people viewed the topics.

For example, most people believe that the number of Mexican immigrants in the United States grew between 2007 and 2014. But in fact, the number declined from 12.8 million in 2007 to 11.7 million in 2014.

After reading all the descriptions of the issues, the participants got a surprise. They were asked to write down the numbers that that were in the descriptions of the four issues. They were not told in advance they would have to memorize the numbers.

The researchers found that people usually got the numerical relationship right on the issues for which the stats were consistent with how many people viewed the world. For example, participants typically wrote down a larger number for the percentage of people who supported same-sex marriage than for those who opposed it - which is the true relationship.

But when it came to the issues where the numbers went against many people's beliefs - such as whether the number of Mexican immigrants had gone up or down - participants were much more likely to remember the numbers in a way that agreed with their probable biases rather than the truth.

"We had instances where participants got the numbers exactly correct - 11.7 and 12.8 - but they would flip them around," Coronel said.

"They weren't guessing - they got the numbers right. But their biases were leading them to misremember the direction they were going."

By using eye-tracking technology on participants while they read the descriptions of the issues, the researchers had additional evidence that people really were paying attention when they viewed the statistics.

"We could tell when participants got to numbers that didn't fit their expectations. Their eyes went back and forth between the numbers, as if they were asking 'what's going on.' They generally didn't do that when the numbers confirmed their expectations," Coronel said.

"You would think that if they were paying more attention to the numbers that went against their expectations, they would have a better memory for them. But that's not what we found."

In the second study, the researchers investigated how these memory distortions could spread and grow more distorted in everyday life. They designed a study similar to the childhood game of "telephone."

For example, the first person in the "telephone chain" in this study saw the accurate statistics about the trend in Mexican immigrants living in the United States (that it went down from 12.8 million to 11.7 million). They had to write those numbers down from memory, which were then passed along to the second person in the chain, who had to remember them and write them down. The second person's estimates were then sent to a third participant.

Results showed that, on average, the first person flipped the numbers, saying that the number of Mexican immigrants increased by 900,000 from 2007 to 2014 instead of the truth, which was that it decreased by about 1.1 million.

By the end of the chain, the average participant had said the number of Mexican immigrants had increased in those 7 years by about 4.6 million.

"These memory errors tended to get bigger and bigger as they were transmitted between people," Sweitzer said.

Coronel said the study did have limitations. For example, it is possible that the participants would have been less likely to misremember if they were given explanations as to why the numbers didn't fit expectations. And the researchers didn't measure each person's biases going in - they used the biases that had been identified by pre-tests they conducted.

Finally, the telephone game study did not capture important features of real-life conversations that may have limited the spread of misinformation.

But the results did suggest that we shouldn't worry only about the misinformation that we run into in the outside world, Poulsen said.

"We need to realize that internal sources of misinformation can possibly be as significant as or more significant than external sources," she said.

"We live with our biases all day, but we only come into contact with false information occasionally."

Credit: 
Ohio State University

Has physics ever been deterministic?

Researchers from the Austrian Academy of Sciences, the University of Vienna and the University of Geneva, have proposed a new interpretation of classical physics without real numbers. This new study challenges the traditional view of classical physics as deterministic.

In classical physics it is usually assumed that if we know where an object is and its velocity, we can exactly predict where it will go. An alleged superior intelligence having the knowledge of all existing objects at present, would be able to know with certainty the future as well as the past of the universe with infinite precision. Pierre-Simon Laplace illustrated this argument, later called Laplace's demon, in the early 1800s to illustrate the concept of determinism in classical physics. It is generally believed that it was only with the advent of quantum physics that determinism was challenged. Scientists found out that not everything can be said with certainty and we can only calculate the probability that something could behave in a certain way.

But is really classical physics completely deterministic? Flavio Del Santo, researcher at Vienna Institute for Quantum Optics and Quantum Information of the Austrian Academy of Sciences and the University of Vienna, and Nicolas Gisin from the University of Geneva, address this question in their new article "Physics without Determinism: Alternative Interpretations of Classical Physics", published in the journal Physical Review A. Building on previous works of the latter author, they show that the usual interpretation of classical physics is based on tacit additional assumptions. When we measure something, say the length of a table with a ruler, we find a value with a finite precision, meaning with a finite number of digits. Even if we use a more accurate measurement instrument, we will just find more digits, but still a finite number of them. However, classical physics assumes that even if we may not be able to measure them, there exist an infinite number of predetermined digits. This means that the length of the table is always perfectly determined.

Imagine now to play a variant of the Bagatelle or pin-board game (as in figure), where a board is symmetrically filled with pins. When a little ball rolls down the board, it will hit the pins and move either to the right or to the left of each of them. In a deterministic world, the perfect knowledge of the initial conditions under which the ball enters the board (its velocity and position) determines unambiguously the path that the ball will follow between the pins. Classical physics assumes that if we cannot obtain the same path in different runs, it is only because in practice we were not able to set precisely the same initial conditions. For instance, because we do not have an infinitely precise measurement instrument to set the initial position of the ball when entering the board.

The authors of this new study propose an alternative view: after a certain number of pins, the future of the ball is genuinely random, even in principle, and not due to the limitations of our measurement instruments. At each hit, the ball has a certain propensity or tendency to bounce on the right or on the left, and this choice is not determined a priori. For the first few hits, the path can be determined with certainty, that is the propensity is 100% for the one side and 0% for the other. After a certain number of pins, however, the choice is not pre-determined and the propensity gradually reaches 50% for the right and 50% for the left for the distant pins. In this way, one can think of each digit of the length of our table as becoming determined by a process similar to the choice of going left or right at each hit of the little ball. Therefore, after a certain number of digits, the length is not determined anymore.

The new model introduced by the researchers hence refuses the usual attribution of a physical meaning to mathematical real numbers (numbers with infinite predetermined digits). It states instead that after a certain number of digits their values become truly random, and only the propensity of taking a specific value is well defined. This leads to new insights on the relationship between classical and quantum physics. In fact, when, how and under what circumstances an indeterminate quantity takes a definite value is a notorious question in the foundations of quantum physics, known as the quantum measurement problem. This is related to the fact that in the quantum world it is impossible to observe reality without changing it. In fact, the value of a measurement on a quantum object is not yet established until an observer actually measures it. This new study, on the other hand, points out that the same issue could have always been hidden also behind the reassuring rules of classical physics.

Credit: 
University of Vienna

Virtual reality illuminates the power of opioid-associated memories

The number of drug overdose deaths in the US has never been higher. Most of those deaths--68 percent in 2017--have involved opioids. An estimated 10 million Americans aged 12 and older misused opioids in 2018.

Insights from virtual reality experiments could help break the cycle of addiction, suggests new research from Washington University School of Medicine, St. Louis. The findings were presented at the 58th Annual Meeting of The American College of Neuropsychopharmacology (ACNP) in Orlando, FL, December 8-11, 2019.

Memories associated with drug use, such as where it occurred and who was present, can trigger craving and ultimately relapse. Researchers led by Dr. Sidney Williams, in the laboratories of Dr. Edward Han and Dr. Jose Moron-Concepcion, used virtual reality to study how these memories are made.

Mice playing an immersive virtual reality video game were taught to associate a specific room with morphine. They formed contextual memories of that environment--related to the emotional, social, spatial, or temporal circumstances of an event--that later triggered morphine-seeking behavior.

Certain brain cells called place cells, which are used to make a mental map of the environment, were active at important times during the experiments, the researchers find--and the pattern was different in the morphine-associated room than in a room associated with water rewards. Place cells are in the hippocampus, the part of the brain necessary for contextual memory formation. In these experiments, the investigators saw a surprising decrease in place cells when mice were seeking morphine in the morphine-paired room, but the activity of those remaining place cells became more selective. The brain formed memories of the drug-paired room differently than it did in other rooms, the researchers conclude.

This suggests special neural mechanisms are engaged by drug use which could explain the strength and influence of these memories. The findings also raise the possibility that drug-associated memories could be specifically targeted for disruption to break the cycle of craving and relapse.

Credit: 
American College of Neuropsychopharmacology

Genomic features of AML in patients over age 60 can predict success of stem cell transplant

image: This is R. Coleman Lindsley, MD, PhD.

Image: 
Dana-Farber Cancer Institute

For older patients with acute myeloid leukemia (AML), the prospects for success of a stem cell transplant can often be predicted based on the particular set of genetic mutations within the tumor cells, investigators at Dana-Farber Cancer Institute and other research centers will report today at the 61st American Society of Hematology (ASH) Annual Meeting.

By analyzing AML tissue samples from 300 patients age 60 or older, the investigators found that mutations present in the samples at the time of diagnosis indicated whether patients were likely to receive a long-term benefit from a transplant or were likely to relapse. The results enabled researchers to establish criteria for low, high, and intermediate-risk groups of patients by the presence or absence of certain mutations in their leukemia cells.

The risk classification system stands to have an impact on how this group of patients is treated, according to the researchers. Patients at low genetic risk may be candidates for treatments that produce less toxicity. Patients at high risk might find it advantageous to join clinical trials of new approaches that can reduce the chance of relapse.

"A bone marrow stem cell transplant is curative in some older patients with AML, but toxicities and risk of relapse can be high," said study first author H. Moses Murdock, a Research Fellow at Dana-Farber and student at the Perelman School of Medicine at the University of Pennsylvania. Murdock is a student in the lab of R. Coleman Lindsley, MD, PhD, physician in the Adult Leukemia Program at Dana-Farber and senior author of the study. "We wanted to explore whether genetics could be used to identify patients who are likely to do well with existing approaches and those who may benefit from alternative transplant strategies."

In the study, researchers sequenced 112 genes in diagnostic AML samples from 300 patients age 60 or older who had undergone an allogeneic stem cell transplant at six U.S. transplant centers. All the transplants took place after patients had achieved their first remission with chemotherapy treatment.

The researchers then examined whether mutations - or lack of mutations - in any of these genes correlated with how patients fared after the transplant. They found that patients with a mutation in either TP53 or JAK2, or the presence of a FLT3-ITD (internal tandem duplication without an accompanying mutation in NPM1) had the highest risk of death or relapse. Patients without mutations in these genes but mutations in either DNMT3A or DDX41, or NPM1 (without an accompanying FLT3-ITD) were at low risk of relapse. Patients without high or low risk mutations were categorized as being at intermediate risk. After adjusting for clinical variables, only 5% percent of patients in the highest risk group were alive and free of leukemia after 3 years, compared to 70% of those in the lowest-risk group.

Of the patients studied, about 25% qualified as low risk, while about 16% were in the highest risk group.

"We've shown that genetic features of AML in older patients can predict outcomes after transplantation," Murdock remarked. "Our findings will be useful in taking patients who are in remission and offering them the best path to a cure."

Credit: 
Dana-Farber Cancer Institute

Current treatment for fungal meningitis is fueling drug resistance

A common first-line treatment approach for cryptococcal meningitis in low-income countries is being compromised by the emergence of drug resistance, new University of Liverpool research warns.

Published in the journal mBio, the findings highlight the need to develop new drugs and treatment regimens for the lethal brain infection, which kills around 180,000 people each year.

Cryptococcal meningitis is a leading cause of death among adults with HIV/AIDS in sub-Saharan Africa. In many parts of the world, the antifungal drug fluconazole is the only agent that is available for the initial treatment of the infection, despite considerable evidence that long-term outcomes are poor.

Drug resistance is thought to play a role in these poor outcomes, but solid data is currently lacking and a better understanding of the relationship between fluconazole exposure and the emergence of resistance has been needed.

In an experimental study in a novel hollow fibre model and mice, the relationship between drug exposure and both antifungal killing and the emergence of resistance to fluconazole was quantified. These results were then bridged to patients in a clinical study where patients with cryptococcal meningitis in Tanzania receiving fluconazole were studied.

The findings showed that fluconazole resistance is caused by duplication of the fungus' chromosomes and occurs while patients are on therapy. Simulations from mathematical models fitted to the patients' data suggested that only 12.8% of patients receiving fluconazole at the recommended 1,200 mg/day were completely free of the fungus after two weeks. Furthermore, 83.4% had a persistent subpopulation that was resistant to fluconazole. Preventing this would require significant dosage escalation of fluconazole beyond what is currently recommended.

William Hope, Professor of Therapeutics and Infectious Diseases at the University of Liverpool, said: "This study is unique in that it combines information from experimental models of cryptococcal meningitis with data from patients receiving fluconazole as monotherapy, which remains the norm in much of Africa, despite not being consistent with current treatment recommendations from the WHO.

"We've shown that the emergence of resistance is related to drug exposure and occurs with the use of clinically-relevant monotherapy regimens. Hence the only drug available to many patients in Africa is compromised by the ability of the fungus to develop resistance while the patient is taking fluconazole.

"Our findings underscore the urgent need for the development of new agents and combinations to reduce the global toll of cryptococcal meningitis."

Antimicrobial resistance is a growing threat and there has been a global push to address to develop new drugs. Led by Professor Hope, the University of Liverpool's Centre for Antimicrobial Pharmacodynamics provides state-of-the-art research facilities for PK/PD studies and offers preclinical and early phase clinical support to ensure new drugs are developed in a streamlined manner.

Credit: 
University of Liverpool

Novel way to ID disease-resistance genes in chocolate-producing trees found

image: Photographs of inoculated leaves from the four most susceptible and four most tolerant cacao genotypes. The right side of the leaves were inoculated with Phytophthora palmivora myceli; left sides were mock inoculated with a sterile pathogen-growing medium. The top row includes photographs of the four most susceptible varieties, and the bottom row shows the four most tolerant.

Image: 
Penn State

Chocolate-producing cacao trees that are resistant to a major pathogen were identified by an international team of plant geneticists. The findings point the way for plant breeders to develop trees that are tolerant of the disease.

The method researchers used to rapidly identify resistance genes could be used for any trait that has a genetic link in any plant, according to the team's leader, Mark Guiltinan, professor of plant molecular biology, Penn State College of Agricultural Sciences. He contends that the strategy represents a major step forward in the quest to develop disease resistance in long generational plants such as trees.

The focus of the study was the tropical tree Theobroma cacao, the source of chocolate. Its seeds are a major export from many producing countries in Central and South America, Africa, and Asia -- but every year, 30-40% of preharvest yield is lost to diseases.

For a long time, scientists have been trying to devise ways to reduce those losses, Guiltinan pointed out, adding that bolstering cacao's resistance to disease is the most efficient and environmentally friendly approach for disease management.

To determine how best to boost cacao resistance, Guiltinan assembled what he called "an amazing transdisciplinary team."

The researchers set out to measure the susceptibility of 60 genetically diverse genotypes of cacao to Phytophthora palmivora -- a major cacao pathogen with global importance -- by first collecting leaf samples from cacao trees at the International Cocoa Collection (CATIE), in Turrialba, Costa Rica.

They focused on trees from four genetic groups -- Guiana, Iquitos, Marañon and Nanay, isolated from remote regions of the Amazon, which give them their names.

Researchers also collected leaves from trees representing four genotypes of interest outside of these genetic groups: Scavina 6, a known source of broad-spectrum disease tolerance; Imperial College Selection 1, a Trinitario hybrid known to be highly susceptible to disease; and CATIE R4 and R6, two recently developed hybrid genotypes with strong tolerance to the fungal disease Moniliophthora roreri, but which are susceptible and moderately tolerant, respectively, to Phytophthora pod rot.

Over the nine-month sampling period, researchers inoculated 1,250 cacao leaves with the pathogen Phytophthora palmivora and then analyzed and measured resulting lesions to assess disease resistance. They identified 16 genotypes showing the strongest disease tolerance and another 16 that they judged to be susceptible. Then they deep-sequenced the genomes of most of those trees to see how they differed.

"We sequenced the genomes of 31 of those plants to enable a powerful population-genetics approach, combined with insights from evolutionary and ecological theories to discover high-priority candidate disease-resistance genes," Guiltinan said. "Once we identify the resistance genes and their locations, we'll give that knowledge, along with molecular markers, to plant breeders."

That information will enable breeders to improve cacao genetics more precisely and rapidly, with the ultimate goal of developing more sustainable cacao farming systems and improving the well-being of cacao farmers worldwide, Guiltinan said.

This new approach to discovering the genetic basis for disease resistance is possible only because of advances in DNA sequencing in recent years that have dramatically quickened the process and lowered costs, Guiltinan noted.

Much of the research, published today (Dec. 6) in Tree Genetics & Genomes, was conducted at the Genomic Facility in Penn State's Huck Institutes of the Life Sciences and by the genome sequencer at Penn State Hershey Medical Center.

The leap in technology has ushered in a new era of disease-resistance research for trees, which have long generational times and previously took decades to show results of breeding and genetic manipulation, Guiltinan said.

"Now, instead of studying one gene at a time, we are studying one genome at a time. We are studying thousands of genes and the genetic diversity of these in different genotypes of cacao -- we are at a whole new level."

The implications of this research extend far beyond cacao, Guiltinan suggested, explaining that this finding could ultimately contribute to enhanced global food and economic security.

"A major constraint to increased food production is crop losses due to microbial plant diseases, which destroy about 15 percent of the world's total crop production every year," he said. "Advances in the science of plant disease control are needed to reduce these disease-related losses, and discovery of resistance genes will unlock the power of genetic diversity for breeding of resistant plants of the future."

Credit: 
Penn State

New report shows dramatic health benefits following air pollution reduction

Dec. 6, 2019 - Reductions in air pollution yielded fast and dramatic impacts on health-outcomes, as well as decreases in all-cause morbidity, according to findings in "Health Benefits of Air Pollution Reduction," new research published in the American Thoracic Society's journal, Annals of the American Thoracic Society.

The study by the Environmental Committee of the Forum of International Respiratory Societies (FIRS) reviewed interventions that have reduced air pollution at its source. It looked for outcomes and time to achieve those outcomes in several settings, finding that the improvements in health were striking. Starting at week one of a ban on smoking in Ireland, for example, there was a 13 percent drop in all-cause mortality, a 26 percent reduction in ischemic heart disease, a 32 percent reduction in stroke, and a 38 percent reduction in chronic obstructive pulmonary disease (COPD). Interestingly, the greatest benefits in that case occurred among non-smokers.

"We knew there were benefits from pollution control, but the magnitude and relatively short time duration to accomplish them were impressive," said lead author of the report, Dean Schraufnagel, MD, ATSF. "Our findings indicate almost immediate and substantial effects on health outcomes followed reduced exposure to air pollution. It's critical that governments adopt and enforce WHO guidelines for air pollution immediately."

In the United States, a 13-month closure of a steel mill in Utah resulted in reducing hospitalizations for pneumonia, pleurisy, bronchitis and asthma by half. School absenteeism decreased by 40 percent, and daily mortality fell by 16 percent for every 100 μg/m3 PM10 (a pollutant) decrease. Women who were pregnant during the mill closing were less likely to have premature births.

A 17-day "transportation strategy," in Atlanta, Georgia during the 1996 Olympic Games involved closing parts of the city to help athletes make it to their events on time, but also greatly decreased air pollution. In the following four weeks, children's visits for asthma to clinics dropped by more than 40 percent and trips to emergency departments by 11 percent. Hospitalizations for asthma decreased by 19 percent. Similarly, when China imposed factory and travel restrictions for the Beijing Olympics, lung function improved within two months, with fewer asthma-related physician visits and less cardiovascular mortality.

In addition to city-wide polices, reducing air pollution within the home also led to health benefits. In Nigeria, families who had clean cook stoves that reduced indoor air pollution during a nine-month pregnancy term saw higher birthweights, greater gestational age at delivery, and less perinatal mortality.

The report also examines the impact of environmental policies economically. It highlights that 25 years after enactment of the Clean Air Act, the U.S. EPA estimated that the health benefits exceeded the cost by 32:1, saving 2 trillion dollars, and has been heralded as one of the most effective public health policies of all time in the United States. Emissions of the major pollutants (particulate matter [PM], sulfur oxides, nitrogen oxides, carbon monoxide, volatile organic compounds, and lead) were reduced by 73 percent between 1990 and 2015 while the U.S. gross domestic product grew by more than 250 percent.

Given these findings, Dr. Schraufnagel has hope. "Air pollution is largely an avoidable health risk that affects everyone. Urban growth, expanding industrialization, global warming, and new knowledge of the harm of air pollution raise the degree of urgency for pollution control and stress the consequences of inaction," he says. "Fortunately, reducing air pollution can result in prompt and substantial health gains. Sweeping policies affecting a whole country can reduce all-cause mortality within weeks. Local programs, such as reducing traffic, have also promptly improved many health measures."

Credit: 
American Thoracic Society

A new view for glasses

Tokyo, Japan - Researchers at The University of Tokyo introduced a new physical model that predicts the dynamics of glassy materials based solely on their local degree of atomic structural order. Using computer simulations, they showed how this theory greatly improves our understanding of how glassy liquids become more viscous on cooling. This work has many potential applications in manufacturing, especially for special glass production in labware and electronic touchscreen devices.

Glass has been produced by humanity since antiquity. However, the physics that controls the motion of the atoms in glassy materials is incredibly complex and still not completely understood. In contrast with most crystalline solids, in which atoms arrange themselves roughly into large repeating lattices, glasses are made of configurations of atoms that show no long-range ordering. As anyone who has watched a glassblower knows, at high temperatures, glass flows like a liquid. This means that the atoms inside have enough mobility to slide past each other. However, as the material cools, it experiences a "glass transition" in which the atoms move slower and slower until they become locked into a disordered "frozen liquid" state. That is, they would have been more stable in a crystalline configuration, but they can't overcome the barrier to get there. Experts often describe the dynamics of the glass as using its "structural relaxation time," which represents how quickly the atoms approach the stable state.

Now, scientists at The University of Tokyo used computer simulations to define the "structural order parameter" which depends only on the local configuration of an atom and its immediate neighbors. This value provides a measure of the deviation from the most efficient packing of the surrounding atoms. Based only on the structural order parameter, the researchers were able to predict the structural relaxation time. "Since relaxation is apparently affected by many physical factors, we were pleasantly surprised that we were able to describe it based solely on the structural order," says first author Hua Tong. By performing extensive computer simulations, they were able to confirm the relationship between local ordering and overall dynamics. This is a unique feature of glass that is not usually seen in crystalline solids. "Our research provides a physical framework to understand how the correlation between locations grows in size, such that the dynamics at the atomic level begin to have cooperativity over an extended region," explains senior author Hajime Tanaka. The findings of this project may help design new processes for manufacturing stronger and more durable glass. The findings of this project may help design new processes for manufacturing stronger and more durable glass.

Credit: 
Institute of Industrial Science, The University of Tokyo

How do you cultivate a healthy plant microbiome?

image: Plant leaves harbor a diverse microbial community, as seen in this plate of agar stamped with the leaf of a greenhouse-grown tomato plant. After incubating for two days, the bacteria from the leaf grew into visible colonies.

Image: 
Photo courtesy of Britt Koskella lab

Scientists are homing in on what a healthy human microbiome looks like, mapping the normal bacteria that live in and on the healthy human body. But what about a healthy plant microbiome?

Is there even such a thing as a healthy plant microbiome in today's agricultural fields, with acres of identical plants assaulted by pesticides and herbicides and hyped up on fertilizer?

A new study by University of California, Berkeley, microbial ecologists used experimental evolution to help identify the core microbiome of commercial tomatoes. They selected for those microbial taxa that best survived on the plants and then showed that these "domesticated" microbial communities are able to effectively fend off random microbes that land on the plants. In other words, these selected communities look like a stable, healthy plant microbiome, akin to what a robust tomato plant might pass to its offspring.

The results are good news for growers who hope that manipulating the plant microbiome, perhaps with probiotics, will make for healthier fields that need less fertilizer and less or no pesticides to produce good yields.

"I see the implications of this work not just being about probiotics, but also about guiding agricultural practice," said study leader Britt Koskella, a UC Berkeley assistant professor of integrative biology. "When planting fields, we should be thinking about how what we do -- whether it is age structuring of crops or monocropping versus crop rotations, what is in the soil or what is living nearby -- can impact the acquisition and health of the plant microbiome. We should be manipulating the growing conditions in a way that microbial transmission is more akin to what would happen naturally."

Koskella, lead author Norma Morella, who is now a postdoctoral fellow at the Fred Hutchinson Cancer Research Center in Seattle, and their colleagues reported their findings online this week in the journal Proceedings of the National Academy of Sciences (PNAS).

How do seedlings get microbiomes from their mothers?

Koskella studies the microbial ecology of plants and how it affects plant health, much like biologists study the human microbiome's role in health. Focusing on agricultural crops, she has some of the same concerns as biologists who worry about the transmission of a healthy human microbiome -- skin, gut and more -- from mother to baby.

When seedlings are first put into fields, for example, there are often no nearby adult plants from which they can acquire leaf and stem microbes. In the absence of maternal transmission, Koskella wondered, how do these plants acquire their microbiomes, and are these microbiomes ideal for the growing plants?

And, if the microbiomes are not well adapted -- for example, not resistant to disease-carrying microbes -- can they be improved?

These questions are becoming increasingly important as growers and industry alike try to improve crop yield and sustainability by surrounding seeds with desirable microbes, engineering soil microbial communities or spraying desired microbes on growing plants.

Increasing evidence also shows that microbiomes can affect yield, tolerance to drought and even the flowering time of plants. Can microbiomes be enhanced to achieve this, and will enhanced microbiomes survive long enough to help the plants?

The new study is encouraging.

"We already know that, in theory, you can select for microbes that perform particular functions: increased yield, drought tolerance or disease resistance, for example," Koskella said. "We are showing here that you can, in principle, create a microbial community that has the function you are interested in, but also is uninvadable, because it is really well-adapted to that plant."

Cultivating a core microbiome

The researchers' experiments, conducted in greenhouses on UC Berkeley's Oxford Tract, involved taking five types of tomatoes and spraying four successive generations of plants with the microbiomes of the previous generation. The first generation was sprayed with a broad mix of microbes found on a variety of tomatoes in an outdoor field at UC Davis.

Nurturing the microbial community of each type of tomato through successive generations allowed it to adapt to each strain, ideally weeding out the maladapted microbes and allowing the well-adapted ones to flourish.

By sequencing the 16S ribosomal subunits of the tomatoes' microbial communities after each generation -- a technique that allows identification of different bacterial taxa -- they were able to show that, by the fourth generation, only 25% of the original microbial taxa remained.

"So, 75 percent of the original bacteria that we spray on go virtually extinct during the experiment," Koskella said. "That is really interesting in itself, because it suggests that a lot of the microbes out there aren't well adapted, they are kind of there by chance. The wind blew them there, rain splashed them there, but they are not thriving, they are likely not adapted to that particular environment."

The remaining 25%, which were very similar across all independent selection lines and across the five tomato strains, looked very much like a "core" microbiome: the key microbes necessary for a healthy plant.

When Morella sprayed tomato plants with a microbial mixture -- half from the partially adapted microbiome of the first generation, half from the more mature fourth generation microbiome -- the fourth generation microbes took over, suggesting that they were much better adapted to the tomato.

"I think this work on the tomato supports the idea that leaf bacteria are probably very distinctive and have traits that are required for them to grow well on those plants, and that just the fact that you can find things there may mean that they are there only transiently and probably in the process of dying," said co-author Steven Lindow, a UC Berkeley professor of plant and microbial biology who has been investigating plant-pathogen interactions for nearly 50 years. "This is very consistent with what we had found before, that good plant colonists can grow on many plants and, in so doing, usurp the ability of anybody else to also grow there. The prophylactic effect is definitely very strong and real and very important in keeping other plant colonists away."

"What you want to ask, really, is, 'Who wins when you put them head to head? The selected microbiome or the unselected microbiome?'" Koskella said. "That, to me, is my favorite part of the whole experiment and was the 'aha! moment': Selection works, you really can select for a microbiome that it is well adapted and not invadable, at least under the conditions we used for selection."

Koskella's group is now running further experiments to determine whether the selected microbiome actually improves plant health, resilience and productivity, and whether probiotic microbes can be integrated successfully into the core microbiome for lasting crop benefits.

Credit: 
University of California - Berkeley

Siberian researchers contribute to global monitoring of the Earth's Green Lungs

Researchers of Siberian Federal University took part in a global project to collect, systematize and universalize data on the composition of forests in all climatic zones and on all continents of the planet.

Researchers from more than 80 scientific institutions provided detailed insights on the quantitative and species composition of world forests, distribution and correlation of its' various components -- trees, shrubs, and ground cover -- using a unified methodological system and data collection protocol.„ Bringing to a common denominator" is expected to greatly facilitate the further study of changes in forests due to global warming.

„Great work has been done. One hundred forty-three authors from leading scientific schools studying forest as the most important biological and ecological system, which life on Earth depends on, armed themselves with a unified methodology and general principles of data collection to compile a map of the distribution of phytomass over all continents where forests grow. Phytomass is everything that can be classified as plants, and even leaf and branch shedding. Knowing the volumes and approximate composition of the world phytomass, we can draw fairly accurate conclusions about carbon stocks, as well as a level of fire-hazard of any particular part of the forest, wether it is prone to certain diseases, or what natural and anthropogenic factors may threaten its well-being in the future," -- said Dr Sergey Verkhovets, one of the co-authors and leading research fellow of the laboratory of biogeochemistry of ecosystems, Siberian Federal University.

The researcher pointed out that the first data on the Siberian taiga included in this global study had been obtained more than ten years ago within project „Assessment of Ecosystems of Central Siberia", which involved work of SibFU students and scientists. The information is planned to be updated (and the entire developed system) no less than once every five to ten years. Thus, the researchers will be able to understand whether their predictions have been correct, or whether forest development follows a different scenario that has not been predicted by experts.

„We can already indicate some changes that Siberian forests will go through, according to the data of our system. Judging mainly by the recorded climatic changes, we can already observe moist warming (that does not mean that droughts will end, though), lengthening of the vegetation season of trees, and an increase in carbon absorption of forests. To put it simpler, trees are getting taller and thicker, and the forest is thickening too. Some types of woody plants do not like such climate changes, i.e. larch forests spread in the northern taiga zone are gradually replaced by pine-spruce forests, and so dark coniferous Scandinavian-type forests are appearing in our country. Shortly, for example, linden may return to Siberia - it used to grow here until the last mass glaciations which reduced them to insignificant relic groves in Novosibirsk and Kemerovo regions," -- continued Dr Verkhovets.

There are less pleasant consequences of climate change, and increased air dryness in the mountain forests of the Western Sayan is among them. Scientists complain about the direct consequences of lack of moisture -- the forest is weakening, and most importantly, it becomes an easy prey for all kinds of pests from phytopathogenic fungi to insects, i. e. Ussuri polygraph and Siberian lasiocampid that destroyed hectares of taiga in the area of the village of Nazimovo in Yenisei district.

„As for Ussuri polygraph, these insects used to be in our taiga en route but considered it as a cold and uncomfortable area. And now, when favourable conditions have formed, we see an outbreak of mass reproduction of polygraph in Siberia marked by rusty dead trees along the roads. There is another abiotic factor that bothers scientists -- soil erosion. Permafrost thawing is dangerous, as it causes unpredictable flooding and stagnation of moisture, and some trees (for example, larch and pine) may not survive overmoistening of the soil," -- the expert specified.

To the research team, fire forecasting was another significant reason to develop the system. It is no secret that in recent years this disaster has been particularly troubling the northern and central regions of Krasnoyarsk Territory. According to the authors of the article, Siberia already experienced terrible fires dozens of thousand years ago, when those linden forests burnt, the very ones that may eventually return to the once-abandoned regions.

„That is not to say that the fires of 2019 are unique in the scale of the affected territories, but they are alarming indeed. The anticyclone which comes here annually from the territory of Yakutia, this time, provoked the strongest smoke that reached Omsk, Krasnoyarsk, Novosibirsk, and even the capital region. It spread a little wider and lasted a little longer than usual -- and cool moist air from the Atlantic could not turn the tide and cool the burning Siberia. The most unpleasant thing is that such fires will happen again and again. Global warming, as it has been already mentioned, brings droughts to some areas, and the drying out forest burns up like a match both from natural factors and from a human careless hand. A forest that lacks moisture resembles a person weakened by a chronic disease -- it is more difficult to repel an attack of parasitic species, and it recovers much more slowly after fires," -- Dr Verkhovets concluded.

Credit: 
Siberian Federal University

Discovery of genes involved in the biosynthesis of antidepressant

image: A and C: longitudinal and cross section of a pistil bearing dark glands in contrast to; B and D: corresponding sections of a glandless pistil

Image: 
P. Rizzo / IPK

The herb St John's Wort Hypericum perforatum has a long history as a medicinal plant. Greek physicians of the first century, such as Hippocrates and Pliny, recommended it as a diuretic and as a cure for snakebites, amongst other uses. In the Middle Ages, it was used in magic potions to ward off evil spirits, whilst Paracelsus in the 16th century found it useful for the alleviation of pain and the treatment of melancholy. In more recent times, St. John's Wort has been recognised for its applications in cancer therapy, due to the plant's bioactive compound hypericin. Despite its high demand in medical applications, the biosynthetic pathways of hypericin have yet to be fully unveiled. Recent research focusing on the hypericin-producing dark glands in the flowers of St. John's Wort has now uncovered key regulatory candidate genes for hypericin biosynthesis.

Hypericum perforatum or St. John's Wort is an herbaceous plant with bright yellow flowers and translucent oil glands, giving its leaves a perforated appearance. Originally native to Europe and western Asia, the plant nowadays is found in temperate regions worldwide. Whilst considered a weed which is poisonous to livestock, St. John's Wort is better known for its medicinal properties. The herb produces several secondary metabolites, of which hypericin counts as one of the most studied. This compound gained an important place in the scientific spotlight due to its potential use in cancer photodynamic therapy and the treatment of the Alzheimer disease. By focussing on the dark gland development in the flowers, in particular the placental tissue, of St. John's Wort, a collaboration of scientists from German and Canadian research institutes recently discovered some of the key genes potentially involved in the development of dark glands, as well as the biosynthesis of hypericin.

Previous research had often focussed on the leaves of St. John's Wort as the model organ for the study of hypericin. However, by phenotyping 93 Hypericum- accessions, the researchers led by Dr. Paride Rizzo from the Leibniz Institute of Plant Genetics and Crop Plant Research (IPK) in Gatersleben, came across a polymorphism, which showcased glanded and glandless phenotypes in the placental tissue. By shifting the focus onto the placental tissue of the flowers, the researchers identified two transcription factors involved in the differentiation of dark glands. They further characterised the development of the dark glands within the placental tissue by utilising different microscopy techniques. These results were achieved through a combination of transcriptomics and metabolomics that provides a view of unprecedented quality on gene expression and metabolites levels during dark gland development in H. perforatum.

First published online in April by the Plant Biotechnology Journal, the findings will now appear in the journal's December print edition. Understanding which genes are involved in the hypericin biosynthesis is a vital step when it comes to the development of new Hypericum extracts for medicinal application. Having now demonstrated the potential of placental tissue as a new model organ for the study of dark glands and the associated biosynthetic pathways, further research will be able to build upon this knowledge. In the near future we could expect that these candidate genes will be used to either hyperactivate or inhibit the formation of dark glands. This will lead to new applications of H. perforatum in the pharma industry.

Credit: 
Leibniz Institute of Plant Genetics and Crop Plant Research

Breakthrough in battle against invasive plants

Plants that can "bounce back" after disturbances like ploughing, flooding or drought are the most likely to be "invasive" if they're moved to new parts of the world, scientists say.

Invasive plants cause harm to people, industry, livestock, wildlife and natural ecosystems worldwide - but predicting which plants could become invasive is very difficult.

A team of scientists from across Europe, led by the University of Exeter, developed and analysed a global database of plant life cycles to tackle this puzzle.

"What we found was a real surprise," said senior author Professor Dave Hodgson, from the University of Exeter.

"Invasive plant populations grow fast in their invaded range, but not in their native range. So you can't use population growth to predict invasiveness.

"However, invasive plant species have an amazing ability to bounce back from disturbances, and we can see this in both their native range and their invaded range.

"Based on this finding, we should avoid the export of plant species that grow well in disturbed environments."

PhD student and ecological consultant Kim Jelbert, lead author of the paper, said: "The kinds of species that bounce back from disturbance tend to be species that produce lots of seeds from large flowers.

"This is a real problem, because large flowers are popular with gardeners all over the world. These species should not be traded internationally.

"We also discovered patterns of ancestry. Close relatives of invasive plants are also likely to be invasive if they escape their native range."

Plant species are transported all over the world by the gardening, food and flower industries - with many exported from their native (home) range.

Professor Hodgson added: "The global battle against invasive species is made difficult because we have failed until now to predict which species will be a problem, and which will be benign, when they colonise new regions.

"For the first time, we have a strong clue to the identity of future invasive plants.

"Weedy plants of disturbed environments - and their close relatives - should not be exported."

Plants that have been moved outside their native range and become invasive include Japanese knotweed, blackberry and Himalayan balsam.

Credit: 
University of Exeter