Earth

Impact of climate change on global banana yields revealed

video: Climate change could negatively impact banana cultivation in some of the world's most important producing and exporting countries, a study has revealed.

Image: 
Video: University of Exeter Images: David Bebber Music: Life Blossom by Keys of Moon

Climate change could negatively impact banana cultivation in some of the world's most important producing and exporting countries, a study has revealed.

Bananas are recognised as the most important fruit crop - providing food, nutrition and income for millions in both rural and urban areas across the globe.

While many reports have looked at the impact of climate change on agricultural production, the effect rising temperatures and changing rainfall has on crucial tropical crops such as the banana are less well understood.

In a new study, led by Dr Dan Bebber from the University of Exeter, scientists have studied both the recent and future impact of climate change on the world's leading banana producers and exporters.

It shows that 27 countries - accounting for 86 per cent of the world's dessert banana production - have on average seen increased crop yield since 1961 due to the changing climate resulting in more favourable growing conditions.

However, crucially the report also suggests that these gains could be significantly reduced, or disappear completely, by 2050 if climate change continues at its expected rate.

It suggests that 10 countries - including the world's largest producer and consumer of banana India and the fourth largest producer, Brazil - are predicted to see a significant decline in crop yields.

The study does highlight that some countries - including Ecuador (the largest exporter) and Honduras, as well as a number of African countries - may see an overall benefit in crop yields.

Dr Bebber, a Senior Lecturer in Biosciences at the University of Exeter said: "We're very concerned about the impact of diseases like Fusarium Wilt on bananas, but the impacts of climate change have been largely ignored. There will be winners and losers in coming years, and our study may stimulate vulnerable countries to prepare through investment in technologies like irrigation".

Grown throughout the tropics and subtropics, bananas are a key crop for millions of people across the world. In Britain, for example, more than five billion bananas are purchased each year, and the UK accounts for seven per cent of the global export market.

Such international trade can play a pivotal role to local and national economies in producing countries. For example, bananas and their derived products constitute the second largest agricultural export commodity of Ecuador and Costa Rica.

Given this importance, predicting the potential impacts of climate change on banana production systems is crucial to ensuring its long-term survival.

In this new study, the team assessed the climate sensitivity of global dessert banana productivity or yield using sophisticated modelling techniques.

It showed that by 2050, any positive effects of climate change on average global banana yields, though likely to continue, will be significantly lessened.

Ten countries are predicted to show at least a negative trend, if not strong declines in yields. These include some of the largest producers such as India and Brazil, as well as Colombia, Costa Rica, Guatemala, Panama and the Philippines, all of which are major exporters.

Dr Bebber added: "It is imperative that we invest in preparing tropical agriculture for future climate change".

Dr Varun Varma, Research Fellow at the University of Exeter and an author of the study said: "An open exchange of ideas is going to be critical going forward. We believe practical solutions already exist, but these are scattered across banana producing countries. This knowledge exchange needs to start now to counteract predicted yield losses due to climate change."

Credit: 
University of Exeter

Novel molecules designed by artificial intelligence in 21 days are validated in mice

image: Deep Learning enables rapid identification of potent DDR1 Kinase Inhibitors.

Image: 
Insilico Medicine

Highlights:

The traditional drug discovery starts with the testing of thousands of small molecules in order to get to just a few lead-like molecules and only about one in ten of these molecules pass clinical trials in human patients.

Generative Adversarial Networks (GANs) are a form of AI imagination and are commonly used to generate images with specific properties

Since the seminal publication by Insilico Medicine team in 2016 GANs are being explored for generation of novel molecular structures with specified properties

For over 3 years scientists worldwide are developing the theoretical base for GANs and other machine learning techniques to substantially accelerate and improve the drug discovery process

In the Nature Biotechnology paper titled "Deep learning enables rapid identification of potent DDR1 kinase inhibitors" for the first time the generative reinforcement learning technology was used to generate novel small molecules for a protein target that were validated in vitro and in vivo in just 46 days

September 2nd, 2019, 4 PM, London, Insilico Medicine, a global leader in artificial intelligence for drug discovery, today announced the publication of a paper titled, "Deep learning enables rapid identification of potent DDR1 kinase inhibitors," in Nature Biotechnology. The paper describes a timed challenge, where the new artificial intelligence system called Generative Tensorial Reinforcement Learning (GENTRL) designed six novel inhibitors of DDR1, a kinase target implicated in fibrosis and other diseases, in 21 days. Four compounds were active in biochemical assays, and two were validated in cell-based assays. One lead candidate was tested and demonstrated favorable pharmacokinetics in mice.

The traditional drug discovery starts with the testing of thousands of small molecules in order to get to just a few lead-like molecules and only about one in ten of these molecules pass clinical trials in human patients. Even a slight improvement in the time it takes to discover new drugs or in the probability of success results in significant savings and public benefit.

The authors of the paper pioneered the field of generative chemistry with seminal publications in 2016 and experimental validation of the molecules generated by GENTRL represents a valuable milestone on the path to more efficient drug discovery powered by artificial intelligence.

Insilico Medicine is developing a comprehensive drug discovery pipeline utilizing artificial intelligence generating novel molecules with the specified properties for a variety of target classes and challenging targets with and without crystal structure rapidly generating leadlike hits. This pipeline was specifically developed to rapidly validate prospective targets with small-molecule chemistry and allow for rapid pharmaceutical drug discovery.

Comments from the Key Opinion Leaders:

"This paper is certainly a really impressive advance and likely to be applicable to many other problems in drug-design. Based on state-of-the-art reinforcement learning, I am also very impressed by the breadth of this study involving as it does molecular modeling, affinity measurements, and animal studies," said Dr. Michael Levitt, professor of structural biology, Stanford University. Dr. Levitt received the Nobel Prize in Chemistry in 2013.

"I interacted with many AI startups in the past and Insilico was the only deep learning company with impressive, demonstrated capabilities integrating target identification and small molecule discovery. They did a lot of theoretical work in GANs from the very beginning and this experimental validation is a significant demonstration that this technology may improve and accelerate drug discovery," said Dr. John Baldoni, CTO of a stealth AI-powered drug development startup and former SVP of Platform Technology and Science at GSK.

"The generative tensorial reinforcement learning in this paper substantially advances the efficiency of biochemistry implementation in drug discovery. Yet to be further experimented at scale, this method signals a breakthrough of pharmaceutical artificial intelligence at industrial level, and may bring significant social and economic impact to our society," said Dr. Kai-Fu Lee, founder of Sinovation Ventures, former executive of Microsoft and Google, and the original inventor of multiple AI technologies.

"I met Alex when working at OpenAI and have been excited to see him pioneer the use of GANs/RL for the pharmaceutical industry since 2016. One major criticism of GANs is that their usefulness has been limited to image editing applications, so I'm glad that Alex and his team are finding ways to use them for molecular generation," said Dr. Ian Goodfellow, the original inventor of Generative Adversarial Networks (GANs)

"This technology builds on our early work on adversarial and generative neural networks since 1990. Insilico has been working on generative models for drug discovery since 2015, and I am happy to see that their GENTRL system produced molecules that were experimentally validated in cells and in mice. AI will have a transformative effect on the pharmaceutical industry, and we need more experimental validation results to accelerate progress," said Dr. Jürgen Schmidhuber, a professor at IDSIA, co-founder of NNAISENSE, and the original inventor of many core techniques and initial concepts in the field of artificial intelligence.

"Reduction of cycle time and overall cost of goods is critical to the future success of Pharma drug discovery activities. In this paper, Insilico highlight a novel AI based technology (GAN-RL) which allowed them to identify lead molecules with efficacy in animal models in notably short timeframes. If this technology proves broadly useful it may well have transformational potential for future lead generation efforts," said Dr. Stevan Djuric, Adjunct Professor, School of Pharmacy, High Point University and former Vice President, Discovery Chemistry and Technology, Abbvie.

"Much hyperbole exists about the promise of artificial intelligence (AI) in improving medical care and in the development of new medical tools. Here however is a paper "Deep learning enables rapid identification of potent DDR1 kinase inhibitors" recently published in Nature Biotechnology that describes an application of AI in drug discovery that is indeed important. A new drug candidate was proposed and tested preclinically in a remarkably short period of time. The results are significant for two reasons. The AI procedures replaced the role normally played by medicinal chemists, and these individuals are in limited supply. The acceleration in rate translates into longer patent coverage that improves the economics of drug development. If this approach can be generalized it could become a widely adopted method in the pharmaceutical industry," said Dr. Charles Cantor, a professor at Boston University, co-founder of Retrotope, Inc, and former Chief Scientist of the Human Genome Project with the US Department of Energy.

"This paper is a significant milestone in our journey towards AI-driven drug discovery. We work in generative chemistry since 2015 and when Insilico's and Alán's theoretical papers were published in 2016 everyone was very skeptical. Now, this technology is going mainstream and we are happy to see the models developed a few years ago and producing molecules against simpler targets being validated experimentally in animals. When integrated into comprehensive drug discovery pipelines, these models work for many target classes and we work with the leading biotechnology companies to push the limits of generative chemistry and generative biology even further," said Alex Zhavoronkov, PhD, the founder and CEO of Insilico Medicine, the lead author of the study.

Credit: 
InSilico Medicine

New feedback phenomenon found to drive increasing drought and aridity

image: New study shows an increasingly high probability of more frequent, more extreme concurrent soil drought and atmospheric aridity, intensified by both climate change and land-atmosphere processes, posing large risks to the planet and human life.

Image: 
Sha Zhou/Columbia Engineering

New York, NY--September 2, 2019--A new Columbia Engineering study indicates that the world will experience more frequent and more extreme drought and aridity than currently experienced in the coming century, exacerbated by both climate change and land-atmosphere processes. The researchers demonstrate that concurrent soil drought and atmospheric aridity are largely driven by a series of land-atmosphere processes and feedback loops. They also found that land-atmosphere feedbacks would further intensify concurrent soil drought and atmospheric aridity in a warmer climate. The study was published today in Proceedings of the National Academy of Sciences (PNAS).

While earlier studies have looked at how atmospheric and oceanic processes drive climate extremes, the Columbia Engineering team has focused on examining and modeling land-atmosphere processes, especially in setting up concurrent extremes that can be very destructive. Soil drought, represented by very low soil moisture, and atmospheric aridity, represented by very high vapor pressure deficit, a combination of high temperature and low atmospheric humidity, are the two main stressors that drive widespread vegetation mortality and reduced terrestrial carbon uptake. Concurrent soil drought and atmospheric aridity is a time period when soil moisture is extremely low and vapor pressure deficit is extremely high.

"Concurrent soil drought and atmospheric aridity have dramatic impacts on natural vegetation, agriculture, industry, and public health," says Pierre Gentine, associate professor of earth and environmental engineering and affiliated with the Earth Institute. "Future intensification of concurrent soil drought and atmospheric aridity would be disastrous for ecosystems and greatly impact all aspects of our lives."

The researchers combined reanalysis datasets and model experiments to identify the main land-atmosphere processes leading to concurrent soil drought and atmospheric aridity, and used climate models and statistical methods to assess how land-atmosphere processes would impact the frequency and intensity of concurrent soil drought and atmospheric aridity in future climate. The challenge they faced was how to isolate the impact of land-atmosphere feedbacks on concurrent drought and aridity. After trying many different methods, they worked with the GLACE-CMIP5 (Global Land Atmosphere Coupling Experiment--Coupled Model Intercomparison Project) scientists at ETH Zurich's Institute for Atmospheric and Climate Science and used their model experiments.

Gentine's group is the first to isolate this phenomenon and were surprised their work produced such dramatic findings.

"Most groups have been focused on assessing concurrent drought and heatwaves, but we are finding stronger coupling between drought and aridity than between drought and heatwaves," says Sha Zhou, the study's lead author and a postdoc working with Gentine. "Concurrent drought and aridity also have a stronger impact on the carbon cycle and so we felt this was a critical point to study."

The team discovered that the feedback of soil drought on the atmosphere is largely responsible for increasing the frequency and intensity of atmospheric aridity. In addition, the soil moisture-precipitation feedback contributes to more frequent extreme low precipitation and soil moisture conditions in most regions. These feedback loops lead to a high probability of concurrent soil drought and extreme aridity. The CMIP5 simulations suggest that land-atmosphere feedbacks will further increase the frequency and intensity of concurrent drought and aridity in the 21st century, with potentially large human and ecological impacts.

The PNAS study highlights the importance of soil moisture variability in enabling a series of processes and feedback loops affecting the Earth's near-surface climate.

Says Gentine, "It's critical that we better quantify and evaluate the representation of these processes in our climate models. Accurate model representation of both soil moisture variability and the associated feedbacks is crucial if we are to provide reliable simulations of the frequency, duration, and intensity of compound drought and aridity events and of their changes in a warmer climate. Ultimately, this will help us mitigate future risks associated with these events."

Credit: 
Columbia University School of Engineering and Applied Science

Functional changes of thermosensory molecules related to environmental adaptation

image: The phylogenetic relationship of four clawed frog species is shown in left with a scale bar (a horizontal arrow). For example, X. tropicalis and X. laevis were estimated to be diverged approximately 60 million years ago. Thermal properties of TRPV1 or TRPA1 are shown in right. In this study, three ancestral TRPV1 were reconstructed. Thermal property of each ancestral TRPV1 is indicated in the box.

Image: 
Shigeru Saito

Animals have adapted to diverse thermal environments from cold to hot. During the course of thermal adaptation processes, preferred thermal ranges for survival shift among species adapted to different thermal niches. Accordingly, evolutionary changes of thermal perception must be required during thermal adaptation. To understand the molecular basis for the shift in thermal perception, the researchers compared the functional properties of thermal sensors among clawed frog species adapted to different thermal niches in Africa.

In clawed frog (genus Xenopus), TRPV1 and TRPA1 serve as heat sensors in thermal perception. In their previous study, heat responses of TRPV1 have been reported to differ between Xenopus laevis and Xenopus tropicalis adapted to cool and warm niches, respectively. Upon heat stimulation, X. laevis TRPV1 showed a maximum response from the first stimulation, while X. tropicalis TRPV1 showed only a small response in the first stimulation and its responses became gradually larger upon repeated heat stimulation.

In the present study, the researchers newly analyzed two species adapted to warm (Xenopus muelleri) and cool niches (Xenopus borealis). TRPV1 from these two species exhibited heat responses similar to X. laevis TRPV1. To elucidate the functional evolutionary process of TRPV1, ancestral proteins of TRPV1 was inferred and artificially reconstructed. Reconstructed ancestral TRPV1 also showed heat responses similar to Xenopus laevis, suggesting that TRPV1 heat responses specifically changed in the lineage leading to X. tropicalis. However, similar functional shift of TRPV1 did not occur from the ancestor to X. muelleri, therefore changes in the TRPV1 heat responses is not always linked with niche selection in the Xenopus evolutionary process.

On the other hand, comparison of TRPA1 among four Xenopus species revealed that heat-evoked activity of TRPA1 from cool-adapted species was considerably higher than that of TRPA1 from warm-adapted species. This finding suggests that the species adapted to cool niches increased the activity of a heat sensor (or vice versa) in order to sharply respond to heat exposure. Therefore, this study illuminated the importance of thermal sensors in environmental adaptation.

Credit: 
National Institutes of Natural Sciences

Native birds in South-eastern Australia worst affected by habitat

image: More than 60% of the birds of south-east mainland Australia have lost over half of their natural habitat. Beyond high profile endangered species like the regent honeyeater, the extensive loss of Eucalyptus woodland and forest has affected numerous birds including flycatchers, whistlers and robins.

Image: 
Graham Winterflood

New research has found that habitat loss is a major concern for hundreds of Australian bird species, and south-eastern Australia has been the worst affected.

The Threatened Species Recovery Hub study, featuring University of Queensland scientists, found that half of all native bird species have each lost almost two-thirds of their natural habitat across Victoria, parts of South Australia and New South Wales.

Lead researcher, Dr Jeremy Simmonds, said the team looked at both threatened and non-threatened birds, including common species.

"While more attention is usually paid to threatened species, common species, like many of our familiar fairy-wrens, pigeons and honeyeaters, are crucially important," Dr Simmonds said.

"Common species play a vital role in controlling insect pests and pollination and their decline through loss of habitat has implications for the health of ecosystems.

"Along with feral and invasive species, habitat destruction is among the greatest threats facing biodiversity in Australia, so it is important to understand how big the problem of habitat removal is: our research developed a method to do this, called the Loss Index.

"We looked at how the amount of habitat available for each of Australia's 447 different land bird species had changed since 1750.

"In places like Queensland's south-east and the Wet Tropics, each hectare of forest cleared can affect up to 180 different native bird species.

"Habitat loss has been particularly devastating for birds from south-east Australia; more than half of the 262 native birds in this region only have a small fraction of their natural habitat remaining in this part of the country.

"Northern Australia and Australia's arid zone have had the least habitat loss, as there has been much less vegetation clearing across that region.

"We also looked at different bird groups and found that Australia's parrot species are more impacted by habitat loss, compared with birds of prey, like eagles and owls.

Dr Simmonds said the index provided a tool for conservation managers and planners to better understand how habitat loss affects all birds, and not just the endangered ones.

"It helps to show that every hectare of native vegetation that is removed chips away at remaining habitat for dozens and sometimes hundreds of species, including common species which typically do not receive conservation attention," he said.

"The quality of the remaining habitat is often reduced, due to weeds, grazing and changed fire patterns, such as more and hotter fires, and this can further reduce the number and type of birds that an area can support."

The Loss Index can also be applied to other species like mammals or plants.

Credit: 
University of Queensland

New artifacts suggest people arrived in North America earlier than previously thought

image: This map depicts a possible Pacific coastal migration route for early Americans.

Image: 
Teresa Hall, Oregon State University

CORVALLIS, Ore. - Stone tools and other artifacts unearthed from an archeological dig at the Cooper's Ferry site in western Idaho suggest that people lived in the area 16,000 years ago, more than a thousand years earlier than scientists previously thought.

The artifacts would be considered among the earliest evidence of people in North America.

The findings, published today in Science, add weight to the hypothesis that initial human migration to the Americas followed a Pacific coastal route rather than through the opening of an inland ice-free corridor, said Loren Davis, a professor of anthropology at Oregon State University and the study's lead author.

"The Cooper's Ferry site is located along the Salmon River, which is a tributary of the larger Columbia River basin. Early peoples moving south along the Pacific coast would have encountered the Columbia River as the first place below the glaciers where they could easily walk and paddle in to North America," Davis said. "Essentially, the Columbia River corridor was the first off-ramp of a Pacific coast migration route.

"The timing and position of the Cooper's Ferry site is consistent with and most easily explained as the result of an early Pacific coastal migration."

Cooper's Ferry, located at the confluence of Rock Creek and the lower Salmon River, is known by the Nez Perce Tribe as an ancient village site named Nipéhe. Today the site is managed by the U.S. Bureau of Land Management.

Davis first began studying Cooper's Ferry as an archaeologist for the BLM in the 1990s. After joining the Oregon State faculty, he partnered with the BLM to establish a summer archaeological field school there, bringing undergraduate and graduate students from Oregon State and elsewhere for eight weeks each summer from 2009 to 2018 to help with the research.

The site includes two dig areas; the published findings are about artifacts found in area A. In the lower part of that area, researchers uncovered several hundred artifacts, including stone tools; charcoal; fire-cracked rock; and bone fragments likely from medium- to large-bodied animals, Davis said. They also found evidence of a fire hearth, a food processing station and other pits created as part of domestic activities at the site.

Over the last two summers, the team of students and researchers reached the lower layers of the site, which, as expected, contained some of the oldest artifacts uncovered, Davis said. He worked with a team of researchers at Oxford University, who were able to successfully radiocarbon date a number of the animal bone fragments.

The results showed many artifacts from the lowest layers are associated with dates in the range of 15,000 to 16,000 years old.

"Prior to getting these radiocarbon ages, the oldest things we'd found dated mostly in the 13,000-year range, and the earliest evidence of people in the Americas had been dated to just before 14,000 years old in a handful of other sites," Davis said. "When I first saw that the lower archaeological layer contained radiocarbon ages older than 14,000 years, I was stunned but skeptical and needed to see those numbers repeated over and over just to be sure they're right. So we ran more radiocarbon dates, and the lower layer consistently dated between 14,000-16,000 years old."

The dates from the oldest artifacts challenge the long-held "Clovis First" theory of early migration to the Americas, which suggested that people crossed from Siberia into North America and traveled down through an opening in the ice sheet near the present-day Dakotas. The ice-free corridor is hypothesized to have opened as early as 14,800 years ago, well after the date of the oldest artifacts found at Cooper's Ferry, Davis said.

"Now we have good evidence that people were in Idaho before that corridor opened," he said. "This evidence leads us to conclude that early peoples moved south of continental ice sheets along the Pacific coast."

Davis's team also found tooth fragments from an extinct form of horse known to have lived in North America at the end of the last glacial period. These tooth fragments, along with the radiocarbon dating, show that Cooper's Ferry is the oldest radiocarbon-dated site in North America that includes artifacts associated with the bones of extinct animals, Davis said.

The oldest artifacts uncovered at Cooper's Ferry also are very similar in form to older artifacts found in northeastern Asia, and particularly, Japan, Davis said. He is now collaborating with Japanese researchers to do further comparisons of artifacts from Japan, Russia and Cooper's Ferry. He is also awaiting carbon-dating information from artifacts from a second dig location at the Cooper's Ferry site.

"We have 10 years' worth of excavated artifacts and samples to analyze," Davis said. "We anticipate we'll make other exciting discoveries as we continue to study the artifacts and samples from our excavations."

Credit: 
Oregon State University

Oxygen depletion in ancient oceans caused major mass extinction

image: For years, scientists struggled to connect a mechanism to this mass extinction, one of the 10 most dramatic ever recorded in Earth's history. Now, researchers from Florida State University have confirmed that this event, referred to by scientists as the Lau/Kozlowskii extinction, was triggered by an all-too-familiar culprit: rapid and widespread depletion of oxygen in the global oceans.

Image: 
Stephen Bilenky

Late in the prehistoric Silurian Period, around 420 million years ago, a devastating mass extinction event wiped 23 percent of all marine animals from the face of the planet.

For years, scientists struggled to connect a mechanism to this mass extinction, one of the 10 most dramatic ever recorded in Earth's history. Now, researchers from Florida State University have confirmed that this event, referred to by scientists as the Lau/Kozlowskii extinction, was triggered by an all-too-familiar culprit: rapid and widespread depletion of oxygen in the global oceans.

Their study, published today in the journal Geology, resolves a longstanding paleoclimate mystery, and raises urgent concerns about the ruinous fate that could befall our modern oceans if well-established trends of deoxygenation persist and accelerate.

Unlike other famous mass extinctions that can be tidily linked to discrete, apocalyptic calamities like meteor impacts or volcanic eruptions, there was no known, spectacularly destructive event responsible for the Lau/Kozlowskii extinction.

"This makes it one of the few extinction events that is comparable to the large-scale declines in biodiversity currently happening today, and a valuable window into future climate scenarios," said study co-author Seth Young, an assistant professor in the Department of Earth, Ocean and Atmospheric Science.

Scientists have long been aware of the Lau/Kozlowskii extinction, as well as a related disruption in Earth's carbon cycle during which the burial of enormous amounts of organic matter caused significant climate and environmental changes. But the link and timing between these two associated events -- the extinction preceded the carbon cycle disruption by more than a hundred thousand years -- remained stubbornly opaque.

"It's never been clearly understood how this timing of events could be linked to a climate perturbation, or whether there was direct evidence linking widespread low-oxygen conditions to the extinction," said FSU doctoral student Chelsie Bowman, who led the study.

To crack this difficult case, the team employed a pioneering research strategy.

Using advanced geochemical methods including thallium isotope, manganese concentration, and sulfur isotope measurements from important sites in Latvia and Sweden, the FSU scientists were able to reconstruct a timeline of ocean deoxygenation with relation to the Lau/Kozlowskii extinction and subsequent changes to the global carbon cycle.

The team's new and surprising findings confirmed their original hypothesis that the extinction record might be driven by a decline of ocean oxygenation. Their multiproxy measurements established a clear connection between the steady creep of deoxygenated waters and the step-wise nature of the extinction event -- its start in communities of deep-water organisms and eventual spread to shallow-water organisms.

Their investigations also revealed that the extinction was likely driven in part by the proliferation of sulfidic ocean conditions.

"For the first time, this research provides a mechanism to drive the observed step-wise extinction event, which first coincided with ocean deoxygenation and was followed by more severe and toxic ocean conditions with sulfide in the water column," Bowman said.

With the oxygen-starved oceans of the Lau/Kozlowskii extinction serving as an unnerving precursor to the increasingly deoxygenated waters observed around the world today, study co-author Jeremy Owens, an assistant professor in the Department of Earth, Ocean and Atmospheric Science, said that there are still important lessons to be learned from ecological crises of the distant past.

"This work provides another line of evidence that initial deoxygenation in ancient oceans coincides with the start of extinction events," he said. "This is important as our observations of the modern ocean suggest there is significant widespread deoxygenation which may cause greater stresses on organisms that require oxygen, and may be the initial steps towards another marine mass extinction."

Credit: 
Florida State University

Researchers determine pollen abundance and diversity in pollinator-dependent crops

image: Ramesh Sagili, Oregon State University associate professor of apiculture and Extension specialist, examines honeybees in Madras, Oregon.

Image: 
Lynn Ketchum, Oregon State University

CORVALLIS, Ore. - A new study provides valuable insights into pollen abundance and diversity available to honeybee colonies employed in five major pollinator-dependent crops in Oregon and California, including California's massive almond industry.

The study, a collaboration between OSU and Texas A&M University, found that almond, cherry and meadowfoam provide ample pollen to honeybees, but highbush blueberry and hybrid carrot seed crops may not. In addition, California almonds don't provide as much pollen diversity as other crops, according to the findings, published in the Journal of Economic Entomology.

The western honeybee is the major pollinator of fruit, nut, vegetable and seed crops that depend on bee pollination for high quality and yield. The findings are important because both pollen abundance and diversity are critical for colony growth and survival of the western honeybee, said study corresponding author Ramesh Sagili, associate professor of apiculture and honeybee Extension specialist in OSU's College of Agricultural Sciences.

"Pollen diversity is important for the growth and development of bees, and low amounts of pollen availability to honeybee colonies can dramatically affect brood rearing," Sagili said. "Beekeepers that employ their colonies for pollination of crops like hybrid carrot seed and highbush blueberry should frequently assess the amount of pollen stores in their colonies and provide protein supplements if pollen stores are low."

Nectar and pollen provide essential nutrients for honeybees. A honeybee colony's protein source is pollen, which has varying amounts of amino acids, lipids, vitamins and minerals. These nutrients obtained from pollen are essential for honeybee larval development. Pollen largely contributes to the growth of fat bodies in larvae and egg development in the queen.

Well-nourished individuals in a honeybee colony are able to withstand the effects of other stressors such as parasites and insecticides, in addition to the long-distance transport of colonies known as "migratory management." Bees are trucked across the county to pollinate various cropping systems - more than 1 million hives are transported to California each year just to pollinate almonds.

A diet low in pollen diversity hurts a colony's defense system, which consequently increases disease susceptibility and pesticide sensitivity. During critical crop bloom periods, growers rent large numbers of honeybee colonies to pollinate their crops. Approximately 2.5 million commercially managed honeybee colonies are used for crop pollination in the United States every year.

Some cropping systems may put bees at risk for temporary nutritional deficiency if the crop plant's pollen is deficient in certain nutrients and bees are unable to find an alternative source of these nutrients, Sagili said.

"It's crucial for beekeepers and crop producers to understand the pollen abundance and diversity that honeybees encounter during crop pollination," he said, adding that blueberry and hybrid carrot seed producers can mitigate nutritional deficiencies by providing supplemental food or forage, including commercially available protein supplements for bees.

Renting colonies to growers for pollination services is a significant source of income for commercial beekeepers, but it also requires them to repeatedly transport the colonies between crops throughout the growing season. In this study, the research team collaborated with 17 migratory commercial beekeepers for pollen collection from honeybee colonies in five different cropping systems from late February to August of 2012.

They installed pollen traps on at least five colonies at each site and collected pollen from the colonies at the height of the blooming season.
They found that California's vast almond footprint - 1 million acres and counting - provides more than enough pollen for the nearly 2 million honeybees employed to pollinate the orchards, but pollen diversity was low when compared with other crops.

"We think the reason for that is almonds bloom early in the year when there are so few plant species in bloom, so bees have few other forage options and primarily rely on almond pollen," Sagili said. "There are parts of the northern and southern ends of California's San Joaquin Valley where there are no other crops in bloom when almond trees bloom, which may further contribute to poor availability of diverse pollen."

Credit: 
Oregon State University

Reconstructing Anak Krakatau flank collapse that caused Dec. 2018 Indonesian tsunami

image: Satellite imagery showing the evolving geomorphology of Anak Krakatau (Indonesia) as a result of the December 2018 through January 2019 eruptive activity and the 22 December 2018 tsunami. (A, B) Island morphology before the flank failure. Image in panel C was captured only 8 hours after the tsunami, and shows the western flank failure and collapse of the summit. (D) The destruction of the summit. (E) The subsequent regrowth of the island. (F) Changes in island surface area through this period. A and F are European Space Agency satellite Sentinel-2A true-color images, and panels B-E are Sentinel-1A and Sentinel-1B SAR backscatter images. Arrows show the radar look direction.

Image: 
Williams et al., <i>Geology</i>

Boulder, Colo., USA: A new study published in Geology presents the detailed observation of a tsunami-generating volcano collapse by remote sensing. The paper by Rebecca Williams of the University of Hull and colleagues analyzes the 2018 collapse of Anak Krakatau, which triggered a tsunami that claimed over 430 lives and devastated coastal communities along the Sunda Strait, Indonesia.

This collapse was captured in unprecedented detail by satellite remote sensing, providing an opportunity to understand the collapse of the volcano in a way that has not previously been possible at any volcanic island in the world. The teams' analysis shows that the catastrophic tsunami was actually caused by a relatively small landslide--an observation with important implications for communities affected by volcanoes and those responsible for assessing their hazard.

Using a suite of observational data, Williams and colleagues' paper reconstructs the eruptive activity of Anak Krakatau before, during, and after the flank collapse.

They find that the volcano was in a normal eruptive state before the flank collapse, but that the collapse changed the style of the continuing eruption, resulting in a reconfiguration of the magmatic plumbing system of the volcano, which allowed water to enter the system. This in turn caused the eruption to switch to a much more explosive, phreatomagmatic, style. This subsequent activity caused the actual destruction of the volcano's summit and the destructive remodeling of the landscape that can be observed in more recent satellite images.

"It is important to not overestimate the tsunami-generating flank collapse volume by not recognizing the volcanic eruption as the main cause of the dramatic geomorphological changes seen in late December when good true colour satellite images were first available, rather than the synthetic-aperture radar (SAR) image we use immediately after the event," says Williams.

Williams and colleagues conclude that the 2018 Sunda Strait tsunami was generated by an unexceptional eruption, which is an unexpected result.

Small flank failures causing large tsunamis represent a vastly underappreciated geohazards--current tsunami monitoring systems do not monitor for this kind of volcanic activity, instead focusing on large earthquakes or proxies related to unusual increases in magma intrusion. This paper demonstrates the rapid first analyses that can be done with remote sensing to inform hazard analyses and risk mitigating strategies in the short-term with low-latency free satellite remote sensing observations and no fieldwork.

Credit: 
Geological Society of America

Understanding probiotic yeast

Researchers led by Prof. Johan Thevelein (VIB-KU Leuven Center for Microbiology) have discovered that Saccharomyces boulardii, a yeast with probiotic properties, produces uniquely excessive amounts of acetic acid, the main component of vinegar. They were also able to find the genetic basis for this trait, which allowed them to modify the acetic acid production of the yeast. If this unique S. boulardii trait can be further validated to have a probiotic effect in animal models, these results could provide the first genetic basis for S. boulardii's unique probiotic potency. The study is published in Genome Research.

A tale of mysterious yeast

In 1923, the French scientist Henri Boulard isolated a mysterious yeast strain from lychees in South East Asia. This yeast turned out to have unexpected and potent probiotic properties. This yeast, called Saccharomyces boulardii, has since been commercialized for treatment of diarrhea and other intestinal diseases. It is now sold in pharmacies all over the world under a wide range of trade names.
Recent whole-genome DNA sequence analysis showed that S. boulardii is closely related to the much better-known Saccharomyces cerevisiae, the yeast species of which different varieties are commonly used in baking, beer brewing, wine making, bioethanol production, etc. The DNA sequence of these two yeasts is actually so similar that S. boulardii is no longer considered as a separate species but as a variety of S. cerevisiae. Why this S. boulardii yeast has been so successful as probiotic, as opposed to the common S. cerevisiae yeasts, has remained a complete mystery.

The vinegar mutations

The team led by Prof. Johan Thevelein (VIB-KU Leuven) found that the production of acetic acid, the main ingredient of vinegar, is a distinguishable feature of Saccharomyces boulardii. Acetic acid is a well-known preservative and strongly inhibits the growth of all microorganisms. But how does S. boulardii produce such large amounts of acetic acid?

Time for a genetic investigation, as Prof. Thevelein explains: "We were able to find two unique mutations in S. boulardii that are responsible for the production of acetic acid. These mutations can act as a genetic 'fingerprint' that allows us to distinguish between these two types of yeast and allow the isolation and identification of new S. boulardii strains from nature."

Based on this knowledge, the researchers were able to implement CRISPR/Cas genome editing to abolish acetic acid production completely as well as switch high into very high acetic acid producers and vice versa. These modified yeast strains can now be used to test the importance of the acetic acid production for the probiotic power of S. boulardii in laboratory animals, which, in turn, may pave the path towards improved treatments for intestinal diseases.

Credit: 
VIB (the Flanders Institute for Biotechnology)

Changing treatment practices for alcohol use disorder could save lives

TORONTO, August 29, 2019 - Treatment practices in Canada and abroad need to change in order to help more people with alcohol use disorder, according to a CAMH-led article just published in The Lancet.

More than 1 million Canadians have alcohol use disorders in any given year, but the vast majority never receives professional help. Despite interventions for alcohol use disorders being effective and--if performed according to current guidelines--cost-saving, they are rare in Canada and elsewhere in the world.

According to senior author Dr. Jürgen Rehm, Senior Scientist at Institute for Mental Health and Policy Research at the Centre for Addiction and Mental Health (CAMH), improved and routine screening should start in primary care and should be followed by accessible specialized care when required.

"Clinical interventions for alcohol use disorders need to start at the primary care level, if average consumption exceeds more than two drinks a day," said Dr. Rehm, "General practitioners should regularly be asking their patients about alcohol intake, and initiate interventions if required."

Stigma is one of the main reasons for a lack of intervention in primary care. While stigmatization of other mental disorders--for instance for major depression--has markedly improved over the last decades, no such improvement has been seen for alcohol use disorders. Stigma may lead patients to conceal their heavy alcohol consumption, and general practitioners may fail to ask them about it.

If a disorder is detected, safe and effective medications are available for use in primary health care. Co-author Markus Heilig, an international expert on the pharmacology of addictive disorders and director of the Center for Social and Affective Neuroscience at Linköping University, Sweden, adds: "Approved medications for alcohol use disorders are no less effective than other widely used medical treatments. They are also safe, well tolerated. And they are cheap. Yet they are only prescribed to a small minority of patients. This needs to change."

Two additional best practices can help ensure that specialized treatment is effective, according to the authors: wait lists for specialist treatment should be minimal and primary care providers should be involved in the patient's after care.

Credit: 
Centre for Addiction and Mental Health

Gut microbiota linked to organ damage in patients with sepsis

Sepsis is a serious condition that can result in organ failure and even death. A novel human study published in The FASEB Journal demonstrates for the first time that the gut microbiota of patients with sepsis plays a major role in organ damage.

To conduct the experiment, researchers first compared the fecal microbial composition of two human groups: people who had sepsis and those who did not. They observed that the gut microbiota was altered at both functional and compositional levels in the first group, compared with that of the second group.

The researchers then transplanted the feces to recipient mice, induced sepsis in the mice, and checked the mice's organ injuries. The mice transplanted with feces from the first group showed more severe liver damage than the mice transplanted with feces from the second group, even though all the mice had been induced with sepsis. These initial findings suggest that targeting intestinal microbiota may help people recover from sepsis.

"Keeping healthy microbiota in your gut is important to maintaining normal immune status and combatting diseases like sepsis," said Peng Chen, PhD, a professor in the department of pathophysiology at Southern Medical University in Guangzhou, China. "With further study, transplantation of healthy feces may one day serve as a novel approach for treating sepsis in intensive care units."

"The microbiome is one of the most penetrating fields in the modern era of human homeostasis and pathology, and this study is yet one more exciting new dimension," said Thoru Pederson, PhD, Editor-in-Chief of The FASEB Journal.

Credit: 
Federation of American Societies for Experimental Biology

Cooper's Ferry archaeological finds reveal humans arrived more than 16,000 years ago

Archaeological discoveries from the Cooper's Ferry site in western Idaho indicate that humans migrated to and occupied the region by nearly 16,500 years ago. The findings expand the timing of human settlement in the Americas to a period predating the appearance of an ice-free corridor linking Beringia and the rest of North America and support the growing notion that the very first Americans likely landed upon the shores of the Pacific coast. How and when human populations first arrived and settled in the Americas remains debated. A longstanding and influential hypothesis proposes that travelers initially entered North America and parts beyond from eastern Beringia, by way of a deglaciated ice-free corridor that separated the Cordilleran and Laurentide ice sheets approximately 14,800 years ago. However, a small but growing body of research has shown that human populations were present and likely well-established south of the Late Pleistocene ice sheets long before such a passage; its proponents hypothesize a Pacific coastal migration route. Loren Davis and colleagues present new findings from Cooper's Ferry that provide evidence of repeated occupation beginning between 16,560 to 15,280 years ago. Artifacts recovered from the site's earliest contexts indicate the use of unfluted and stemmed stone projectile point technologies before the use of fluted, broad-based points of the widespread Clovis Paleoindian Tradition. According to Davis et al., the age, design and manufacture of Cooper's Ferry's distinctive stemmed points closely resemble features of artifacts found in Late Pleistocene archeological sites in northeastern Asia. The results suggest an initial migration along the Pacific Coast more than 16,000 years ago.

Credit: 
American Association for the Advancement of Science (AAAS)

Humans were changing the environment much earlier than believed

image: A) Onsets represent the earliest time step assessed at the "common" prevalence level (1-20% land area) for extensive agriculture, intensive agriculture, and pastoralism; the earliest time step assessed as "present" for urbanism. B) Decline represents the latest time step assessed at the "common" prevalence level for foraging.

Image: 
Reprinted with permission from: ArchaeoGLOBE Project, SCIENCE, August 30 2019 (DOI: 10.1126/science.aax1192)

Humans' dramatic transformation of Earth's landscape is no recent phenomenon, according to a new study from Science.

The research, which assessed global land use from 10,000 to 170 years ago, reveals that hunter-gatherers, farmers and pastoralists had made significant alterations to the planet by 4,000 years ago, much earlier than indicated by Earth scientists' previous land-use reconstructions.

"Understanding how humans interact with the environment over the long-term past is one of the best things we can do to help us understand how people will deal with this in the future," says Michael Barton, co-author and a professor at Arizona State University's School of Human Evolution and Social Change. "We're not starting from zero. We're starting from a long history."

Researchers can look for evidence of whether ancient peoples' actions benefited or harmed biodiversity and allowed them to reside sustainably or not in an area for a long amount of time. Studying their environmental successes and failures can give a better idea of how to create positive change as humans continue to reshape the planet.

The study also has implications for the Earth system models used to predict future human environmental impact. Accurate predictions rely on comparing the present to the past -- and the data currently representing Earth's past that is used for those models underestimates human impact.

Suspecting this, the authors set out to gather richer, globalized data from those who know humanity's past best -- archaeologists. They began a crowdsourcing effort, called the ArchaeoGLOBE Project, by sending a massive survey to scholars whose expertise covered areas all over the world. 255 respondents filled out over 700 regional questionnaires, which provided the information for the study.

"Many people have realized for some time now that the study of long-term human-environment interactions must include archaeological knowledge, but our research and dataset really open the door to this sort of collaboration at global scale for the first time," says Lucas Stephens, lead author of the paper and recent doctoral graduate from the University of Pennsylvania.

"Our aggregate knowledge paints a surprisingly clear, globally coherent picture," says Nicolas Gauthier, co-author and recent doctoral graduate from Arizona State University.

That picture shows that shifting cultivation and pastoralism had affected over 40% of Earth's land area by 4,000 years ago. It also reveals that continuous cultivation was common to widespread over most of the planet by 2,000 years ago, over 1,000 years earlier than indicated by today's most widely referenced land-use study, the History Database of the Global Environment, known as HYDE.

Archaeologists reported on some regions more than others for the ArchaeoGLOBE Project, reflecting the intensity of research in different areas. But by highlighting these gaps, the study helps scientists prioritize their data gathering and provides a starting point for them to continue investigating land use over time.

"Our hope is that it will push the field forward in a way that would not have been possible had everyone worked in isolation," Gauthier says.

Credit: 
Arizona State University

Nanostructured material with potential for use in catalyzers

Titanium oxide (TiO2) nanofibers can have various applications, such as in catalyzers and filters. When TiO2 is excited by ultraviolet light, it degrades organic material. Hence, TiO2 can be applied to filter wastewater for reuse, for example.

A new method of fabricating these fibers has been developed in Brazil by Rodrigo Savio Pessoa and Bruno Manzolli Rodrigues, researchers at the Aeronautical Technology Institute's Plasma and Process Laboratory (LPP-ITA) and the Science and Technology Institute of Universidade Brasil (ICT-UB), as part of a project supported by São Paulo Research Foundation - FAPESP. An article on the subject has been published in Materials Today: Proceedings.

"The technique we used is called atomic layer deposition. It promotes growth of the material layer by layer, or even molecule by molecule," Pessoa told.

In the study, TiO2 was deposited on nanofibers of PBAT (poly (butylene adipate-co-terephthalate)), a biopolymer that degrades rapidly in nature, unlike PET (polyethylene terephthalate), which remains intact for decades.

The first step was to produce a membrane of PBAT nanofibers, which was done by electrospinning, a technique similar to that used to make cotton candy, but involving an electrostatic procedure.

"A PBAT solution was electrospun to produce ultrathin nanofibers only a few hundred nanometers thick. These fibers made up the sheet used as a substrate," Pessoa said.

The next step was to coat each fiber with TiO2. "Atomic layer deposition uses precursors of the material of interest produced from gas or liquid that's rapidly evaporated by low pressure. In this case, we used titanium tetrachloride (TiCl4) and water (H2O) as precursors. This was done in a vacuum chamber heated to 100 °C and 150 °C," he explained.

The TiCl4 was released in successive pulses of 0.25 seconds. When released in a vacuum, TiCl4 quickly evaporates and reacts with the surface of the fibers, binding to hydroxyl radicals (OH-) and oxygen radicals (O2-) present in the material.

Because TiCI4 does not react with itself, the initial pulse filled only one layer, which was then oxidized with steam. Hydrogen bound to the chlorine and oxygen bound to the titanium, forming the first monolayer of TiO2.

This procedure was repeated approximately 1,000 times, building up the TiO2 structure layer by layer. To remove the PBAT substrate and free the TiO2 nanotubes, the material was heated to 900 °C in a controlled manner. The result was a sheet of TiO2 nanotubes with a thickness of approximately 100 nanometers.

"The deposition technique is based on surface reactions and therefore results in an even coating, covering the fibers one by one. It's relatively simple but requires automation so that the amount of material and dispersal time are rigorously controlled," Pessoa said.

As a material for filtration, the sheet of TiO2 nanotubes combines the mechanical virtue of blocking particles larger than a specific size with the biochemical virtue of generating radicals that easily degrade organic matter when irradiated with UV light. Because the sheet is made of nanofibers, it has a large surface area, which considerably increases the reaction rate.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo