Earth

Biologists find invasive snails using new DNA-detection technique

image: Biologists led by the University of Iowa used a special technique called eDNA to discover an invasive species of tiny snails in streams in central Pennsylvania where the snails' presence had been unknown. The invasive New Zealand mud snail has spread to the Eastern Seaboard after arriving in the western United States decades ago.

Image: 
Edward Levri, Pennsylvania State University-Altoona

Invasive species, beware: Your days of hiding may be ending.

Biologists led by the University of Iowa discovered the presence of the invasive New Zealand mud snail by detecting their DNA in waters they were inhabiting incognito. The researchers employed a technique called environmental DNA (eDNA) to reveal the snails' existence, showing the method can be used to detect and control new, unknown incursions by the snail and other invasive species.

"eDNA has been used successfully with other aquatic organisms, but this is the first time it's been applied to detect a new invasive population of these snails, which are a destructive invasive species in fresh waters around the world," says Maurine Neiman, associate professor in the Department of Biology and the study's co-author. "eDNA can be used to find organisms at really early stages of invasion, so it can detect a population even when there are so few of the organisms that traditional methods would never find them."

The biologists traveled to central Pennsylvania seeking evidence of the presence of the mud snail, which for decades has been spreading in fresh waters in the continental United States, beginning in the Northwest, moving to the Great Lakes, and now migrating along the Eastern Seaboard. The tiny aquatic snails' population densities can balloon to more than 500,000 individuals in a square yard, covering the water bottom and crowding out native species.

The researchers collected samples from eight sites spread across six rivers in the Susquehanna River watershed, which feeds into Chesapeake Bay and the Mid-Atlantic watershed. Six of the sites had no reported cases of the mud snail, despite physical surveys, while the other two locations had not been studied.

The researchers used the eDNA technique to look for DNA the snails would leave as tracers in sloughed-off skin cells or bodily waste. They discovered the snails were there, after all: The eDNA results confirmed the mud snails were at one site where none had been detected previously, and were likely at low population levels at other sites as well.

"This study presents an important step forward in demonstrating that eDNA can be successfully applied to detect new P. antipodarum invasions and will allow us to more accurately track and potentially halt ongoing range expansion of this destructive invasive species," wrote James Woodell, a research support technician at University of Hawai?i at Mānoa who performed the research while a master's student in biology at Iowa and is the study's corresponding author.

The eDNA technique was developed less than a decade ago. It has been used to ferret out invasive species, including fish, frogs, and crustaceans, in aquatic ecosystems. For this study, the biologists refined the filtering protocols from an existing eDNA sampling system for mud snail detection and tested it for the first time in the field.

Credit: 
University of Iowa

Unprecedented data sharing driving new rare disease diagnoses in Europe

image: Sergi Beltran and Leslie Matalonga pictured in front of a supercomputer and servers that hosts the RD-Connect GPAP platform. The platform is located at the CNAG-CRG facilities in the Parc Cientific de Barcelona.

Image: 
Centro Nacional de Análisis Genómico (CNAG-CRG)

Rare disease experts detail the first results of an unprecedented collaboration to diagnose people living with unsolved cases of rare diseases across Europe. The findings are published today in a series of six papers in the European Journal of Human Genetics.

In the main publication, an international consortium, known as Solve-RD, explains how the periodic reanalysis of genomic and phenotypic information from people living with a rare disease can boost the chance of diagnosis when combined with data sharing across European borders on a massive scale. Using this new approach, a preliminary reanalysis of data from 8,393 individuals resulted in 255 new diagnoses, some with atypical manifestations of known diseases.

A complementary study describes the method in more detail and four accompanying case studies showcase the advantages of the approach. In one case study, researchers used the method to identify a new genetic form of pontocerebellar hypoplasia type 1 (PCH1), a genetic disease that affects the development of the brain. PCH1 is normally linked to mutations in four known genes. The researchers used the method to identify a new variant in a fifth gene.

In another case study, researchers used the method on an individual with a complex neurodevelopmental disorder and found the disease was caused by a new genetic variant in mitochondrial DNA. This went previously undetected because the patient did not present typical symptoms of a mitochondrial disorder. The diagnosis will help tailor treatment for the individual, as well as inform their family members on the possibility of passing it on to future generations.

Key to the reanalysis of unsolved cases is the RD-Connect Genome-Phenome Analysis Platform, which is developed, hosted and coordinated by the Centro Nacional de Analisis Genomico (CNAG-CRG), part of the Centre for Genomic Regulation (CRG), based in Barcelona.

Recognised officially by the International Rare Diseases Research Consortium and funded by the EU, Spanish and Catalan governments, the RD-Connect GPAP provides authorised clinicians and researchers with secure and controlled access to pseudonymised genomic data and clinical information from patients with rare diseases. The platform enables the secure, fast and cost-effective automated re-analysis of the thousands of undiagnosed patients and relatives entering the Solve-RD project.

According to Sergi Beltran, co-leader of Solve-RD data analysis and Head of the Bioinformatics Unit at CNAG-CRG, "Solve-RD has shown that it is possible to securely share large amounts of genomics data internationally for the benefit of the patients. The work we are publishing today is just the tip of the iceberg, since many more patients are being diagnosed thanks to the innovative methods developed and applied within Solve-RD".

An estimated 30 million people in Europe are affected by a rare disease during their lifetime. More than 70% of rare diseases have a genetic cause. However, around 50% of patients with a rare disease remain undiagnosed even in advanced expert clinical settings that use techniques such as genome sequencing.

At the same time, scientists around the world are finding an average of 250 new gene-disease associations and 9,200 variant-disease associations per year. As scientific understanding expands, reanalysing data periodically can help people receive a diagnosis.

The consortium, which consists of more than 300 researchers and clinicians in fifteen countries, and who collectively see more than 270,000 rare disease patients each year, aims to eventually diagnose more than 19,000 unsolved cases of rare diseases with an unknown molecular cause. Their preliminary findings are an important first step for the development of a European-wide system to facilitate the diagnosis rare diseases, which can be a long and arduous process.

Credit: 
Center for Genomic Regulation

NUS researchers develop novel technique to automate production of pharmaceutical compounds

image: Dr. Liu Chenguang (left) and Assistant Professor Wu Jie (right) are part of the NUS team that developed the automated technique to produce pharmaceutical compounds.

Image: 
National University of Singapore

Singapore, 1 June 2021 - The discovery and development of new small-molecule compounds for therapeutic use involves a huge investment of time, effort and resources. Giving a new spin to conventional chemical synthesis, a team of researchers from the National University of Singapore (NUS) has developed a way to automate the production of small molecules suitable for pharmaceutical use. The method can potentially be used for molecules that are typically produced via manual processes, thereby reducing the manpower required.

The research team that achieved this technological breakthrough was led by Assistant Professor Wu Jie from the NUS Department of Chemistry as well as Associate Professor Saif A. Khan from the NUS Department of Chemical and Biomolecular Engineering.

Demonstrating the novel technique on prexersatib, a pharmaceutical molecule used in cancer treatment, the NUS team achieved a fully automated six-step synthesis with 65 per cent isolated yield within 32 hours. In addition, their technique also successfully produced 23 prexasertib derivatives in an automated fashion, signifying the method's potential for drug discovery and design.

The findings, which were first published in the journal Nature Chemistry on 19 April 2021, can potentially be applied to the production of a wide-range of pharmaceutical molecules.

Simplifying the production of pharmaceutical compounds

Recent advances in end-to-end continuous-flow synthesis are rapidly expanding the capabilities of automated syntheses of small-molecule pharmaceutical compounds in flow reactors. There are well-defined production methods for molecules such as peptides and oligonucleotides which have repeating functional units. However, it is challenging to conduct multi-step continuous-flow synthesis of active pharmaceutical ingredients due to issues such as solvent and reagent incompatibilities.

The new automated technique developed by the NUS research team combines two chemical synthesis techniques. These comprise continuous-flow synthesis, where chemical reactions are carried out in a seamless process, and solid-supported synthesis, in which molecules are chemically bonded and grown onto an insoluble support material.

Their novel technique, called solid phase synthesis-flow, or SPS-flow, enables the target molecule to be developed on a solid supporting material as the reaction reagent flows through a packed-bed reactor. The entire process is controlled by computer automation. Compared to existing automated techniques, the SPS-flow method enables wider reaction patterns and longer linear end-to-end automated synthesis of pharmaceutical compounds.

The researchers tested their technique on cancer-inhibiting molecule prexasertib due to its suitability in being attached to solid resin which was used as the support material. Their experiments showed a yield of 65 per cent after 32 hours of continuous automated execution. This is an improvement from the existing method of producing prexasertib that is estimated to take around a week, and requires an extensive six-step manual process and purification procedure to produce a yield of up to 50 per cent.

The new method also allows for synthetic modifications early in the process, hence enabling greater structural diversification compared to traditional methods which only allow late-stage diversification of a molecule's common core structure. Using a computer-based chemical recipe file, the team successfully produced 23 derivative molecules of prexasertib. The derivatives produced are molecules with parts of the molecular structure differing slightly from the original molecule.

"The capability to easily obtain these derivatives is crucial during the drug discovery and design process as understanding the relationship between molecule structures and their activities play an important role for the selection of promising clinical candidates," explained Assoc Prof Khan.

Creating new possibilities in drug development

The NUS team plans to further showcase their SPS-flow technique's versatility by conducting more research incorporating top-selling pharmaceutical molecules.

"Our new technique presents a simple and compact platform for on-demand automated synthesis of a drug molecule and its derivatives. We estimate that 73 per cent of the top 200 bestselling small-molecule drugs could be produced using this technique," said Asst Prof Wu.

Future studies taken by the team will target the development of a fully automated and portable system for active pharmaceutical ingredient production at a larger scale suitable for manufacturing. The system will apply the newly developed technique in lead optimisation to speed up the process of drug discovery.

Credit: 
National University of Singapore

Curtin study finds WA's natural 'museums of biodiversity' at risk

image: A biodiverse BIF in the Mid West

Image: 
Curtin University

Up to three quarters of the biodiversity living on Western Australia's iconic ironstone mountains in the State's Mid West (known as Banded Iron Formations) could be difficult or impossible to return quickly to its previous state after the landscape has been mined, a Curtin University study has found.

The research published in Ecology and Evolution, discovered that the plant ecosystems are well-adapted to the characteristics of the region's ancient and nutrient-poor soils - and that the very different features of mined landscapes mean many native species are unlikely to be returned by rehabilitation.

Lead researcher Dr Adam Cross from Curtin's School of Molecular and Life Sciences said the elevation and different habitats offered by Banded Iron Formations (BIF) in an otherwise dry, mostly flat landscape, make them a sponge for biodiversity - but that their iron-rich rock made them increasingly attractive to iron-ore miners.

"Unfortunately, the chemical characteristics of some tailings and other by-products produced by mines can be more similar to material on the moon than to the ancient, highly-weathered soils of BIF, and this presents a really challenging, hostile environment for many native plant species," Dr Cross said.

The Mid West region is known for its BIF ranges, which Dr Cross describes as stunning natural 'museums', that host much of the regional florist biodiversity. He said almost every plant species from the surrounding landscape can be found on them - as well as some unique species found nowhere else.

"These collections of species have accumulated over very long periods of time, and the increased pressure to mine BIF is putting the biodiversity at risk. Once BIF are gone, that's it - we cannot recreate these iconic landforms, and our study suggests that, even if we could, the post-mining environment likely wouldn't support many of the species that used to call them home."

"BIF harbour such biodiversity because in periods where the climate has been hotter and drier, their rocky, complex soils offered a cooler, wetter refuge for many species that were unable to survive in the surrounding landscape.

"With climate change suggesting a hotter, drier outlook for the Mid West region in future decades, it is increasingly important that we preserve and conserve remaining BIF habitats and the species growing on them."

The research team looked at 538 plant species in an 82,000 hectare area in WA's Mid West, assessing their growth on different soil types across the region and examining their potential tolerance to the chemical characteristics of mined materials.

Although many species were adapted to the acidic, nutrient-poor soils of BIF, the team found at least some were tolerant of a wide range of soil types and might be used as 'pioneers' to help kick-start vegetation recovery in rehabilitation.

Dr Cross said more studies were needed to find ways to rapidly change the chemical characteristics of post-mining soils to speed up rehabilitation, and preserve the area's biodiversity.

"The mining industry needs to consider the soil properties of landforms requiring rehabilitation or ecological restoration, and the implications for vegetation establishment and plant community development, at the very earliest stages of planning or environmental impact assessment," Dr Cross said.

"Ecosystems are extremely complex; we need to recognise, appreciate and learn from this complexity when we are attempting to return biodiversity to areas that have been impacted by mining.

"We need to reach a happy medium between development and conservation to effectively continue mining in these areas, while preserving our incredible natural resources."

Credit: 
Curtin University

Protecting the intellectual abilities of people at risk for psychosis

image: Effects of SSRIs on brain and intellectual development. on the left side: brain map showing regions of the brain protected by the prolonged administration of SSRIs comprising a network of prefrontal and limbic sructures. On the right side: plot displaying an increase in IQ scores in subjects treated with SSRIs as opposed to the IQ decrease in subjects not treated with SSRIs.

Image: 
Valentina Mancini

One person in 2000 suffers from a microdeletion of chromosome 22 that can lead to the development of psychotic disorders, such as schizophrenia, in adolescence. In addition to symptoms such as hallucinations or delusions, psychotic disorders also comes with a progressive decline in intelligence quotient (IQ). If current drug treatments are successful in containing psychotic symptoms, nothing can be done to prevent the deterioration of intellectual skills that leads to loss of autonomy. Researchers at the University of Geneva (UNIGE), Switzerland, have discovered that prescription of selective serotonin reuptake inhibitors (SSRIs) - a class of drugs used to treat anxiety and depression -in late childhood can reduce the deterioration of intellectual abilities, and have a neuroprotective effect on some of the brain regions affected by the psychotic illness. This study, to be read in the journal Translational Psychiatry, opens up a new field of research and new hope for people affected by the microdeletion of chromosome 22.

The average IQ is around 100 points. However, for people who may develop a psychotic illness, such as those with a microdeletion of chromosome 22, the average drops to 70-80 points. "The problem is that when a psychotic disorder occurs, such as schizophrenia, the brain frontal lobe and the hippocampus are particularly affected, which leads to the gradual deterioration of already below-average intellectual capacities", explains Valentina Mancini, a researcher in the Department of Psychiatry at UNIGE Faculty of Medicine and first author of the study. From then on, the average IQ drops to around 65-70 points, leading to a loss of autonomy that requires a protected environment. "At present, drug treatments manage to contain psychotic symptoms, such as hallucinations, anxiety or distortion of reality, but there is no treatment that can reduce the deterioration of affected people's intellectual capacities", notes the Geneva researcher.

200 patients followed over a 20 years period reveal a possible solution

The team of Stéphan Eliez, professor in the Department of Psychiatry at UNIGE Faculty of Medicine, has been following 200 patients affected by the microdeletion of chromosome 22 for the past 20 years. "30 to 40% of them developed schizophrenia psychotic disorder", he explains. "Thanks to this cohort, we found that people suffering from this syndrome lost 7 to 8 IQ points from childhood to adulthood. This figure rises to 15 IQ points for those who developed psychotic disorders."

Yet the physicians noted that two to three teenagers a year are exceptions, and even gained IQ points. Why? "We made a comprehensive analysis of these patients' medical data to find out any common feature in the treatments prescribed to them by their GP", explains Valentina Mancini. Two observations caught their attention.

The first is the prescription of small, regular doses of SSRIs - a drug that increases the levels of serotonin, a neurotransmitter involved in the regulation of behaviour - in late childhood and throughout adolescence. "These drugs increase neurogenesis and act on synaptic plasticity. They are prescribed today to reduce anxiety and depressive symptoms", explains the Geneva researcher. And the younger the patients received this treatment, at around 10-12 years of age, the more the frontal lobe and the hippocampus - and therefore the intellectual capacities - were preserved from deterioration caused by the psychotic illness. The second observation is that a neuroleptic drug - prescribed in small doses to control psychotic symptoms such as hallucinations or delusions - also seems to have a positive effect if added to SSRIs during adolescence. "These two medications, especially when combined, have thus preserved the anatomical structure of the brain affected by the degradation responsible for the decline in intellectual capacity", remarks Stéphan Eliez.

A promising discovery for the future of people at risk of psychosis

This study provides for the first time an indication of a neuroprotective preventive treatment for the development and preservation of IQ. "It should be stressed that too great a deterioration of intellectual skills progressively leads to a very problematic psychosocial dependence. Here, we could succeed in protecting this population", notes Stéphan Eliez.

Once the results of this study are confirmed, the effect of SSRIs could be tested on other types of patients and possibly prescribed preventively to people at risk of intellectual deterioration, such as individuals with other genetic syndromes like Fragile X or Down's syndrome, or children of schizophrenic parents. "We also want to investigate whether the 3% to 4% of adolescents in the general population who develop psychotic symptoms would see this risk reduced by taking this drug", continues Valentina Mancini.

The Geneva team will now compare the results obtained from their research cohort with international databases in order to confirm the neuroprotective role induced by these treatments prescribed at the end of childhood, adolescence being the critical phase for the onset of psychotic diseases.

Credit: 
Université de Genève

A novel nanometer-scale proximity labeling method targeting histidine residues

image: The research group found novel protein chemical labeling reaction using singlet oxygen
( 1 O 2 ). Utilizing the short diffusion distance of 1 O 2 and a technique to localize the 1 O 2
generator, the research group demonstrated site-selective labeling of antibody.

Image: 
Shinichi Sato

Researchers have created a new nanometer-scale proximity labeling system that targets histidine residues quickly, providing a new chemical tool in protein chemical modification.

The results of their research were published in the Journal of the American Chemical Society on April 27, 2021.

Protein chemical modification, a technology that introduces functions into the chemical structure of proteins through irreversible strong bonds, is used for the creation of protein-based biomaterials and for drug delivery systems.

In order to carry out modification, protein labeling is necessary. Proximity labeling is one of those techniques. It labels biomolecules located close to a protein of interest which can then also be marked and analyzed.

However, there are only a few chemical reactions that can be applied to protein chemical modification methods. Moreover, there have been very few reports on the selective modification of histidine residues.

In previous electrophilic approaches, the weak nucleophilic nature of histidine residues results in low selectivity for other nucleophilic amino acids.

A gaseous inorganic chemical known as singlet oxygen helped overcome this barrier. Singlet oxygen is a highly reactive chemical species with microsecond-scale lifetimes and nanometer-scale diffusion distances.

The research group employed nucleophiles to capture the electrophilic intermediates produced by the reaction of singlet oxygen with histidine residues. The high reactivity of singlet oxygen led to a rapid and complete reaction.

Conventional histidine labeling methods take several hours to chemically modify the histidine residues of proteins. Yet, this method modified the histidine residues in only a few minutes by visible light irradiation of the photocatalyst under physiological pH conditions.

Corresponding author Dr Shinichi Sato from the Frontier Research Institute for Interdisciplinary Sciences at Tohoku University says that their discovery has opened the door to protein analysis research using singlet oxygen. "Using conventional singlet oxygen production methods can potentially develop into a technology that clarifies unexplored intracellular signal transduction and protein-protein interactions."

Credit: 
Tohoku University

Study pinpoints key causes of ocean circulation change

image: Variability in ocean currents is influenced by multiple factors.

Image: 
Prof Helen Johnson

Researchers have identified the key factors that influence a vital pattern of ocean currents.

The Atlantic meridional overturning circulation (AMOC) carries warm water from the tropics northward.

Many scientists think that this heat transport makes areas including north-west Europe and the UK warmer than they would otherwise be.

Climate models suggest the AMOC is likely to weaken over the coming decades, with widespread implications for regional and global climate.

The new study - led by the universities of Exeter and Oxford, and published in Nature Geoscience - pinpoints the causes of monthly and annual AMOC variation and finds a differing picture at two key locations.

Observational data came from large arrays of monitoring equipment - off the coasts of Florida and Africa, and in the North Atlantic between Greenland and Scotland - run by the international RAPID and OSNAP projects.

"Understanding AMOC variability is challenging because the circulation is influenced by multiple factors that all vary and whose overlapping impacts persist for years," said lead author Dr Yavor Kostov, of the Department of Geography at the University of Exeter.

"Our findings reveal the vital role of winds in driving changes in this ocean circulation.

"Winds were a key factor both in the sub-tropical and sub-polar locations we examined.

"As the climate continues to change, more efforts should be concentrated on monitoring those winds - especially in key regions on continental boundaries and the eastern coast of Greenland - and understanding what drives changes in them."

While AMOC variability off the southern USA is dominated by the impact of winds, variability in the North Atlantic is generated by the combined effects of winds, heat and freshwater anomalies.
"Our reconstruction suggests that, compared to the subtropics, the overturning circulation in the subpolar North Atlantic is more sensitive to changes in the background ocean state such as shifts in the sites of deep convection," Dr Kostov said.

"This implies that future climate change may alter annual AMOC variability in this region. It emphasises the need for continued observations of the subpolar North Atlantic ocean."

The study also finds that changes in the surface temperature and salinity near Canada and Greenland can trigger a delayed remote impact on the Atlantic circulation as far south as Florida.

Credit: 
University of Exeter

How do plants hedge their bets?

image: Genetically identical Arabidopsis seedlings germinating at variable times.

Image: 
Katie Abley

In some environments there is no way for a seed to know for sure when the best time to germinate is.

In spring, cues like light, temperature and water may suggest to seeds that conditions are optimal for germination, but a week later an unpredictable drought or frost could kill the emerging seedlings.

So how does a plant make sure that all of its offspring are not killed at once by an ill-timed environmental stress following germination?

There is evidence that some plant species produce seeds that germinate at different times to hedge their bets against this risk. Many species produce seeds that can enter a dormant state and exist in the soil for several years and some also produce seeds that germinate at different times within a season.

This means that if lethal environmental fluctuations do occur, a fraction of a plant's offspring will survive as seeds, which can go on to germinate at another time.

This variability in germination time can be seen even with genetically identical seeds grown in an identical environment.

In agriculture, the variability in germination time can be a problem when you want to harvest the whole crop at the same time. Instead, farmers have to monitor their crops' maturity, taking measurements from multiple individual plants to estimate when the best time is to harvest.

Scientists at the Sainsbury Laboratory Cambridge University (SLCU) used the model plant, thale cress (Arabidopsis thaliana), to ask: What makes genetically identical seeds germinate at different times?

"We already know that two plant hormones - abscisic acid (ABA) that inhibits germination and gibberellic acid (GA) that promotes germination - interact with each other to control the decision to germinate, but we wanted to know how this interaction creates variability in germination times between seeds", said Dr Katie Abley, researcher at SLCU and joint first author of the research published in eLife.

"By measuring the levels of variability in germination times for hundreds of genetically distinct strains of Arabidopsis, we were able to identify two regions of DNA (genetic loci) that control how variable germination time is. Both loci contain genes that influence how sensitive seeds are to ABA and testing mutants of these genes provided evidence that they regulate variability in germination timing."

Using this new information, the researchers generated a mathematical model of the ABA-GA network to understand how the interactions between ABA and GA could cause a batch of identical seeds to have a range of germination times. They wanted to understand how the network could give rise to different levels of variability in germination time.

"We found that changing ABA sensitivity in the model replicated the experimental germination time distributions that we observed", explained Dr Pau Formosa-Jordan, joint first author and now research group leader at Max Planck Institute for Plant Breeding Research in Cologne, Germany.

"In the model, groups of seeds with higher sensitivity to ABA germinate in a more spread-out way because, upon sowing, each one of these seeds relies on stochastic fluctuations in the ABA-GA network to switch from a non-germination state to a germination state - which is known as a bistable switch behaviour. Yet, seeds with lower sensitivity to ABA more rapidly and synchronously get to the germination state after being sown, without the need for the stochastic fluctuations. Our stochastic model suggests an ABA-GA bistable switch can generate variability in germination times, with the germination time being influenced by stochastic fluctuations in the levels of hormones."

While the researchers expect there to be other genetic and biophysical effects at play that affect variability in germination time, their findings show that this plant trait is genetically controlled and high or low variability in germination times could, therefore, be specifically selected for in crop breeding programmes or to rehabilitate natural areas with highly variable environments.

Credit: 
University of Cambridge

Researchers discover how cells can survive in high salt concentrations

image: Image of a cell "ruptured" by a fire-polished micropipette

Image: 
UPF - IRB Barcelona

Cells have to constantly adapt to their surroundings in order to survive. A sudden increase in the environmental levels of an osmolyte, such as salt, causes cells to lose water and shrink. In a matter of seconds, they activate a mechanism that allows them to recover their initial water volume and avoid dying.

Finding out which genes are involved in surviving osmotic stress was the subject of a study led by the laboratories of Dr. Posas and Dr. de Nadal at the Institute for Research in Biomedicine (IRB Barcelona) and Dr. Valverde at Pompeu Fabra University (UPF), in collaboration with a group led by Dr. Moffat from the University of Toronto (Canada). wide-genome genetic screening, the scientists discovered the central role of a gene known as LRRC8A in cellular ability to survive osmotic shock.

This gene codes for a protein that forms channels in the membrane and that allow chloride ions to leave the cell. "Using a human epithelial cell model, as well as other human and mouse cell types, we have been able to demonstrate that this channel opens shortly after the cells are exposed to a high concentration of sodium chloride (NaCl)," explains Dr. De Nadal, who, together with Dr. Francesc Posas, heads the Cell Signalling laboratory at IRB Barcelona. The authors have also identified the molecular mechanism that causes this rapid opening. The chloride channel phosphorylates, which means a phosphate group is added to a specific amino acid in its sequence, thus activating the channel.

"This has been a very complex project, and it has taken us years to see the light," explains Dr. Miguel Ángel Valverde, head of UPF's Laboratory of Molecular Physiology. "We have also shown how vital it is for this channel to become activated and remove chloride in order to start the volume recovery process and for cells to survive over time," he adds.

The use of a violet dye that stains only living cells has allowed the researchers to observe that cell death increases by approximately 50% when the activity of this chloride channel is blocked with a particular compound.

A journey through time to answer old questions

In the '90s, various landmark scientific papers on cell volume regulation described the process by which cells regulate their volume to survive. It was known that the proteins responsible for volume recovery under salt stress require low intracellular concentrations in order to become activated, but it was not known how this occurred under such adverse conditions. With this discovery, the authors have answered a question posed by researchers years ago: how does chloride exit the cell to start the whole process? In the words of the paper's main co-author, Dr. Selma Serra (UPF): "Now we have the answer to that question. It is the LRRC8A channel that brings down the chloride levels in a cell. Until now we had a good understanding of the role played by this channel in cell adpatation to environments with very low salt concentrations. The big challenge was to find out how the same chloride channel could be crucial in the opposite mechanism. At the beginning of the project, it seemed to go against any kind of scientific logic that a channel used to shrink cells could also swell them."

Using electrophysiological and fluorescence microscopy techniques in living cells to ascertain intracellular chloride levels, the researchers have demonstrated the involvement of the LRRC8A chloride channel in responses to high-salt stimuli.

A major technical and conceptual challenge

Studying this process at the molecular level has posed a considerable challenge for the team involved in this project. Because it is very complicated to conduct in vivo studies of cells while they undergo osmotic shock and shrink. "Imagine you're looking at a juicy grape, and suddenly it looks like a raisin, that makes things very complicated for us," say the authors.

Another high-impact factor is that, under these stress conditions, the mechanism for activating the chloride channel is very different to what has been described so far in the literature. The article's lead co-author, Predrag Stojakovic, says, "It came as a big surprise to find out that the signalling pathways in response to stress, the MAP kinase, proteins we've been studying in the lab for months, are directly responsible for activating this channel". MAP kinases are a group of signalling proteins that add phosphate groups to other proteins, thus activating or deactivating them. Using molecular techniques, the authors have looked throughout the channel's protein to find the target sequence of these kinase proteins. "We have been able to identify the specific residue of the chloride channel that leads to activation under the control of the MAP kinase channel in response to stress," says doctoral student Stojakovic.
Future implications

"This new piece of research opens up new possibilities for studying cell adaptation and survival salt stress. Certain organs of the body, such as the kidneys, are often exposed to high salt concentration, which can threaten their survival. Knowing what molecules control survival under these conditions could be very useful for understanding certain pathologies that entail volume recovery in response to salts," explains Dr. Posas.

In addition, discovering the role of this channel in these cell regulation processes is highly relevant in many pathologies involving proteins regulated by LRRC8A. This may be significant in situations such as certain kinds of arterial hypertension or cerebral ischemia.

Credit: 
Institute for Research in Biomedicine (IRB Barcelona)

How news coverage affects public trust in science

News media reports about scientific failures that do not recognize the self-correcting nature of science can damage public perceptions of trust and confidence in scientific work, according to findings by researchers at the Annenberg Public Policy Center (APPC) of the University of Pennsylvania and the University at Buffalo, the State University of New York.

News stories about science follow several specific narratives, the researchers write in a new study in the journal Public Understanding of Science. One is that science is "in crisis" or "broken," a narrative driven in recent years by reports of unsuccessful efforts to replicate findings in psychology, a rise in retractions, failures of peer review, and the misuse of statistics, among other things.

"Attempts and failures to replicate findings are an essential and healthy part of the scientific process," said co-author Yotam Ophir, an assistant professor of communication at the University of Buffalo and a former postdoctoral fellow in APPC's science of science communication program, where the work was conducted. "Our research shows the need for journalists and scientists to accurately contextualize such failures as part of the self-correcting nature of science."

In an experiment, nearly 4,500 U.S. adults were assigned to read one of four different types of news stories about science or a control story. Among the findings:

Exposure to stories highlighting problems reduced trust in scientists and induced negative beliefs about scientists.

Greater effects were seen among people who read stories saying that science was in crisis or broken.

"We've identified a tendency in news coverage to overgeneralize the prevalence of problems in science and take them as an indicator that the enterprise as a whole is broken," said co-author and APPC Director Kathleen Hall Jamieson. What the experiment found, she added, is that "exposure to news that mistakenly concluded that because something has gone wrong science is in crisis can unjustifiably undercut confidence in science."

The experiment

The study sought to provide experimental evidence about the effects of exposure to different narratives about science. It was conducted online with 4,497 U.S. adults in early 2019 - before, Jamieson noted, the world was in the throes of the Covid-19 pandemic and "science discovered life-saving vaccines with unprecedented speed."

The experiment tested the effects of four narratives:

the "honorable quest" or discovery, in which a scientist discovers knowledge that is reliable and consequential;

the "counterfeit quest," or retraction of published work, in which a scientist engages in dishonorable and guileful conduct;

the science is "in crisis/broken" narrative, which indicts scientists or the institution of science for failing to address a known problem; and

the "problem explored," where scientists explore and potentially fix a problem revealed by the "crisis/broken" narrative.

Participants were randomly assigned a reading based on edited news stories that were consistent with one of the narratives. For example, one "quest" story told of a discovery in immunotherapy to treat leukemia, while a "counterfeit quest" story described retracted scientific claims about eating behavior. A "science is broken" story described an "alarming increase in the number of retractions," and a "problem explored" story looked at psychologists exploring ways to increase the reliability of psychology studies. A fifth group of participants read a control story about an unrelated subject, baseball.

After completing the readings, the participants were asked about their trust in science, beliefs about science, and support for funding of science.

Trust in science is high

The researchers found that:

Trust in science was moderately high;

Beliefs that science is self-correcting and beneficial were moderate to high;

Among people with higher levels of trust in science, the more they perceived the problem-focused stories to be representative of science, the more likely they were to believe that science is self-correcting;

For people with lower levels of trust in science, the effect was reversed: the more they saw the problem-focused stories as representative, the less likely they were to believe that science is self-correcting;

Support for funding science was not affected by the stories.

"This study," the authors concluded, "demonstrates the adverse, if small, effects of problem-focused media narratives on trust in, beliefs about, and support for scientists and points to the importance of perceived representativeness and audience trust in scientists in the audience's response to them."

The experiment follows up on a 2018 study by Jamieson in the Proceedings of the National Academy of Sciences . The earlier study examined three media narratives about science - the honorable quest, counterfeit quest, and crisis/broken. Of the crisis/broken articles examined in that study, just 29% indicated that science is self-correcting and 34% were written by a scientist. That study expressed concern that "defective narratives can enhance the capacity of partisans to discredit areas of science... containing findings that are ideologically uncongenial to them."

How journalists and scientists can bolster trust in science

"By labeling problems in scientific research 'a crisis' and by framing scientific failures as indications that science is unreliable, both scientists and journalists are failing to communicate the true values of science," Ophir said. "Making mistakes is part of science. What the news media and scientists themselves often frame as failure is an indicator of healthy science."

The content analysis found that honorable quest story was the most prevalent. But the study noted that when media reports do discuss failures "they tend to ignore scientific attempts to address the problems," the authors write. "We argue that such narratives about individual or systemic scientific failures fail to communicate scientific norms of continuing exploration, scrutiny, and skepticism and could, particularly if being presented regularly and consistently, harm public trust and confidence in scientific work."

Use of the "problem explored" narrative could lessen the detrimental effects and improve attitudes toward science by "better communicating scientific norms of continuing exploration, scrutiny, and skepticism," the authors write. "As scientific communication in news media is the result of a negotiation between scientists and journalists, these results could guide future science communication efforts by both journalists and members of the scientific community.

"Like others before us..." they conclude, "we believe that such a change will require scientific institutions to reconsider the current incentive structure, that prioritizes the promotion of novel, statistically significant discoveries over [rigorous] self-correction efforts."

Credit: 
Annenberg Public Policy Center of the University of Pennsylvania

The effects of protein corona on the interactions of AIE-visualized liposomes with ce

image: The development of TR4@Lipo with self-indicating property and its behaviors on cellular uptake when exposed to media containing different concentration of serum. (A) Molecular structure of TR4. (B) Schematic illustration of TR4@Lipo. TR4 generates blue fluorescence when incorporated into the liposomes. (C) Confocal images of MCF-7 cells after incubation with TR4@Lipo in the presence of different concentrations of FBS from 0% to 10%. The scale bar is 10 μm. Co-localization profiles are shown in the bottom panel.

Image: 
@Science China Press

Since the introduction on the market in 1995 of Doxil, pegylated liposomal doxorubicin, liposomes have become one of the most clinically established drug delivery systems in nanomedicine. Among different liposomal formulations, cationic liposomes have attracted great attention because of their capacity to bind negatively charged nucleic acids to perform as non-viral gene delivery tools, and due to their potentials to fuse with cell membranes leading to a direct release of cargoes from liposomes into the cytoplasm and high drug delivery efficiency. However, in most cases, the drug/gene delivery and therapeutic efficacy by cationic liposomes are evaluated in vitro in serum-free conditions. It is known that once introduced in biological fluids, nanoscale objects (such as liposomes) absorb numerous proteins and biomolecules on their surface forming a layer called "protein corona". This layer affects nanoparticle charge, size and surface properties and confers to nanoparticle new biological properties, which affect its following performance, such as distribution, toxicity, cellular internalization and final fate. Thus, it could be expected that the cationic liposome behaviour on cellular levels is different when exposed to a serum-free condition and a biological environment. Fully understand the behaviour of cationic liposomes in a biologically relevant environment, especially the effect of protein corona on the liposome interaction with cells, will help to guide the design of efficient liposomal formulations and narrow the gap between in vitro and in vivo studies.

Within this context, Wang et al. have synthesized a cell membrane probe (called TR4) containing 4 arginine residues, a palmitic acid tail and tetraphenylethylene (TPE), and developed a cationic liposome (named TR4@Lipo) by inserting TR4 into hydrogenated phosphatidylcholine (HSPC)-cholesterol binary liposome membranes (Figure 1A-B). Benefiting from the aggregation-induced emission property of TPE, TR4 showed dramatically enhanced fluorescent intensity when restricted in the lipid bilayer, which endowed TR4@Lipo with a self-indicating capacity and made it visualized by confocal microscopy. Interestingly, when exposed to serum-free medium, TR4@Lipo interacted with cells through a cell membrane fusion behavior and this process did not consume energy. However, once introduced in a medium supplemented with fetal bovine serum or human serum and with protein corona forming on the liposome surface, TR4@Lipo was endocytosed by cells following an energy-dependent pathway (Figure 1C). Moreover, when doxorubicin (as a cargo model) was loaded in TR4@Lipo, it was observed that not only the internalization pathway of liposomes but also the intracellular distribution of cargoes were altered with protein corona formation.

Overall, by using a self-indicating liposome, the presented study highlights the effects of protein corona on the interactions of cationic liposomes, which switches the internalization of cationic liposomes from energy-independent membrane fusion to energy-dependent endocytosis, as well as modulating the intracellular distribution of encapsulated cargoes (Figure 2). This knowledge furthers our understanding of bio-nano interactions and is important for the efficient design and application of cationic liposomes in future studies.

Credit: 
Science China Press

Diet plays critical role in NASH progressing to liver cancer in mouse model

image: Debanjan Dhar, PhD, is co-senior author of the study and assistant professor in the Department of Medicine, Division of Gastroenterology at UC San Diego School of Medicine.

Image: 
UC San Diego Health Sciences

Non-alcoholic fatty liver disease (NAFLD) is the most common cause of chronic liver disease worldwide. NAFLD patients are at higher risk of developing Non-alcoholic steatohepatitis (NASH), which causes severe and chronic liver inflammation, fibrosis and liver damage. A patient with NASH is believed to be at high risk for developing a form of liver cancer called hepatocellular carcinoma (HCC).

Apart from lifestyle interventions, there are currently no approved treatments for NASH. A liver transplant is sometimes the only remedy.

While risk factors for NASH (obesity, type-2 diabetes and gene mutations like PNPLA3) and HCC (Hepatitis B and C infections, alcohol overconsumption and cirrhosis) are well known, the precise mechanism of how simple fatty liver progresses to chronic inflammation, liver fibrosis, NASH and HCC is not known.

A recent study led by researchers at University of California San Diego School of Medicine found in a mouse model that when fed a Western diet rich in calories, fat and cholesterol, the mice progressively became obese, diabetic and developed NASH, which progressed to HCC, chronic kidney and cardiovascular disease.

The findings, published in the May 31, 2021 online edition of Cellular and Molecular Gastroenterology and Hepatology, showed that by simply changing the Western diet in a mouse model to a normal chow diet, where calories are derived from proteins and carbohydrates rather than fats, with no cholesterol, NASH and liver fibrosis were improved; and cancer progression and mortality prevented.

"While the mice that continued on a Western diet developed HCC and had an increased risk of death, 100 percent of the mice that stopped the diet survived the length of the study without developing HCC," said Debanjan Dhar, PhD, co-senior author of the study and assistant professor in the Department of Medicine, Division of Gastroenterology at UC San Diego School of Medicine.

"This indicates that NASH and HCC may be a preventable disease and that diet plays a crucial role in the disease outcome."

In mice no longer fed the Western diet, researchers also found a decrease in liver fat and improvement in glucose tolerance -- an indicator of diabetes -- and several genes and cytokines that were affected in NASH returned to normal levels and function. In addition, Dhar and his team found key changes in the gut microbiome that modulate liver disease progression.

"Although NASH is a liver disease, our results show its development and progression is orchestrated by multiple organs."

A surprising finding, said the researchers, was that when they switched the Western diet of the mice with NASH to normal chow, the effect was more pronounced on the liver rather than on whole body weight.

"This could mean that slight changes in the liver might have profound effects on the disease outcome," said David Brenner, MD, co-senior author and vice chancellor of UC San Diego Health Sciences.

Researchers also compared mouse model findings to human patient datasets, indicating that gene expression changes in mouse livers were similar to human counterparts.

"Our animal model provides an important pre-clinical testing platform to study the safety and efficacy of drugs that are currently being developed, as well as to test the repurposing of other drugs that are already FDA approved for other diseases," said Dhar.

Credit: 
University of California - San Diego

SWOG researchers advance cancer care at virtual ASCO 2021

"SWOG always brings an impressive portfolio of work to the ASCO annual meeting," said SWOG Chair Charles D. Blanke, MD, "and this year I'm particularly excited about the research our investigators are presenting because it includes results that are likely to be practice-changing."

Investigators will present 12 abstracts from SWOG-led or co-led studies and 11 abstracts from studies led by other groups within the National Clinical Trials Network (NCTN).

Results from S1216 will be presented orally by study chair Neeraj Agarwal, MD, of the Huntsman Cancer Institute at the University of Utah. S1216 compared androgen deprivation therapy (ADT) combined with TAK-700 to the standard treatment of ADT with bicalutamide in patients who had metastatic hormone-sensitive prostate cancer. The study found that adding TAK-700 to ADT lengthened median progression-free survival in these patients and improved prostate-specific antigen response. The combination did not, however, significantly lengthen median overall survival, though it is worth noting that the median overall survival seen in the control arm was higher than has been reported in other recent phase 3 trials in this setting (abstract 5001).

Kenneth Grossmann, MD, PhD, also of the Huntsman Cancer Institute at the University of Utah, will give an oral presentation of S1404 results. S1404 tested pembrolizumab against therapies that were the standard of care at the start of the trial--either high-dose interferon or ipilimumab--in patients with high-risk resected melanoma. The drug significantly lengthened relapse-free survival in these patients, although it did not provide a statistically significant improvement in overall survival. The safety profile of pembrolizumab was more favorable than that of either ipilimumab or high-dose interferon in this patient population. Notably, the overall outcomes of patients on this trial were substantially better than what was predicted when the study was designed, likely due to the widespread availability of better therapies in the metastatic setting. This is good news for patients with melanoma (abstract 9501).

Here are highlights from some of the other SWOG work to be presented at ASCO 2021.

A secondary analysis of data from S0809 will be presented by Sepideh Gholami, MD, of the University of California, Davis. Previously reported results from S0809 showed that adjuvant capecitabine and gemcitabine followed by radiation therapy with capecitabine improved overall survival times in patients with resected extrahepatic cholangiocarcinoma and gallbladder cancers compared to historical controls. This secondary analysis asked whether this adjuvant chemoradiation provided a benefit specifically to those patients whose cancers had spread to their lymph nodes. Researchers conclude that this adjuvant therapy combination after surgery improves patient outcomes regardless of whether lymph nodes are involved and can have additional benefit in those with lymph node involvement, perhaps by preventing local recurrence (abstract 4104).

Long-term results from S1200 will be presented by Dawn L. Hershman, MD, MS, SWOG's vice chair of NCI's Community Oncology Research Program research and a professor of medicine and epidemiology at Columbia University. Many breast cancer patients are treated with drugs called aromatase inhibitors to reduce the chance that their cancer will return. These drugs, however, often cause joint pain, leading many patients to discontinue using them. S1200 tested whether acupuncture could provide pain relief to these patients. The trial compared a true acupuncture procedure to a sham procedure and to wait-list controls. In 2017, early results from S1200 reported better pain outcomes from true acupuncture through 24 weeks after the start of the therapy. Hershman's ASCO 2021 presentation on the 52-week outcomes from the trial echoes those promising early findings. Women with breast cancer and taking aromatase inhibitors who were treated with true acupuncture for 12 weeks for joint symptoms had lower worst pain levels than patients receiving sham acupuncture and wait-list controls. These benefits persisted over one year even though the initial course of acupuncture was only 12 weeks (abstract 12018).

Davendra Sohal, MD, MPH, of the University of Cincinnati Medical Center, will present a secondary analysis of data from S1505. S1505 enrolled patients with operable pancreatic ductal adenocarcinoma. These patients were randomized to get a course of either neoadjuvant FOLFIRINOX or gemcitabine-nab paclitaxel before surgery. Sohal's analysis looked at the relationship between patients' skeletal muscle and adipose tissue measurements and their overall survival times. The analysis showed that higher visceral fat was associated with lower overall survival among these patients (abstract 4131).

Results from S1605 will be presented by Peter Black, MD, of the University of British Columbia. S1605 was a phase II trial that tested the efficacy of the drug atezolizumab in patients with non-muscle-invasive bladder cancer that was unresponsive to treatment with bacillus Calmette-Guerin (BCG). Radical cystectomy is the standard of care for these patients, however some patients are not eligible for surgery due to poor general health and some choose to preserve their bladder. The observed rate of response to atezolizumab in S1605 suggests this drug could be a valuable treatment for these patients (abstract 4541).

An update on the toxicity data for S1800A, one of the sub-studies run as part of the Lung-MAP trial, will be presented by Karen Reckamp, MD, of Cedars-Sinai Medical Center. Lung-MAP is a master protocol for patients with stage-IV non-small cell lung cancer, and those who were not eligible for a biomarker-matched sub-study were enrolled to S1800A. Here they were randomized to either ramucirumab plus pembrolizumab or to an investigator-chosen standard of care treatment. Researchers found that the rate of Grade-3 and higher toxicities was lower in patients in the ramucirumab plus pembrolizumab arm than in patients who received standard of care treatment. Efficacy outcomes are expected in the fall of 2021 (abstract 9075).

Results from another Lung-MAP sub-study, S1900A, are to be presented by Jonathan W. Riess, MD, a SWOG investigator at the University of California Davis Comprehensive Cancer Center. S1900A was a phase II study of the PARP inhibitor rucaparib in patients with advanced (stage IV) non-small cell lung cancer (NSCLC) whose tumors had at least one of two specific genetic changes (genomic loss of heterozygosity and/or a deleterious BRCA1/2 mutation). Prior studies have found that PARP inhibitors such as rucaparib are robustly effective in treating a range of cancers that display these genomic changes. It was not known, however, if PARP inhibitors were also effective in treating advanced-stage NSCLC with these changes. S1900A was designed to answer this question. The study failed to show sufficient efficacy of rucaparib in these patients in the overall population. The researchers conclude that genomic loss of heterozygosity does not predict sufficient activity of rucaparib in NSCLC. However, in an unplanned analysis, a signal towards preferential clinical activity was observed in patients with NSCLC whose tumor harbored mutations in both alleles of the BRCA gene. Studies following up on this finding are ongoing (abstract 9024).

Credit: 
SWOG Cancer Research Network

Newly discovered African 'climate seesaw' drove human evolution

image: The alkaline Nakuru Lake in Kenya is rich in the cyanobacterium Spirulina platensis, the basic food of the Lesser Flamingo. However, due to increasing rainfall in the region in recent years, the bacterium and with it the flamingos are disappearing

Image: 
Prof. Martin Trauth, University of Potsdam

While it is widely accepted that climate change drove the evolution of our species in Africa, the exact character of that climate change and its impacts are not well understood. Glacial-interglacial cycles strongly impact patterns of climate change in many parts of the world, and were also assumed to regulate environmental changes in Africa during the critical period of human evolution over the last ~1 million years. The ecosystem changes driven by these glacial cycles are thought to have stimulated the evolution and dispersal of early humans.

A paper published in Proceedings of the National Academy of Sciences of the United States of America (PNAS) this week challenges this view. Dr. Kaboth-Bahr and an international group of multidisciplinary collaborators identified ancient El Niño-like weather patterns as the drivers of major climate changes in Africa. This allowed the group to re-evaluate the existing climatic framework of human evolution.

Walking with the rain

Dr. Kaboth-Bahr and her colleagues integrated 11 climate archives from all across Africa covering the past 620 thousand years to generate a comprehensive spatial picture of when and where wet or dry conditions prevailed over the continent. "We were surprised to find a distinct climatic east-west 'seesaw' very akin to the pattern produced by the weather phenomena of El Niño, that today profoundly influences precipitation distribution in Africa," explains Dr. Kaboth-Bahr, who led the study.

The authors infer that the effects of the tropical Pacific Ocean on the so-called "Walker Circulation" - a belt of convection cells along the equator that impact the rainfall and aridity of the tropics - were the prime driver of this climate seesaw. The data clearly shows that the wet and dry regions shifted between the east and west of the African continent on timescales of approximately 100,000 years, with each of the climatic shifts being accompanied by major turnovers in flora and mammal fauna.

"This alternation between dry and wet periods appeared to have governed the dispersion and evolution of vegetation as well as mammals in eastern and western Africa," explains Dr. Kaboth-Bahr. "The resultant environmental patchwork was likely to have been a critical component of human evolution and early demography as well."

The scientists are keen to point that although climate change was certainly not the sole factor driving early human evolution, the new study nevertheless provides a novel perspective on the tight link between environmental fluctuations and the origin of our early ancestors.

"We see many species of pan-African mammals whose distributions match the patterns we identify, and whose evolutionary history seems to articulate with the wet-dry oscillations between eastern and western Africa," adds Dr. Eleanor Scerri, one of the co-authors and an evolutionary archaeologist at the Max Planck Institute for the Science of Human History in Germany. "These animals preserve the signals of the environments that humans evolved in, and it seems likely that our human ancestors may have been similarly subdivided across Africa as they were subject to the same environmental pressures."

Ecotones: the transitional regions between different ecological zones

The scientists' work suggests that a seesaw-like pattern of rainfall alternating between eastern and western Africa probably had the effect of creating critically important ecotonal regions - the buffer zones between different ecological zones, such grassland and forest.

"Ecotones provided diverse, resource-rich and stable environmental settings thought to have been important to early modern humans," adds Dr. Kaboth-Bahr. "They certainly seem to have been important to other faunal communities."

To the scientists, this suggests that Africa's interior regions may have been critically important for fostering long-term population continuity. "We see the archaeological signatures of early members of our species all across Africa," says Dr. Scerri, "but innovations come and go and are often re-invented, suggesting that our deep population history saw a constant saw-tooth like pattern of local population growth and collapse. Ecotonal regions may have provided areas for longer term population continuity, ensuring that the larger human population kept going, even if local populations often went extinct."

"Re-evaluating these patterns of stasis, change and extinction through a new climatic framework will yield new insights into the deep human past," says Dr. Kaboth Bahr. "This does not mean that people were helpless in the face of climatic changes, but shifting habitat availability would certainly have impacted patterns of demography, and ultimately the genetic exchanges that underpin human evolution."

Credit: 
Max Planck Institute of Geoanthropology

Beer byproduct mixed with manure proves an excellent pesticide

image: A productive lettuce yield following the researchers' new biodisinfestation method.

Image: 
Image: Maite Gandariasbeitia et al

The use of many chemical fumigants in agriculture have been demonstrated to be harmful to human health and the environment and therefore banned from use.

Now, in an effort to reduce waste from the agricultural industry and reduce the amounts of harmful chemicals used, researchers have investigated using organic byproducts from beer production and farming as a potential method to disinfest soils, preserve healthy soil microorganisms and increase crop yields.

In this study published to Frontiers in Sustainable Food Systems, researchers from the Neiker Basque Institute for Agricultural Research and Development in Spain investigated using agricultural by-products rapeseed cake and beer bagasse (spent beer grains), along with fresh cow manure as two organic biodisinfestation treatments. The lead author Maite Gandariasbeitia explains: "Rapeseed cake and beer bagasse are two potential organic treatments which have shown really positive results in previous studies.

"Their high nitrogen content promotes the activity of beneficial microorganisms in the soil, which helps to break down organic matter like manure and kill off nematodes and other parasites which damage crops."

Gandariasbeitia also highlights how nematodes can negatively impact crop yields: "Root-knot nematodes are a type of common soil parasite which penetrate a plant's root tissue to lay their eggs and this activity causes galls, or knot-like swellings, to form on the root," she says.

"This damage negatively impacts root development and means the crop can't take up nutrients efficiently, slowing plant growth and ultimately, leading to reduced yields for farmers."

To disinfest the soil and reduce these nematode populations, beer bagasse and rapeseed cake were incorporated into the soil with fresh cow manure as a potential organic treatment. After the first crop post-treatment, the researchers found a significant reduction in galling on plant roots.

Next steps for research

Plots also demonstrated increased yields by around 15% compared to the control plots after one year. Additionally, the organic matter treatment boosted populations of beneficial microorganisms in the soils, as demonstrated by a significantly higher soil respiration rate.

The study demonstrates that these agricultural byproducts are an effective treatment for root-knot nematodes and other soil parasites, achieving higher crop yields as well as promoting sustainable food systems to reduce waste from the agricultural industry. Gandariasbeitia highlights that further research is needed to explore other potential organic treatments that could be used in a similar way: "There are still many questions to answer so that we can gain a better understanding of what happens in the soil during and after these biodisinfestation treatments.

"This can help us to really elucidate what characteristics we should be looking for in other potential organic treatments to be effective in tackling soil parasite populations."

Credit: 
Frontiers