Culture

Hopkins news: Climate change could unlock new microbes and increase heat-related deaths

The Journal of Clinical Investigation (JCI) recently published "Viewpoint" articles by Johns Hopkins University School of Medicine professors who warn that global climate change is likely to unlock dangerous new microbes, as well as threaten humans' ability to regulate body temperature.

Johns Hopkins Bloomberg Distinguished Professors Rexford Ahima, M.D., Ph.D., and Arturo Casadevall, M.D., Ph.D., M.S., along with William Dietz, M.D., Ph.D., director of the George Washington University's Sumner M. Redstone Global Center for Prevention and Wellness, and Susan Pacheco, M.D., associate professor in the Department of Pediatrics at the University of Texas Health Science Center at Houston, authored journal articles relevant to their fields that detail how rising temperatures around the world pose dangerous threats to humanity.

Ahima, director of Johns Hopkins' Division of Endocrinology, Diabetes and Metabolism, wrote in the journal that "global warming threatens human thermoregulation and survival."

Ahima explains that people generate body heat and have the capacity to regulate their temperature within a few degrees. But "as heat waves become more common, more severe, and longer, we expect to see more heat-related illnesses and deaths," he writes.

Ahima cites a recent study that examined global heat-related mortality, pointing out that tropical and subtropical countries and regions will experience the sharpest surge in illness and death stemming from higher temperatures, while the United States and Europe can also expect increases.

Casadevall's article explores "the specter of new infectious diseases" as a result of the changing climate.

"Given that microbes can adapt to higher temperatures," writes the professor of molecular microbiology and immunology, and infectious diseases, at Johns Hopkins' schools of medicine and public health, "there is concern that global warming will select for microbes with higher heat tolerance that can defeat our endothermy defenses and bring new infectious diseases."

Endothermy allows humans and other warm-blooded mammals to maintain high temperatures that can protect against infectious diseases by inhibiting many types of microbes.

Casadevall cites a particular climate threat from the fungal kingdom.

"We have proposed that global warming will lead many fungal species to adapt to higher temperatures," he writes, "and some with pathogenic potential for humans will break through the defensive barrier provided by endothermy."

As an example, Casadevall points to the rise of Candida auris, a species of fungus identified in 2009 and called a "catastrophic threat" by the U.S. Centers for Disease Control and Prevention in 2017.

"The nearly simultaneous emergence of Candida auris on three continents, an event proposed to result from global warming, has raised the specter that increased warmth by itself will trigger adaptations on certain microbes to make them pathogenic for humans."

Casadevall says that, while fungi present the most immediate threat, other microbes also adapt to evolving conditions such as temperature. He writes that "the conceptual threat originally identified with fungi, and exemplified by C. auris as the canary in the coal mine, applies across the microbial world."

Dietz's article addresses climate change and malnutrition, calling obesity, undernutrition and climate change a "syndemic," or multiple epidemics that interact and share common underlying social or economic determinants and policy drivers. In her article, Pacheco discusses climate change's adverse consequences regarding pregnancy and maternal, fetal and child health.

In all four JCI "Viewpoint" articles, long-term strategies are urged to reduce greenhouse gas emissions and slow the trend of rising temperatures.

Credit: 
Johns Hopkins Medicine

Genetic identification of human remains from the Spanish Civil War and the dictatorship

image: The BIOMICs team works to optimise DNA extraction systems

Image: 
Miriam Baeta / UPV/EHU

It is estimated that around 114,000 people disappeared throughout Spain during the Spanish Civil War and subsequent dictatorship. Unfortunately, eight decades on, only a small percentage of these people have been found or identified, with around 9,000 victims from 700 mass graves (of which it is thought there are approximately 2,000) being recovered in the last fifteen years. As time goes by and the samples themselves continue to deteriorate, conventional methods are no longer precise enough to identify the remains of all these unknown people. However, genetic analyses constitute an effective tool for this purpose.

The BIOMICs research team at the UPV/EHU has spent the last ten years working to identify these disappeared persons by genetically analysing bone and teeth samples taken from remains recovered from various mass graves dating from the Spanish Civil War and subsequent dictatorship and comparing the results with DNA taken from family members. 'Once DNA has been obtained from the remains, we analyse a series of specific genetic markers, depending on the type of kinship relationship we wish to study,' explains Doctor Miriam Baeta, a member of the BIOMICs research team working at the University of the Basque Country's Department of Zoology and Cellular Animal Biology.

The aim of the genetic analyses is to determine the profile of the remains or to gather enough information to enable the team to compare them with the profiles of presumed family members or the information contained in the database of the DNA bank of relatives of disappeared persons. Data is also sometimes stored in the database itself in the hope that it will match future family profiles yet to be included. Each case is different because 'for example, if you want to analyse paternal lineage, you study the Y chromosome; but it you want to analyse maternal lineage, then you have to focus on mitochondrial DNA,' explains Dr Baeta.

We are increasingly able to study smaller and smaller markers

The task of identifying human remains is a complex one 'because we are talking about post mortem DNA, which is often extremely degraded, making it hard to obtain a complete genetic profile,' continues Dr Baeta. 'Identification or coincidence is easier to prove when more markers are studied. Moreover, in many cases, the most suitable relatives are not available, meaning that the ones still alive are not close enough or the markers obtained cannot be compared with those from the DNA samples of living family members,' she adds. Dr Baeta highlights the importance of the DNA bank of relatives of victims, 'so that more comparisons can be done during future exhumations'.

She claims to be optimistic, since 'thanks to technological advances, we are increasingly able to study smaller and smaller markers, which have a greater chance of success in the analysis, because being smaller, they are better able to survive degradation.' Over the ten years that the team has been working in this field, many advances have been made that have enabled them to optimise the identification system. 'Among other things, we have optimised the DNA extraction systems, as well as various different steps throughout the process with the aim of obtaining more informative profiles. We are constantly striving to improve every part of the process,' she explains. The latest advance proposed by the team 'enables us to study smaller fragments of mitochondrial DNA. Thanks to this technique, we can do an initial screening to dismiss, in a cost-effective manner, possible relationships through the maternal lineage; in other words, it makes it easier to determine maternal kinship: only if there is a coincidence in this first phase is it worth applying the methodologies used to date to analyse mitochondrial DNA.'

Researchers in the team have published a paper in which they present all the knowledge acquired over the past ten years. In specific terms, they explain the techniques and procedures used to identify 525 human remains. To enable this identification, they obtained saliva samples from 879 presumed relatives, enabling them to identify 137 disappeared persons. No informative profile was obtained for 17% of the samples analysed, as a result of limited DNA or degraded samples; however, profiles were obtained for another 297 human remains that, despite everything, had previously remained unidentified. 'In general, we obtain profiles for the majority of skeletal remains, but we do not have suitable relatives with which to compare them,' clarifies Dr Baeta.

She then comments that 'when we manage to identify someone it is a very happy moment, because in addition to the joy of the outcome itself, we know there is a lot of complicated work behind the result. At the end of the day, it is a collective effort by our team and the Aranzadi Science Society, the Gogora Institute and the associations of victims and relatives of people who disappeared during the Spanish Civil War and subsequent dictatorship.'

Credit: 
University of the Basque Country

Scientists isolate biomarkers that can identify delirium risk and severity

image: Regenstrief Institute and Indiana University School of Medicine researchers have identified blood-based biomarkers associated with both delirium duration and severity in critically ill patients. An estimated 7 million hospitalized Americans suffer from the acute confusion and disorientation, characteristics of delirium, including a majority of patients in medical or surgical ICUs.

Image: 
Regenstrief Institute

INDIANAPOLIS -- Regenstrief Institute and Indiana University School of Medicine researchers have identified blood-based biomarkers associated with both delirium duration and severity in critically ill patients. This finding opens the door to easy, early identification of individuals at risk for longer delirium duration and higher delirium severity and could potentially lead to new treatments of this brain failure for which drugs have been shown to be largely ineffective.

An estimated 7 million hospitalized Americans suffer from the acute confusion and disorientation, characteristics of delirium, including a majority of patients in medical or surgical intensive care units (ICUs). Individuals who experience delirium in the ICU are more likely to have more hospital-associated complications, longer stays and higher risk of readmission. They are more likely to experience cognitive impairment and also have a greater likelihood of dying for up to a year after their hospital stay than ICU patients who did not experience delirium.

"If you can tell which patients will have higher delirium severity and longer duration and therefore greater probability of death, there are important treatment implications," said Regenstrief Institute research scientist and IU School of Medicine faculty member Babar Khan, M.D., who led the research and is the president of the American Delirium Society. "Analyzing biomarkers to stratify risk for delirium is a promising approach with the potential to be applied regularly in ICU patients in the near future."

In a new observational study, Dr. Khan and colleagues report that biomarkers for astrocyte and glial activation as well as for inflammation were associated with increased delirium duration and severity and greater in-hospital mortality.

Biomarkers of the 321 study participants, all of whom experienced delirium in an ICU, were identified from samples obtained via simple blood draws. Delirium severity was determined using a tool developed by a team including Regenstrief, IU School of Medicine and Purdue College of Pharmacy scientists. The CAM-ICU-7, short for Confusion Assessment Method for the Intensive Care Unit 7 -- is easy to administer, even to patients on mechanical ventilators. More than half of ICU patients in the U.S. receive mechanical ventilation.

Each day with delirium in the ICU is associated with a 10 percent increased likelihood of death, according to Dr. Khan, so diminishing its duration and ultimately preventing it is critical. Regenstrief, IU School of Medicine and research scientists from other institutions have conclusively shown in several large trials that antipsychotics, such as the widely used haloperidol, are not effective for the management of delirium duration or severity.

Regenstrief and IU School of Medicine researchers are actively exploring other approaches to delirium. Dr. Khan is co-principal investigator of an ongoing study that is the first to test whether listening to music, a non-pharmacological strategy that has been shown to decrease over-sedation, anxiety and stress in critically ill patients -- all factors that predispose to ICU delirium - and lowers the likelihood of developing delirium. In a completed study, Regenstrief researchers determined that waking ICU patients and having them breathe on their own decreased acute brain failure.

The new study, "Biomarkers of Delirium Duration and Delirium Severity in the ICU" has been published online ahead of print in the journal Critical Care Medicine.

Credit: 
Regenstrief Institute

Earthquake catalog shows complex rupturing during 2019 ridgecrest sequence

The 2019 Ridgecrest earthquake sequence, which startled nearby California residents over the 4 July holiday with magnitude 6.4 and magnitude 7.1 earthquakes, included 34,091 earthquakes overall, detailed in a high-resolution catalog created for the sequence.

The catalog, developed by David Shelly at the U.S. Geological Survey in Golden, Colorado, was published in the Data Mine column in Seismological Research Letters. The paper is part of a larger Data Mine series aimed at rapidly sharing data from the Ridgecrest sequence among researchers.

"Because of the complexity in this sequence, I think there are still a lot of unanswered questions about what the important aspects of the triggering and evolution of this sequence were, so having this catalog can help people make more progress on answering those questions," said Shelly.

Shelly used a technique called template matching, which scanned through seismic signals to find those matching the "fingerprint" of 13,525 known and cataloged earthquakes, as well as precise relative relocation techniques to detect 34,091 earthquakes associated with the event. Most of the earthquakes were magnitude 2.0 or smaller.

The catalog covers the time period spanning the the foreshock sequence leading up to the 4 July 2019 magnitude 6.4 earthquake through the first 10 days of aftershocks following the magnitude 7.1 earthquake on 5 July.

By precisely locating the earthquakes, Shelly was able to discern several crosscutting fault structures in the region, with mostly perpendicular southwest- and northwest strikes. The foreshocks of the magnitude 6.4 event aligned on a northwest-striking fault that appears to have ruptured further in the aftershocks of that earthquake, along with a southwest-striking fault where a surface rupture was observed by teams who went out to the site.

Shelly said the magnitude 7.1 earthquake appears to have started at the northwestern edge of the magnitude 6.4 rupture, extending to the northwest and southeast and possibly extending that rupture to the northwest and southeast. The magnitude 7.1 event was highly complex, with several southwest-striking alignments and multi-fault branching and high rates of aftershocks, especially at the northwestern end of the rupture.

The Ridgecrest earthquakes took place along "a series of immature faults, in the process of developing," Shelly said, noting that this could explain in part why the earthquake sequence was so complex. Compared to the mature San Andreas Fault Zone to the west, which accommodates about half of the relative plate motion as the Pacific and North American tectonic plates collide, the Ridgecrest faults are broadly part of the Eastern California Shear Zone, where multiple faults accommodate up to 25 percent of this tectonic strain.

Shelly noted that the catalog benefitted from the long-established, densely instrumented, real-time seismic network that covers the region. "When there's a big earthquake in an area that's not well-covered, people rush out to try to at least cover the aftershocks with great fidelity," he explained. "Here, having this permanent network makes it so you can evaluate the entire earthquake sequence, starting with the foreshock data, to learn more about the earthquake physics and processes."

Credit: 
Seismological Society of America

Montana State researcher harnesses microorganisms to make living building materials

image: Chelsea Heveran uses a scanning electron microscope to examine Synechococcus cyanobacteria in MSU's Imaging and Chemical Analysis Laboratory on Jan. 14, 2020. MSU Photo by Adrian Sanchez-Gonzalez

Image: 
MSU Photo by Adrian Sanchez-Gonzalez

BOZEMAN -- To make a building material that's alive, Montana State University researcher Chelsea Heveran has a recipe: get some gelatin from the grocery store, make a broth with bacteria called Synechococcus that photosynthesize like plants, add a bit of calcium, then mix with sand and cool until hardened into an concrete-like solid that can be used to replicate itself.

Like many recipes, this one is underlaid by some complex chemistry and is the hard-won result of experimentation, according to Heveran, the lead author of a new paper published in the journal Matter that summarizes the research. The article appeared online Jan. 15 and was featured the same day in a story in the New York Times.

Heveran, assistant professor in the Department of Mechanical and Industrial Engineering in MSU's Norm Asbjornson College of Engineering, said the study marks the first time that microbes have been used as the main catalyst of a building material in a way that preserves them for later use. The team demonstrated that the Synechococcus cyanobacteria remained alive in the sandy bricks for a month or more under favorable conditions of humidity and temperature.

"You can break off a piece and use it to make new bricks," said Heveran, who conducted the study with a team of colleagues at University of Colorado Boulder, where she was a postdoctoral researcher before continuing to contribute to the project as an MSU faculty member.

That's a fundamental improvement over normal concrete, where each batch requires significant amounts of a chemical binder -- cement -- that must be mined, processed and hauled to the mixing site, Heveran explained. By contrast, she imagines a scenario in which a single living brick could be brought to a remote location; simple additives and some basic equipment are all that would then be needed to transform native earth into infrastructure.

The recipe works because the photosynthesizing Synechococcus cause calcium carbonate, the main mineral in limestone, to form in the solution and solidify the sand mixture. The researchers found that they could reliably control the cyanobacteria's behavior by adjusting temperature and humidity. High humidity and cool temperatures let the microbes stay alive for extended time periods, while higher temperatures caused the bricks to re-dissolve and promoted microbial growth and mineralization. In this way, one brick could be divided to generate multiple new living bricks. Maximum strength was achieved by drying the material, which killed the microbes.

According to Heveran, the material is surprisingly tough -- about as strong as cement mortar but weaker than concrete. "We aren't ready to build a skyscraper out of this stuff," she said. Compared to normal concrete, however, the bacterial bricks are relatively easy to recycle by dissolving them and adding new Synechococcus to re-solidify the mixture.

Heveran conducted the research in the Living Materials Laboratory run by Wil Srubar, assistant professor in CU Boulder's Department of Civil, Environmental, and Architectural Engineering, as well as at MSU, where she joined the faculty in August 2018. She said MSU is known for its research in biomineralization -- the process by which living organisms produce minerals that can modify their surroundings. A team of MSU researchers that includes assistant professor of civil engineering Adrienne Phillips has used other biomineralizing bacteria to seal leaks in oil and gas wells. Along with Phillips and Cecily Ryan, assistant professor of mechanical and industrial engineering, Heveran is now researching ways to make an effective concrete filler from discarded plastic that has been shredded and biomineralized to improve its chemical bonding ability.

"We're very excited to have Chelsea here," said Dan Miller, head of the mechanical and industrial engineering department. "Her expertise is a great contribution to MSU's research in materials science and the crossover with biology, which is creating a cutting-edge and growing field in engineering right now."

Heveran said she draws inspiration from bone, a living material she studied while earning her doctorate at CU Boulder. "Bone is amazing because it's made by cells -- it self-repairs and maintains high strength and toughness for decades," she said. The new study published in Matter suggests potential for additional properties engineered into building materials using other microorganisms.

"We're happy with what we engineered -- it has some neat properties," Heveran said. "But we're thinking of this more as a platform to say, 'We really could start to engineer living building materials. We could do so much more.'"

Credit: 
Montana State University

Both simple and advanced imaging can predict best stroke patients for thrombectomy

image: Amrou Sarraj, MD, of McGovern Medical School at UTHealth, is leading research on endovascular thrombectomy for stroke patients

Image: 
Maricruz Kwon, UTHealth

Both simple and advanced computed tomography (CT) were effective in accurately predicting which stroke patients would benefit from endovascular thrombectomy to remove a large cerebral clot, but together they were even better, reported researchers at The University of Texas Health Science Center at Houston (UTHealth).

Results of the multicenter study, Optimizing Patient Selection for Endovascular Treatment in Acute Ischemic Stroke (SELECT), were published in yesterday's Early View edition of the Annals of Neurology.

Stroke is the leading cause of long-term disability and fourth-leading cause of death in the world. An ischemic stroke, caused by a blockage of an artery, is the most common form. Endovascular thrombectomy can be performed to remove a clot lodged in a blood vessel with a mechanical device threaded through an artery. It has been shown to be an effective treatment for improving clinical outcomes in stroke up to 24 hours from onset.

"Endovascular thrombectomy has revolutionized the treatment for acute stroke patients presenting with large vessel occlusion. Different imaging techniques are used to identify patients who may benefit from this treatment. However, how these imaging profiles correlate with each other and with the stroke outcomes is unknown," said Amrou Sarraj, MD, lead author and associate professor of neurology at McGovern Medical School at UTHealth.

Imaging must be done to determine the location of the clot and whether the patient is a good candidate for thrombectomy, meaning they have a smaller area of brain tissue death. Physicians use non-contrast simple CT and/or CT with an injected contrast dye (CT perfusion) to view the clot and surrounding area of cellular death. While simple CT is readily available at most hospitals, CT perfusion tends to be only available at more advanced stroke centers.

Of the 361 patients enrolled, a significant proportion of patients had favorable imaging results on both the CT and CT perfusion, meaning they were candidates for endovascular thrombectomy. Those patients also had significantly higher odds of receiving endovascular therapy and higher 90-day functional independence rates after recovery (58%).

Even when the two imaging modalities disagreed, the functional and safety outcomes were reasonable (38% achieved functional independence), which was better than the patients who did not receive thrombectomy. Patients with an unfavorable result on CT perfusion imaging, but favorable on simple CT, had higher rates of symptomatic hemorrhage in the brain tissue and death after stroke. Patients with unfavorable imaging profiles on both modalities had very poor outcomes.

"While best outcomes were observed in patients with a favorable profile on both imaging modalities, patients who had a favorable profile on at least one imaging modality also achieved reasonable outcomes," said Sarraj, who sees patients at UT Physicians, the clinical practice of McGovern Medical School, and is an attending neurologist at Memorial Hermann-Texas Medical Center.

The ongoing international Phase III randomized controlled trial, SELECT2, also led by Sarraj, will assess the efficacy and safety of thrombectomy procedure in patients with unfavorable profile on one or both imaging modalities. The SELECT trials are funded by grants from Stryker Neurovascular.

Credit: 
University of Texas Health Science Center at Houston

Scientists identify gene that puts brakes on tissue growth

image: Results from a Northwestern University study of the planarian flatworm could have ramifications for novel tissue engineering methods.

Image: 
Northwestern University

The planarian flatworm is a simple animal with a mighty and highly unusual ability: it can regenerate itself from nearly every imaginable injury, including decapitation. These tiny worms can regrow any missing cell or tissue -- muscle, neurons, epidermis, eyes, even a new brain.

Since the late 1800s, scientists have studied these worms to better understand fundamental principles of natural regeneration and repair, information that could provide insights into tissue healing and cancer. One mechanism that is yet unknown is how organisms like these control the proportional scaling of tissue during regeneration.

Now, two Northwestern University molecular biologists have identified the beginnings of a genetic signaling pathway that puts the brakes on the animal's growth. This important process ensures the appropriate amount of tissue growth in these highly regenerative animals.

"These worms have essentially discovered a natural form of regenerative medicine through their evolution," said Christian Petersen, who led the research. "Planarians can regenerate their whole lives, but how do they limit their growth? Our discovery will improve understanding of the molecular components and organizing principles that govern perfect tissue restoration."

The findings ultimately may have important ramifications for novel tissue engineering methods or strategies to promote natural repair mechanisms in humans.

Petersen is an associate professor of molecular biosciences in Northwestern's Weinberg College of Arts and Sciences. He and Erik G. Schad, a graduate student in Petersen's lab, conducted the study.

The results were published in the Jan. 20 issue of the journal Current Biology. Petersen is the corresponding author, and Schad is the paper's first author.

The researchers have identified a control system for limiting regeneration and also a new mechanism to explain how stem cells can influence growth. Specifically, Petersen and Schad discovered that a gene called mob4 suppresses tissue growth in the animals. When the researchers inhibited the gene in experiments, the animal grew to twice its normal size.

The gene, they found, works in a rather surprising way: by preventing the descendants of stem cells from producing a growth factor called Wnt, a protein released from cells to communicate across distances. The Wnt signaling pathway is known to play a role in cancer cell regeneration.

Planarians are 2 to 20 millimeters in size and have a complex anatomy with around a million cells. They live in freshwater ponds and streams around the world. The worm's genome has been sequenced, and its basic biology is well-characterized, making planarians popular with scientists.

Credit: 
Northwestern University

Fungal diversity and its relationship to the future of forests

image: Stanford researchers gathered soil samples from dozens of North American forests, including Pike National Forest in Colorado. They used these samples to better understand the influence of climate change on symbiotic soil microbes that control the health of forests.

Image: 
Kabir Peay

If you indulge in truffles, or porcini and chanterelle mushrooms, you have enjoyed a product of ectomycorrhizal fungi. Forming symbiotic relationships with plants - including pine, birch, oak and willow tree species - these fungi have existed for millions of years, their sprawling filaments supporting ecosystems throughout their reach.

According to research from Stanford University, published Jan. 21 in the Journal of Biogeography, by the year 2070, climate change could cause the local loss of over a quarter of ectomycorrhizal fungal species from 3.5 million square kilometers of North American pine forests - an area twice the size of Alaska.

"These are critical organisms for the functioning and the health of forests," said Kabir Peay, associate professor of biology in Stanford's School of Humanities and Sciences and senior author of the study. "We have evidence to suggest that these fungi are as susceptible to climate change as other kinds of organisms and their response may be even more important."

Previously, the Peay lab had mapped the global distributions of forests where trees associate with different types of symbiotic fungi, finding that over 60 percent of all trees on Earth currently associate with ectomycorrhizal fungi. Now, by learning more about the communities these fungi form in different climates, the researchers projected how climate change might affect them in the future.

Microbial maps

Over several years, the Peay lab has gathered about 1,500 soil samples from 68 pine forests, which represent a swath of North America from Florida to Alaska. In past work, they sequenced DNA in each sample to understand what fungal species live in that soil, and in what abundance. Their results, published previously, suggested that fungi were different in each region, contradicting a common assumption that those communities would look similar in most places in the world. They followed that up by mapping the associations between trees and symbiotic microbes around the world.

For their latest paper, Brian Steidinger, a postdoctoral scholar in the Peay lab, explored the relationship between these geographical fungi patterns and historical climate data.

"We took soil from the cores and climatic data unique to each site," said Steidinger, who was lead author of the study. "We found that climate was by far the most important predictor of contemporary fungal diversity patterns across North America."

Steidinger also found that different regions of North America had unique optimal temperatures for fungal diversity. For example, cold boreal forests had a diversity peak around 5 C mean annual temperature, while Eastern temperate forests peaked in diversity near 20 C.

The researchers then applied these data to predict future diversity, given projections of climate change produced by the Intergovernmental Panel on Climate Change. Because of the regional differences in optimal climate for fungal diversity, some forests, particularly those in the North and Northwest, could experience major decreases in fungal diversity.

"According to our models, climate change over the next 50 years could eliminate more than a quarter of ectomycorrhizal species inside 3.5 million square kilometers of North American pine forests," said Steidinger. "That's an area twice the size of Alaska."

Other regions, such as the Eastern temperate forests, could experience gains of 30 to 50 percent - assuming it is as easy to develop new species as to lose them.

"One of the things that's kind of shocking and a little bit scary is that we predict there will be some pretty significant decreases in diversity in western North America, well known culturally for fungal diversity and for people who are interested in collecting edible mushrooms," Peay said.

Buffering against climate change

Ectomycorrhizal fungi form a sheath around their hosts' roots, which can help prevent erosion and protect roots from damage and disease. The fungi seem to boost carbon storage in soil by slowing down decomposition and encouraging the buildup of soil. They also help their host trees grow more quickly - and therefore take in more carbon - by improving their ability to take in nitrogen, which they need in order to grow.

"In terms of ecosystem function, particularly buffering the atmosphere against climate change, ectomycorrhizal fungi are among the last microbes you want to lose," said Steidinger. "We're expecting to lose the species that seem to be the most functionally intense - the ones with the greatest enzyme activity, the ones that forage out the farthest."

Building on this work, the researchers are considering studying forests with low diversity of fungi and conducting experiments to better understand how these altered fungal communities might function in the future.

"For microbiome work, I feel like we're in a new era of discovery," said Peay. "Like Darwin and Wallace getting on ships and going to new places and seeing new things and changing the way they view the world, that is what is happening in this field."

Credit: 
Stanford University

Study provides insight into 'rapport-building' during victim interviews

A University of Liverpool research paper, published in Psychology, Public Policy, and Law, provides details of the approaches needed to help build rapport with victims of crime during interviews.

Interviewing victims is one of the most challenging aspects of sexual offence investigations. Victims can be unwilling to reveal information, specifically within a formal interviewing setting and it is crucial to obtain information since they are often the only source of information.

In the UK, US, Canada and Israel there are a number of models and protocols in place, such as the 'PEACE' model and the National Institute of Child Health and Human Development (NICHD) interview protocol, which helps to ensure a non-accusatory, information gathering approach to interviewing victims.

South Korea

In South Korea, sexual offences are a serious social problem. According to recent official statistics, the total number of sexual crimes occurring per year rose 12% from 2014 to 2018.

The Korean National Police Agency has attempted to improve competences in investigating sex offences, with an emphasis on interviewing due to the limited amount of physical evidence that is often a hallmark of such cases. Since 2004 KNPA have introduced the PEACE model to spread the recent knowledge on investigative interviewing principles and techniques. The KNPA also executed nationwide video-recorded interview system to improve the admissibility of police interviews and to protect the human rights of interviewees in 2007. Further, the KNPA disseminated the NICHD protocol to assist officers interviewing child victims in 2010.

To enhance and maintain the officers' expertise related to the guidelines introduced, the KPIA (Korean Police Investigation Academy), which is the professional training institution of the KNPA, provides relevant investigative interviewing courses composed of learning theories and simulation exercises

Rapport-based interviewing

Unfortunately, research has found that Korean officers often do not adopt the methods recommended by the NICHD in practice. Especially, the establishment and building of rapport when interviewing victims. Rapport-building offers a friendly atmosphere and consequently reduces the uneasiness that may exert a negative impact on information gathered

Rapport-based interviewing can mitigate the negative feelings of child victims during police interviews and increase the amount of information generated. Despite these findings, little is known about how to create an environment of rapport and, more specifically, there is very little known about the set of behaviours or approaches that underpin it.

To find out more researchers from the University's Centre for Critical and Major Incident Psychology, led by Centre Director Professor Laurence Alison, analysed over 100 hours of KNPA investigative interviews using a framework called ORBIT.

ORBIT

The observing rapport-based interpersonal techniques (ORBIT) framework analyses rapport-based interviewing skills along two dimensions: motivational interviewing (MI) skills and interpersonal competence (use of adaptive interviewing behaviours and absence of maladaptive interviewing behaviours).

MI is a counselling method that helps people resolve ambivalent feelings and insecurities to find the internal motivation they need to change their behaviour. For example finding the motivation to tell an interviewer what has happened or details of a perpetrator despite wanting to forget or not want to talk about it.

The researchers coded 103 hours of investigative interviews with sexual offence victims - a sample of 86 single victim cases conducted by 26 police interviewers in South Korea. In all cases, there was a subsequent conviction.

Results

Results showed that humanistic approaches positively influence adaptive interactions between interviewer and victim whilst simultaneously reducing maladaptive ones. This results in an increase in yield.

Researchers also found that interviewer adaptive behaviours directly increase victim adaptive behaviour (with the same effect for maladaptive behaviour). Victim adaptive behaviour is positively associated with interview yield, and victim maladaptive behaviour is negatively associated with it.

Professor Alison, said: "These results suggest that interviews conducted in a humanistic-consistent fashion strongly positively influence adaptive victim behaviour, which, in turn, increases interview yield."

Credit: 
University of Liverpool

Study results will inform immunization programs globally

video: Professor Helen Marshall (University of Adelaide) and study participant Harry Spurrier discuss the findings of the B Part of It meningococcal B study.

Image: 
University of Adelaide.

The results of the B Part of It study - the largest meningococcal B herd immunity study ever conducted - are published today in the New England Journal of Medicine.

The results have implications for all meningococcal B vaccine programs globally.

Led by Professor Helen Marshall from the University of Adelaide's Robinson Research Institute, the B Part of It study involved almost 35,000 senior school students in South Australia, aged 15 to 18 years, during 2017 and 2018.

"Our study has shown good protection was provided by the meningococcal B vaccine against meningococcal disease in those vaccinated but did not show an overall reduction in the proportion of adolescents carrying the bacteria, including the B strain," Professor Marshall says.

Adolescents can harmlessly carry the meningococcus bacteria in the back of the throat with only a very small proportion developing the disease. Meningococcal B is one of the most common strains that causes meningococcal disease, an acute bacterial infection that kills approximately 10% of those infected, and causes permanent disabilities in about 20% of cases. Those most at risk are babies and children up to the age of five years, and teenagers and young adults from ages 15 to 24 years.

"We are pleased to report not a single case of meningococcal disease among our study participants to date, over the three years since the study began, compared to 12 cases in the same age group in the two years prior to the study.

"Potentially this means a life or lives were saved, as, on average, one in every 10 children with meningococcal disease dies from it," Professor Marshall says.

"These results highlight the importance of individual vaccination for adequate protection, as the vaccine is unlikely to be able to stop spread of the bacteria between individuals," Professor Marshall says. "The study has identified the critical finding that individuals need to be vaccinated to protect themselves against meningococcal B disease, rather than expecting community protection through reduced transmission of the bacteria," she says.

The study also identified a number of high-risk behaviours associated with carriage of meningococcal strains in young people, including: smoking cigarettes, attending bars or clubs, and intimate kissing. Older school students, school boarders, and those who had recently had a cold or sore throat were also more likely to carry the meningococcus in their throat.

The outcomes of the B Part of It study are now being used in Australia and globally, to assess the cost-effectiveness of meningococcal B immunization programs for children and young people.

Gill and Oren Klemich, who lost their 18-year-old son, Jack, to meningococcal B in 2009, have supported the B Part of It program as ambassadors and followed the results closely.

"We are extremely proud of the impact this study has had in informing both local and global policy around meningococcal B, as well as the impact the study and South Australian Government funded program have had in vaccinating and protecting over 60,000 young people against this horrible disease," says Oren.

Credit: 
University of Adelaide

Study highlights effectiveness of behavioral interventions in conflict-affected regions

A new study, published in The Lancet Global Health, highlights the effectiveness of behavioural intervention in reducing psychological distress in conflict-affected regions.

More than 125 million people today are directly affected by armed conflict, the highest number since World War II. Although reported rates of mental disorders vary, previous studies have shown that mood and anxiety disorders are common, with high rates for depression and posttraumatic stress disorder. As such, scalable interventions to address a range of mental health problems are desperately needed.

Self-help Plus (SH+) is a guided self-help intervention developed by the World Health Organization (WHO) based on audio-recorded material and an illustrated workbook that can be facilitated by briefly trained non-specialists. It is delivered to groups of up to 30 people over 5 weekly sessions of 2 hours duration. As such, SH+ has been developed as a way of rapidly supporting large numbers of people experiencing psychological distress in the context of humanitarian crises.

Randomised controlled trial

To ascertain the effectiveness of SH+ a research team led by Dr Wietse Tol (Johns Hopkins University, US) and Dr Mark van Ommeren (WHO) conducted the first ever-randomised controlled trial of the intervention. The research team included the University of Liverpool's Dr Ross White (Reader of Clinical Psychology, Dept of Psychological Sciences). Dr White was one of the experts consulted by WHO in the development of the SH+ intervention which is based on Acceptance and Commitment Therapy.

The trial was conducted in Uganda, which hosts over 1.2 million refugees fleeing conflicts in countries such as neighbouring South Sudan and the Democratic Republic of Congo. The researchers visited 14 different villages in the area and recruited 20-30 female South Sudanese refugees from each village. A total of 694 female refugees with at least moderate levels of psychological distress were recruited into the study. Villages were randomly assigned to receive either SH+ or enhanced usual care. Participants were assessed 1 week before, 1 week after, and 3 months after the intervention.

Results

The findings of this large randomised controlled trial indicated that SH+, compared to enhanced usual care, was effective at reducing psychological distress (as assessed by the Kessler 6 assessment instrument) and bringing about improvements on a range of other outcomes (including functioning, depression and wellbeing) 3 months after the intervention had stopped. The SH+ intervention was highly acceptable to participants with at least 80% of participants allocated to receive SH+ attending each of the five sessions.

Dr White, said: "Our research suggests that SH+ may be well suited as a first-line intervention for large populations exposed to major stressors in low-resource settings. SH+ will complement other intensive forms of intervention for those experiencing more severed difficulties.

"Further research is required to explore how the beneficial impact of SH+ can be maintained over longer periods of time."

Credit: 
University of Liverpool

Life's Frankenstein beginnings

image: Szostak believes the earliest cells developed on land in ponds or pools, potentially in volcanically active regions. Ultraviolet light, lightning strikes, and volcanic eruptions all could have helped spark the chemical reactions necessary for life formation.

Image: 
Don Kawahigashi/Unsplash

When the Earth was born, it was a mess. Meteors and lightning storms likely bombarded the planet's surface where nothing except lifeless chemicals could survive. How life formed in this chemical mayhem is a mystery billions of years old. Now, a new study offers evidence that the first building blocks may have matched their environment, starting out messier than previously thought.

Life is built with three major components: RNA and DNA--the genetic code that, like construction managers, program how to run and reproduce cells--and proteins, the workers that carry out their instructions. Most likely, the first cells had all three pieces. Over time, they grew and replicated, competing in Darwin's game to create the diversity of life today: bacteria, fungi, wolves, whales and humans.

But first, RNA, DNA or proteins had to form without their partners. One common theory, known as the "RNA World" hypothesis, proposes that because RNA, unlike DNA, can self-replicate, that molecule may have come first. While recent studies discovered how the molecule's nucleotides--the A, C, G and U that form its backbone--could have formed from chemicals available on early Earth, some scientists believe the process may not have been such a straightforward path.

"Years ago, the naive idea that pools of pure concentrated ribonucleotides might be present on the primitive Earth was mocked by Leslie Orgel as 'the Molecular Biologist's Dream,'" said Jack Szostak, a Nobel Prize Laureate, professor of chemistry and chemical biology and genetics at Harvard University, and an investigator at the Howard Hughes Medical Institute. "But how relatively modern homogeneous RNA could emerge from a heterogeneous mixture of different starting materials was unknown."

In a paper published in the Journal of the American Chemical Society, Szostak and colleagues present a new model for how RNA could have emerged. Instead of a clean path, he and his team propose a Frankenstein-like beginning, with RNA growing out of a mixture of nucleotides with similar chemical structures: arabino- deoxy- and ribonucleotides (ANA, DNA, and RNA).

In the Earth's chemical melting pot, it's unlikely that a perfect version of RNA formed automatically. It's far more likely that many versions of nucleotides merged to form patchwork molecules with bits of both modern RNA and DNA, as well as largely defunct genetic molecules, such as ANA. These chimeras, like the monstrous hybrid lion, eagle and serpent creatures of Greek mythology, may have been the first steps toward today's RNA and DNA.

"Modern biology relies on relatively homogeneous building blocks to encode genetic information," said Seohyun Kim, a postdoctoral researcher in chemistry and first author on the paper. So, if Szostak and Kim are right and Frankenstein molecules came first, why did they evolve to homogeneous RNA?

Kim put them to the test: He pitted potential primordial hybrids against modern RNA, manually copying the chimeras to imitate the process of RNA replication. Pure RNA, he found, is just better--more efficient, more precise, and faster--than its heterogeneous counterparts. In another surprising discovery, Kim found that the chimeric oligonucleotides--like ANA and DNA--could have helped RNA evolve the ability to copy itself. "Intriguingly," he said, "some of these variant ribonucleotides have been shown to be compatible with or even beneficial for the copying of RNA templates."

If the more efficient early version of RNA reproduced faster than its hybrid counterparts then, over time, it would out-populate its competitors. That's what the Szostak team theorizes happened in the primordial soup: Hybrids grew into modern RNA and DNA, which then outpaced their ancestors and, eventually, took over.

"No primordial pool of pure building blocks was needed," Szostak said. "The intrinsic chemistry of RNA copying chemistry would result, over time, in the synthesis of increasingly homogeneous bits of RNA. The reason for this, as Seohyun has so clearly shown, is that when different kinds of nucleotides compete for the copying of a template strand, it is the RNA nucleotides that always win, and it is RNA that gets synthesized, not any of the related kinds of nucleic acids."

So far, the team has tested only a fraction of the possible variant nucleotides available on early Earth. So, like those first bits of messy RNA, their work has only just begun.

Credit: 
Harvard University

Surprise discovery shakes up our understanding of gene expression

A group of University of Chicago scientists has uncovered a previously unknown way that our genes are made into reality.

Rather than directions going one-way from DNA to RNA to proteins, the latest study shows that RNA itself modulates how DNA is transcribed--using a chemical process that is increasingly apparent to be vital to biology. The discovery has significant implications for our understanding of human disease and drug design.

"It appears to be a fundamental pathway we didn't know about. Anytime that happens, it holds promise to open up completely new directions of research and inquiry," said Prof. Chuan He, a world-renowned chemist.

The human body is among the most complex pieces of machinery to exist. Every time you so much as scratch your nose, you're using more intricate engineering than any rocket ship or supercomputer ever designed. It's taken us centuries to deconstruct how this works, and each time someone discovers a new mechanism, a few more mysteries of human health make sense--and new treatments become available.

For example, in 2011, He opened a new avenue of research with his discovery of a particular process called reversible RNA methylation, which plays a critical role in how genes are expressed.

The picture many of us remember learning in school is an orderly progression: DNA is transcribed into RNA, which then makes proteins that carry out the actual work of living cells. But it turns out there are a lot of wrinkles.

He's team found that the molecules called messenger RNA, previously known as simple couriers that carry instructions from DNA to proteins, were actually making their own impacts on protein production. This is done by a reversible chemical reaction called methylation; He's key breakthrough was showing that this methylation was reversible. It wasn't a one-time, one-way transaction; it could be erased and reversed.

"That discovery launched us into a modern era of RNA modification research, which has really exploded in the last few years," said He. "This is how so much of gene expression is critically affected. It impacts a wide range of biological processes--learning and memory, circadian rhythms, even something so fundamental as how a cell differentiates itself into, say, a blood cell versus a neuron."

He's team also identified and characterized a number of "reader" proteins that recognize methylated mRNA and impact target mRNA stability and translation.

But as He's lab worked with mice to understand the mechanisms, they began to see that messenger RNA methylation could not fully explain everything they observed.

This was mirrored in other experiments. "The data coming out of the community was saying there's something else out there, something extremely important that we're missing--that critically impacts many early development events, as well as human diseases such as cancer," he said.

He's team discovered that a group of RNAs called chromosome-associated regulatory RNAs, or carRNAs, was using the same methylation process, but these RNAs do not code proteins and are not directly involved in protein translation. Instead, they controlled how DNA itself was stored and transcribed.

"This has major implications in basic biology," He said. "It directly affects gene transcriptions, and not just a few of them. It could induce global chromatin change and affects transcription of 6,000 genes in the cell line we studied."

He sees major implications in biology, especially in human health--everything from identifying the genetic basis of disease to better treating patients.

"There are several biotech companies actively developing small molecule inhibitors of RNA methylation, but right now, even if we successfully develop therapies, we don't have a full mechanical picture for what's going on," he said. "This provides an enormous opportunity to help guide disease indication for testing inhibitors and suggest new opportunities for pharmaceuticals."

Their breakthrough is only the beginning, He said. "I believe this represents a conceptual change," he said. "Barriers like these are hard to crack, but once you do, everything flows from there."

Credit: 
University of Chicago

FSU Research: Despite less ozone pollution, not all plants benefit

image: From left, Christopher Holmes, the Werner A. and Shirley B. Baum assistant professor of meteorology in the Department of Earth, Ocean, and Atmospheric Science at Florida State University, and Jason Ducker, a postdoctoral researcher. Their research compared levels of atmospheric ozone to the amount of ozone plants took in through pores on their leaves at more than 30 sites over 10 years. They found that environmental factors have more impact on the ozone dose the plants received than the amount of ozone in the atmosphere.

Image: 
Bruce Palmer / FSU Photography Services

TALLAHASSEE, Fla. -- Breathe easy: Concentrations of ozone in the air have decreased over large parts of the country in the past several decades.

But not too easy.

Policies and new technologies have reduced emissions of precursor gases that lead to ozone air pollution, but despite those improvements, the amount of ozone that plants are taking in has not followed the same trend, according to Florida State University researchers. Their findings are published in the journal Elementa: Science of the Anthropocene.

"Past studies of plant damage from ozone have been overly optimistic about what the improving ozone air quality means for vegetation health," said Christopher Holmes, the Werner A. and Shirley B. Baum assistant professor of meteorology in the Department of Earth, Ocean, and Atmospheric Science.

Ozone is a gas made of three oxygen molecules. In the upper levels of the atmosphere, it is helpful for life on Earth because it keeps too much ultraviolet radiation from reaching the planet's surface. But when it's found at ground level, ozone is a pollutant that can damage the lungs. It's also toxic for plants, and present-day levels of the pollutant have cut global grain yields by up to 15 percent, resulting in global losses of soybean, wheat, rice and maize valued at $10 billion to $25 billion annually.

The falling levels of ozone pollution are good news for human health, but FSU researchers wanted to know if plants were seeing benefits too. To answer this question, Allison Ronan, a former graduate student, and Jason Ducker, a postdoctoral researcher at FSU, worked with Holmes and another researcher to track the amount of ozone plants sucked up through pores on their leaves over 10 years at more than 30 test sites. They compared those trends to measurements of atmospheric ozone.

As they expected, the ozone concentrations in the air decreased at most of their study sites, but, surprisingly, the ozone uptake into plants at the sites didn't necessarily go down at the same time. In fact, at many sites, atmospheric ozone concentrations fell while the ozone uptake into plants rose.

The different trends happen because plants can open and close the stomata pores on their leaves in response to weather, especially light, temperature, moisture, drought and other environmental conditions. If the stomata close, the plants cease taking up ozone, regardless of the concentration in the surrounding air. That means the ozone uptake into leaves doesn't exactly track the amount of ozone in the air. The FSU scientists found that these environmental factors have more impact on the ozone dose the plants receive than the amount of ozone in the atmosphere.

"We know that weather and growing conditions vary a lot from year to year, and that variability in weather turns out to be more important for driving the trends and variability in ozone uptake into plants than the concentrations in the surrounding air," Holmes said. "With decreasing ozone concentrations, we're moving in the right direction, but the benefits for crops and vegetation may not be apparent until the air quality improvements have persisted longer."

The FSU team identified the differing trends by using a dataset developed by Holmes' research group. The dataset, called SynFlux, fuses measurements from air quality networks with data from field sites that monitor energy flows between vegetation and the atmosphere. It enabled the team to study ozone uptake trends at many more sites than has previously been possible.

Future studies of plant damage and accompanying economic losses need to avoid relying primarily on measures of ozone concentration in the atmosphere and look at ozone uptake instead, researchers said.

"With the SynFlux dataset that we have developed, we've now got the information to do that on a large scale at many sites across multiple continents," Holmes said. "We're just scratching the surface of what we can learn about air pollution impacts on vegetation using this tool."

Credit: 
Florida State University

New study provides insights for detecting the invasive brown treesnake

video: This one-minute informational video can be embedded in news stories.

Image: 
Dickinson College

(Carlisle, Pa.) - Researchers from Dickinson College and the U.S. Geological Survey collaborated on field research to understand the ability of human searchers to detect the invasive brown treesnake (BTS) on the island of Guam. Due to their nocturnal and tree-dwelling habits, these snakes are extremely difficult to detect, especially when they are present at low densities in an area. A new study published in the January issue of Ecosphere helps explain why and provides valuable information on optimizing search methods and search locations that could be valuable if the BTS was accidentally introduced to a snake-free island.

In a study partially funded by the U.S. Navy, a team of researchers led by Scott Boback, associate professor of biology at Dickinson College, determined the precise location of snakes using radio telemetry in a 55-hectare forest on Guam, while a team of USGS scientists led by Robert Reed performed visual surveys at the same site. Boback and Reed say the synchronized combination of those techniques revealed where and how visual surveyors failed to detect some snakes. Because this study was performed on a low-density population after intensive snake control efforts, results provide rapid response teams with tools to improve detection during early invasions on nearby islands.

"By having a team perform independent visual surveys at the same time that another team was locating radio-tagged snakes, we were able to conclude that low detection rates are primarily due to snake behavior rather than searcher ability," Reed said. "Our findings about when and where snakes are detectable can then be used to target them more precisely during survey efforts."

Guam is a U.S. territory in the Pacific Ocean approximately 4,000 miles west of the Hawaiian Islands. Since arriving on the island accidentally around the time of World War II, the BTS has caused ecological devastation: it has contributed to the loss of 12 of 13 native forest bird species as well as several lizard and bat species. The snake has also caused millions of dollars of infrastructure damage due to electrical outages and has resulted in hospitalization of numerous infants due to snakebite.

According to Boback, research that improves detection and rapid response measures is key to preventing "Guam 2.0"--or the arrival of the BTS on nearby, snake-free islands. "What would happen if brown treesnakes got transported to nearby, snake-free islands is the nightmare no one wants to talk about," said Boback. "Prevention is the goal, but the spread of the BTS is an event for which preparedness and rapid response are critical."

Boback said the nearly certain "catastrophic cascading effect" of ecological imbalance caused by BTS arriving on nearby islands would include biodiversity loss, increased insect populations and economic loss from ecotourism.

Journal

Ecosphere

DOI

10.1002/ecs2.3000

Credit: 
Dickinson College