Culture

Earthquake catalog shows complex rupturing during 2019 ridgecrest sequence

The 2019 Ridgecrest earthquake sequence, which startled nearby California residents over the 4 July holiday with magnitude 6.4 and magnitude 7.1 earthquakes, included 34,091 earthquakes overall, detailed in a high-resolution catalog created for the sequence.

The catalog, developed by David Shelly at the U.S. Geological Survey in Golden, Colorado, was published in the Data Mine column in Seismological Research Letters. The paper is part of a larger Data Mine series aimed at rapidly sharing data from the Ridgecrest sequence among researchers.

"Because of the complexity in this sequence, I think there are still a lot of unanswered questions about what the important aspects of the triggering and evolution of this sequence were, so having this catalog can help people make more progress on answering those questions," said Shelly.

Shelly used a technique called template matching, which scanned through seismic signals to find those matching the "fingerprint" of 13,525 known and cataloged earthquakes, as well as precise relative relocation techniques to detect 34,091 earthquakes associated with the event. Most of the earthquakes were magnitude 2.0 or smaller.

The catalog covers the time period spanning the the foreshock sequence leading up to the 4 July 2019 magnitude 6.4 earthquake through the first 10 days of aftershocks following the magnitude 7.1 earthquake on 5 July.

By precisely locating the earthquakes, Shelly was able to discern several crosscutting fault structures in the region, with mostly perpendicular southwest- and northwest strikes. The foreshocks of the magnitude 6.4 event aligned on a northwest-striking fault that appears to have ruptured further in the aftershocks of that earthquake, along with a southwest-striking fault where a surface rupture was observed by teams who went out to the site.

Shelly said the magnitude 7.1 earthquake appears to have started at the northwestern edge of the magnitude 6.4 rupture, extending to the northwest and southeast and possibly extending that rupture to the northwest and southeast. The magnitude 7.1 event was highly complex, with several southwest-striking alignments and multi-fault branching and high rates of aftershocks, especially at the northwestern end of the rupture.

The Ridgecrest earthquakes took place along "a series of immature faults, in the process of developing," Shelly said, noting that this could explain in part why the earthquake sequence was so complex. Compared to the mature San Andreas Fault Zone to the west, which accommodates about half of the relative plate motion as the Pacific and North American tectonic plates collide, the Ridgecrest faults are broadly part of the Eastern California Shear Zone, where multiple faults accommodate up to 25 percent of this tectonic strain.

Shelly noted that the catalog benefitted from the long-established, densely instrumented, real-time seismic network that covers the region. "When there's a big earthquake in an area that's not well-covered, people rush out to try to at least cover the aftershocks with great fidelity," he explained. "Here, having this permanent network makes it so you can evaluate the entire earthquake sequence, starting with the foreshock data, to learn more about the earthquake physics and processes."

Credit: 
Seismological Society of America

Montana State researcher harnesses microorganisms to make living building materials

image: Chelsea Heveran uses a scanning electron microscope to examine Synechococcus cyanobacteria in MSU's Imaging and Chemical Analysis Laboratory on Jan. 14, 2020. MSU Photo by Adrian Sanchez-Gonzalez

Image: 
MSU Photo by Adrian Sanchez-Gonzalez

BOZEMAN -- To make a building material that's alive, Montana State University researcher Chelsea Heveran has a recipe: get some gelatin from the grocery store, make a broth with bacteria called Synechococcus that photosynthesize like plants, add a bit of calcium, then mix with sand and cool until hardened into an concrete-like solid that can be used to replicate itself.

Like many recipes, this one is underlaid by some complex chemistry and is the hard-won result of experimentation, according to Heveran, the lead author of a new paper published in the journal Matter that summarizes the research. The article appeared online Jan. 15 and was featured the same day in a story in the New York Times.

Heveran, assistant professor in the Department of Mechanical and Industrial Engineering in MSU's Norm Asbjornson College of Engineering, said the study marks the first time that microbes have been used as the main catalyst of a building material in a way that preserves them for later use. The team demonstrated that the Synechococcus cyanobacteria remained alive in the sandy bricks for a month or more under favorable conditions of humidity and temperature.

"You can break off a piece and use it to make new bricks," said Heveran, who conducted the study with a team of colleagues at University of Colorado Boulder, where she was a postdoctoral researcher before continuing to contribute to the project as an MSU faculty member.

That's a fundamental improvement over normal concrete, where each batch requires significant amounts of a chemical binder -- cement -- that must be mined, processed and hauled to the mixing site, Heveran explained. By contrast, she imagines a scenario in which a single living brick could be brought to a remote location; simple additives and some basic equipment are all that would then be needed to transform native earth into infrastructure.

The recipe works because the photosynthesizing Synechococcus cause calcium carbonate, the main mineral in limestone, to form in the solution and solidify the sand mixture. The researchers found that they could reliably control the cyanobacteria's behavior by adjusting temperature and humidity. High humidity and cool temperatures let the microbes stay alive for extended time periods, while higher temperatures caused the bricks to re-dissolve and promoted microbial growth and mineralization. In this way, one brick could be divided to generate multiple new living bricks. Maximum strength was achieved by drying the material, which killed the microbes.

According to Heveran, the material is surprisingly tough -- about as strong as cement mortar but weaker than concrete. "We aren't ready to build a skyscraper out of this stuff," she said. Compared to normal concrete, however, the bacterial bricks are relatively easy to recycle by dissolving them and adding new Synechococcus to re-solidify the mixture.

Heveran conducted the research in the Living Materials Laboratory run by Wil Srubar, assistant professor in CU Boulder's Department of Civil, Environmental, and Architectural Engineering, as well as at MSU, where she joined the faculty in August 2018. She said MSU is known for its research in biomineralization -- the process by which living organisms produce minerals that can modify their surroundings. A team of MSU researchers that includes assistant professor of civil engineering Adrienne Phillips has used other biomineralizing bacteria to seal leaks in oil and gas wells. Along with Phillips and Cecily Ryan, assistant professor of mechanical and industrial engineering, Heveran is now researching ways to make an effective concrete filler from discarded plastic that has been shredded and biomineralized to improve its chemical bonding ability.

"We're very excited to have Chelsea here," said Dan Miller, head of the mechanical and industrial engineering department. "Her expertise is a great contribution to MSU's research in materials science and the crossover with biology, which is creating a cutting-edge and growing field in engineering right now."

Heveran said she draws inspiration from bone, a living material she studied while earning her doctorate at CU Boulder. "Bone is amazing because it's made by cells -- it self-repairs and maintains high strength and toughness for decades," she said. The new study published in Matter suggests potential for additional properties engineered into building materials using other microorganisms.

"We're happy with what we engineered -- it has some neat properties," Heveran said. "But we're thinking of this more as a platform to say, 'We really could start to engineer living building materials. We could do so much more.'"

Credit: 
Montana State University

Both simple and advanced imaging can predict best stroke patients for thrombectomy

image: Amrou Sarraj, MD, of McGovern Medical School at UTHealth, is leading research on endovascular thrombectomy for stroke patients

Image: 
Maricruz Kwon, UTHealth

Both simple and advanced computed tomography (CT) were effective in accurately predicting which stroke patients would benefit from endovascular thrombectomy to remove a large cerebral clot, but together they were even better, reported researchers at The University of Texas Health Science Center at Houston (UTHealth).

Results of the multicenter study, Optimizing Patient Selection for Endovascular Treatment in Acute Ischemic Stroke (SELECT), were published in yesterday's Early View edition of the Annals of Neurology.

Stroke is the leading cause of long-term disability and fourth-leading cause of death in the world. An ischemic stroke, caused by a blockage of an artery, is the most common form. Endovascular thrombectomy can be performed to remove a clot lodged in a blood vessel with a mechanical device threaded through an artery. It has been shown to be an effective treatment for improving clinical outcomes in stroke up to 24 hours from onset.

"Endovascular thrombectomy has revolutionized the treatment for acute stroke patients presenting with large vessel occlusion. Different imaging techniques are used to identify patients who may benefit from this treatment. However, how these imaging profiles correlate with each other and with the stroke outcomes is unknown," said Amrou Sarraj, MD, lead author and associate professor of neurology at McGovern Medical School at UTHealth.

Imaging must be done to determine the location of the clot and whether the patient is a good candidate for thrombectomy, meaning they have a smaller area of brain tissue death. Physicians use non-contrast simple CT and/or CT with an injected contrast dye (CT perfusion) to view the clot and surrounding area of cellular death. While simple CT is readily available at most hospitals, CT perfusion tends to be only available at more advanced stroke centers.

Of the 361 patients enrolled, a significant proportion of patients had favorable imaging results on both the CT and CT perfusion, meaning they were candidates for endovascular thrombectomy. Those patients also had significantly higher odds of receiving endovascular therapy and higher 90-day functional independence rates after recovery (58%).

Even when the two imaging modalities disagreed, the functional and safety outcomes were reasonable (38% achieved functional independence), which was better than the patients who did not receive thrombectomy. Patients with an unfavorable result on CT perfusion imaging, but favorable on simple CT, had higher rates of symptomatic hemorrhage in the brain tissue and death after stroke. Patients with unfavorable imaging profiles on both modalities had very poor outcomes.

"While best outcomes were observed in patients with a favorable profile on both imaging modalities, patients who had a favorable profile on at least one imaging modality also achieved reasonable outcomes," said Sarraj, who sees patients at UT Physicians, the clinical practice of McGovern Medical School, and is an attending neurologist at Memorial Hermann-Texas Medical Center.

The ongoing international Phase III randomized controlled trial, SELECT2, also led by Sarraj, will assess the efficacy and safety of thrombectomy procedure in patients with unfavorable profile on one or both imaging modalities. The SELECT trials are funded by grants from Stryker Neurovascular.

Credit: 
University of Texas Health Science Center at Houston

Scientists identify gene that puts brakes on tissue growth

image: Results from a Northwestern University study of the planarian flatworm could have ramifications for novel tissue engineering methods.

Image: 
Northwestern University

The planarian flatworm is a simple animal with a mighty and highly unusual ability: it can regenerate itself from nearly every imaginable injury, including decapitation. These tiny worms can regrow any missing cell or tissue -- muscle, neurons, epidermis, eyes, even a new brain.

Since the late 1800s, scientists have studied these worms to better understand fundamental principles of natural regeneration and repair, information that could provide insights into tissue healing and cancer. One mechanism that is yet unknown is how organisms like these control the proportional scaling of tissue during regeneration.

Now, two Northwestern University molecular biologists have identified the beginnings of a genetic signaling pathway that puts the brakes on the animal's growth. This important process ensures the appropriate amount of tissue growth in these highly regenerative animals.

"These worms have essentially discovered a natural form of regenerative medicine through their evolution," said Christian Petersen, who led the research. "Planarians can regenerate their whole lives, but how do they limit their growth? Our discovery will improve understanding of the molecular components and organizing principles that govern perfect tissue restoration."

The findings ultimately may have important ramifications for novel tissue engineering methods or strategies to promote natural repair mechanisms in humans.

Petersen is an associate professor of molecular biosciences in Northwestern's Weinberg College of Arts and Sciences. He and Erik G. Schad, a graduate student in Petersen's lab, conducted the study.

The results were published in the Jan. 20 issue of the journal Current Biology. Petersen is the corresponding author, and Schad is the paper's first author.

The researchers have identified a control system for limiting regeneration and also a new mechanism to explain how stem cells can influence growth. Specifically, Petersen and Schad discovered that a gene called mob4 suppresses tissue growth in the animals. When the researchers inhibited the gene in experiments, the animal grew to twice its normal size.

The gene, they found, works in a rather surprising way: by preventing the descendants of stem cells from producing a growth factor called Wnt, a protein released from cells to communicate across distances. The Wnt signaling pathway is known to play a role in cancer cell regeneration.

Planarians are 2 to 20 millimeters in size and have a complex anatomy with around a million cells. They live in freshwater ponds and streams around the world. The worm's genome has been sequenced, and its basic biology is well-characterized, making planarians popular with scientists.

Credit: 
Northwestern University

Fungal diversity and its relationship to the future of forests

image: Stanford researchers gathered soil samples from dozens of North American forests, including Pike National Forest in Colorado. They used these samples to better understand the influence of climate change on symbiotic soil microbes that control the health of forests.

Image: 
Kabir Peay

If you indulge in truffles, or porcini and chanterelle mushrooms, you have enjoyed a product of ectomycorrhizal fungi. Forming symbiotic relationships with plants - including pine, birch, oak and willow tree species - these fungi have existed for millions of years, their sprawling filaments supporting ecosystems throughout their reach.

According to research from Stanford University, published Jan. 21 in the Journal of Biogeography, by the year 2070, climate change could cause the local loss of over a quarter of ectomycorrhizal fungal species from 3.5 million square kilometers of North American pine forests - an area twice the size of Alaska.

"These are critical organisms for the functioning and the health of forests," said Kabir Peay, associate professor of biology in Stanford's School of Humanities and Sciences and senior author of the study. "We have evidence to suggest that these fungi are as susceptible to climate change as other kinds of organisms and their response may be even more important."

Previously, the Peay lab had mapped the global distributions of forests where trees associate with different types of symbiotic fungi, finding that over 60 percent of all trees on Earth currently associate with ectomycorrhizal fungi. Now, by learning more about the communities these fungi form in different climates, the researchers projected how climate change might affect them in the future.

Microbial maps

Over several years, the Peay lab has gathered about 1,500 soil samples from 68 pine forests, which represent a swath of North America from Florida to Alaska. In past work, they sequenced DNA in each sample to understand what fungal species live in that soil, and in what abundance. Their results, published previously, suggested that fungi were different in each region, contradicting a common assumption that those communities would look similar in most places in the world. They followed that up by mapping the associations between trees and symbiotic microbes around the world.

For their latest paper, Brian Steidinger, a postdoctoral scholar in the Peay lab, explored the relationship between these geographical fungi patterns and historical climate data.

"We took soil from the cores and climatic data unique to each site," said Steidinger, who was lead author of the study. "We found that climate was by far the most important predictor of contemporary fungal diversity patterns across North America."

Steidinger also found that different regions of North America had unique optimal temperatures for fungal diversity. For example, cold boreal forests had a diversity peak around 5 C mean annual temperature, while Eastern temperate forests peaked in diversity near 20 C.

The researchers then applied these data to predict future diversity, given projections of climate change produced by the Intergovernmental Panel on Climate Change. Because of the regional differences in optimal climate for fungal diversity, some forests, particularly those in the North and Northwest, could experience major decreases in fungal diversity.

"According to our models, climate change over the next 50 years could eliminate more than a quarter of ectomycorrhizal species inside 3.5 million square kilometers of North American pine forests," said Steidinger. "That's an area twice the size of Alaska."

Other regions, such as the Eastern temperate forests, could experience gains of 30 to 50 percent - assuming it is as easy to develop new species as to lose them.

"One of the things that's kind of shocking and a little bit scary is that we predict there will be some pretty significant decreases in diversity in western North America, well known culturally for fungal diversity and for people who are interested in collecting edible mushrooms," Peay said.

Buffering against climate change

Ectomycorrhizal fungi form a sheath around their hosts' roots, which can help prevent erosion and protect roots from damage and disease. The fungi seem to boost carbon storage in soil by slowing down decomposition and encouraging the buildup of soil. They also help their host trees grow more quickly - and therefore take in more carbon - by improving their ability to take in nitrogen, which they need in order to grow.

"In terms of ecosystem function, particularly buffering the atmosphere against climate change, ectomycorrhizal fungi are among the last microbes you want to lose," said Steidinger. "We're expecting to lose the species that seem to be the most functionally intense - the ones with the greatest enzyme activity, the ones that forage out the farthest."

Building on this work, the researchers are considering studying forests with low diversity of fungi and conducting experiments to better understand how these altered fungal communities might function in the future.

"For microbiome work, I feel like we're in a new era of discovery," said Peay. "Like Darwin and Wallace getting on ships and going to new places and seeing new things and changing the way they view the world, that is what is happening in this field."

Credit: 
Stanford University

Study provides insight into 'rapport-building' during victim interviews

A University of Liverpool research paper, published in Psychology, Public Policy, and Law, provides details of the approaches needed to help build rapport with victims of crime during interviews.

Interviewing victims is one of the most challenging aspects of sexual offence investigations. Victims can be unwilling to reveal information, specifically within a formal interviewing setting and it is crucial to obtain information since they are often the only source of information.

In the UK, US, Canada and Israel there are a number of models and protocols in place, such as the 'PEACE' model and the National Institute of Child Health and Human Development (NICHD) interview protocol, which helps to ensure a non-accusatory, information gathering approach to interviewing victims.

South Korea

In South Korea, sexual offences are a serious social problem. According to recent official statistics, the total number of sexual crimes occurring per year rose 12% from 2014 to 2018.

The Korean National Police Agency has attempted to improve competences in investigating sex offences, with an emphasis on interviewing due to the limited amount of physical evidence that is often a hallmark of such cases. Since 2004 KNPA have introduced the PEACE model to spread the recent knowledge on investigative interviewing principles and techniques. The KNPA also executed nationwide video-recorded interview system to improve the admissibility of police interviews and to protect the human rights of interviewees in 2007. Further, the KNPA disseminated the NICHD protocol to assist officers interviewing child victims in 2010.

To enhance and maintain the officers' expertise related to the guidelines introduced, the KPIA (Korean Police Investigation Academy), which is the professional training institution of the KNPA, provides relevant investigative interviewing courses composed of learning theories and simulation exercises

Rapport-based interviewing

Unfortunately, research has found that Korean officers often do not adopt the methods recommended by the NICHD in practice. Especially, the establishment and building of rapport when interviewing victims. Rapport-building offers a friendly atmosphere and consequently reduces the uneasiness that may exert a negative impact on information gathered

Rapport-based interviewing can mitigate the negative feelings of child victims during police interviews and increase the amount of information generated. Despite these findings, little is known about how to create an environment of rapport and, more specifically, there is very little known about the set of behaviours or approaches that underpin it.

To find out more researchers from the University's Centre for Critical and Major Incident Psychology, led by Centre Director Professor Laurence Alison, analysed over 100 hours of KNPA investigative interviews using a framework called ORBIT.

ORBIT

The observing rapport-based interpersonal techniques (ORBIT) framework analyses rapport-based interviewing skills along two dimensions: motivational interviewing (MI) skills and interpersonal competence (use of adaptive interviewing behaviours and absence of maladaptive interviewing behaviours).

MI is a counselling method that helps people resolve ambivalent feelings and insecurities to find the internal motivation they need to change their behaviour. For example finding the motivation to tell an interviewer what has happened or details of a perpetrator despite wanting to forget or not want to talk about it.

The researchers coded 103 hours of investigative interviews with sexual offence victims - a sample of 86 single victim cases conducted by 26 police interviewers in South Korea. In all cases, there was a subsequent conviction.

Results

Results showed that humanistic approaches positively influence adaptive interactions between interviewer and victim whilst simultaneously reducing maladaptive ones. This results in an increase in yield.

Researchers also found that interviewer adaptive behaviours directly increase victim adaptive behaviour (with the same effect for maladaptive behaviour). Victim adaptive behaviour is positively associated with interview yield, and victim maladaptive behaviour is negatively associated with it.

Professor Alison, said: "These results suggest that interviews conducted in a humanistic-consistent fashion strongly positively influence adaptive victim behaviour, which, in turn, increases interview yield."

Credit: 
University of Liverpool

Study results will inform immunization programs globally

video: Professor Helen Marshall (University of Adelaide) and study participant Harry Spurrier discuss the findings of the B Part of It meningococcal B study.

Image: 
University of Adelaide.

The results of the B Part of It study - the largest meningococcal B herd immunity study ever conducted - are published today in the New England Journal of Medicine.

The results have implications for all meningococcal B vaccine programs globally.

Led by Professor Helen Marshall from the University of Adelaide's Robinson Research Institute, the B Part of It study involved almost 35,000 senior school students in South Australia, aged 15 to 18 years, during 2017 and 2018.

"Our study has shown good protection was provided by the meningococcal B vaccine against meningococcal disease in those vaccinated but did not show an overall reduction in the proportion of adolescents carrying the bacteria, including the B strain," Professor Marshall says.

Adolescents can harmlessly carry the meningococcus bacteria in the back of the throat with only a very small proportion developing the disease. Meningococcal B is one of the most common strains that causes meningococcal disease, an acute bacterial infection that kills approximately 10% of those infected, and causes permanent disabilities in about 20% of cases. Those most at risk are babies and children up to the age of five years, and teenagers and young adults from ages 15 to 24 years.

"We are pleased to report not a single case of meningococcal disease among our study participants to date, over the three years since the study began, compared to 12 cases in the same age group in the two years prior to the study.

"Potentially this means a life or lives were saved, as, on average, one in every 10 children with meningococcal disease dies from it," Professor Marshall says.

"These results highlight the importance of individual vaccination for adequate protection, as the vaccine is unlikely to be able to stop spread of the bacteria between individuals," Professor Marshall says. "The study has identified the critical finding that individuals need to be vaccinated to protect themselves against meningococcal B disease, rather than expecting community protection through reduced transmission of the bacteria," she says.

The study also identified a number of high-risk behaviours associated with carriage of meningococcal strains in young people, including: smoking cigarettes, attending bars or clubs, and intimate kissing. Older school students, school boarders, and those who had recently had a cold or sore throat were also more likely to carry the meningococcus in their throat.

The outcomes of the B Part of It study are now being used in Australia and globally, to assess the cost-effectiveness of meningococcal B immunization programs for children and young people.

Gill and Oren Klemich, who lost their 18-year-old son, Jack, to meningococcal B in 2009, have supported the B Part of It program as ambassadors and followed the results closely.

"We are extremely proud of the impact this study has had in informing both local and global policy around meningococcal B, as well as the impact the study and South Australian Government funded program have had in vaccinating and protecting over 60,000 young people against this horrible disease," says Oren.

Credit: 
University of Adelaide

Study highlights effectiveness of behavioral interventions in conflict-affected regions

A new study, published in The Lancet Global Health, highlights the effectiveness of behavioural intervention in reducing psychological distress in conflict-affected regions.

More than 125 million people today are directly affected by armed conflict, the highest number since World War II. Although reported rates of mental disorders vary, previous studies have shown that mood and anxiety disorders are common, with high rates for depression and posttraumatic stress disorder. As such, scalable interventions to address a range of mental health problems are desperately needed.

Self-help Plus (SH+) is a guided self-help intervention developed by the World Health Organization (WHO) based on audio-recorded material and an illustrated workbook that can be facilitated by briefly trained non-specialists. It is delivered to groups of up to 30 people over 5 weekly sessions of 2 hours duration. As such, SH+ has been developed as a way of rapidly supporting large numbers of people experiencing psychological distress in the context of humanitarian crises.

Randomised controlled trial

To ascertain the effectiveness of SH+ a research team led by Dr Wietse Tol (Johns Hopkins University, US) and Dr Mark van Ommeren (WHO) conducted the first ever-randomised controlled trial of the intervention. The research team included the University of Liverpool's Dr Ross White (Reader of Clinical Psychology, Dept of Psychological Sciences). Dr White was one of the experts consulted by WHO in the development of the SH+ intervention which is based on Acceptance and Commitment Therapy.

The trial was conducted in Uganda, which hosts over 1.2 million refugees fleeing conflicts in countries such as neighbouring South Sudan and the Democratic Republic of Congo. The researchers visited 14 different villages in the area and recruited 20-30 female South Sudanese refugees from each village. A total of 694 female refugees with at least moderate levels of psychological distress were recruited into the study. Villages were randomly assigned to receive either SH+ or enhanced usual care. Participants were assessed 1 week before, 1 week after, and 3 months after the intervention.

Results

The findings of this large randomised controlled trial indicated that SH+, compared to enhanced usual care, was effective at reducing psychological distress (as assessed by the Kessler 6 assessment instrument) and bringing about improvements on a range of other outcomes (including functioning, depression and wellbeing) 3 months after the intervention had stopped. The SH+ intervention was highly acceptable to participants with at least 80% of participants allocated to receive SH+ attending each of the five sessions.

Dr White, said: "Our research suggests that SH+ may be well suited as a first-line intervention for large populations exposed to major stressors in low-resource settings. SH+ will complement other intensive forms of intervention for those experiencing more severed difficulties.

"Further research is required to explore how the beneficial impact of SH+ can be maintained over longer periods of time."

Credit: 
University of Liverpool

Life's Frankenstein beginnings

image: Szostak believes the earliest cells developed on land in ponds or pools, potentially in volcanically active regions. Ultraviolet light, lightning strikes, and volcanic eruptions all could have helped spark the chemical reactions necessary for life formation.

Image: 
Don Kawahigashi/Unsplash

When the Earth was born, it was a mess. Meteors and lightning storms likely bombarded the planet's surface where nothing except lifeless chemicals could survive. How life formed in this chemical mayhem is a mystery billions of years old. Now, a new study offers evidence that the first building blocks may have matched their environment, starting out messier than previously thought.

Life is built with three major components: RNA and DNA--the genetic code that, like construction managers, program how to run and reproduce cells--and proteins, the workers that carry out their instructions. Most likely, the first cells had all three pieces. Over time, they grew and replicated, competing in Darwin's game to create the diversity of life today: bacteria, fungi, wolves, whales and humans.

But first, RNA, DNA or proteins had to form without their partners. One common theory, known as the "RNA World" hypothesis, proposes that because RNA, unlike DNA, can self-replicate, that molecule may have come first. While recent studies discovered how the molecule's nucleotides--the A, C, G and U that form its backbone--could have formed from chemicals available on early Earth, some scientists believe the process may not have been such a straightforward path.

"Years ago, the naive idea that pools of pure concentrated ribonucleotides might be present on the primitive Earth was mocked by Leslie Orgel as 'the Molecular Biologist's Dream,'" said Jack Szostak, a Nobel Prize Laureate, professor of chemistry and chemical biology and genetics at Harvard University, and an investigator at the Howard Hughes Medical Institute. "But how relatively modern homogeneous RNA could emerge from a heterogeneous mixture of different starting materials was unknown."

In a paper published in the Journal of the American Chemical Society, Szostak and colleagues present a new model for how RNA could have emerged. Instead of a clean path, he and his team propose a Frankenstein-like beginning, with RNA growing out of a mixture of nucleotides with similar chemical structures: arabino- deoxy- and ribonucleotides (ANA, DNA, and RNA).

In the Earth's chemical melting pot, it's unlikely that a perfect version of RNA formed automatically. It's far more likely that many versions of nucleotides merged to form patchwork molecules with bits of both modern RNA and DNA, as well as largely defunct genetic molecules, such as ANA. These chimeras, like the monstrous hybrid lion, eagle and serpent creatures of Greek mythology, may have been the first steps toward today's RNA and DNA.

"Modern biology relies on relatively homogeneous building blocks to encode genetic information," said Seohyun Kim, a postdoctoral researcher in chemistry and first author on the paper. So, if Szostak and Kim are right and Frankenstein molecules came first, why did they evolve to homogeneous RNA?

Kim put them to the test: He pitted potential primordial hybrids against modern RNA, manually copying the chimeras to imitate the process of RNA replication. Pure RNA, he found, is just better--more efficient, more precise, and faster--than its heterogeneous counterparts. In another surprising discovery, Kim found that the chimeric oligonucleotides--like ANA and DNA--could have helped RNA evolve the ability to copy itself. "Intriguingly," he said, "some of these variant ribonucleotides have been shown to be compatible with or even beneficial for the copying of RNA templates."

If the more efficient early version of RNA reproduced faster than its hybrid counterparts then, over time, it would out-populate its competitors. That's what the Szostak team theorizes happened in the primordial soup: Hybrids grew into modern RNA and DNA, which then outpaced their ancestors and, eventually, took over.

"No primordial pool of pure building blocks was needed," Szostak said. "The intrinsic chemistry of RNA copying chemistry would result, over time, in the synthesis of increasingly homogeneous bits of RNA. The reason for this, as Seohyun has so clearly shown, is that when different kinds of nucleotides compete for the copying of a template strand, it is the RNA nucleotides that always win, and it is RNA that gets synthesized, not any of the related kinds of nucleic acids."

So far, the team has tested only a fraction of the possible variant nucleotides available on early Earth. So, like those first bits of messy RNA, their work has only just begun.

Credit: 
Harvard University

Surprise discovery shakes up our understanding of gene expression

A group of University of Chicago scientists has uncovered a previously unknown way that our genes are made into reality.

Rather than directions going one-way from DNA to RNA to proteins, the latest study shows that RNA itself modulates how DNA is transcribed--using a chemical process that is increasingly apparent to be vital to biology. The discovery has significant implications for our understanding of human disease and drug design.

"It appears to be a fundamental pathway we didn't know about. Anytime that happens, it holds promise to open up completely new directions of research and inquiry," said Prof. Chuan He, a world-renowned chemist.

The human body is among the most complex pieces of machinery to exist. Every time you so much as scratch your nose, you're using more intricate engineering than any rocket ship or supercomputer ever designed. It's taken us centuries to deconstruct how this works, and each time someone discovers a new mechanism, a few more mysteries of human health make sense--and new treatments become available.

For example, in 2011, He opened a new avenue of research with his discovery of a particular process called reversible RNA methylation, which plays a critical role in how genes are expressed.

The picture many of us remember learning in school is an orderly progression: DNA is transcribed into RNA, which then makes proteins that carry out the actual work of living cells. But it turns out there are a lot of wrinkles.

He's team found that the molecules called messenger RNA, previously known as simple couriers that carry instructions from DNA to proteins, were actually making their own impacts on protein production. This is done by a reversible chemical reaction called methylation; He's key breakthrough was showing that this methylation was reversible. It wasn't a one-time, one-way transaction; it could be erased and reversed.

"That discovery launched us into a modern era of RNA modification research, which has really exploded in the last few years," said He. "This is how so much of gene expression is critically affected. It impacts a wide range of biological processes--learning and memory, circadian rhythms, even something so fundamental as how a cell differentiates itself into, say, a blood cell versus a neuron."

He's team also identified and characterized a number of "reader" proteins that recognize methylated mRNA and impact target mRNA stability and translation.

But as He's lab worked with mice to understand the mechanisms, they began to see that messenger RNA methylation could not fully explain everything they observed.

This was mirrored in other experiments. "The data coming out of the community was saying there's something else out there, something extremely important that we're missing--that critically impacts many early development events, as well as human diseases such as cancer," he said.

He's team discovered that a group of RNAs called chromosome-associated regulatory RNAs, or carRNAs, was using the same methylation process, but these RNAs do not code proteins and are not directly involved in protein translation. Instead, they controlled how DNA itself was stored and transcribed.

"This has major implications in basic biology," He said. "It directly affects gene transcriptions, and not just a few of them. It could induce global chromatin change and affects transcription of 6,000 genes in the cell line we studied."

He sees major implications in biology, especially in human health--everything from identifying the genetic basis of disease to better treating patients.

"There are several biotech companies actively developing small molecule inhibitors of RNA methylation, but right now, even if we successfully develop therapies, we don't have a full mechanical picture for what's going on," he said. "This provides an enormous opportunity to help guide disease indication for testing inhibitors and suggest new opportunities for pharmaceuticals."

Their breakthrough is only the beginning, He said. "I believe this represents a conceptual change," he said. "Barriers like these are hard to crack, but once you do, everything flows from there."

Credit: 
University of Chicago

FSU Research: Despite less ozone pollution, not all plants benefit

image: From left, Christopher Holmes, the Werner A. and Shirley B. Baum assistant professor of meteorology in the Department of Earth, Ocean, and Atmospheric Science at Florida State University, and Jason Ducker, a postdoctoral researcher. Their research compared levels of atmospheric ozone to the amount of ozone plants took in through pores on their leaves at more than 30 sites over 10 years. They found that environmental factors have more impact on the ozone dose the plants received than the amount of ozone in the atmosphere.

Image: 
Bruce Palmer / FSU Photography Services

TALLAHASSEE, Fla. -- Breathe easy: Concentrations of ozone in the air have decreased over large parts of the country in the past several decades.

But not too easy.

Policies and new technologies have reduced emissions of precursor gases that lead to ozone air pollution, but despite those improvements, the amount of ozone that plants are taking in has not followed the same trend, according to Florida State University researchers. Their findings are published in the journal Elementa: Science of the Anthropocene.

"Past studies of plant damage from ozone have been overly optimistic about what the improving ozone air quality means for vegetation health," said Christopher Holmes, the Werner A. and Shirley B. Baum assistant professor of meteorology in the Department of Earth, Ocean, and Atmospheric Science.

Ozone is a gas made of three oxygen molecules. In the upper levels of the atmosphere, it is helpful for life on Earth because it keeps too much ultraviolet radiation from reaching the planet's surface. But when it's found at ground level, ozone is a pollutant that can damage the lungs. It's also toxic for plants, and present-day levels of the pollutant have cut global grain yields by up to 15 percent, resulting in global losses of soybean, wheat, rice and maize valued at $10 billion to $25 billion annually.

The falling levels of ozone pollution are good news for human health, but FSU researchers wanted to know if plants were seeing benefits too. To answer this question, Allison Ronan, a former graduate student, and Jason Ducker, a postdoctoral researcher at FSU, worked with Holmes and another researcher to track the amount of ozone plants sucked up through pores on their leaves over 10 years at more than 30 test sites. They compared those trends to measurements of atmospheric ozone.

As they expected, the ozone concentrations in the air decreased at most of their study sites, but, surprisingly, the ozone uptake into plants at the sites didn't necessarily go down at the same time. In fact, at many sites, atmospheric ozone concentrations fell while the ozone uptake into plants rose.

The different trends happen because plants can open and close the stomata pores on their leaves in response to weather, especially light, temperature, moisture, drought and other environmental conditions. If the stomata close, the plants cease taking up ozone, regardless of the concentration in the surrounding air. That means the ozone uptake into leaves doesn't exactly track the amount of ozone in the air. The FSU scientists found that these environmental factors have more impact on the ozone dose the plants receive than the amount of ozone in the atmosphere.

"We know that weather and growing conditions vary a lot from year to year, and that variability in weather turns out to be more important for driving the trends and variability in ozone uptake into plants than the concentrations in the surrounding air," Holmes said. "With decreasing ozone concentrations, we're moving in the right direction, but the benefits for crops and vegetation may not be apparent until the air quality improvements have persisted longer."

The FSU team identified the differing trends by using a dataset developed by Holmes' research group. The dataset, called SynFlux, fuses measurements from air quality networks with data from field sites that monitor energy flows between vegetation and the atmosphere. It enabled the team to study ozone uptake trends at many more sites than has previously been possible.

Future studies of plant damage and accompanying economic losses need to avoid relying primarily on measures of ozone concentration in the atmosphere and look at ozone uptake instead, researchers said.

"With the SynFlux dataset that we have developed, we've now got the information to do that on a large scale at many sites across multiple continents," Holmes said. "We're just scratching the surface of what we can learn about air pollution impacts on vegetation using this tool."

Credit: 
Florida State University

New study provides insights for detecting the invasive brown treesnake

video: This one-minute informational video can be embedded in news stories.

Image: 
Dickinson College

(Carlisle, Pa.) - Researchers from Dickinson College and the U.S. Geological Survey collaborated on field research to understand the ability of human searchers to detect the invasive brown treesnake (BTS) on the island of Guam. Due to their nocturnal and tree-dwelling habits, these snakes are extremely difficult to detect, especially when they are present at low densities in an area. A new study published in the January issue of Ecosphere helps explain why and provides valuable information on optimizing search methods and search locations that could be valuable if the BTS was accidentally introduced to a snake-free island.

In a study partially funded by the U.S. Navy, a team of researchers led by Scott Boback, associate professor of biology at Dickinson College, determined the precise location of snakes using radio telemetry in a 55-hectare forest on Guam, while a team of USGS scientists led by Robert Reed performed visual surveys at the same site. Boback and Reed say the synchronized combination of those techniques revealed where and how visual surveyors failed to detect some snakes. Because this study was performed on a low-density population after intensive snake control efforts, results provide rapid response teams with tools to improve detection during early invasions on nearby islands.

"By having a team perform independent visual surveys at the same time that another team was locating radio-tagged snakes, we were able to conclude that low detection rates are primarily due to snake behavior rather than searcher ability," Reed said. "Our findings about when and where snakes are detectable can then be used to target them more precisely during survey efforts."

Guam is a U.S. territory in the Pacific Ocean approximately 4,000 miles west of the Hawaiian Islands. Since arriving on the island accidentally around the time of World War II, the BTS has caused ecological devastation: it has contributed to the loss of 12 of 13 native forest bird species as well as several lizard and bat species. The snake has also caused millions of dollars of infrastructure damage due to electrical outages and has resulted in hospitalization of numerous infants due to snakebite.

According to Boback, research that improves detection and rapid response measures is key to preventing "Guam 2.0"--or the arrival of the BTS on nearby, snake-free islands. "What would happen if brown treesnakes got transported to nearby, snake-free islands is the nightmare no one wants to talk about," said Boback. "Prevention is the goal, but the spread of the BTS is an event for which preparedness and rapid response are critical."

Boback said the nearly certain "catastrophic cascading effect" of ecological imbalance caused by BTS arriving on nearby islands would include biodiversity loss, increased insect populations and economic loss from ecotourism.

Journal

Ecosphere

DOI

10.1002/ecs2.3000

Credit: 
Dickinson College

Status report: OSIRIS-REx completes closest flyover of sample site nightingale

image: During the OSIRIS-REx Reconnaissance B flyover of primary sample collection site Nightingale, the spacecraft left its safe-home orbit to pass over the sample site at an altitude of 0.4 miles (620 m). The pass, which took 11 hours, gave the spacecraft's onboard instruments the opportunity to take the closest-ever science observations of the sample site.

Image: 
NASA/Goddard/University of Arizona

Preliminary results indicate that NASA's OSIRIS-REx spacecraft successfully executed a 0.4-mile (620-m) flyover of site Nightingale yesterday as part of the mission's Reconnaissance B phase activities. Nightingale, OSIRIS-REx's primary sample collection site, is located within a crater high in asteroid Bennu's northern hemisphere.

To perform the pass, the spacecraft left its 0.75-mile (1.2-km) safe home orbit and flew an almost 11-hour transit over the asteroid, aiming its science instruments toward the 52-ft (16-m) wide sample site before returning to orbit. Science observations from this flyover are the closest taken of a sample site to date.

The primary goal of the Nightingale flyover was to collect the high-resolution imagery required to complete the spacecraft's Natural Feature Tracking image catalog, which will document the sample collection site's surface features - such as boulders and craters. During the sampling event, which is scheduled for late August, the spacecraft will use this catalog to navigate with respect to Bennu's surface features, allowing it to autonomously predict where on the sample site it will make contact . Several of the spacecraft's other instruments also took observations of the Nightingale site during the flyover event, including the OSIRIS-REx Thermal Emissions Spectrometer (OTES), the OSIRIS-REx Visual and InfraRed Spectrometer (OVIRS), the OSIRIS-REx Laser Altimeter (OLA), and the MapCam color imager.

A similar flyover of the backup sample collection site, Osprey, is scheduled for Feb. 11. Even lower flybys will be performed later this spring - Mar. 3 for Nightingale and May 26 for Osprey - as part of the mission's Reconnaissance C phase activities. The spacecraft will perform these two flyovers at an altitude of 820 feet (250 m), which will be the closest it has ever flown over asteroid Bennu's surface.

Credit: 
NASA/Goddard Space Flight Center

Here, there and everywhere: Large and giant viruses abound globally

image: Art illustration capturing giant virus genomic diversity. This image complements a January 22, 2020, Nature paper led by researchers at the Department of Energy Joint Genome Institute uncovering a broad diversity of large and giant viruses that belong to the nucleocytoplasmic large DNA viruses (NCLDV) supergroup. The team reconstructed 2,074 genomes of large and giant viruses from more than 8,500 publicly available metagenome datasets generated from sampling sites around the world, and virus diversity in this group expanded 10-fold from just 205 genomes, redefining the phylogenetic tree of giant viruses.

Image: 
Zosia Rostomian/Berkeley Lab

While the microbes in a single drop of water could outnumber a small city's population, the number of viruses in the same drop--the vast majority not harmful to humans--could be even larger. Viruses infect bacteria, archaea and eukaryotes, and they range in particle and genome size from small, to large and even giant. The genomes of giant viruses are on the order of 100 times the size of what has typically been associated with viruses, while the genomes of large viruses may be only 10 times larger. And yet, while they are found everywhere, comparatively little is known about viruses, much less those considered large and giant.

In a recent study published in the journal Nature, a team led by researchers at the U.S. Department of Energy (DOE) Joint Genome Institute (JGI), a DOE Office of Science User Facility located at Lawrence Berkeley National Laboratory (Berkeley Lab) uncovered a broad diversity of large and giant viruses that belong to the nucleocytoplasmic large DNA viruses (NCLDV) supergroup. The expansion of the diversity for large and giant viruses offered the researchers insights into how they might interact with their hosts, and how those interactions may in turn impact the host communities and their roles in carbon and other nutrient cycles.

"This is the first study to take a more global look at giant viruses by capturing genomes of uncultivated giant viruses from environmental sequences across the globe, then using these sequences to make inferences about the biogeographic distribution of these viruses in the various ecosystems, their diversity, their predicted metabolic features and putative hosts," noted study senior author Tanja Woyke, who heads JGI's Microbial Program.

The team mined more than 8,500 publicly available metagenome datasets generated from sampling sites around the world, including data from several DOE-mission relevant proposals through JGI's Community Science Program. Proposals from researchers at Concordia University (Canada), University of Michigan, University of Wisconsin-Madison, and the Georgia Institute of Technology focused on microbial communities from freshwater ecosystems, including, respectively, the northern Lakes of Canada, the Laurentian Great Lakes, Lake Mendota and Lake Lanier were of particular interest.

Sifting Out and Reconstructing Virus Genomes

Much of what is known about the NCLDV group has come from viruses that have been co-cultivated with amoeba or with their hosts, though metagenomics is now making it possible to seek out and characterize uncultivated viruses. For instance, a 2018 study from a JGI-led team uncovered giant viruses in the soil for the first time. The current study applied a multi-step approach to mine, bin and then filter the data for the major capsid protein (MCP) to identify NCLDV viruses. JGI researchers previously applied this approach to uncover a novel group of giant viruses dubbed "Klosneuviruses."

Previously known members of the viral lineages in the NCLDV group infect mainly protists and algae, and some of them have genomes in the megabase range. The study's lead and co-corresponding author Frederik Schulz, a research scientist in Woyke's group, used the MCP as a barcode to sift out virus fragments, reconstructing 2,074 genomes of large and giant viruses. More than 50,000 copies of the MCP were identified in the metagenomic data, two-thirds of which could be assigned to viral lineages, and predominantly in samples from marine (55%) and freshwater (40%) environments. As a result, the giant virus protein space grew from 123,000 to over 900,000 proteins, and virus diversity in this group expanded 10-fold from just 205 genomes, redefining the phylogenetic tree of giant viruses.

Metabolic Reprogramming a Common Strategy for Large and Giant Viruses

Another significant finding from the study was a common strategy employed by both large and giant viruses. Metabolic reprogramming, Schulz explained, makes the host function better under certain conditions, which then helps the virus to replicate faster and produce more progeny. This can provide short- and long-term impact on host metabolism in general, or on host populations impacted by adverse environmental conditions. Function prediction on the 2,000 new giant virus genomes led the team to uncover a prevalence of encoded functions that could boost host metabolism, such as genes that play roles in the uptake and transport of diverse substrates, and also photosynthesis genes including potential light-driven proton pumps. "We're seeing that this is likely a common strategy among the large and giant viruses based on the predicted metabolism that's encoded in the viral genomes," he said. "It seems to be way more common than had been previously thought."

Woyke noted that despite the number of metagenome-assembled genomes (MAGs) reconstructed from this effort, the team was still unable to link 20,000 major capsid proteins of large and giant viruses to any known virus lineage. "Getting complete, near complete, or partial giant virus genomes reconstructed from environmental sequences is still challenging and even with this study we are likely to just scratch the surface of what's out there. Beyond these 2,000 MAGs extracted from 8,000 metagenomes, there are still a lot of giant virus diversity that we're missing in the various ecosystems. We can detect a lot more MCPs than we can extract MAGs, and they don't fit in the genome tree of viral diversity - yet."

"We expect this to change with not only new metagenome datasets becoming available but also complementary single-cell sorting and sequencing of viruses together with their unicellular hosts," Schulz added.

Credit: 
DOE/Lawrence Berkeley National Laboratory

3,000-year-old teeth solve Pacific banana mystery

image: The findings were made from 3,000-year-old skeletons at Teouma, the oldest archaeological cemetery in Remote Oceania, a region that includes Vanuatu and all of the Pacific Islands east and South, including Hawaii, Rapa Nui and Aotearoa.

Image: 
University of Otago

Humans began transporting and growing banana in Vanuatu 3000 years ago, a University of Otago scientist has discovered.

The discovery is the earliest evidence of humans taking and cultivating banana in to what was the last area of the planet to be colonised.

In an article published this week in Nature Human Behaviour, Dr Monica Tromp, Senior Laboratory Analyst at the University of Otago's Southern Pacific Archaeological Research (SPAR), found microscopic particles of banana and other plants trapped in calcified dental plaque of the first settlers of Vanuatu.

The finds came from 3000-year-old skeletons at the Teouma site on Vanuatu's Efate Island.

Dr Tromp used microscopy to look for 'microparticles' in the plaque, also known as dental calculus, scraped from the teeth of the skeletons. That allowed her to discover some of the plants people were eating and using to make materials like fabric and rope in the area when it was first colonised.

Teouma is the oldest archaeological cemetery in Remote Oceania, a region that includes Vanuatu and all of the Pacific islands east and south, including Hawaii, Rapa Nui and Aotearoa. The Teouma cemetery is unique because it is uncommon to find such well-preserved archaeological burials in the Pacific. Bone generally does not preserve in hot and humid climates and the same is true for things made of plant materials and also food.

The first inhabitants of Vanuatu were people associated with the Lapita cultural complex who originated in Island South East Asia and sailed into the Pacific on canoes, reaching the previously uninhabited islands of Vanuatu around 3000 years ago.

There has been debate about how the earliest Lapita people survived when they first arrived to settle Vanuatu and other previously untouched islands in the Pacific. It is thought Lapita people brought domesticated plants and animals with them on canoes - a transported landscape. But direct evidence for these plants had not been found at Teouma until Dr Tromp's study.

"One of the big advantages of studying calcified plaque or dental calculus is that you can find out a lot about otherwise invisible parts of people's lives," Dr Tromp says. Plaque calcifies very quickly and can trap just about anything you put inside of your mouth - much like the infamous Jurassic Park mosquito in amber - but they are incredibly small things that you can only see with a microscope."

The study began as part of Dr Tromp's PhD research in the Department of Anatomy and involved collaboration with the Vanuatu Cultural Centre, Vanuatu National Herbarium and the community of Eratap village - the traditional landowners of the Teouma site.

Dr Tromp spent hundreds of hours in front of a microscope finding and identifying microparticles extracted from thirty-two of the Teouma individuals. The positive identification of banana (Musa sp.) is direct proof it was brought with the earliest Lapita populations to Vanuatu.

Palm species (Arecaceae) and non-diagnostic tree and shrub microparticles were also identified, indicating these plants were also important to the lives of this early population, possibly for use as food or food wrapping, fabric and rope making, or for medicinal purposes, Dr Tromp says.

"The wide, and often unexpected range of things you can find in calcified plaque makes what I do both incredibly exciting and frustrating at the same time."

The article was co-authored by Elizabeth Matisoo-Smith, Rebecca Kinaston and Hallie Buckley of the University of Otago, and Stuart Bedford and Matthew Spriggs of the Australian National University . It can be found here: https://www.nature.com/articles/s41562-019-0808-y

Credit: 
University of Otago