Culture

Bacteria in iron-deficient environments process carbon sources selectively

When humans have low iron levels, they tend to feel weak, fatigued and dizzy. This fatigue prevents patients with iron-deficient anemia from exercising or exerting themselves in order to conserve energy.

Similarly, in low-iron environments, microbes survive by slowing down carbon processing and extracting iron from minerals. However, this strategy requires microbes to invest precious food sources into producing mineral-dissolving compounds. Given this paradox, researchers wanted to understand how microbes sustain survival strategies in environments with too little iron to thrive.

Iron is critical to carbon metabolism because it's required by the proteins involved in processing carbon. But because oxygen makes soluble iron less abundant in the environment, bacteria often operate under iron limitation and need to shut down or dramatically decrease carbon intake.

Looking at a group of bacteria from soil, researchers at Northwestern University discovered that these organisms overcome limitation in their carbon processing machinery by rerouting their metabolic pathways to favor producing iron-scavenging compounds. The study is the first to use metabolomics, a high-resolution technique to monitor carbon flow in the cells, to study the impact of iron on the carbon cycling in bacterial cells.

The study was published today (Nov. 30) in the journal Proceedings of the National Academy of Sciences.

Ludmilla Aristilde, an associate professor of civil and environmental engineering at the McCormick School of Engineering, led the research. Her research group focuses on understanding environmental processes that involve organics, with implications for ecosystem health, agricultural productivity and environmental biotechnology.

Within the network of metabolism in bacteria, the citric acid cycle provides the carbon skeletons needed to make iron-scavenging compounds. Metabolism of certain carbon sources generates better carbon and fuel from the citric acid cycle. Iron-starved bacteria favor carbon processing through the citric acid cycle in order to produce more iron-scavenging compounds. Aristilde said this finding is significant because the research reveals that inorganic nutrients can have a direct impact on organic processes.

"The hierarchy in carbon metabolism highlights that selectivity in specific carbon usage is strongly linked to something that is inorganic," Aristilde said. "To put this in the context of climate change, we need to understand what conditions control soil carbon cycling and its contribution to carbon dioxide."

By focusing on the Pseudomonas species in soils, the research group was able to make inferences about other species. The Pseudomonas bacteria also exist as human and plant pathogens, in our gut and elsewhere in the environment. Aristilde hopes that because the bacteria she and her researchers chose to study are so ubiquitous, future research will be able to use her group's findings as a roadmap.

Past research studied organism behavior with a lower resolution of information. While scientists have used genomics to predict what may happen in metabolism of species based on identifying and measuring genes, the Aristilde lab uses metabolomics of the species to capture what is actually happening in metabolism. Their research provides clues that imply many other organisms and systems might also employ similar metabolic strategies.

As an environmental engineer, Aristilde said that her area of study is all about understanding mechanisms and making predictions about how environmental processes like the carbon cycle behave. Beyond carbon cycling and climate change, the study also has implications in plant and human health. Understanding how bacteria that cause inflections change carbon metabolism to compete for iron in their plant or human hosts can enable researchers to better design target treatments.

Credit: 
Northwestern University

Area burned by severe fire increased 8-fold in western US over past four decades

WASHINGTON--The number of wildfires and the amount of land they consume in the western U.S. has substantially increased since the 1980s, a trend often attributed to ongoing climate change. Now, new research finds fires are not only becoming more common in the western U.S. but the area burned at high severity is also increasing, a trend that may lead to long-term forest loss.

The new findings show warmer temperatures and drier conditions are driving an eight-fold increase in annual area burned by high severity fire across western forests from 1985-2017. In total, annual area burned by high severity wildfires -- defined as those that kill more than 95% of trees -- increased by more than 450,000 acres.

"As more area burns at high severity, the likelihood of conversion to different forest types or even to non-forest increases," said Sean Parks, a research ecologist with the U.S. Forest Service Rocky Mountain Research Station and lead author of the new study. "At the same time, the post-fire climate is making it increasingly difficult for seedlings to establish and survive, further reducing the potential for forests to return to their pre-fire condition."

Parks will present the results Wednesday, 9 December at AGU's Fall Meeting 2020. The findings are also published in AGU's journal Geophysical Research Letters, which publishes high-impact, short-format reports with immediate implications spanning all Earth and space sciences.

Scientists have known for years that wildfires are on the rise in the western U.S., coincident with recent long-term droughts and warmer temperatures. Many western states, especially parts of California, have undergone several multi-year droughts over the past four decades, a fact scientists attribute to human-caused changes to the climate. However, it is less clear how fire severity has changed over the past half century.

In the new study, Parks and John Abatzoglou, an atmospheric scientist at the University of California Merced, used satellite imagery to assess fire severity in four large regions in the western U.S. from 1985 to 2017. Rather than analyze the amount of area burned each year, they instead looked at the area burned at high severity, which is more likely to adversely impact forest ecosystems and human safety and infrastructure.

"The amount of area burned during a given year is an imperfect metric for assessing fire impacts," Parks said. "There was a substantial amount of fire in the western U.S. prior to Euro-American colonization, but that fire did not likely have the extreme effects that we're seeing now."

Beneficial fires

Wildfires were historically a common component of many forest ecosystems, especially in dry areas that receive little or sporadic rainfall. Fire was such a common occurrence in some regions that many tree species - especially certain species of pine - evolved traits that allow them to not only survive fires but to facilitate their ignition as well.

In the mountainous slopes of California, for example, ponderosa pines, sugar pines and giant sequoias sport thick bark that keeps the living tissue underneath insulated from extreme heat. Some tree species also drop the branches growing closest to the ground, which might otherwise allow fires to climb up into the canopy.

Species like jack pines are so dependent on fire that their seeds are unable to effectively disperse until a passing blaze melts the resinous coating surrounding their cones. And the slender, needle-like leaves of pines dry out more quickly than the broad leaves of deciduous hardwoods, making them excellent kindling.

The catch is these trees evolved to cope with frequent, low-intensity fires. During a severe fire, even the most well-adapted plants can succumb to mortality. If too many trees die, forest regrowth can be impeded by the lack of viable seeds.

"Forest burned at high severity bears the biggest ecological impacts from a fire," said Philip Dennison, a fire scientist at the University of Utah who was unaffiliated with the study. "These are the areas that are going to take the longest to recover, and in many places that recovery has been put into question due to higher temperatures and drought."

A 2019 study authored by Parks found up to 15% of intermountain forests in the western U.S. are at risk of disappearing. In dry regions, such as the southwestern U.S., that number increases to 30% when assuming fires burn under extreme weather.

As western North America continues to reel from the vice-like grip of droughts and increasing temperatures, scientists expect severe fires will become even more common.

"One take home message is that fire severity is elevated in warmer and drier years in the western U.S., and we expect that climate change will result in even warmer and drier years in the future," Parks said.

Credit: 
American Geophysical Union

HIV-like virus edited out of primate genome

image: Dr. Kamel Khalili and Dr. Tricia Burdo at the Lewis Katz School of Medicine at Temple University

Image: 
Temple University

(Philadelphia, PA) - Taking a major step forward in HIV research, scientists at the Lewis Katz School of Medicine at Temple University have successfully edited SIV - a virus closely related to HIV, the cause of AIDS - from the genomes of non-human primates. The breakthrough brings Temple researchers and their collaborators closer than ever to developing a cure for human HIV infection.

"We show for the first time that a single inoculation of our CRISPR gene-editing construct, carried by an adeno-associated virus, can edit out the SIV genome from infected cells in rhesus macaque monkeys," said Kamel Khalili, PhD, Laura H. Carnell Professor and Chair of the Department of Neuroscience, Director of the Center for Neurovirology, and Director of the Comprehensive NeuroAIDS Center at the Lewis Katz School of Medicine at Temple University (LKSOM).

Dr. Khalili was a senior co-investigator on the new study, with Tricia H. Burdo, PhD, Associate Professor and Associate Chair of Education in the Department of Neuroscience at LKSOM, who is an expert on the utilization of the SIV (simian immunodeficiency virus)-infected antiretroviral therapy (ART)-treated rhesus macaque model for HIV pathogenesis and cure studies; and with Andrew G. MacLean, PhD, Associate Professor at the Tulane National Primate Research Center and the Department of Microbiology and Immunology at Tulane University School of Medicine, and Binhua Ling, PhD, Associate Professor at the Southwest National Primate Research Center, Texas Biomedical Research Institute. Dr. Ling was previously Associate Professor at the Tulane National Primate Research Center and the Department of Microbiology and Immunology at Tulane University School of Medicine. Pietro Mancuso, PhD, an Assistant Scientist in Dr. Khalili's laboratory in the Department of Neuroscience at LKSOM, was first author on the report, which was published online November 27 in the journal Nature Communications.

Of particular significance, the new work shows that the gene-editing construct developed by Dr. Khalili's team can reach infected cells and tissues known to be viral reservoirs for SIV and HIV. These reservoirs, which are cells and tissues where the viruses integrate into host DNA and hide away for years, are a major barrier to curing infection. SIV or HIV in these reservoirs lies beyond the reach of ART, which suppresses viral replication and clears the virus from the blood. As soon as ART is stopped, the viruses emerge from their reservoirs and renew replication.

In non-human primates, SIV behaves very much like HIV. "The SIV-infected rhesus macaque model studied in Dr. Burdo's lab is an ideal large animal model for recapitulating HIV infection in humans," explained Dr. Khalili.

For the new study, the researchers began by designing an SIV-specific CRISPR-Cas9 gene-editing construct. Experiments in cell culture confirmed that the editing tool cleaved integrated SIV DNA at the correct location from host cell DNA, with limited risk of potentially harmful gene editing at off-target sites. The research team then packaged the construct into an adeno-associated virus 9 (AAV9) carrier, which could be injected intravenously into SIV-infected animals.

Dr. Burdo, in collaboration with colleagues at Tulane National Primate Research Center, randomly selected three SIV-infected macaques to each receive a single infusion of AAV9-CRISPR-Cas9, with another animal serving as a control. After three weeks, the researchers harvested blood and tissues from the animals. Analyses showed that in AAV9-CRISPR-Cas9-treated macaques, the gene-editing construct had been distributed to a broad range of tissues, including the bone marrow, lymph nodes, and spleen, and had reached CD4+ T cells, which are a significant viral reservoir.

Moreover, the Temple researchers demonstrated that the SIV genome was effectively cleaved from infected cells, based on genetic analyses of tissues from treated animals. "The step-by-step excision of SIV DNA occurred with high efficiency from tissues and blood cells," Dr. Mancuso explained. Excision efficiency varied by tissue but reached notably high levels in the lymph nodes.

The new study is a continuation of efforts by Dr. Khalili and colleagues to develop a novel gene-editing system using CRISPR-Cas9 technology - the subject of the 2020 Nobel Prize in Chemistry - to specifically remove HIV DNA from genomes harboring the virus. The researchers have shown previously that their system can effectively eliminate HIV DNA from cells and tissues in HIV-infected small animal models, including HIV-1 humanized mice.

Co-corresponding author Dr. MacLean is encouraged by the findings. "This is an important development in what we hope will be an end to HIV/AIDS," says MacLean. "The next step is to evaluate this treatment over a longer period of time to determine if we can achieve complete elimination of the virus, possibly even taking subjects off of ART."

Dr. MacLean is hopeful that this treatment strategy will translate to the human population. The biotech company Excision BioTherapeutics, of which Dr. Khalili is a scientific founder and where Dr. Burdo contributes to preclinical research and development and serves on the Scientific Advisory Board, will assist with funding and infrastructure for larger scale studies and future clinical trials after approval by the Food and Drug Administration.

"We hope to soon move our work into clinical studies in humans as well," Dr. Khalili added. "People worldwide have been suffering with HIV for 40 years, and we are now very near to clinical research that could lead to a cure for HIV infection."

Credit: 
Temple University Health System

Linking medically complex children's outpatient team with hospitalists improved care

image: Ricardo Mosquera, MD, with one of his patients at the UT Physicians High Risk Children's Clinic, which is linking with hospitalists for continuity of care during a child's hospitalization.

Image: 
Dwight Andrews, UTHealth

When medically complex children are hospitalized, linking hospitalists to their regular outpatient providers through an inpatient consultation service were more likely to improve outcomes, according to researchers at The University of Texas Health Science Center at Houston (UTHealth).

Results from the quality improvement trial, which showed the inpatient consultation service was more likely to reduce total hospital days, hospital admissions and readmissions, days in pediatric intensive care (PICU), and health care system costs, were published today in JAMA Pediatrics.

"We expected to see an improvement in parent satisfaction, but I was surprised to see how significantly an inpatient consultation from the outpatient providers reduced admissions, readmission, and PICU days, as well as total hospital days and health system costs," said Ricardo Mosquera, MD, first and corresponding author and associate professor of pediatrics at McGovern Medical School at UTHealth. Mosquera is director of the UTHealth High Risk Children's Program, a collaboration between UTHealth and Children's Memorial Hermann Hospital.

Jon E. Tyson, MD, MPH, professor and assistant dean in the Department of Pediatrics, was the senior author.

During a one-year period from 2016 to 2017, 167 children were randomized to inpatient consultation service and 175 to usual hospital care for the trial. Results showed that with the inpatient consultation service, the probability of reduced hospital days was 91%; of PICU days was 89%, and of mean total health care system costs was 94%.

In total numbers comparing inpatient consultation service to usual care, there were 296 versus 636 total hospital days; and $24,928 versus $42,276 per child year in costs. The economic analysis was done by health care economist and second author Elenir Avritscher, MD, PhD, associate professor in the Department of Pediatrics at McGovern Medical School.

The trial also revealed that parents were more likely to give an overall rating of 9 or 10 (with 10 being the highest) to the providers of inpatient care.

"We knew that families were concerned about not seeing their regular health care provider while their children were in the hospital," Mosquera said. "It was reassuring for them to see us when necessary for visits in the hospital."

Children with complex medical needs account for 0.4% of U.S. children but are responsible for approximately 40% of pediatric deaths and 54% of all pediatric hospital charges.
To help these fragile children have the best outcomes, the UT Physicians High Risk Children's Clinic was established by Mosquera and a multidisciplinary team of health care specialists 10 years ago, growing from one patient to 800.

A UTHealth study published in JAMA in 2014 showed that the outpatient comprehensive care program for high-risk children with medical complexity reduced emergency department visits, hospital and pediatric intensive care unit admissions and stays, and health systems costs. In 2018, the program was named to the national network for Children and Youth Special Health Care Needs, established by the U.S. Maternal and Child Health Bureau.

The UTHealth team recognized a gap because when patients did need to be hospitalized for acute illness, the hospitalists were unfamiliar with the child and their outpatient care, which could provide key insight to their needs while the child was in the hospital.

"In order to avoid fragmented care, we recognized that we should extend comprehensive care to all settings, including hospitalization," said Mosquera. The clinic also offers telemedicine and, when possible, in-person home visits for its patients.

During the trial, parents of children randomized to the inpatient consultation service were asked to contact the outpatient care team when an emergency department doctor was considering hospitalization for the patient. A study nurse also reviewed the daily log to identify admissions of all study children, including those randomized to usual hospital care.

For the inpatient consultation service, an outpatient clinic provider spoke with a member of the child's in-hospital team before or soon after admission, at discharge, and intermittently during the stay for patients with more complicated care, if needed. The clinic team also participated in the post-hospitalization plan, called the patient within 36 hours of discharge, and scheduled a clinic appointment for no more than 10 days after discharge.

"Our physicians consulted with the hospitalists or emergency department physician to determine if the patient should be hospitalized, the course of treatment and care, and discharge and transition back to the outpatient setting," Mosquera said. "But the hospitalist team still retains full responsibilities. I think this model is important not just for children, but for any risk population regardless of age."

Credit: 
University of Texas Health Science Center at Houston

Worst-case emissions projections are already off-track

image: A plenary at the 25th Conference of the Parties, held by the United Nations Framework Convention on Climate Change in December 2019.

Image: 
UNFCCC / UNclimatechange, Flickr

Under the worst-case scenarios laid out in the United Nations’ climate change projections, global temperatures could increase as much as 7.2 degrees Fahrenheit (more than 4 degrees Celsius) by 2100, leading to as much as 3 feet (0.98 meters) in global sea level rise and an array of disastrous consequences for people and planet. But new research from CU Boulder finds that these high-emissions scenarios, used as baseline projections in the UN’s Intergovernmental Panel on Climate Change (IPCC) global assessments, have not accurately reflected the slowing rate of growth in the global economy and we are unlikely to catch up to them anytime soon.

The new study, published today in Environmental Research Letters, is the most rigorous evaluation of how projected climate scenarios established by the IPCC have evolved since they were established in 2005.

The good news: Emissions are not growing nearly as fast as IPCC assessments have indicated, according to the study's authors. The bad news: The IPCC is not using the most accurate and up-to-date climate scenarios in its planning and policy recommendations.

"If we're making policy based on anticipating future possibilities, then we should be using the most realistic scenarios possible," said Matt Burgess, lead author on the study and a fellow at the Cooperative Institute for Research in Environmental Sciences (CIRES) at CU Boulder. "We'll have better policies as a result."

The IPCC was established in 1988 and provides policymakers around the globe with regular research-based assessments on the current and projected impacts of climate change. Its reports, the sixth of which is due out in 2022, play an instrumental role in shaping global climate policy.

To see if IPCC scenarios are on track, the researchers compared projections from the latest report, published in 2014, and data used to prepare the upcoming report, to data gathered from 2005 to 2017 on country-level gross domestic product (GDP), fossil-fuel carbon dioxide emissions, likely energy use and population trends during this century. Burgess and his co-authors show that even before the pandemic, due to slower-than-projected per-capita GDP growth, as well a declining global use of coal, these high-emissions scenarios were already well off-track in 2020, and look likely to continue to diverge from reality over the coming decades and beyond. The COVID-19 pandemic's dampening effect on the global economy only accentuates their findings, they said.

As a result, they contend that these high-emissions scenarios should not be used as the baseline scenarios in global climate assessments, which aim to represent where the world is headed without additional climate mitigation policy.

When it comes to climate change scenarios, some scientists and climate experts fear that economic growth will be higher than the projected scenarios, and we'll be taken by surprise by climate changes. But that is unlikely to happen, according to Burgess, assistant professor in environmental studies and faculty affiliate in economics.

This new research adds to a growing literature that argues that economic growth and energy use are currently over-projected for this century. The research also points out that the high-emissions scenarios used by the IPCC don't fully account for economic damages from climate change.

The researchers recommend that these policy-relevant scenarios should be frequently recalibrated to reflect economic crashes, technological discoveries, or other real-time changes in society and Earth's climate. Anticipating the future is difficult and updates are to be expected, according to Roger Pielke Jr., co-author on the paper and professor of environmental studies.

Their study does not mean that people can let their guard down when it comes to addressing climate change, the authors stress. No matter the scenario, the only way to get to net zero emissions as a society is to dramatically reduce carbon dioxide emissions from our energy sources.

"We're still affecting the climate and the challenge of reducing emissions is as hard as ever," said Pielke Jr. "Just because it's not the worst-case scenario doesn't mean that the problem goes away."

Credit: 
University of Colorado at Boulder

Preschool children can't see the mountains for the cat

COLUMBUS, Ohio - Imagine seeing an image of a cat in front of a wide scene of mountains and being told just to remember the mountains if you saw them in a later picture. As an adult, that's not hard to do.

But a new study shows that, even when told to pay attention to the mountain, preschool children focus so much on the cat that they won't later recognize the same mountain.

The results suggest that young children have a bias toward paying attention to objects rather than scenes, even when the task is to attend to the scenes, said Kevin Darby, lead author of the study, who received his doctorate in psychology at The Ohio State University.

"Children really struggled to ignore objects that were irrelevant to what we told them to do," said Darby, who is now a postdoctoral researcher in psychology at the University of Virginia.

The study was published recently in the journal Child Development.

It's not that children can't focus on scenes, said study co-author Vladimir Sloutsky, professor of psychology at Ohio State. They can, when objects do not compete for attention with the scenes.

"Our findings suggest that children failed to filter irrelevant objects, rather than failed to focus on relevant scenes," Sloutsky said.

The study involved 69 preschool children with an average age of 5 and 80 adults.

Participants were first shown a photograph of an object (tree, cat, car, slide or person) superimposed on a scene (beach, street, office, mountain or kitchen). The were told to remember either the object or the scene.

They then viewed a series of photos showing more objects and scenes and were told to indicate if they again saw the scene or object that they were told to remember from the first photo.

Adults had little problem attending to either the relevant object or to the scene and remembering when they saw it again. But children had trouble recognizing a scene when it appeared in the second photo.

"If we showed a child a tree in front of a beach scene and told them to remember the beach, they couldn't do that," Darby said.

"When the child saw a later image with a car in front of the beach, they would be paying attention to the car and would not recognize the beach," Darby said.

So why do children have a bias toward paying attention to objects? This study can't say, but other research offers some possible explanations, according to the researchers.

For one, objects are the main things to focus on early in development, beginning in early infancy, from parents to toys that they are given, Darby said.

"Objects take up a lot of visual space when you're young and hold them in your hands, so they are easy to focus on," he said.

In addition, objects play a big role in language development. Many of the first words children learn are labels for objects.

There is also evidence that during brain development, regions supporting object recognition develop earlier than regions supporting scene recognition, the researchers said.

Credit: 
Ohio State University

Unexpected similarity between honey bee and human social life

image: An image obtained from the system showing barcoded bees inside the observation hive. Outlines reflect whether a barcode could be decoded successfully (green), could not be decoded (red), or was not detected (no outline). The hive entrance is in the lower-right corner, and the inset reveals two bees that were automatically detected performing trophallaxis.

Image: 
Tim Gernat, University of Illinois

Bees and humans are about as different organisms as one can imagine. Yet despite their many differences, surprising similarities in the ways that they interact socially have begun to be recognized in the last few years. Now, a team of researchers at the University of Illinois Urbana-Champaign, building on their earlier studies, have experimentally measured the social networks of honey bees and how they develop over time. They discovered that there are detailed similarities with the social networks of humans and that these similarities are completely explained by new theoretical modeling, which adapts the tools of statistical physics for biology. The theory, confirmed in experiments, implies that there are individual differences between honey bees, just as there are between humans.

The study, which measures the extent of individual differences in honey bee networking for the first time, was carried out by first author physics PhD student Sang Hyun Choi, postdocs Vikyath D. Rao, Adam R. Hamilton and Tim Gernat, Swanlund Chair of Physics Nigel Goldenfeld and Swanlund Chair of Entomology Gene E. Robinson (GNDP). Goldenfeld and Robinson are also faculty at the Carl R. Woese Institute for Genomic Biology at Illinois, of which Robinson is the director. The collaboration comprised experimental measurements of honey bee social behavior performed by Hamilton, Gernat and Robinson, with data analysis by Rao and theoretical modeling and interpretation by Choi and Goldenfeld. Their findings were published in a recent article in the journal Proceedings of the National Academy of Science.

"Originally, we wanted to use honey bees as a convenient social insect to help us find ways to measure and think about complex societies," said Goldenfeld. "A few years ago, Gene, Tim, Vikyath and I collaborated on a large project that put "bar codes" on bees so that we could automatically monitor everywhere they went in the hive, every direction in which they pointed, and every interaction partner. In this way, we could build a social network in time, something known as a temporal network."

That study, done a few years ago, involved high-resolution imaging of barcode-fitted honey bees, with algorithms detecting interaction events by mapping the position and orientation of the bees in the images. In those studies, researchers focused on trophallaxis -- the act of mouth-to-mouth liquid food transfer -- when measuring the social interactions between honey bees. Trophallaxis is used not only for feeding but for communication, making it a model system for studying social interactions.

"We chose to look at trophallaxis because it is the type of honey bee social interaction that we can accurately track," said Choi. "Since honey bees are physically connected to each other by proboscis contact during trophallaxis, we can tell whether they are actually engaging in an interaction or not. In addition, each honey bee is tagged so we can identify each individual engaged in each interaction event."

"In our earlier work, we asked how long bees spend between events where they meet other bees, and we showed that they interact in a non-uniform way," said Goldenfeld. "Sang Hyun and I took the same data set, but now asked a different question: What about the duration of interaction events, not the time between interactions?"

In looking at the individual interactions, the time spent varied from short interactions to long interactions. Based on these observations, Choi developed a theory where bees exhibited an individual trait of attractiveness that could be likened to human interaction. For example, humans might prefer to interact with friends or family members rather than strangers.

"We developed a theory for this based on a very simple idea: if a bee is interacting with another bee, you can think of that as a sort of "virtual spring" between them," said Goldenfeld. "The strength of the spring is a measure of how attracted they are to each other so if the spring is weak, then the bees will quickly break the spring and go away, perhaps to find another bee with whom to interact. If the spring is strong, they may stay interacting longer. We call this theoretical description a minimal model, because it can quantitatively capture the phenomenon of interest without requiring excessive and unnecessary microscopic realism. Non-physicists are often surprised to learn that detailed understanding and predictions can be made with a minimum amount of descriptive input."

Goldenfeld explained that the mathematical framework for their theory originated from a branch of physics called statistical mechanics, originally developed to describe gas atoms in a container, and since extended to encompass all states of matter, including living systems. Choi and Goldenfeld's theory made correct predictions about the experimental honey bee dataset that was previously collected.

Out of curiosity, the theory was then applied to human datasets, revealing similar patterns as with the honey bee dataset. Choi and Goldenfeld then applied an economic measure for wealth and income disparities in humans -- termed the Gini coefficient -- to show that bees displayed disparities in attractiveness in their social interactions, although not as different as humans. These results indicate a surprising universality of the patterns of social interactions in both honey bees and humans.

"It is obvious that human individuals are different, but it is not so obvious for honey bees," said Choi. "Therefore, we examined the inequality in the activity level of the honey bees in a way that is independent of our theory to verify that honey bee workers are indeed different. Previous work done in our group has used the Gini coefficient to quantify the inequality in honey bee foraging activity so we thought that this method would also work to examine the inequality in trophallaxis activity."

"Finding such striking similarities between bees and humans spark interest in discovering universal principles of biology, and the mechanisms that underlie them," said Robinson.

The researchers' findings suggest that complex societies may have surprisingly simple and universal regularities, which can potentially shed light on the way that resilient and robust communities emerge from very different social roles and interactions. The researchers predict that their minimal theory could be applied to other eusocial insects since the theory does not involve honey bee-specific features.

In future studies, the same techniques from statistical mechanics can be applied to understand the cohesiveness of communities through well-characterized pair interactions, said Choi and Goldenfeld.

"This was my first project after I joined Nigel's group, and it took a long time for me to figure out the right way to approach the problem," said Choi. "It was fun and challenging to work on such an interdisciplinary project. As a physics student studying biological systems, I had never expected myself to use concepts from economics."

"It was very exciting to see how simple physical ideas could explain such a seemingly complex and widespread social phenomenon, and also give some organismal insights," said Goldenfeld. "I was very proud of Sang Hyun for having the persistence and insights to figure this out. Like all transdisciplinary science, this was a really tough problem to solve, but incredibly fascinating when it all came together. This is the sort of advance that arises from the co-location of different scientists within the same laboratory -- in this case the Carl R. Woese Institute for Genomic Biology."

Credit: 
Carl R. Woese Institute for Genomic Biology, University of Illinois at Urbana-Champaign

New method identifies adaptive mutations in complex evolving populations

RIVERSIDE, Calif. -- A team co-led by a scientist at the University of California, Riverside, has developed a method to study how HIV mutates to escape the immune system in multiple individuals, which could inform HIV vaccine design.

HIV, which can lead to AIDS, evolves rapidly and attacks the body's immune system. Genetic mutations in the virus can prevent it from being eliminated by the immune system. While there is no effective cure for the virus currently available, it can be controlled with medication.

"Understanding the genetic drivers of disease is important in the biomedical sciences," said John P. Barton, an assistant professor of physics and astronomy at UCR, who co-led the study with Matthew R. McKay, a professor of electronic and computer engineering and chemical and biological engineering at the Hong Kong University of Science and Technology. "Being able to identify genomic rearrangements is key to understanding how illnesses occur and how to treat them."

Barton explained that notable examples of genetic drivers of disease include mutations that allow viruses to escape from immune control, while others confer drug resistance to bacteria.

"It can be difficult, however, to differentiate between real, adaptive mutations and random genetic variation," he added. "The new method we developed allows us to identify such mutations in complex evolving populations."

Evolutionary history, he added, contains information about which mutations affect survival and which simply reflect random variation.

"However, it is computationally difficult to extract this information from data," he said. "We used methods from statistical physics to overcome this computational challenge. Our method can be applied generally to evolving populations and is not limited to HIV."

McKay explained the new method provides a means to efficiently infer selection from observations of complex evolutionary histories.

"It enables us to sort out which genetic changes provide an evolutionary advantage from those that offer no advantage or have a deleterious effect," he said. "The method is quite general and could be potentially used to study diverse evolutionary processes, such as the evolution of drug resistance of pathogens and the evolution of cancers. The accuracy and high efficiency of our approach enable the analysis of selection in complex evolutionary systems that were beyond the reach of existing methods."

Some well-known diseases that have known genetic causes are cystic fibrosis, sickle cell anemia, Duchenne muscular dystrophy, colorblindness, and Huntington's disease.

"In the case of HIV, an understanding of the genetic mutations that lead to HIV resistance could help researchers determine the most appropriate treatment for patients," Barton said. "Our approach isn't limited to HIV, but there are a few reasons why we focused on HIV as a test system. HIV is highly mutable and genetically diverse. It also mutates within humans to escape from the immune system. Understanding the details of how HIV evolves could therefore help to develop better treatments against the virus."

Barton was supported by a grant from the National Institutes of Health. Study results appear in Nature Biotechnology. The title of the research paper is "MPL resolves genetic linkage in fitness inference from complex evolutionary histories."

Barton and McKay were joined in the study by Muhammad Saqib Sohail and Raymond H. Y. Louie of Hong Kong University of Science and Technology and the University of New South Wales.

The University of California, Riverside (http://www.ucr.edu) is a doctoral research university, a living laboratory for groundbreaking exploration of issues critical to Inland Southern California, the state and communities around the world. Reflecting California's diverse culture, UCR's enrollment is more than 24,000 students. The campus opened a medical school in 2013 and has reached the heart of the Coachella Valley by way of the UCR Palm Desert Center. The campus has an annual statewide economic impact of almost $2 billion. To learn more, email news@ucr.edu.

Credit: 
University of California - Riverside

Earthquake scenario for large German city

image: City of Cologne: probabilities of occurrence of no (left), weak (middle), or strong (right) damage per building. Of the estimated 170,000 residential buildings in the city, we estimate that more than 10,000 could suffer moderate to severe building damage.

Image: 
GFZ

What if there is a major earthquake near Cologne? This scenario is the subject of the "Risk Analysis in Civil Protection 2019", whose report was recently submitted to the German Bundestag (document: Bundestag Drucksache 19/23825). In the 125-page document, a group of experts has listed in detail, on the basis of extensive research work, what effects can be expected in the event of strong ground movements. What Germans usually only know from TV and media reports from other countries is the result of a modeling of a strong earthquake near the megacity of Cologne: ground shaking, damaged and destroyed houses, blocked roads, many injured and dead.

The German Research Center for Geosciences GFZ and its researchers played a central role in this analysis. The GFZ had the task of modeling the ground movements caused by such an earthquake and quantifying possible damage to the city's buildings. In particular, new geophysical models for the Lower Rhine Bay were developed to estimate the influence of the near-surface layers of the subsoil on ground movements. The researchers created a "building-by-building" model of the city in order to quantify the number and vulnerability of buildings that could be affected by the earthquake.

A massive earthquake in the Lower Rhine Bay with a magnitude of 6.5, as assumed for the underlying scenario, is quite possible. The GFZ expert for historical earthquakes, Gottfried Grünthal, says: "Statistical analyses show that an earthquake with a magnitude of 5.5 is to be expected in the Lower Rhine Bay approximately every hundred to three hundred years. A quake with a magnitude of 6.5 is to be expected approximately every 1000 to 3000 years.

Marco Pilz, scientist of the GFZ section earthquake hazard and dynamic risks, describes the fictitious initial situation: "At a depth of only a few kilometers, a tectonic fault ruptures in the Lower Rhine Bay. Only seconds later the shock waves reach the surface and the nearby city of Cologne. The ground starts to shake, buildings creak and sometimes collapse, streets are blocked by falling debris. Good knowledge of the local underground conditions has shown us that these conditions must be taken into account for an accurate modeling of the shaking".

Based on this, a building-related damage assessment suggests that major impacts can be expected in the city of Cologne. "Old buildings are likely to be particularly affected, so that the distribution of damage in the city area could be quite heterogeneous," adds Cecilia Nievas, a researcher from the same section. "Of the estimated 170,000 residential buildings in the city, more than 10,000 could suffer moderate to severe damage according to our calculations".

The further effects, for example on utilities, are more difficult to assess and require detailed investigations: How many hospitals are affected, what capacities remain for the treatment of the injured, and how well do emergency services reach affected regions? GFZ-researcher Pilz: "Although we at GFZ had contributed a large part to this risk analysis, what was remarkable about the cooperation was the involvement of many experts from federal and state authorities, the district government, the affected districts, the cities and their immediately affected services such as the fire department, THW, railroads and energy suppliers. Everyone has worked together, from the very top down to the local level".

Section head Fabrice Cotton adds: "It was a very productive exchange of information. The elaboration of such scenarios is important because they provide an effective tool for dialogue with the authorities and for understanding their needs when planning relief operations. Such exercises can also help to gain a complete overview of the entire seismic risk chain (from the physics of the earthquake to its effects) and to work at the interface between different scientific disciplines (e.g. here between seismology and civil engineering)".

Credit: 
GFZ GeoForschungsZentrum Potsdam, Helmholtz Centre

Research unlocks new information about reading through visual dictionary in the brain

image: Nitin Tandon, MD, and his research team identified a crucial region in the temporal lobe which appears to act as the brain's visual dictionary.

Image: 
James LaCombe

The uniquely human ability to read is the cornerstone of modern civilization, yet very little is understood about the effortless ability to derive meaning from written words. Scientists at The University of Texas Health Science Center at Houston (UTHealth) have now identified a crucial region in the temporal lobe, know as the mid-fusiform cortex, which appears to act as the brain's visual dictionary. While reading, the ability of the human brain to distinguish between a real word such as "lemur" and a made-up word like "urmle" appears to lie in the way that region processes information.

These findings were published today in Nature Human Behavior.

"How much the mid-fusiform responds to a word and how quickly it can distinguish between a real and made-up word is highly dependent on how frequently the real word is encountered in everyday language," said Nitin Tandon, MD, senior author, professor and vice chair in the Vivian L. Smith Department of Neurosurgery at McGovern Medical School at UTHealth. "So short, common words like 'say' can be identified quickly but long, infrequent words like 'murmurings' take longer to be identified as real words."

For the study, Nitin Tandon and postdoctoral fellow Oscar Woolnough, PhD, the lead author, used recordings from patients who had electrodes temporarily placed in their brains while undergoing treatment for epilepsy. These recordings were then used to create a visual representation of the early neural processing of written words.

They found that this region, which has been overlooked in many previous studies of reading, compares incoming strings of letters encountered while reading with stored patterns of learned words. After words are identified in this area of the brain, this information is spread to other visual-processing regions of the brain.

"Since word frequency is one of the main factors that determines how fast people can read, it is likely that the mid-fusiform acts as the bottleneck to reading speed," Tandon said. "We showed that if we temporarily disrupt activity in the mid-fusiform cortex using briefly applied electrical pulses, it causes a temporary inability to read, a dyslexia, but doesn't disrupt other language functions like naming visual objects or understanding speech."

Tandon said the study serves to improve understanding of how people read and can help people with reading disorders such as dyslexia, the most common learning disability.

Credit: 
University of Texas Health Science Center at Houston

Report assesses promises and pitfalls of private investment in conservation

image: A shade-grown coffee farm near the town of Jardín in the Antioquia department of Colombia. Coffee beans grown under trees are higher quality, supporting the livelihoods of farmers and their families. Trees on shade-coffee farms are beneficial for soil health and provide habitat to migratory birds. Photo courtesy of Amanda Rodewald.

Image: 
Amanda Rodewald

The Ecological Society of America (ESA) today released a report entitled "Innovative Finance for Conservation: Roles for Ecologists and Practitioners" that offers guidelines for developing standardized, ethical and effective conservation finance projects.

Public and philanthropic sources currently supply most of the funds for protecting and conserving species and ecosystems. However, the private sector is now driving demand for market-based mechanisms that support conservation projects with positive environmental, social and financial returns. Examples of projects that can support this triple bottom line include green infrastructure for stormwater management, clean transport projects and sustainable production of food and fiber products.

"The reality is that public and philanthropic funds are insufficient to meet the challenge to conserve the world's biodiversity," said Garvin Professor and Senior Director of Conservation Science at Cornell University Amanda Rodewald, the report's lead author. "Private investments represent a new path forward both because of their enormous growth potential and their ability to be flexibly adapted to a wide variety of social and ecological contexts."

Today's report examines the legal, social and ethical issues associated with innovative conservation finance and offers resources and guidelines for increasing private capital commitments to conservation. It also identifies priority actions that individuals and organizations working in conservation finance will need to adopt in order to "mainstream" the field.

One priority action is to standardize the metrics that allow practitioners to compare and evaluate projects. While the financial services and investment sectors regularly employ standardized indicators of financial risk and return, it is more difficult to apply such indicators to conservation projects. Under certain conservation financing models, for example, returns on investment are partially determined by whether the conservation project is successful - but "success" can be difficult to quantify when it is defined by complex social or environmental changes, such as whether a bird species is more or less at risk of going extinct as a result of a conservation project.

Another priority action is to establish safeguards and ethical standards for involving local stakeholders, including Indigenous communities. In the absence of robust accountability and transparency measures, mobilizing private capital in conservation can result in unjust land grabs or in unscrupulous investments where profits flow disproportionately to wealthy or powerful figures. The report offers guidelines for ensuring that conservation financing improves the prosperity of local communities.

According to co-author Peter Arcese, a professor at the University of British Columbia and adjunct professor at Cornell University, opportunities in conservation finance are growing for patient investors who are interested in generating modest returns while simultaneously supporting sustainable development.

"Almost all landowners I've worked with in Africa and North and South America share a deep desire to maintain or enhance the environmental, cultural and aesthetic values of the ecosystems their land supports," Arcese said. "By creating markets and stimulating investment in climate mitigation, and forest, water and biodiversity conservation projects, we can offer landowners alternative income sources and measurably slow habitat loss and degradation."

Rodewald sees a similar landscape of interest and opportunity. "No matter the system - be it a coffee plantation in the Andes, a timber harvest in the Pacific Northwest, or a farm in the Great Plains - I am reminded again and again that conservation is most successful when we safeguard the health and well-being of local communities. Private investments can be powerful tools to do just that," said Rodewald.

Credit: 
Ecological Society of America

App predicts risk of developing Alzheimer's

A new study from Lund University in Sweden shows that validated biomarkers can reveal an individual's risk of developing Alzheimer's disease. Using a model that combines the levels of two specific proteins in the blood of those with mild memory impairment, the researchers are able to predict the risk of developing Alzheimer's. The researchers have also developed an app that doctors can use to give patients a risk assessment.

Oskar Hansson and his colleagues have been researching different biomarkers for a long time to produce better diagnostics at an early stage of Alzheimer's disease. Over the past year, they have also developed accurate markers in blood tests for Alzheimer's. The aim has been to identify the disease at an early stage of its progression, before the actual dementia stage, in order to begin treatment to ease symptoms, avoid unnecessary examinations and create a sense of security among patients.

"Many people with Alzheimer's disease currently seek care when they have only developed mild memory impairment, meaning many years before the dementia stage of the disease. It is often difficult for doctors to give the correct diagnosis in people with milder memory impairment, as many different conditions other than Alzheimer's can be the cause. In this study we developed a model that is based on the results of a simple blood test and that with a high degree of validity can predict who will develop Alzheimer's dementia within four years", explains Oskar Hansson, professor in neurology at Lund University and consultant at the clinical memory research unit at Skåne University Hospital.

Among the many biomarkers the researchers have investigated over the years, the current study shows that a model combining the concentration and levels of the two proteins 'phosphorylated tau' and 'neurofilamet light' in the blood is the one that gives the most reliable result and prognosis comparable to today's cerebral spinal fluid (CSF) analyses. The model also provides a more reliable answer than the current basic model involving age, gender, education and basic memory tests.

"In addition to this initial evaluation, the methods that are currently on offer for diagnosing Alzheimer's disease are costly and time-consuming methods using PET cameras and CSF analyses, which are only available in certain specialist healthcare settings. Our goal over the last few years has been to find simple methods that can be used in primary care to make an early diagnosis and to begin treatment to relieve symptoms at an earlier stage. This will require more studies, but we have absolutely come one major step closer to our goal", adds Hansson.

The researchers have also developed an online tool - an app - that combines basic data (age, gender, education and basic cognition tests) with results from the individual's biomarkers in the blood. Taken together, these provide information about an individual patient's risk of developing Alzheimer's disease within two or four years. The app is currently only intended for research and must be validated in more studies before it can be used in healthcare. The app is available at: https://brainapps.shinyapps.io/plasmaatnapp/

The study involved a total of 573 people with mild cognitive impairment and an average age of 71. The participants, who represented a selection of individuals, came from the two major multi-centre studies: the Swedish BioFinder Study and ADNI, the Alzheimer's Disease Neuroimaging Initiative. The study was led by Oskar Hansson and Niklas Mattsson-Carlgren from Lund University and is a collaboration with Henrik Zetterberg and Kaj Blennow at the Sahlgrenska Academy and Sahlgrenska University Hospital, the Clinical Neurochemistry Laboratory in Mölndal and the American pharmaceutical company Eli Lilly, among others.

Credit: 
Lund University

Study reveals new findings on nature's UV sunscreens

image: New research provides an insight into the behaviour of mycosporine-like amino acids which protect living organisms in our oceans and lakes from the damaging effects of ultraviolet radiation.

Image: 
Swansea University

Swansea University research has provided a new insight into the behaviour of nature's own UV sunscreens when they are exposed to other parts of the light spectrum.

Mycosporine-like amino acids (MAAs) provide screening against the damaging effects of ultraviolet radiation in living organisms in our oceans and lakes.

These compounds are known to increase in the environment where levels of UV are high. The uniqueness of these compounds has led to interest from the healthcare industry in the development of more natural sunscreen formulations.

But a new study by scientists at Swansea University has found that these compounds are also increased when living cells of algae are exposed to light at the far-red end of the light spectrum.

Whilst it is widely known that UV light increases the concentrations of MAAs no-one had investigated the effects of other regions of the light spectrum.

The study's author Professor Carole Llewellyn, from the Faculty of Engineering and Science, said: "It is interesting that the far-red light plays a role in the production of these UV sunscreen compounds and highlights how different regions of the light spectrum play an important role in maintaining a healthy balance within cells" .

"Our discovery also highlights the complex interplay in nature and throws into question the role of these sunscreen compounds with the possibility that that they could also be important in contributing to temperature regulation of the earth's surface."

Credit: 
Swansea University

Study reveals connection between gut bacteria and vitamin D levels

Our gut microbiomes -- the many bacteria, viruses and other microbes living in our digestive tracts -- play important roles in our health and risk for disease in ways that are only beginning to be recognized.

University of California San Diego researchers and collaborators recently demonstrated in older men that the makeup of a person's gut microbiome is linked to their levels of active vitamin D, a hormone important for bone health and immunity.

The study, published November 26, 2020 in Nature Communications, also revealed a new understanding of vitamin D and how it's typically measured.

Vitamin D can take several different forms, but standard blood tests detect only one, an inactive precursor that can be stored by the body. To use vitamin D, the body must metabolize the precursor into an active form.

"We were surprised to find that microbiome diversity -- the variety of bacteria types in a person's gut -- was closely associated with active vitamin D, but not the precursor form," said senior author Deborah Kado, MD, director of the Osteoporosis Clinic at UC San Diego Health. "Greater gut microbiome diversity is thought to be associated with better health in general."

Kado led the study for the National Institute on Aging-funded Osteoporotic Fractures in Men (MrOS) Study Research Group, a large, multi-site effort that started in 2000. She teamed up with Rob Knight, PhD, professor and director of the Center for Microbiome Innovation at UC San Diego, and co-first authors Robert L. Thomas, MD, PhD, fellow in the Division of Endocrinology at UC San Diego School of Medicine, and Serene Lingjing Jiang, graduate student in the Biostatistics Program at Herbert Wertheim School of Public Health and Human Longevity Sciences.

Multiple studies have suggested that people with low vitamin D levels are at higher risk for cancer, heart disease, worse COVID-19 infections and other diseases. Yet the largest randomized clinical trial to date, with more than 25,000 adults, concluded that taking vitamin D supplements has no effect on health outcomes, including heart disease, cancer or even bone health.

"Our study suggests that might be because these studies measured only the precursor form of vitamin D, rather than active hormone," said Kado, who is also professor at UC San Diego School of Medicine and Herbert Wertheim School of Public Health. "Measures of vitamin D formation and breakdown may be better indicators of underlying health issues, and who might best respond to vitamin D supplementation."

The team analyzed stool and blood samples contributed by 567 men participating in MrOS. The participants live in six cities around the United States, their mean age was 84 and most reported being in good or excellent health. The researchers used a technique called 16s rRNA sequencing to identify and quantify the types of bacteria in each stool sample based on unique genetic identifiers. They used a method known as LC-MSMS to quantify vitamin D metabolites (the precursor, active hormone and the breakdown product) in each participant's blood serum.

In addition to discovering a link between active vitamin D and overall microbiome diversity, the researchers also noted that 12 particular types of bacteria appeared more often in the gut microbiomes of men with lots of active vitamin D. Most of those 12 bacteria produce butyrate, a beneficial fatty acid that helps maintain gut lining health.

"Gut microbiomes are really complex and vary a lot from person to person," Jiang said. "When we do find associations, they aren't usually as distinct as we found here."

Because they live in different regions of the U.S., the men in the study are exposed to differing amounts of sunlight, a source of vitamin D. As expected, men who lived in San Diego, California got the most sun, and they also had the most precursor form of vitamin D.

But the team unexpectedly found no correlations between where men lived and their levels of active vitamin D hormone.

"It seems like it doesn't matter how much vitamin D you get through sunlight or supplementation, nor how much your body can store," Kado said. "It matters how well your body is able to metabolize that into active vitamin D, and maybe that's what clinical trials need to measure in order to get a more accurate picture of the vitamin's role in health."

"We often find in medicine that more is not necessarily better," Thomas added. "So in this case, maybe it's not how much vitamin D you supplement with, but how you encourage your body to use it."

Kado pointed out that the study relied on a single snapshot in time of the microbes and vitamin D found in participants' blood and stool, and those factors can fluctuate over time depending on a person's environment, diet, sleep habits, medications and more. According to the team, more studies are needed to better understand the part bacteria play in vitamin D metabolism, and to determine whether intervening at the microbiome level could be used to augment current treatments to improve bone and possibly other health outcomes.

Credit: 
University of California - San Diego

New tech can get oxygen, fuel from Mars's salty water

image: This illustration shows Jezero Crater -- the landing site of the Mars 2020 Perseverance rover -- as it may have looked billions of years go on Mars, when it was a lake. Vijay Ramani's lab has developed a way to extract hydrogen and oxygen out of the briny water that may remain under the Martian surface.

Image: 
NASA/JPL-Caltech

When it comes to water and Mars, there's good news and not-so-good news. The good news: there's water on Mars! The not-so-good news?

There's water on Mars.

The Red Planet is very cold; water that isn't frozen is almost certainly full of salt from the Martian soil, which lowers its freezing temperature.

You can't drink salty water, and the usual method using electricity (electrolysis) to break it down into oxygen (to breathe) and hydrogen (for fuel) requires removing the salt; a cumbersome, costly endeavor in a harsh, dangerous environment.

If oxygen and hydrogen could be directly coerced out of briny water, however, that brine electrolysis process would be much less complicated -- and less expensive.

Engineers at the McKelvey School of Engineering at Washington University in St. Louis have developed a system that does just that. Their research was published today in the Proceedings of the National Academy of Sciences (PNAS).

The research team, led by Vijay Ramani, the Roma B. and Raymond H. Wittcoff Distinguished University Professor in the Department of Energy, Environmental & Chemical Engineering, didn't simply validate its brine electrolysis system under typical terrestrial conditions; the system was examined in a simulated Martian atmosphere at -33 ?F (-36 ?C).

"Our Martian brine electrolyzer radically changes the logistical calculus of missions to Mars and beyond" said Ramani. "This technology is equally useful on Earth where it opens up the oceans as a viable oxygen and fuel source"

In the summer of 2008, NASA's Phoenix Mars Lander "touched and tasted" Martian water, vapors from melted ice dug up by the lander. Since then, the European Space Agency's Mars Express has discovered several underground ponds of water which remain in a liquid state thanks to the presence of magnesium perchlorate -- salt.

In order to live -- even temporarily -- on Mars, not to mention to return to Earth, astronauts will need to manufacture some of the necessities, including water and fuel, on the Red Planet. NASA's Perseverance rover is en-route to Mars now, carrying instruments that will use high-temperature electrolysis. However, the Mars Oxygen In-Situ Resource Utilization Experiment (MOXIE) will be producing oxygen only, from the carbon dioxide in the air.

The system developed in Ramani's lab can produce 25 times more oxygen than MOXIE using the same amount of power. It also produces hydrogen, which could be used to fuel astronauts' trip home.

"Our novel brine electrolyzer incorporates a lead ruthenate pyrochlore anode developed by our team in conjunction with a platinum on carbon cathode" Ramani said. "These carefully designed components coupled with the optimal use of traditional electrochemical engineering principles has yielded this high performance."

The careful design and unique anode allow the system to function without the need for heating or purifying the water source.

"Paradoxically, the dissolved perchlorate in the water, so-called impurities, actually help in an environment like that of Mars," said Shrihari Sankarasubramanian, a research scientist in Ramani's group and joint first author of the paper.

"They prevent the water from freezing," he said, "and also improve the performance of the electrolyzer system by lowering the electrical resistance."

Typically, water electrolyzers use highly purified, deionized water, which adds to the cost of the system. A system that can work with "sub-optimal" or salty water, such as the technology demonstrated by Ramani's team, can significantly enhance the economic value proposition of water electrolyzers everywhere - even right here on planet Earth.

"Having demonstrated these electrolyzers under demanding Martian conditions, we intend to also deploy them under much milder conditions on Earth to utilize brackish or salt water feeds to produce hydrogen and oxygen, for example through seawater electrolysis," said Pralay Gayen, a postdoctoral research associate in Ramani's group and also a joint first author on this study.

Such applications could be useful in the defense realm, creating oxygen on demand in submarines, for example. It could also provide oxygen as we explore uncharted environments closer to home, in the deep sea.

The underlying technologies enabling the brine electrolyzer system are the subject of patent filing through the Office of Technology Management and are available for licensing from the university.

Credit: 
Washington University in St. Louis