Tech

Researchers identify an action mechanism for a drug against Alzheimer' disease

image: From left to right, the experts Carmen Escolano (IBUB-UB) and Mercè Pallàs (UBNeuro-UB).

Image: 
UNIVERSITY OF BARCELONA

A study conducted on mice published in the journal Geroscience has identified the action mechanism of a promising compound against Alzheimer's disease, developed by the team of Medical Chemistry and Pharmacology at the University of Barcelona. The new drug belongs to a family of molecules that, when bound to imidazole I2 receptors, these cause a reduction in neuroinflammation and an improvement in cognition and other markers of the progression of this disease, the most prevalent among dementias. The results show that these beneficial effects would occur when the calcineurin pathway is modulated. According to the researchers, this preclinical study opens the door to the development of new therapies against Alzheimer's, a disease that has not been cured yet, and also against other neurodegenerative diseases.

The article results from the collaboration of two research teams from the Faculty of Pharmacy and Food Sciences, led by Mercè Pallàs, member of the Institute of Neurosciences (UBNeuro), and Carmen Escolano, from the Institute of Biomedicine of the University of Barcelona (IBUB). The study is also signed by UB researchers Christian Griñán Ferré, Foteini Vasilopoulou, Sergio Rodríguez Arévalo, Andrea Bagán and Sònia Abás.

Late-onset Alzheimer's disease murine model

The new compound, which presents a high affinity and selectivity regarding imidazole I2 receptors, has been designed and synthetized by the group on medical chemistry led by Carmen Escolano. These receptors are in several organs and take part in multiple physiological processes (analgesia, inflammation, nervous system diseases, etc.). Moreover, they are related to neurodegenerative processes and they seem to increase in the brain of people with Alzheimer's disease.

Previous studies carried out by this research group have shown the positive effect of this family of compounds on the evolution of Alzheimer's. "Following these results, our goal was to determine the mechanism and parameters that change when the drug is given to animal models, specifically to mice with neurodegeneration linked to aging, which is considered linkable to late-onset Alzheimer's. That is, the one in which the symptoms start around the age of 65 ", notes Carmen Escolano.

In the experiment, the researchers analysed different markers of disease progression, as well as short- and long-term behavioural and memory tests, to study the effects of treatment on the behaviour and memory of mice. The results show a significant improvement in the animals that received the drug, compared to the control group. "The new molecule improved cognition and alleviated the anxiety in mice. In addition, we were able to confirm at a molecular level that the treatment with this molecule reduced the typical neuroinflammation and oxidative stress in Alzheimer's, and it decreased specific markers of the pathology, such as tau protein or beta amyloid", says Mercè Pallàs.

The study also enabled researchers to deduce the mechanism of action of the new compound. "Our findings provide evidence that the molecular changes that take place after treatment are related to the calcineurin pathway, an enzyme phosphatase responsible for the production of inflammatory mediators such as cytokines or the reduction of neuronal plasticity," notes Carmen Escolano.

"These results", continues the researcher, "open up new possibilities for this family of imidazole I2 receptor ligands, as the cognitive improvement they produce in animal models of neurodegeneration is determined by the mechanism of action described".

Credit: 
University of Barcelona

Breakthrough in nuclear physics

image: Using collision data from the ALICE detector at the Large Hadron Collider at CERN, the strong interaction between a proton (right) and the rarest of the hyperons, the omega hyperon (left), which contains three strange quarks, was successful measured with high precision.

Image: 
Daniel Dominguez / CERN

The positively charged protons in atomic nuclei should actually repel each other, and yet even heavy nuclei with many protons and neutrons stick together. The so-called strong interaction is responsible for this. Prof. Laura Fabbietti and her research group at the Technical University of Munich (TUM) have now developed a method to precisely measure the strong interaction utilizing particle collisions in the ALICE experiment at CERN in Geneva.

The strong interaction is one of the four fundamental forces in physics. It is essentially responsible for the existence of atomic nuclei that consist of several protons and neutrons. Protons and neutrons are made up of smaller particles, the so-called quarks. And they too are held together by the strong interaction.

As part of the ALICE (A Large Ion Collider Experiment) project at CERN in Geneva, Prof. Laura Fabbietti and her research group at the Technical University of Munich have now developed a method to determine with high precision the forces that act between protons and hyperons, unstable particles comprising so-called strange quarks.

The measurements are not only groundbreaking in the field of nuclear physics, but also the key to understanding neutron stars, one of the most enigmatic and fascinating objects in our universe.

Comparison between theory and experiment

One of the biggest challenges in nuclear physics today is understanding the strong interaction between particles with different quark content from first principles, that is, starting from the strong interaction between the particles' constituents, the quarks and the gluons, that convey the interaction force.

The theory of the strong interaction can be used to determine the strength of the interaction. However, these calculations do not provide reliable predictions for normal nucleons with up and down quarks, but for nucleons that contain heavy quarks, such as hyperons which contain one or more strange quarks.

Experiments to determine the strong interaction are extremely difficult because hyperons are unstable particles that are rapidly decaying after production. This difficulty has so far prevented a meaningful comparison between theory and experiment. The research method deployed by Prof. Laura Fabbietti now opens a door to high-precision studies of the dynamics of the strong force at the Large Hadron Collider (LHC).

Measurement of the strong force even for the rarest hyperon

Four years ago, Prof. Fabbietti, professor for Dense and Strange Hadronic Matter at TUM, proposed to employ a technique called femtoscopy to study the strong interaction at the ALICE experiment. The technique allows investigating spatial scales close to 1 femtometer (10^-15 meter) - about the size of a proton - and the spatial range of the strong-force action.

Meanwhile, Prof. Fabbietti's group at TUM managed not only to analyse the experimental data for most of the hyperon-nucleon combinations, they also succeeded in measuring the strong interaction for the rarest of all hyperons, the Omega, consisting of three strange quarks. Furthermore, the group also developed their own framework that is able to produce theoretical predictions.

"My TUM group has opened a new avenue for nuclear physics at the LHC, one which involves all types of quarks, reaching an unexpected precision in a place nobody has looked so far," says Prof. Fabbietti. The work published now in "nature" presents only some of the many interactions measured for the first time.

Do neutron stars contain hyperons?

Understanding the interaction between hyperons and nucleons is also extremely important for testing the hypothesis of whether neutron stars contain hyperons. The forces that exist between the particles have a direct influence on the size of a neutron star.

So far, the relationship between the mass and the radius of a neutron star is unknown. In the future, Prof. Fabbietti's work will therefore also help to solve the riddle of the neutron stars.

Credit: 
Technical University of Munich (TUM)

Time to lower body temperature is critical in out-of-hospital cardiac arrest

image: journal providing clinical advances, best practices, and protocols on this critical, life-saving technology, including its application in cardiac arrest, spinal cord and traumatic brain injury, stroke, and burns.

Image: 
Mary Ann Liebert Inc., publishers

New Rochelle, NY, December 8, 2020--Time to reach the target body temperature was a significant factor in achieving favorable neurological outcomes in patients with witnessed out-of-hospital cardiac arrest. Significantly more favorable neurological outcomes occurred if the time to target temperature management was Therapeutic Hypothermia and Temperature Management. Click here to read the article now.

Furthermore, the effectiveness of extracorporeal cardiopulmonary resuscitation (ECPR) compared to conventional cardiopulmonary resuscitation (CCPR) increased as the interval from witnessed out-of-hospital cardiac arrest to target temperature decreased. ECPR with extracorporeal membrane oxygenation (ECMO) is a more promising treatment for out-of-hospital cardiac arrest than CCPR.

Comparing the results of this study to previous analyses, Tadashi Kaneko, Mie University Hospital, and coauthors conclude that "target temperature management may improve the neurological outcomes of witnessed out-of-hospital cardiac arrest." They report higher favorable neurological outcomes than in previous studies of patients with either ECPR or CCPR and no body temperature lowering.

"This article is a significant contribution to the field of therapeutic hypothermia therapy in out-of-hospital cardiac arrest patients emphasizing again the importance of time to treatment and the benefits of ECPR in combination with ECMO," says W. Dalton Dietrich, III, PhD, Editor-in-Chief of Therapeutic Hypothermia and Temperature Management, Scientific Director of The Miami Project to Cure Paralysis, and Kinetic Concepts Distinguished Chair in Neurosurgery, University of Miami Leonard M. Miller School of Medicine.

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Beating the heat: Oxidation in novel coating material for aircraft gas turbine engines

image: The oxidation processes in ytterbium silicide greatly depend on the amount of air in the environment, as evidenced by scanning electron microscopy images and X-ray diffraction peaks

Image: 
Ryo Inoue from Tokyo University of Science

Certain sections of aero gas-turbine engines, which are widely used in aircrafts, regularly reach temperatures above 1,200 °C. Needless to say, any materials used in such harsh environments must be durable and up to the task. Ceramic matrix composites made of silicon carbide (SiC) have recently garnered interest as promising candidates for gas-turbine engines. However, these materials require a heat-resistant coating layer to prevent the oxidation of SiC and subsequent evaporation of SiO2, which is a process that leads to a decrease in the material volume and, therefore, structural defects such as large cracks or the topmost layer flaking off.

Unfortunately, existing coating layers cannot fully prevent this oxidization to SiO2 because oxygen can permeate through microscopic cracks in these layers or by simple diffusion.

To address this issue, some scientists have focused on using ytterbium silicide (Yb-Si) as a coating material because Yb-Si can reach high melting points and their oxides are mainly Yb-silicates, which remain attached as an oxide layer and do not evaporate easily. However, not much is known about the fundamental phenomena that take place in these materials at high temperatures in either air or water vapor environments.

In a recent study published in Intermetallics, a team of scientists-including Junior Associate Professor Ryo Inoue, Assistant Professor Yutaro Arai and Professor Yasuo Kogo from Tokyo University of Science, and Senior Researcher Takuya Aoki from the Japan Aerospace Exploration Agency (JAXA)-set out to understand the oxidation mechanisms in Yb-Si. They conducted a variety of experiments to gain insight into the oxidation behavior (and degradation) of different Yb-Si coatings at high temperatures under three types of atmospheres: air, water vapor, and a mixture of both.

Through X-ray diffraction analysis, energy dispersive spectroscopy, and scanning electron microscopy, the scientists were able to accurately visualize and quantify the morphology and composition of the Yb-Si samples before and after the heat exposure tests. One of the main findings was that the Yb to Si ratio was a major player in defining the oxidation behavior of the material; Yb5Si3 oxidized more than Yb3Si5 because of the preferential oxidation of Yb in silicide. Moreover, the amount of oxide decreased considerably in more water vapor-rich atmospheres.

Most importantly, the researchers explored the mechanisms by which ytterbium content can affect the formation of SiO2. "After heat exposure of both silicides in steam, we found SiO2 in Yb5Si3, whereas Si was actually still present in Yb3Si5," remarks Dr Inoue, who led the study. "Our analyses indicate that SiO2 growth is suppressed in Yb3Si5 because SiO2 partakes in, and is the limiting factor of, reactions that form Yb-silicates," he adds. Though the exact intermediate reactions that lead to the formation of the various Yb-silicates are not completely understood yet, the team presented two highly possible reaction pathways. This will likely be clarified through future studies with even more detailed characterization techniques.

Overall, this study provides meaningful insight into what happens during the oxidation of Yb-Si, which will help in the development of protective coatings for aero gas-turbine engines. "If a coating that can withstand harsher environments can be realized, engine parts will become more heat resistant, which naturally leads to higher engine efficiency," remarks Dr Inoue.

Hopefully, further advances in coating technology will reduce aerial transportation costs and fuel consumption, making flying cheaper and less harmful to the environment.

Credit: 
Tokyo University of Science

Ancient alliance

"Happy families are all alike; each unhappy family is unhappy in its own way." So goes the first line of Leo Tolstoy's "Anna Karenina." Little did the Russian novelist know his famous opening line would one day be used to describe microbial communities, their health and their relationships to their hosts.

It's this idea that an unhealthy or stressed host to a microbiome has a more diverse microbiome than its healthy counterpart," said UC Santa Barbara ecologist An Bui, a graduate student researcher in the lab of theoretical ecologist Holly Moeller. The diversity, she said, is a response to variable conditions that may in turn indicate an unstable or stressed environment. "Healthy hosts are probably going to have very similar microbiomes," she said, "while unhealthy hosts are different in their own ways."

Bui and colleagues recently put the Anna Karenina hypothesis to the test in California's Tehachapi mountains as they sought to understand how climate change might affect fungal communities in woodland soil in a future California.

"Fungi are really important for woodland systems," said Bui, the lead author of a study that appears in the journal FEMS Microbiology Ecology. "But we don't necessarily know how they will change with climate change."

As the global average temperature rises, forests and woodlands around the world are under increasing threat, she explained.

"It's not just about temperature and rainfall, but also the organisms the trees and plants associate with," she said. Soil fungi have a variety of relationships with woodland plants. Saprotrophic fungi, for instance, decompose dead organic matter, while pathotrophs eat live organic matter.

And then there are the symbiotrophs, which engage in mutually beneficial relationships with their plant hosts via their roots. Attaching to roots and extending threadlike hyphae in every direction underground -- the so-called "Wood Wide Web" -- mycorrhizae give the woodland tree and plant community access to nutrients from faraway places.

"They get all of their energy in an exchange for carbon from trees and other plants," Bui said. "And then they give their hosts nitrogen and phosphorus from the soil." These fungi provide almost half of a tree's organic nitrogen budget, according to the study, and contribute the bulk of new carbon into the soil.

To get a sense of how warming could affect California's woodland soil fungal community, the team sampled soils at sites along an arid (dry) to mesic (moderately moist) climactic gradient at the Tejon Ranch in the Tehachapi mountains.

"The sites we worked at were a proxy for what we think California would look like with future climate change," Bui said. As one ascends from the warmer, drier base of the mountains into the cooler, moister elevations, the landscape changes with the temperature and relative humidity, giving the researchers a glimpse of what California woodlands might look like as climate change forces them to retract.

Of particular interest to the team were the soils around the oak trees that dot the landscape, where, in addition to the decomposers and pathogenic fungi in the soil, tree-mutualist mycorrhizae create their vast networks. The researchers were interested in how the number of species and their abundance might change between sites.

"As it turns out, the fungal communities are completely different," Bui said. "And the hottest, driest sites have the highest number and the greatest diversity in fungal species." True to the Anna Karenina hypothesis, the trees under the more arid, stressful conditions had the most diverse and dispersed fungal communities.

But, while the larger fungal communities varied from site to site, Bui said, the communities of mutualists within them tended to remain the same, save for small shifts within the mutualist populations to select for traits that could be more useful under the the circumstances.

"When we looked at ectomycorrhizae and arbuscular mycorrhizae, those communities were more similar across climactic conditions than the whole fungal community," she said. "So there's a possibility that host association for mutualists at least buffers that shift in community structure the whole fungal community experiences."

If so, the benefit could be reciprocal, according to the researchers. Buffering the fungi from climate change preserves their function, which could, in turn, conserve their host trees' function in the face of a changing California woodland ecosystem.

More work would need to be done to understand how far this buffering effect would extend, but the results are a positive bit of news for the future of California woodlands. Further studies could broaden the scope to include how these relationships and other adaptations might affect tree health, according to Bui.

"I think this gives us a little bit of hope that the players in this ecosystem that are crucial for the survival of the habitat for many species -- like the oaks -- might be able to keep doing what they're doing," she said. "Even though we do need to do a lot of work in terms of conservation and mitigation, there's a possibility for them to persist. And I think that's hopeful and exciting."

Credit: 
University of California - Santa Barbara

Glyphosate can create biomarkers predicting disease in future generations

PULLMAN, Wash. - Exposure to the widely used weed-killer glyphosate makes genetic changes to rats that can be linked to increased disease in their grandchildren and great-grandchildren, a new study has found.

The study provides evidence that glyphosate-induced changes to sperm from exposed rats could be used as biomarkers for determining propensity in subsequent generations for prostate and kidney diseases as well as obesity and incurring multiple diseases at once. In fact, by the time third- and fourth-generation rats whose predecessors had been exposed to the chemical were middle-aged, 90% had one or more of these health problems, a dramatically higher rate than the control group.

While limited in scope, the study, which tested generational groups of around 50 rats each, provides a proof of concept that could lead to a new medical diagnostic tool, said Michael Skinner, the corresponding author on the study published in the journal Epigenetics on Dec. 9.

"While we can't fix what's wrong in the individual who is exposed, we can potentially use this to diagnose if someone has a higher chance of getting kidney or prostate disease later in life, and then prescribe a therapeutic or lifestyle change to help mitigate or prevent the disease," said Skinner, a professor of biological sciences at Washington State University.

This study follows a 2019 paper in Scientific Reports in which Skinner's lab demonstrated the ability of glyphosate to promote the transgenerational inheritance of disease in mice.

Glyphosate is widely used in agriculture and common in the human food supply. Previous research has indicated that the chemical has limited toxicology for those that ingest it since it has a short half-life and breaks down in the body quickly. However, Skinner's research and other animal studies have provided evidence that health effects from glyphosate and other chemicals can be inherited by subsequent generations.

In this current study, the research team took those findings further by identifying genetic changes in the rats' sperm caused by the chemical. Sperm have a unique property in their DNA, a group of proteins called histones that are connected like beads on a string. Skinner and his colleagues found that glyphosate exposure causes the sperm to accumulate hundreds of new histones retention sites, and they correlated those histones to specific diseases in subsequent generations.

"We need to change how we think about toxicology," Skinner said. "Today worldwide, we only assess direct exposure toxicology; we don't consider subsequent generational toxicity. We do have some responsibility to our future generations."

The study focuses on sperm, but the researchers anticipate the same ability to find markers in the female germline or eggs. Additional research is needed, first replicating this study in larger groups of animals which would help pinpoint the disease susceptibility rates more precisely. Ultimately, the goal would be able to produce diagnostic tests for humans, but replicating the studies in humans will be challenging simply because glyphosate is so ubiquitous in our diet, Skinner said.

"Right now, it's very difficult to find a population that is not exposed to glyphosate to have a control group for comparison," he said.

Credit: 
Washington State University

Research shows disparities in how communities respond to cardiac arrest

Black neighborhoods had a significantly lower rate of bystander automated external defibrillator (AED) use relative to non-Hispanic/Latino white communities, according to researchers at The University of Texas Health Science Center at Houston (UTHealth).

Hispanic/Latino neighborhoods also had lower rates of AED use, according to the study, which was published in a recent edition of Circulation, a journal of the American Heart Association (AHA).

First author Ryan Huebinger, MD, assistant professor of emergency medicine at McGovern Medical School at UTHealth, and his team conducted the research by analyzing data from more than 18,000 out-of-hospital cardiac arrests. The research earned Huebinger an AHA's Young Investigator Award.

The researchers wrote that the findings identified an important opportunity to improve training and access to information about AEDs to better serve minority and underrepresented neighborhoods in Texas. "Through these efforts we can hopefully close the gap and save lives," Huebinger said.

Credit: 
University of Texas Health Science Center at Houston

Religious discrimination particularly high for Jews and Muslims, study shows

HOUSTON - (Dec. 9, 2020) - Although people of all faiths report growing religious discrimination during the past few years, the phenomenon is most common among Jews and Muslims, according to a new study from researchers at Rice University and West Virginia University (WVU). In addition, Jews and Muslims are much more likely to become victims of violence because of their religious beliefs.

"Individuals' Experiences with Religious Hostility, Discrimination, and Violence: Findings from a New National Survey" was recently published in Socius, a journal of the American Sociological Association. Researchers Elaine Howard Ecklund, director of Rice's Religion and Public Life Program and the Herbert S. Autrey Chair in Social Sciences, and Christopher P. Scheitle, an associate professor of sociology at WVU, included samples of religious groups that are in the minority in the United States (Muslims, Jews, Buddhists, Hindus and atheists) as well as Christians in their study.

About a quarter of those surveyed reported feeling hostility or disrespect toward their religion when interacting with other people, but over one-third of Jews and almost two-thirds of Muslims reported having such experiences.

Jews and Muslims were also more likely to report religious harassment, threats and violence. While 8.7% of all the people surveyed reported being threatened with physical violence due to their religion, threats were reported by 16.7% of Jews and 20.3% of Muslims.

Muslims and Jews were also the most likely to report organizational discrimination. Only 1.7% of those surveyed overall reported being denied services in a place of business because of their religion, but 5.9% of Jews and 5.8% of Muslims did.

All of these discriminatory experiences occurred regardless of an individual's race or ethnicity, national origin and other characteristics, the researchers wrote.

There were some noteworthy differences between the experiences of Jews and Muslims. "Muslim adults were much more likely to report being harassed by the police (21%), while only 2.2% of Jewish adults say they have experienced such harassment," Scheitle said. "This highlights that Muslim adults face some unique forms of discrimination."

"I think that, at some level, leaders do not think of religious discrimination as a problem," Ecklund said. "Unfortunately, our results show that religious discrimination is alive and well."

Credit: 
Rice University

Harvesting the sun's energy for clean drinking water: Where we are, where we need to be

image: A solution to the growing clean drinking water availability problem is direct solar steam generation technology, which can remove harmful soluble pollutants from water

Image: 
Lei Miao from SIT

Without drinkable water there is no life. Yet, nearly 1.1 billion people worldwide lack access to fresh water and another 2.4 billion suffer from diseases borne by unclean drinking water. This is because while science has yielded advanced water treatment methods such as membrane distillation and reverse osmosis, these are often difficult to implement in developing countries owing to their high cost and low productivity.

A more nascent technology shows promise as an alternative for such regions of the world: direct solar steam generation (DSSG). DSSG involves harvesting the heat from the sun to convert water into vapor, thereby desalinating it or ridding it of other soluble impurities. The vapor is then cooled and collected as clean water for use.

This is a simple technology, but a key step, evaporation, is presenting roadblocks for its commercialization. With existing technology, evaporation performance has hit the theoretical limit. However, this is not sufficient for practical implementation. Measures to improve device design to minimize solar heat loss before it reaches bulk water, recycle latent heat in the water, absorb and utilize energy from the surroundings as well, and so on, have been taken to improve the evaporation performance beyond the theoretical limit and make this technology viable.

In a new paper published in Solar Energy Materials and Solar Cells, Professor Lei Miao from Shibaura Institute of Technology, Japan, along with colleagues Xiaojiang Mu, Yufei Gu, and Jianhua Zhou from Guilin University of Electronic Technology, China, review strategies formulated in the last two years to surpass this theoretical limit. "Our aim is to summarize the story of the development of new evaporation strategies, point out current deficiencies and challenges, and lay out future research directions to hasten the practical application of the DSSG purification technology", says Prof. Miao.

A pioneering strategy with which this evolutionary saga begins is the volumetric system, which, in lieu of bulk heating, uses a suspension of noble metals or carbon nanoparticles to absorb the sun's energy, transfer heat to the water surrounding these particles, and generate steam. While this increases the absorbed energy of the system, there is much heat loss.

To address this issue, the "direct contact type" system was developed, in which a double-layer structure with pores of different sizes covers the bulk water. The top layer with larger pores serves as a heat absorber and vapor escape route and the bottom layer with smaller pores is used to transport water up from the bulk to the top layer. In this system, the contact between the heated top layer and the water is concentrated, and heat loss is reduced to about 15%.

The "2D water path" or "indirect contact type" system came next, which further lowered heat loss by avoiding contact between the solar energy absorber and bulk water. This paved the way for the eventual development of the "1D water path" system, which is inspired by the natural capillary-action-based water transport process in plants. This system displays an impressive evaporation rate of 4.11 kg m-2h-1, nearly thrice the theoretical limit, along with a heat loss of only 7%.

This was followed by the injection-control technique in which the controlled sprinkling of water as rain on the solar energy absorber allows its absorption in a manner mimicking that in soil. This results in an evaporation rate of 2.4 kg m-2h-1 with a conversion efficiency of 99% from solar energy to water vapor.

Parallelly, strategies to gain additional energy from the environment or from the bulk water itself, and recover the latent heat from high-temperature steam, have been under development to improve the evaporation rate. Techniques to reduce the energy required for evaporation in the first place are also being developed, such as hydratable and light-absorbing aerogels, polyurethane sponge with carbon black nanoparticles, and carbon dot (CD) coated wood to hold the sun's energy and the water to be evaporated.

Several other such design strategies exist and several more are to come. Many pertinent issues--like the collection of the condensed water, durability of the materials, and stability during outdoor applications under fluctuating wind and weather conditions, remain to be addressed.

Yet the pace at which work on this technology is progressing makes it one to look forward to. "The path to the practical implementation of DSSG is riddled with problems," says Prof. Miao. "But given its advantages, there is a chance that it will be one of the frontrunning solutions to our growing drinking water scarcity problem."

Credit: 
Shibaura Institute of Technology

How soil fungi respond to wildfire

image: Oak and evergreen trees in Hood Mountain Regional Park and Preserve in Santa Rosa in August 2019.

Image: 
Gabriella Selva

In the wake of the 2017 North Bay fires, the golden hills of Santa Rosa, California, were unrecognizable. Smoky, seared and buried under ash, the landscape appeared desolate, save for some ghostly, blackened - but still alive - oak trees. For Stanford University graduate student, Gabriel Smith, whose family lives in Santa Rosa, the devastation was heartbreaking, but it also offered a unique scientific opportunity: a natural experiment on the effects of wildfires on the microbes that live in soil, which Smith studies in the form of fungi.

So, Smith and his mother spent his winter break collecting soil samples from burned areas near trees in Santa Rosa's Trione-Annadel State Park and Hood Mountain Regional Park and Preserve. For comparison, they also gathered samples from unburned locations.

"I wanted to know how these ecosystems that, on the outside, looked so burned and so destroyed might have been affected at a level that is not so obvious - the soil fungi that I study," said Smith, who is a member of the lab of Kabir Peay, an associate professor of biology in the School of Humanities and Sciences. Most people know soil fungi by their fruit - mushrooms - but there's much more to these organisms, both physically and functionally. Working alongside plant roots and other microbes that live in the soil, soil fungi play important roles in their ecosystems, including helping trees grow and aiding in decomposition.

The research, which was published Dec. 9 in Molecular Ecology, focused on two ecosystems in these parks, oak woodland and mixed evergreen forest. As the researchers expected, analysis of dozens of soil samples established that, among the areas that had not burned, the ecosystems contained a different mix of soil fungi. The analysis also showed that, when comparing burned and unburned areas, the oak woodland soil fungal community was less altered by the fires than those in the evergreen forests. This aligns with the fact that oak woodlands depend on regular fire to thrive, whereas evergreen forests are less dependent on fire to survive. The researchers have continued this work by planting seedlings in some of the soil samples - those results will be detailed in a future paper. They are also hoping to find out more about the physiological mechanisms that could explain the responses of the fungi.

"There has been renewed interest in how climate change is influencing the frequency of fires and how that's going to affect fire-mediated ecological processes in California going forward," said Peay, who is senior author of the research. "So it's important to have specific details about how changes in the fire regimes in California, and the West Coast in general, are going to be influencing ecosystems."

Looking deeper

Oak woodlands benefit from fire to the extent that many parks, including Trione-Annadel, are treated with prescribed burns to keep their oaks healthy. Fire clears leaf litter and dead branches, creates improved conditions for some seeds, and controls insects and pathogens that might otherwise cause disease. Most importantly, fire can prevent other trees - such as those found in evergreen forests - from invading the oak forests. While mature evergreens can survive, and even benefit from, fires, encroaching seedlings may not.

To understand how the 2017 fires altered soil fungal communities in these two ecosystems, Smith and his mother dug up the top 10 centimeters of soil from 12 sites in Trione-Annadel and six at Hood Mountain, with guidance from the California Park Service. While Smith was home for break, the samples had to be temperature regulated.

"We ended up filling not only my parents' fridge but also my grandmother's fridge and my aunt's fridge. We also rigged a top-loading chest freezer to keep the right temperature," said Smith, who is lead author of the research. "There was a great deal of family support that went into this research."

Back at the Stanford lab, Smith and Lucy Edy, a co-term student in earth systems who worked on this project as part of the Stanford Biology Summer Undergraduate Research Program, determined what fungi resided in each sample through DNA analysis. What he found suggests that how fungal communities respond to fire belowground mirrors how other parts of their ecosystems respond to fire above ground.

"There was a much greater difference between the burned and unburned points in evergreen forests than there was in the oak woodland communities," said Smith. "We predicted there would be a difference between the two ecosystems, but the extent of that difference was actually more than we expected."

It will take additional research to understand why this is the case, but the researchers hypothesize that part of the reason may be that the soil fungal community "resets" when it burns. This would mean that the soil fungi associated with the oaks have less time between fires to change from its reset form, and the evergreen soil fungi have longer, leading to the greater differences seen in the soil of burned and unburned evergreen forests.

The future forest

For much of the history of studying fungi, researchers had to depend on what they could see above ground, including mushrooms. But increased access to DNA sequencing has opened up the field, helping scientists detail the complex relationships between various soil microbes, plants and ecosystem functions. Still, many questions remain concerning the effects of microbial diversity in the soil - for example, the consequences of losing half the population of one microbe versus two-thirds or all of it, and the net impact of losing microbes that could cause disease in certain plants in addition to losing microbes that benefit those plants.

"As fire regimes increase in intensity and frequency with climate change, we must understand the ecological responses of these ecosystems in order to determine our necessary responses in relation to them," said Edy, who is a co-author of the paper. "Fungal ecology is perhaps outside the realm of first consideration when people think about the impact of wildfire, but these below-ground microbial interactions fuel and sustain entire ecosystems."

This project, born from terrible circumstances, will likely produce many more studies, like the seedling experiments, and further investigations into how the fungal communities in the oak woodlands withstand fire.

"This was not originally part of Gabriel's PhD project. He had the foresight to recognize that this is not just something that was interesting on a personal level, but also that there's nice intellectual potential here," said Peay. "Works like this can advance our understanding of how the changes we see in the soil might then play a role in changing what future ecosystem types look like."

Credit: 
Stanford University

A balancing act: Improved water treatment technique using 'energy matching'

image: Scarcity of freshwater in many parts of the world calls for improved and sustainable methods of wastewater treatment

Image: 
PS Photography on Pexels

Today, a large number of people worldwide suffer from shortage of fresh drinking water, especially in remote rural regions, causing a significant threat to human life and society. While techniques such as membrane distillation and reverse osmosis have been used to treat saline water and alleviate the situation, they suffer from limitations like low productivity, high cost, and high energy consumption.

In recent years, "direct solar steam generation" (DSSG) has emerged as a viable technique for water purification, which utilizes "photothermal" materials that can absorb high amounts of solar energy. These materials are then made to float in water, which helps to maintain localized heating and generate water vapor that is subsequently condensed to obtain clean water. Current DSSG methods have reached the limits of solar thermal efficiency and evaporation rate; however, given the demand for high-flux clean water in large-scale commercialization, further enhancement in evaporation rate is necessary. Previous studies have tried to do this by exploring different absorbers to manipulate the "input" (IE) and "required" energy (IE) needed for evaporation, but the relationship between IE and RE has not been studied yet.

To this end, Prof Lei Miao from Shibaura Institute of Technology, Japan, along with co-authors Xiaojiang Mu and Jianhua Zhou from Guilin University of Electronic Technology, China, aimed to find a balance between IE and RE to optimize evaporation performance in DSSG. According to them, the trick to this was to reduce the RE to match the IE, a unique concept called "energy matching." For this, they came up with an innovative evaporation system based on bilayer structures of carbon nanotube aerogel-coated wood (CACW). The design provided three layers of thermal insulation, which (1) minimized heat loss and prevented a sudden temperature drop in the absorber and (2) regulated water transport to the evaporation surface. Prof Miao explains, "Water speed regulation is key to the 'energy matching' strategy employed in our design. By controlling the speed of water transport, we ensure that the RE for evaporation is balanced with the IE to the absorber." The findings of their study are published in Solar RRL.

To test the water transport speed in the CACW system, the scientists evaluated the evaporation rates for different concentrations of carbon nanotubes and for wood sheets of different thicknesses. In addition, they used the system to treat liquid samples emulating sewage and estimated their quality post treatment in terms of ion concentration, oil content, and bacterial levels. Finally, they estimated the IE and evaporation rates under varying water transport speeds.

The analysis revealed that the best evaporation performance and highest solar-to-vapor energy conversion efficiency achieved with this system were 2.22 kg m-2h-1 and 93.2%, respectively, which are higher than other carbon-based materials. Moreover, the evaporator showed sufficient self-cleaning ability along with excellent stability after ten cycles. The treated water exhibited significantly reduced metal ion concentrations, bacterial level, and oil content compared to the input samples, suggesting that it was suitable for drinking.

With such encouraging results, Prof Miao considers it a triumph for the "energy-matching" strategy and believes it has broken new ground. She concludes, "Our strategy yielded a 40% improvement in the evaporation rate along with a high solar-to-vapor conversion efficiency of 93%. We now look forward to the practical implementation of DSSG in desalination of seawater and sewage treatment. In the future, we hope to come up with new ideas to develop this technology further until we have eradicated water scarcity."

Credit: 
Shibaura Institute of Technology

Discovery suggests new promise for nonsilicon computer transistors

For decades, one material has so dominated the production of computer chips and transistors that the tech capital of the world -- Silicon Valley -- bears its name. But silicon's reign may not last forever.

MIT researchers have found that an alloy called InGaAs (indium gallium arsenide) could hold the potential for smaller and more energy efficient transistors. Previously, researchers thought that the performance of InGaAs transistors deteriorated at small scales. But the new study shows this apparent deterioration is not an intrinsic property of the material itself.

The finding could one day help push computing power and efficiency beyond what's possible with silicon. "We're really excited," said Xiaowei Cai, the study's lead author. "We hope this result will encourage the community to continue exploring the use of InGaAs as a channel material for transistors."

Cai, now with Analog Devices, completed the research as a PhD student in the MIT Microsystems Technology Laboratories and Department of Electrical Engineering and Computer Science (EECS), with Donner Professor Jesús del Alamo. Her co-authors include Jesús Grajal of Polytechnic University of Madrid, as well as MIT's Alon Vardi and del Alamo. The paper will be presented this month at the virtual IEEE International Electron Devices Meeting.

Transistors are the building blocks of a computer. Their role as switches, either halting electric current or letting it flow, gives rise to a staggering array of computations -- from simulating the global climate to playing cat videos on Youtube. A single laptop could contain billions of transistors. For computing power to improve in the future, as it has for decades, electrical engineers will have to develop smaller, more tightly packed transistors. To date, silicon has been the semiconducting material of choice for transistors. But InGaAs has shown hints of becoming a potential competitor.

Electrons can zip through InGaAs with ease, even at low voltage. The material is "known to have great [electron] transport properties," says Cai. InGaAs transistors can process signals quickly, potentially resulting in speedier calculations. Plus, InGaAs transistors can operate at relatively low voltage, meaning they could enhance a computer's energy efficiency. So InGaAs might seem like a promising material for computer transistors. But there's a catch.

InGaAs' favorable electron transport properties seem to deteriorate at small scales -- the scales needed to build faster and denser computer processors. The problem has led some researchers to conclude that nanoscale InGaAs transistors simply aren't suited for the task. But, says Cai, "we have found that that's a misconception."

The team discovered that InGaAs' small-scale performance issues are due in part to oxide trapping. This phenomenon causes electrons to get stuck while trying to flow through a transistor. "A transistor is supposed to work as a switch. You want to be able to turn a voltage on and have a lot of current," says Cai. "But if you have electrons trapped, what happens is you turn a voltage on, but you only have a very limited amount of current in the channel. So the switching capability is a lot lower when you have that oxide trapping."

Cai's team pinpointed oxide trapping as the culprit by studying the transistor's frequency dependence -- the rate at which electric pulses are sent through the transistor. At low frequencies, the performance of nanoscale InGaAs transistors appeared degraded. But at frequencies of 1 gigahertz or greater, they worked just fine -- oxide trapping was no longer a hindrance. "When we operate these devices at really high frequency, we noticed that the performance is really good," she says. "They're competitive with silicon technology."

Cai hopes her team's discovery will give researchers new reason to pursue InGaAs-based computer transistors. The work shows that "the problem to solve is not really the InGaAs transistor itself. It's this oxide trapping issue," she says. "We believe this is a problem that can be solved or engineered out of." She adds that InGaAs has shown promise in both classical and quantum computing applications.

"This [research] area remains very, very exciting," says del Alamo. "We thrive on pushing transistors to the extreme of performance." One day, that extreme performance could come courtesy of InGaAs.

This research was supported in part by the Defense Threat Reduction Agency and the National Science Foundation.

Credit: 
Massachusetts Institute of Technology

Focus on human factor in designing systems

video: A new study has found one of the challenges in designing systems that involve people interacting with technology is to tackle the human trait of overconfidence.

Image: 
QUT

A new study has found one of the challenges in designing systems that involve people interacting with technology is to tackle the human trait of overconfidence.

The study, published in the journal IEEE Control Systems, takes a novel multidisciplined approach in studying "cyberphysical human systems". The research considers the relationship between people and computer systems both from the perspective of control system engineering and behavioural economics.

The research by QUT's Cyberphysical Systems Professor Daniel Quevedo, and Marius Protte and Professor René Fahr, both from Paderborn University in Germany, looks at the impact that human decision can make on an engineered system.

Professor Quevedo said control system engineers generally did not examine the interaction between people and the systems they were in, and how their choices could impact on the system.

To explain how unpredictable human decisions could impact on a controlled system, Professor Quevedo said an example was if he was planning a drive using a navigation system and was offered alternative routes.

"I make my own decision based on the information and drive. And that affects the whole traffic system," Professor Quevedo said.

"There is this problem about what information does the car system give me so that I behave in one way or another.

"That's just for one car. With traffic, there are many cars. What information should we get so that we behave in one way or another? How do our actions work?"

While the system's designer expects humans to take the fastest route, they might take a different route. If enough people decided to take an alternative route, then the traffic flow predictions of the system would need to be reconsidered.

Professor Quevedo said successful design of "human-in-the-loop" control systems required an understanding of how humans behaved.

He said an interesting issue was that people, unlike machines, did not necessarily improve their performance through immediate and frequent feedback.

"Given the immense complexity of human behaviour, there's no clear way to create appropriate models for human decision making," Professor Quevedo said.

In the study, the researchers looked at how people behaved when given the task of piloting a drone and found that frequent feedback about the quality of the piloting decisions made, may lead to poor performance.

"While more information is commonly considered to result in better decisions, human susceptibility for perceptual biases in response to high information supply must be considered," Professor Quevedo said.

"Otherwise, individuals might take unnecessarily high risks, rendering thoughtfully designed policies inefficient.

The study highlights that people often overestimate their ability at a task, such as believing they are better than average drivers, or they succumb to the "hot hand fallacy" from basketball which links the likelihood that a player will score in the future to his past successes in throwing.

"If you win you think you're doing really well, you fall in love with yourself," Professor Quevedo said.

"As a control engineer, I always tended to assume that cooperative people somehow just do what they're told because they're part of a system.

"We need to incorporate a model of human behaviour, but human behaviour is a difficult thing.

"You don't want to overload people with information because they can't process all of it. But it's much more refined than that."

This multidisciplinary study of human behaviour through behavioural economics and control system engineering is a start for future research.

"Putting the worlds together is the first step for us. Now we want to continue," Professor Quevedo said.

"The current work exposes the human as an under observed source of errors in human-in-the-loop control systems.

"Future areas of research need to be how to design mechanisms on when to pass on information and how to pass on information to human decision makers."

Credit: 
Queensland University of Technology

A colossal step for electronics

image: Strongly Correlated Oxide Proton Resistor devices

Image: 
Osaka University

Osaka, Japan - Researchers at Osaka University demonstrated a new technique for modifying the hydrogen concentration of resistors by applying an electrical voltage. The generated electric field drove the diffusion of hydrogen ions deeper into the perovskite rare-earth nickelate lattice, which led to a tunable "colossal" increase in electrical resistance. This research can lead to new gas sensors and electrically switchable smart materials.

Computer chips depend on the careful control of electrical signals through semiconductors. Conventionally, the conductivity of silicon chips is modified by intentionally "doping" them with impurity ions. However, this process is usually done once at the factory, and cannot be changed later. Thus, the ability to dynamically control the doping of materials would open the way for novel switches and potentially even entirely new kinds of computer circuits.

Now, scientists at Osaka University have created thin films of neodymium nickel oxide (NdNiO3) with an electrical resistance that can change dramatically by controlling the distribution of hydrogen ions (protons) in the film. The hydrogen was added in a process called "gas-phase annealing" in which the thin film, which has a perovskite crystal structure, was exposed to hydrogen gas in the presence of an electric field which caused the formation of hydrogen protons. This reaction was sped up by platinum electrodes, which act as catalysts.

Increasing the annealing temperature caused more protons to diffuse into the film. At room temperature, the resistance of the films doubled from the original value, but jumped by a factor of 30 at 200°C. "We call such a large increase in resistance 'colossal,' because it is easily detected in electronic devices," first author Umar Sidik explains.

In this way, the combination of electric field and gas-phase annealing at a desired temperature was shown to enable control of the diffusional doping, which led to electrically tunable colossal resistive devices. The crystal structures were confirmed using X-ray diffraction and optical microscopy. The change was visible because the hydrogen-doped region became optically transparent.

"In addition to the large resistance modulation, ion doping also has potential to reversibly change the structural and electronic properties of correlated materials via an electric field by manipulating the ion diffusion process into or out of a material," senior author Azusa N. Hattori says. In fact, this can lead to the whole area of "iontronic" devices that rely on ion motion within a solid lattice to function.

Credit: 
Osaka University

NUS engineers discover new microbe for simpler, cheaper and greener wastewater treatment

image: The team led by Associate Professor He Jianzhong (left) and Research Fellow Dr Wang Qingkun (right) discovered Thauera sp. strain SND5 after they isolated and tested various strains of bacteria from wastewater samples. Dr Wang holding a wastewater sample containing the unique SND5 bacterium.

Image: 
National University of Singapore

Researchers from the National University of Singapore (NUS) have developed a new way to treat sewage that is much simpler, cheaper and greener than existing methods.

Led by Associate Professor He Jianzhong from the Department of Civil and Environmental Engineering at the Faculty of Engineering, the NUS team found a new strain of bacterium called Thauera sp. strain SND5 that can remove both nitrogen and phosphorus from sewage.

This discovery, which was first reported in the journal Water Research on 15 October 2020, significantly reduces the high operational costs and emission of greenhouse gases associated with traditional wastewater treatment methods.

The team's new treatment method is also in the running for the International Water Association Project Innovation Awards 2021.

2-in-1 pollutant remover

In sewage, nitrogen is present in ammonia while phosphorous is present in phosphates. Too much of either compound risks polluting the environment, so they must be removed before the treated water can be released.

Most existing sewage treatment systems use separate reactors for removing nitrogen and phosphorous, with different conditions for different microbes. Such a process is both bulky and expensive.

Some existing systems use a single reactor, but they are inefficient because different microbes in the same reactor will compete with one another for resources. This makes it difficult to maintain the delicate balance among the microbes, resulting in an overall lower efficiency.

Another problem with some existing sewage treatment methods is that they release nitrous oxide, a greenhouse gas. The NUS team's new microbe solves this problem as it converts the ammonia into harmless nitrogen gas instead. Additionally, phosphates originally present in sewage water were found to be removed.

Faster, cheaper and greener approach

The unique SND5 bacterium was discovered in a wastewater treatment plant in Singapore. When the NUS research team was carrying out routine monitoring, they observed an unexpected removal of nitrogen in the aerobic tanks, as well as better-than-expected phosphate removal despite the absence of known phosphorus-removing bacteria.

"This leads us to hypothesise the occurrence of a previously undescribed biological phenomenon, which we hope to understand and harness for further applications," said Assoc Prof He.

The NUS researchers then took wastewater samples from a tank, isolated various strains of bacteria, and tested each of them for their ability to remove nitrogen and phosphorus.

One of the strains, which appeared as sticky, creamy, light yellow blobs on the agar medium, surprised the researchers by its ability to remove both nitrogen and phosphorous from water. In fact, it did the job faster than the other microbes that were tested. The NUS team sequenced its genes and compared them to related bacteria in a global database. They then established it to be a new strain.

Compared to conventional nitrogen removal processes of nitrification and denitrification, the NUS team's way of using the newly identified microbe can save about 62 per cent of electricity due to its lower oxygen demand. This is of great significance as the aeration system in a wastewater treatment plant can consume nearly half of the plant's total energy.

Assoc Prof He explained, "Population and economic growth have inevitably led to the production of more wastewater, so it is important to develop new technologies that cost less to operate and produce less waste overall - all while meeting treatment targets."

Meanwhile, the NUS researchers are looking to test their process at a larger scale, and formulate a "soup" of multiple microbes to boost SND5's performance even further.

Credit: 
National University of Singapore