Tech

Antibiotics protect apples from fire blight, but do they destroy the native microbiome?

image: Anna Wallis in apple orchard

Image: 
Anna Wallis

Like humans, certain plants are treated with antibiotics to ward off pathogens and protect the host. Saving millions, antibiotics are one of the 20th century's greatest scientific discoveries, but repeated use and misuse of these life-saving microbial products can disrupt the human microbiome and can have severe effects on an individual's health. Overuse has led to several microbes developing resistance to the antibiotic, rendering it useless, and created "superbugs" that overpower medication. But do we find that same phenomenon in plants and our food industry?

This was the question Dr. Anna Wallis and colleagues investigated in their recent research "Endophytic bacterial communities in apple leaves are minimally impacted by streptomycin use for fire blight management," published in Phytobiomes Journal in April 2021.

Pathogens like Erwinia amylovora, the causal agent of fire blight in apples, can have severe impacts on fruit production. Current management practices include using the antibiotic streptomycin to protect apples from this disease but the long-term impacts on the microbiome are poorly understood. Antibiotics are often broad-spectrum, meaning they destroy all susceptible bacteria, both good and bad. In many cases, antibiotics give relief from an immediate problem but can result in long-term negative effects as beneficial microbes are demolished from the environment.

To assess if this was true in apple orchards, Wallis and her advisor Dr. Kerik D. Cox analyzed the microbiome of apple leaves over two years in two orchards in Geneva, New York. Some were treated with various amounts of streptomycin and others were under organic management strategies without any antibiotics. Cox and a former student, Dr. Kiersten Tancos, had previously looked at the surface (epiphytic) microbiome of apple trees being treated with streptomycin and saw large effects with this application. When Wallis and Cox decided to look at the endophytic microbiomes (microbes living within the plant), they were expecting to see a similar trend.

Surprisingly, Wallis and Cox found little evidence that streptomycin altered the leaf bacterial microbiome and instead found that geographical location, even though the orchards were in close proximity, played a bigger role in bacterial composition than management strategy. While it is well-known that geography plays a role in microbial community assemblage, it is surprising this factor is a stronger influencer on the microbial composition than an antibiotic.

"Our work adds to a growing body of literature that demonstrates the sustainability of current methods [such as antibiotics] of disease control used by apple growers," said Wallis. While previous research investigated this question by looking at just the soil or microbes living on the plant surface, this is the first study to look at the impact of streptomycin on the endophytic leaf microbiomes, which are likely attributing more to host health than the surface microbes.

The authors hypothesize the endophytic microbiomes are more resilient to the streptomycin application because of the possibility of naturally obtaining resistance to the antibiotic application. Soil and plant microbiomes are filled with several microbes that have the innate ability to produce antibiotics. So, while Erwinia amylovora remains susceptible to streptomycin, it is likely the soil and endophytic microbiome have naturally acquired resistance to streptomycin prior to commercial applications.

Interestingly, several other papers have shown that soil amendments and other crop manipulations dramatically impact the microbial communities and may make them beneficial to the crop. Because streptomycin is naturally occurring in soils and agricultural environments, it is not known if the endophytic microbiome has gained antibiotic resistance to streptomycin in the last few years or if the innate apple microbiome has long had resistance to this antibiotic. Current evidence suggests streptomycin is a sustainable management strategy for fire blight of apple.

Credit: 
American Phytopathological Society

California's worst wildfires are helping improve air quality prediction

UC Riverside engineers are developing methods to estimate the impact of California's destructive wildfires on air quality in neighborhoods affected by the smoke from these fires. Their research, funded by NASA and the results published in Atmospheric Pollution Research, fills in the gaps in current methods by providing air quality information at the neighborhood scales required by public health officials to make health assessments and evacuation recommendations.

Measurements of air quality depend largely on ground-based sensors that are typically spaced many miles apart. Determining how healthy it is to breathe air is straightforward in the vicinity of the sensors but becomes unreliable in areas in between sensors.

Akula Venkatram, a professor of mechanical engineering in UC Riverside's Marlan and Rosemary Bourns College of Engineering, directed a group that developed a method to interpret fine particulate matter concentrations observed by ground-based sensors during the 2017 fire complex that included the Atlas, Nuns, Tubbs, Pocket, and Redwood Valley fires, and the 2018 Camp Fire.

Their method fills in the gaps in air quality information obtained from ground-level monitors and satellite images using a mathematical model that simulates the transport of smoke from the fires. This approach provides estimates of particulate emissions from wildfires, which is the most uncertain of the inputs of other methods of interpreting the same data. These emissions combined with the physics embodied in the smoke transport model allowed the group to estimate the variation of particulate concentrations over distances as small as one kilometer.

"We need better ways to measure air quality so we can let people know when and where it's safe to go out and exercise, or go stay somewhere else, for example," Venkatram said. "In addition to filling in the gaps in the data from monitoring stations and satellite images, our method can also be used to predict the next day's air quality by estimating wildfire emissions for tomorrow based on today's observations."

While any smoke can make air unpleasant to breathe, it is the tiniest particles, called PM2.5, that can penetrate lung tissue and cause the most health problems. The UC Riverside model is specifically designed to predict PM2.5 concentrations in areas with insufficient coverage by air quality monitoring stations.

The authors hope their work will help efforts to protect public health during California's inevitable annual wildfires.

Credit: 
University of California - Riverside

1 of 695 Fast mitigation of power grids instability risks

image: Fast mitigation of power grids instability risks

Image: 
Pavel Odinev / Skoltech

Skoltech scientists in collaboration with researchers from the University of Arizona and the Los Alamos National Laboratory have developed an approach that allows power grids to return to stability fast after demand response perturbation. Their research at the crossroads of demand response, smart grids, and power grid control was published in the journal Applied Energy.

Power grids are complex systems that manage the generation, transmission and distribution of electrical power to consumers, also called loads. As it is not possible to store electrical energy along the transmission lines, grid operators must ensure, ideally at all times, the balance between production and consumption of electrical energy, i.e. the stability of power grids. While it is essential to guarantee the provision of electricity to consumers and meet their needs, random events, such as electric faults, fluctuations due to renewables penetration or a sudden excessive demand, may perturb the grid stability, possibly leading to brownouts or worse, blackouts. An illustrative real-life situation is when the electricity demand for air conditioning is high in case of heat waves, and cannot be met. If the electrical power reserve is insufficient, rolling blackouts may be implemented on a part or the whole of the distribution network; this management strategy, or worse, failures that lead to non-controlled blackouts profoundly affect the customers' comfort and well-being, as well as the utility companies' operation and profitability.

Careful scheduling by electricity providers is crucial to manage peaks of demands. To avoid contingencies, solutions exist: generator reserves, ancillary services, and demand response. The latter solution involves the customers who may reduce or shift their electricity consumption at suitable times for the company in exchange for lower prices. Companies may also leverage direct load control programs acting on, e.g., air-conditioners' and water heaters' switches.

"Demand response is a low-cost and flexible solution that relies on the customers' tolerance to a temporary perturbation of their comfort, but it may also cause problems such as the parasitic synchronization of individual devices. Put simply, after some demand response operation like turning all devices on, the devices will all be in the same state, consuming energy; then after some time they will all be off, and then on again almost simultaneously, and so on and so forth, if no action is taken to avoid such a kind of collective behavior. In other words, if all devices become synchronized, the power grid has to bear the stress of consumption oscillations possibly causing instability, and certainly delaying the ensemble's availability for a subsequent demand response period," professor Henni Ouerdane comments.

In the new study, the scientists analyzed the relaxation of the energy consumption dynamics of an ensemble of thermostatically controlled loads (air conditioners) after a demand response perturbation. Relaxation must occur rapidly enough to ensure energy efficiency, grid stability, and return to the consumers' "comfort zone".

"The focus was on a discrete model that captures the actual discrete dynamics of the system as information and control signals are sent over discrete time intervals. While it was shown previously using a continuous in-time model that relaxation may be accelerated through mean-field control, it is essential for practical load management to rather rely on discretized control models. The study demonstrates that super-relaxation is a genuine feature of discretized models, but also how these models based on a few control parameters provide energy systems operators with an efficient way to effectively manage the consumption dynamics of aggregated loads", adds lead author of the study and Skoltech PhD student Ilia Luchnikov.

"Our work essentially deals with the optimal use of energy resources and the sustainability of power systems, but the scope of demand response extends beyond power systems management, as it may be applied in other settings like the water sector, the oil industry, and waste management," professor Ouerdane concludes.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

New conductive polymer ink opens for next-generation printed electronics

image: Researchers at Linköping University, Sweden, have developed a stable high-conductivity polymer ink. The new n-type material comes in the form of ink with ethanol as the solvent.

Image: 
Thor Balkhed

Researchers at Linköping University, Sweden, have developed a stable high-conductivity polymer ink. The advance paves the way for innovative printed electronics with high energy efficiency. The results have been published in Nature Communications.

Electrically conducting polymers have made possible the development of flexible and lightweight electronic components such as organic biosensors, solar cells, light-emitting diodes, transistors, and batteries.

The electrical properties of the conducting polymers can be tuned using a method known as "doping". In this method, various dopant molecules are added to the polymer to change its properties. Depending on the dopant, the doped polymer can conduct electricity by the motion of either negatively charged electrons (an "n-type" conductor), or positively charged holes (a "p-type" conductor). Today, the most commonly used conducting polymer is the p-type conductor PEDOT:PSS. PEDOT:PSS has several compelling features such as high electrical conductivity, excellent ambient stability, and most importantly, commercial availability as an aqueous dispersion. However, many electronic devices require a combination of p-types and n-types to function. At the moment, there is no n-type equivalent to PEDOT:PSS.

Researchers at Linköping University, together with colleagues in the US and South Korea, have now developed a conductive n-type polymer ink, stable in air and at high temperatures. This new polymer formulation is known as BBL:PEI.

"This is a major advance that makes the next generation of printed electronic devices possible. The lack of a suitable n-type polymer has been like walking on one leg when designing functional electronic devices. We can now provide the second leg", says Simone Fabiano, senior lecturer in the Department of Science and Technology at Linköping University.

Chi-Yuan Yang is a postdoc at Linköping University and one of the principal authors of the article published in Nature Communications. He adds:

"Everything possible with PEDOT:PSS is also possible with our new polymer. The combination of PEDOT:PSS and BBL:PEI opens new possibilities for the development of stable and efficient electronic circuits", says Chi-Yuan Yang.

The new n-type material comes in the form of ink with ethanol as the solvent. The ink can be deposited by simply spraying the solution onto a surface, making organic electronic devices easier and cheaper to manufacture. Also, the ink is more eco-friendly than many other n-type organic conductors currently under development, which instead contain harmful solvents. Simone Fabiano believes that the technology is ready for routine use.

"Large-scale production is already feasible, and we are thrilled to have come so far in a relatively short time. We expect BBL:PEI to have the same impact as PEDOT:PSS. At the same time, much remains to be done to adapt the ink to various technologies, and we need to learn more about the material", says Simone Fabiano.

Credit: 
Linköping University

Augmented reality in retail and its impact on sales

Augmented reality (AR) is a technology that superimposes virtual objects onto a live view of physical environments, helping users visualize how these objects fit into their physical world. Researchers from City University of Hong Kong and Singapore Management University published a new paper in the Journal of Marketing that identifies four broad uses of AR in retail settings and examines the impact of AR on retail sales.

The study, forthcoming in the Journal of Marketing, is titled "Augmented Reality in Retail and Its Impact on Sales" and is authored by Yong-Chin Tan, Sandeep Chandukala, and Srinivas Reddy. The researchers discuss the following uses of AR in retail settings:

* To entertain customers. AR transforms static objects into interactive, animated three-dimensional objects, helping marketers create fresh experiences that captivate and entertain customers. Marketers can use AR-enabled experiences to drive traffic to their physical locations. For example, Walmart collaborated with DC Comics and Marvel to place special thematic displays with exclusive superhero-themed AR experiences in its stores. In addition to creating novel and engaging experiences for customers, the displays also encouraged customers to explore different areas in the stores.

* To educate customers. Due to its interactive and immersive format, AR is also an effective medium to deliver content and information to customers. To help customers better appreciate their new car models, Toyota and Hyundai have utilized AR to demonstrate key features and innovative technologies in a vivid and visually appealing manner. AR can also be used to provide in-store wayfinding and product support. Walgreens and Lowe's have developed in-store navigation apps that overlay directional signals onto a live view of the path in front of users to guide them to product locations and notify them if there are special promotions along the way.

* To facilitate product evaluation. By retaining the physical environment as a backdrop for virtual elements, AR also helps users visualize how products would appear in their actual consumption contexts to assess product fit more accurately prior to purchase. For example, Ikea's Place app uses AR to overlay true-to-scale, three-dimensional models of furniture onto a live view of customers' rooms. Customers can easily determine if the products fit in a space without taking any measurements. Uniqlo and Topshop have also deployed the same technology in their physical stores, offering customers greater convenience by reducing the need to change in and out of different outfits. An added advantage of AR is its ability to accommodate a wide assortment of products. This capability is particularly useful for made-to-order or bulky products. BMW and Audi have used AR to provide customers with true-to-scale, three-dimensional visual representations of car models based on customized features such as paint color, wheel design, and interior aesthetics.

* To enhance the post-purchase consumption experience. Lastly, AR can be used to enhance and redefine the way products are experienced or consumed after they have been purchased. For example, Lego recently launched several specially designed brick sets that combine physical and virtual gameplay. Through the companion AR app, animated Lego characters spring to life and interact with the physical Lego sets, creating a whole new playing experience. In a bid to address skepticism about the quality of its food ingredients, McDonald's has also used AR to let customers discover the origins of ingredients in the food they purchased via story-telling and three-dimensional animations.

The research also focuses on the promising application of AR to facilitate product evaluation prior to purchase and examine how it impacts sales in online retail. For example:

* The availability and usage of AR has a positive impact on sales. The overall impact appears to be small, but certain products are more likely to benefit from the technology than others.

* The impact of AR is stronger for products and brands that are less popular. Thus, retailers carrying wide product assortments can use AR to stimulate demand for niche products at the long tail of the sales distribution. AR may also help to level the playing field for less-popular brands. With the launch of AR-enabled display ads on advertising platforms such as Facebook and YouTube, less-established brands could consider investing in this new ad format because they stand to benefit most from this technology.

* The impact of AR is also greater for products that are more expensive, indicating that AR could increase overall revenues for retailers. Retailers selling premium products may also leverage AR to improve decision comfort and reduce customers' hesitation in the purchase process.

* Customers who are new to the online channel or product category are more likely to purchase after using AR, suggesting that AR has the potential to promote online channel adoption and category expansion. As prior research has shown that multichannel customers are more profitable, omni-channel retailers can use AR to encourage their offline customers to adopt the online channel.

Taken together, these findings provide converging evidence that AR is most effective when product-related uncertainty is high. Managers can thus use AR to reduce customer uncertainty and improve sales.

Credit: 
American Marketing Association

No-Cath forecast

image: Elucid's vascuCAP software gives physicians a color-coded map of cardiac vessels that shows different plaque types and indicates which deserve the most attention.

Image: 
Laboratory of U. Joseph Schoepf

Coronary artery disease (CAD) is the most common form of heart disease and is present in about 18.2 million American adults. This disease is defined by narrowing of the vessels that supply the heart with critical oxygen and nutrients, typically caused by plaque blockages and inflammation. But not all plaques have the same composition, and while blockages in the heart vessels can cause heart attacks and cardiac arrest, not all areas with blockage create problems with blood flow.

A recent study published in the International Journal of Cardiology examined a noninvasive method to predict which blockages require surgical intervention (stent placement or bypass surgery) and which can be treated without surgery. With the help of a multinational research team, Medical University of South Carolina researchers Akos Varga-Szemes, M.D., Ph.D., and U. Joseph Schoepf, M.D., performed a validation study with a novel artificial intelligence (AI) program to compare predictions made by the software to previously logged patient measurements.

The baseline measurements were determined by the currently accepted care standard, which requires catheterization to check the flow rate inside vessels. With the new program, Schoepf and Varga-Szemes aim to characterize the blood flow in the heart vessels noninvasively to decide which patients are good candidates for having their vessels reopened and those better left alone.

The study analyzed data from 113 patients with suspected CAD who had undergone noninvasive coronary CT scanning as well as invasive catheterization to measure blood flow, specifically a number called the fractional flow reserve (FFR). For coronary CT scanning, patients received a dye injection for imaging of the coronary blood vessels. For catheterization, a catheter with a pressure sensor was inserted through a vessel in the arm or leg and steered to the heart and the target blockage, a procedure that carries all of the discomfort and many of the risks of invasive surgery.

The compiled data set was used as the standard baseline, and the researchers deployed the AI software to make its predictions based on CT data for the three major coronary arteries (left anterior descending artery, left circumflex artery and right coronary artery) and the branching arteries. They then compared the data sets to determine the software's prediction accuracy.

The team used the commercially available software vascuCAP, a system based on AI that they are developing with the biotech company Elucid. The software looks at coronary CT images of the heart vessels and plaques and determines the makeup of the plaques. This information itself can be useful to clinicians because the composition of a plaque influences the elasticity of vessels and the risk of altered blood flow -- for example, fatty plaques with a lot of lipids are more fragile and more likely to eventually lead to a heart attack, whereas plaques made up of calcium are more stable and less likely to be associated with a heart attack. And importantly, the software's AI algorithms then use the plaque composition to estimate the blood flow through that region.

"What we are talking about is an artificial intelligence algorithm which learns the different characteristics or different components of the blockage," Varga-Szemes said. "Based on those results, it predicts whether this limits the blood flow."

The algorithm calculates the FFR by estimating the flow rates upstream and downstream of the plaque and determining the difference. While this is the same measurement performed by catheterization, a catheter is a stiff instrument which needs to be forced through the vessels to reach to and beyond plaques. But because plaques are often fragile and unstable and the procedure itself has some risk of complications, avoiding this risk is preferable if possible.

Furthermore, the software can examine the blood flow through multiple target areas and slide up and down a vessel without the manipulation that would be required with catheterization. It also provides clinicians a map of the entire cardiac vessel network, revealing any areas of concern and providing numeric values to indicate the flow rates.

The results of the validation study showed that the software worked well and provided results that correlated very closely with the results obtained by catheterization, but without the risks of an invasive procedure and at a fraction of the time and cost.

The potential to predict the significance of CAD blockages noninvasively is great news for patients, who may be able to avoid the discomfort and risks of invasive catheterization while still minimizing their cardiac risks and receiving the treatment they need. It also benefits clinicians and hospitals by saving valuable catheter suite time and costs while still providing the information physicians need to choose the best treatment plans for their patients.

"There are high-risk features that can be seen on CT that help us to predict which patients are more prone to have a heart attack in the future, regardless of what the blood flow looks like at the time," Schoepf said. "So physicians can identify which patients need their help right now but also which patients need more TLC in their treatment to prevent heart attacks further down the road."

The next step will be to work with a larger patient population and continue to validate the software. If all goes well, clinicians may soon have help deciding how to optimally treat their patients without needing to perform invasive procedures.

Credit: 
Medical University of South Carolina

Gold digger: Neural networks at the nexus of data science and electron microscopy

image: (a) A Cropped FRIL image used for input (b) Ground truth image annotated by human experts (c) Network-generated image with annotations d. Discriminator network to discern between real and fake images.

Image: 
Max Planck Florida Institute for Neuroscience

From sample preparation to image acquisition, electron microscopy (EM) requires precise and time-consuming steps to produce the clarity and detail needed to visualize small cell structures with high resolution. Moreover, once EM images are created, extracting the biological information out of them through analysis can be an even more laborious and time intensive task. Especially because current EM analysis software often requires the skilled eye of a scientist to manually review hundreds of images.

With a bit of ingenuity and the application of cutting-edge neural networks, an interdisciplinary team of scientists at the Max Planck Florida Institute for Neuroscience (MPFI) have created a new powerful analysis software aimed at streamlining part of the lengthy process. In collaboration with the Electron Microscopy Core Facility and the Christie Lab, the project tasked two high school students with dramatically improving upon established computer-based techniques for the analysis of protein distribution in EM images. Unlike traditional light microscopy that uses fluorescent labeling, EM requires proteins to be labeled with gold nanoparticles in order to visualize them within a cell. Playfully named "Gold Digger", the software uses a deep learning approach to identify gold particles bound to specific proteins of interest.

In their new publication in Scientific Reports, the MPFI team has engineered an adaptable, deep learning-based algorithm capable of accurately identifying different sizes of gold particles. This fully automated approach will speed up the counting process and generate more precise location information of protein distributions across a membrane, expediting new breakthroughs.

Deep learning or neural networks is a computational strategy that allows software to progressively learn over time. Much like the human brain, these types of algorithms are able to deconstruct a visual scene into individual components and be taught to recognize certain aspects. By supplying pre-annotated "training data" the software learns how to copy and mimic human actions for a given task, something that computers weren't able to do in the not-so-distant past.

"One of the challenges of the project, was figuring out a way to train our software to recognize only gold particles which appear dark on an electron micrograph, as opposed similarly looking shadows caused by the uneven surface of a cell; something that only trained EM experts could do previously" explains Dr. Michael Smirnov, Neural Data Scientist at MPFI and corresponding author of the publication. "We found that by feeding enough training data and correcting errors that pop up in our algorithms, our software could distinguish gold particles from these shadow artifacts with near human level accuracy. I think this really demonstrates the robustness and utility of our technique."

This project started with the curiosity of two high school data science students, Diego Jerez and Eleanor Stuart, but quickly it developed into a more complex and interdisciplinary project. "I feel very lucky to get the unique opportunity to apply what we've learned in the classroom setting to real world scientific pursuit, and seeing first-hand how data science can help address scientific questions," explained Diego Jerez, first author of this publication. "These young students showed a real aptitude for this type of coding and conceptual work and I couldn't be more proud of what they have accomplished. I can't wait to see the contributions they'll make to the scientific community in the future," describes Dr. Smirnov.

The small, compact architecture of the Gold Digger software was primarily used for freeze fracture replica EM, but it was specifically designed to be generalizable and compatible between various EM applications including changes in magnification, image area, cell type and gold particle size. The software will soon be distributed open source and include a user-friendly interface. Scientists everywhere will have the opportunity to take full advantage of and improve upon this innovative algorithm.

"The synergy of the collaborative work of our team was crucial to bridge the gap between these areas of expertise" explained Naomi Kamasawa, Ph.D., the Head of the Electron Microscopy Core Facility at MPFI. "But this is what the Max Planck Society does - bring together people who are passionately curious about a variety of subjects, and allow them to be creative together. When you do that, anything is possible."

Credit: 
Max Planck Florida Institute for Neuroscience

Texas A&M study: Racial, ethnic diversity in schools influence mental health

A Texas A&M researcher is discovering the demographic characteristics that can produce or lessen stress for racial and ethnic minority students in school settings.

The study, recently published in the journal Ethnicity and Disease, collected mental health survey assessments among 389 sixth-graders from 14 Texas public schools in urban areas. Melissa DuPont-Reyes, assistant professor at the Texas A&M University School of Public Health, led the investigation of self-reported depressive-anxious symptoms over a two-year period. This issue of the journal highlighted research by early stage investigators, especially scholars of color, to advance new knowledge and action to address social inequities in health.

Overall, the study found that a higher percentage of non-Latinx white students in a school increases mental health risk for non-Latinx Black and Latinx students, while more racial and ethnic diversity decreases mental health risk for some Latinx students.

DuPont-Reyes built upon the data collected from the Texas Stigma Study (2011-2015), a longitudinal evaluation of a mental illness anti-stigma intervention, by adding publicly available data on the participating Texas public schools. The two data sources allowed for data points on sex, household income, parental educational attainment, family history of mental illness and past mental health service use, as well as school factors such as enrollment, socioeconomic status and performance.

Each school's racial and ethnic density and diversity were measured as well. In this study, density refers to the percentage of non-Latinx white enrollment. Diversity signifies the range and size of all racial and ethnic groups enrolled. Dissimilarity in school racial and ethnic enrollment can produce challenges unique to racial and ethnic minorities, such as harassment, marginalization, feeling faced with different expectations and social isolation, that can significantly influence mental health. DuPont-Reyes' team adds new knowledge about younger adolescents and Latinx groups, as well as simultaneous analysis of both diversity and density measures.

Non-Latinx Black and Latinx students, according to the results, reported double the rate of depressive-anxious symptoms compared to their non-Latinx white counterparts in schools with greater non-Latinx white enrollment. In terms of diversity, high-stress Latinx students -- those who tend to experience greater levels of discrimination -- saw about a fifth the rate of depressive-anxious symptoms compared to their non-Latinx white counterparts in schools with greater racial and ethnic diversity. Non-Latinx white students saw greater symptoms with increasing diversity in schools.

These findings are important, the researchers found, as school-aged populations in the United States are ethnically diverse, yet integrative curriculum and enrollment policies have remained at a standstill or worsened in some areas. DuPont-Reyes' team examined these students at the precipice of when mental health symptoms emerge.

Credit: 
Texas A&M University

Earthquakes continued after COVID-19-related oil and gas recovery shutdown

When hydraulic fracturing operations ground to a halt last spring in the Kiskatinaw area of British Columbia, researchers expected seismic quiescence in the region. Instead, hundreds of small earthquakes occurred for months after operations shut down, according to a new study.

In her presentation at the Seismological Society of America (SSA)'s 2021 Annual Meeting, Rebecca Salvage of the University of Calgary said about 65% of these events could not be attributed to either natural seismicity or active fluid injection from hydraulic fracturing operations.

Salvage and her colleagues instead suggest the latent earthquakes may be the result of aseismic slip, driven by fluid from previous hydraulic fracture injections keeping rock pore pressures elevated.

"Because there are lots of faults in that area, the fluid is becoming trapped in these zones," Salvage explained. "And as aseismic deformation occurs, which leads to very, very slow slip in these zones, then you get seismicity generated from that process."

The study by Salvage and her University of Calgary colleague David Eaton offers an unusual glimpse at how hydraulic fracturing may alter the rate of seismicity in a region long after active operations cease.

Their findings may reflect a new background rate of seismicity for the area, Salvage said, "but since this is such an unprecedented situation, we have no idea whether that is the case, and we won't know until all hydraulic fracturing ceases in the Kiskatinaw Seismic Monitoring and Mitigation Area (KSMMA) entirely, which is unlikely to occur any time in the near future."

Hydraulic fracturing operations are thought to be the main cause of seismic activity in the Kiskatinaw region, causing thousands of small earthquakes over the past two decades. The area, along with most of western Canada, has very few natural earthquakes. Between 1984 and 2008, before oil and gas operations in the area, seismologists detected only 20 earthquakes in the Kiskatinaw region, Salvage said.

Researchers had just finished installing a new seismic array in the KSMMA in January 2020, hoping to learn more about how active operations were related to earthquakes, particularly those of very small magnitudes.

Operators in the area have their own small private arrays, "but the public sensors in the area were much more scattered and sparse," said Salvage. "We installed this array thinking, this is going to be great, we're going to capture all this hydraulic fracturing."

But when COVID-19 reached the region, operations came to a halt due to a government lockdown which caused plummeting oil and gas prices.

Then the researchers noticed things weren't entirely quiet. Between April and August 2020, they detected 389 earthquakes during a period of almost no hydraulic fracturing. All of the earthquakes were magnitude 1.2 or smaller, "so it wouldn't be noticeable to anybody that the background seismicity had increased without a seismologist doing the analysis," Salvage said.

The earthquakes didn't fall into the same patterns that would be expected of hydraulic fracturing-induced seismicity, according to the researchers. The rate of earthquakes persisted over time, instead of declining, and there was no pattern of earthquakes moving away from an initial source, as is often observed during active fluid injection.

Hydraulic fracturing has resumed in the area, along with an uptick in earthquakes, Salvage said.

Credit: 
Seismological Society of America

Understanding our restoring force

image: A restored savanna with only a few longleaf pine trees supports much greater biodiversity than the unrestored woodland in the background that is packed tightly with trees.

Image: 
Nash Turley

An expansive project led by Michigan State University's Lars Brudvig is examining the benefits, and limits, of environmental restoration on developed land after humans are done with it.

Experts estimate there are up to 17 million square miles of land worldwide that have been altered by humans -- through cultivation say -- and then abandoned. That's more than four times the size of the continental United States.

Once humans change a landscape, their impacts linger long after they've moved on. However, humans can heal some of that damage by working to restore the land to its natural state.

But questions remain about how far restoration can go in overcoming a land's past, how much it can move the needle back toward normal. Brudvig and his collaborators now have some answers that they've published April 19 online in the Proceedings of the National Academy of Sciences.

"Restoration benefited sites regardless of their land-use history. The benefits are clear," said Brudvig, an associate professor of plant biology in MSU's College of Natural Science.

For this project, researchers compared land that had been used for farming with land without a history of agriculture. By working to restore habitats on both types of plots, the team could paint a clearer picture of how a habitat's history affects restoration efforts.

The researchers found that the effects of restoration outweighed the detriments from a plot's previous land use two-to-one. Despite the benefits, however, restoration could not erase all of farming's lasting effects.

"Agricultural sites were different to begin with and they remained different after restoration," Brudvig said. "It does beg the question of what else we should be doing."

Though this project does not answer that question, it does offer many insights that can help ecologists decide where and how to target their restoration efforts.

In the study, the team observed dozens of different ecological properties across more than 300 acres for several years following a restoration treatment developed for longleaf pines savannas in the Southeast U.S.

"The longleaf pine is the tree of the South. It's this charismatic, beautiful, really, really cool species of tree," Brudvig said. "There's also incredible biodiversity in this region. There's on the order of 900 different species of plants that are found here and nowhere else."

This work required a large experimental site with a well-known land-use history. Fortunately, Brudvig was part of a multiuniversity collaboration that had been working at such a site for years: the Savannah River Site in South Carolina.

The site is a U.S. Department of Energy complex and its natural ecosystems are managed by the U.S. Department of Agriculture's Forest Service.

"I don't know of another place on Earth where we could have set up this research project and pulled this off," Brudvig said.

The site's history is well documented, but also complicated and painful, Brudvig said. The site has a long history of agriculture, with farmers replacing open, grassy savannas with fields to grow corn, cotton and other crops. But as the Cold War waged in the mid-20th century, the U.S. government commandeered the land and shut down those farms.

In the time since, people have turned the farmland into tree plantations, densely packed with longleaf and other pines. The few remaining natural savannas also transitioned into thick forest because people began suppressing fires in the region, too.

Longleaf pines, which thrive in the savanna setting, have evolved to be resilient to fires caused by lightning strikes, for example. Suppressing fires allowed tree species that are better acclimated for more crowded forest conditions to fill in the open spaces.

Counterintuitively, then, restoring the savanna meant removing trees in areas with and without histories of agriculture.

"I get that question a lot: If you're trying to restore an ecosystem, shouldn't you be planting trees?" Brudvig said. "But by removing trees, you keep dense canopies from growing, giving opportunities to other plants on the ground. There's a whole suite of plants and animals adapted to the conditions of savannas."

And thinning trees also created valuable lumber, so the U.S. Forest Service was able to take bids from contractors to carefully thin the trees, meaning this restoration effort was also a revenue generating one.

To compare the effects of restoration and past land use, the team used vetted statistical tools to put different factors on the same mathematical footing. For example, they could assign comparable values to soil quality, plant diversity and how different species were interacting with each other, such as how effective bees were at pollinating plants.

The researchers could then analyze how each category was affected by land use and restoration in a quantitative way.

A black and yellow carpenter bee collects pollen from little purple flowers.

Interactions between pollinators and plants were one of the dozens of ecological properties researchers monitored in this study. Credit: Nash Turley

"Past studies have looked at more narrow sets of characteristics -- such as plant properties or animal diversity," Brudvig said. "And we kind of did that, too, but collectively as a group, we have 45 different ways we've queried the ecosystem."

Researchers on this project came from seven different universities, including the University of Wisconsin-Madison.

"For me, the most important takeaway from this project is that the past matters for present-day restoration," said John Orrock, a collaborator on the project and a professor of integrative biology at UW-Madison. "The success of current restoration is haunted by the ghost of land-use past."

Knowing this helps ecologists make the most effective use of limited resources, he said, adding that teamwork was critical to performing a study of this magnitude.

"Conducting experiments at landscape scales is incredibly challenging," said Ellen Damschen, a co-investigator on the project from UW-Madison, where she is a professor of integrative biology. "We have had the great fortune of partnering with the U.S. Forest Service and Department of Energy at Savannah River Site to test key restoration questions at scales relevant to management."

"What makes this work possible is that partnership and the trust built over decades of collaborating together, as well as welcoming new members," Damschen said. "It is wonderful when students and collaborators can ask new and different questions that can shed light on how the system works."

One of those contributors was Nash Turley, who's now a postdoctoral researcher at Pennsylvania State University. But, back in 2014, he was joining MSU as a postdoc and helping create the project before setting foot on campus.

"I turned in my Ph.D. thesis at the University of Toronto and started driving down to work in the field with Lars the next day," he said. "On the way, I wrote in my notepad that we should look at all these factors associated with land-use legacy."

Because Brudvig, Orrock, Damschen and their colleagues had been working at the Savannah River Site for years, they had an abundance of data already available.

"But we needed more," Turley said.

The team's study area consisted of 126 plots, each larger than a football field. Researchers measured each of the 45 ecological variables-- say, the number of plant species -- across the study area.

"It's easy to say we measured 45 variables, but it was just an immense, immense effort," Turley said. "It takes one crew of people an entire summer, making measurements all day long to monitor one of those properties."

But the payoff was worth it. Although restoration didn't undo land-use legacies, its benefits were clear no matter a plot's history. And the restoration itself was relatively simple. The team was able to return land to a more natural state by merely thinning trees.

Granted, this particular restoration won't work for every ecosystem. But understanding an ecosystem's history and performing similar studies in the future will help identify ways to improve those 17 million square miles of abandoned land, Turley said. Especially in the 2020s, which the United Nations has declared the Decade on Ecosystem Restoration.

"There's a great opportunity to do a lot of restoration and do a lot of good," Turley said. "But because our human interactions have done so much and last so long, we'll have to put in even more effort to undo those."

And those opportunities don't just live in the South or in other places around the globe. Motivated Spartans can find them in their own backyard.

"In Michigan and the Midwest, there's a ton of abandoned land, mostly left to do its own thing," Turley said. "If we had the motivation to do better with it, we could. It just takes the will."

Credit: 
Michigan State University

Helpful, engineered 'living' machines in the future?

image: New soft, mechanical metamaterials can "think" about how forces are applied to it and respond via preprogrammed reactions.

Image: 
ELIZABETH FLORES-GOMEZ MURRAY/ PENN STATE

Engineered, autonomous machines combined with artificial intelligence have long been a staple of science fiction, and often in the role of villain like the Cylons in the "Battlestar Galactica" reboot, creatures composed of biological and engineered materials. But what if these autonomous soft machines were ... helpful?

This is the vision of a team of Penn State and U.S. Air Force researchers, outlined in a recent paper in Nature Communications. These researchers produced a soft, mechanical metamaterial that can "think" about how forces are applied to it and respond via programmed reactions. This platform holds great potential for a variety of applications from medical treatments to improving the environment.

"We created soft, mechanical metamaterials with flexible, conductive polymer networks that can compute all digital logic computations," said Ryan Harne, James F. Will Career Development Associate Professor, Penn State. "Our paper reports a way to create decision-making functionality in engineered materials in a way that could support future soft, autonomous engineered systems that are invested with the basic elements of lifeforms yet are programmed to perform helpful services for people. These could include helping maintain sustainable and robust infrastructure, monitoring of airborne and waterborne contaminants and pathogens, assisting with patient wound healing, and more."

Human thought processes are based on logic, Harne notes, which is similar to Boolean logic from mathematics. This approach uses binary inputs to process binary control outputs -- using only "on" and "off" sequences to represent all thought and cognition. The soft materials that the research team created "think" using the reconfiguration of the conductive polymer networks. Mechanical force, applied to the materials, connects, and reconnects the network.

Using a low voltage input into the materials, the research team created a way for the soft material to decide how to react according to the output voltage signal from the reconfigured conductive polymer network.

The type of logic that Harne and the team uses goes beyond pure mechanical logic, which is a way of using combinations of bistable switches -- switches with two stable states -- to represent the "0s" and "1s" of a binary number sequence. They found that when they used pure mechanical logic, the researchers ended up getting stuck because certain logical operations cannot be constructed.

"You hit a point where you can't actually process all of the eight logic gates," Harne said. "You can process four of them, but you can't process the last four. We discovered the way to incorporate electrical signals along with mechanical signals, allowing us to process all of the logic gates used in modern digital computing."

The key to realizing all the logic gates was in the combination of the electrical polymer network with the soft, deformable material. The researchers created the logic operations by simultaneously reconfiguring the soft material and the electrically conductive network.

This also ensures that the binary output is in the form of electricity, which is needed to drive an actuation mechanism that makes the material respond to the applied mechanical force. The combination of electrical and mechanical signals allows the machine to move to get out of the way or to push back in a certain direction.

Harne and the team want to go beyond a single material and design something more complex.

"I have a vision for how scientists and engineers can create engineered living systems that help society," Harne said. "All you need to do is bring together all of the functions of life forms. And when you do that, you have at your disposal the building blocks of engineered life."

While this all seems like science fiction, Harne believes it has tremendous potential.

"It is somewhat sci-fi, I do have to admit that, and I will say, I've had colleagues think I'm a little crazy," Harne said. "But if we as engineers and scientists understand all of the things that make up life, why aren't we trying to make engineered living things that can help people?"

Credit: 
Penn State

Rock glaciers will slow Himalayan ice melt

image: A rock glacier in the Khumbu Valley, Nepal.

Image: 
Stephan Harrison

Some Himalayan glaciers are more resilient to global warming than previously predicted, new research suggests.

Rock glaciers are similar to "true" ice glaciers in that they are mixtures of ice and rock that move downhill by gravity - but the enhanced insulation provided by surface rock debris means rock glaciers will melt more slowly as temperatures rise.

Rock glaciers have generally been overlooked in studies about the future of Himalayan ice.

The new study, led by Dr Darren Jones at the University of Exeter, shows rock glaciers already account for about one twenty-fifth of Himalayan glacial ice - and this proportion will rise as exposed glaciers continue to melt and some transition to become rock glaciers.

"Glaciers play a vital role in regulating water supplies, and Himalayan glaciers regulate water for hundreds of millions of people," said Professor Stephan Harrison, of the University of Exeter.

"Over the past century, these glaciers have lost about 25% of their mass due to climate change, and they are predicted to lose more in the future.

"However, glacier models have treated glaciers as uniform lumps of ice - and our study shows not all glaciers will melt at the same rate.

"Many are covered in rock and are in various stages of the transition to rock glaciers.

"These slow-moving glaciers are well insulated, and as a result they are more resilient to global warming than 'true' glaciers."

The study has provided the first estimate of the number and importance of rock glaciers in the Himalayas.

It shows that there are about 25,000 rock glaciers in the region, containing a total of about 51 cubic kilometres of ice - or 41-62 trillion litres of water.

Despite this, lead author Dr Darren Jones cautioned: "Although we find that rock glaciers are more resilient to warming, it remains clear that all Himalayan glaciers are in long-term decline, with enormous implications for the people who rely on them for water supplies."

"Further research into Himalayan rock glaciers is critical for underpinning climate change adaptation strategies and to ensure that this highly populated region is in a strong position to meet sustainable development goals," said Professor Richard Betts, of the Met Office Hadley Centre and the University of Exeter, who was also involved in the study.

The research team included Dr Karen Anderson at the University of Exeter and Dr Sarah Shannon at the University of Bristol.

Funders included the Natural Environment Research Council, GW4 PhD funding to Dr Darren Jones, and the BEIS/Defra Met Office Hadley Centre Climate Programme.

The paper, published in the journal Science of the Total Environment, is entitled: "Rock glaciers represent hidden water stores in the Himalaya."

Credit: 
University of Exeter

Designing healthy diets - with computer analysis

image: A new mathematical model for the interaction of bacteria in the gut could help design new probiotics and specially tailored diets to prevent diseases. The research, from Chalmers University of Technology in Sweden, was recently published in the journal PNAS.

Image: 
Creative Commons

A new mathematical model for the interaction of bacteria in the gut could help design new probiotics and specially tailored diets to prevent diseases. The research, from Chalmers University of Technology in Sweden, was recently published in the journal PNAS.

"Intestinal bacteria have an important role to play in health and the development of diseases, and our new mathematical model could be extremely helpful in these areas," says Jens Nielsen, Professor of Systems Biology at Chalmers, who led the research.

The new paper describes how the mathematical model performed when making predictions relating to two earlier clinical studies, one involving Swedish infants, and the other adults in Finland with obesity.

The studies involved regular measurements of health indicators, which the researchers compared with the predictions made from their mathematical model - the model proved to be highly accurate in predicting multiple variables, including how a switch from liquid to solid food in the Swedish infants affected their intestinal bacterial composition.

They also measured how the obese adults' intestinal bacteria changed after a move to a more restricted diet. Again, the model's predictions proved to be reliably accurate.

"These are very encouraging results, which could enable computer-based design for a very complex system. Our model could therefore be used to for creating personalised healthy diets, with the possibility to predict how adding specific bacteria as novel probiotics could impact a patient's health," says Jens Nielsen.

Many factors at play

There are many different things that affect how different bacteria grow and function in the intestinal system. For example, which other bacteria are already present and how they interact with each other, as well as how they interact with the host -- that is to say, us. The bacteria are also further affected by their environmental factors, such as the diet we eat.

All of these variables make predicting the effect that adding bacteria or making dietary changes will have. One must first understand how these bacteria are likely to act when they enter the intestine or how a change in diet will affect the intestinal composition. Will they be able to grow there or not? How will they interact with and possibly affect the bacteria that are already present in the gut? How do different diets affect the intestinal microbiome?

"The model we have developed is unique because it accounts for all these variables. It combines data on the individual bacteria as well as how they interact. It also includes data on how food travels through the gastrointestinal tract and affects the bacteria along the way in its calculations. The latter can be measured by examining blood samples and looking at metabolites, the end products that are formed when bacteria break down different types of food," says Jens Nielsen.

The data to build the model has been gathered from many years' worth of pre-existing clinical studies. As more data is obtained in the future, the model can be updated with new features, such as descriptions of hormonal responses to dietary intake.

A potential huge asset for future healthcare

Research on diet and the human microbiome, or intestinal bacterial composition, is a field of research that generates great interest, among both researchers and the general public. Jens Nielsen explains why:

"Changes in the bacterial composition can be associated with or signify a great number of ailments, such as obesity, diabetes, or cardiovascular diseases. It can also affect how the body responds to certain types of cancer treatments or specially developed diets."

Working with the bacterial composition therefore offers the potential to influence the course of diseases and overall health. This can be done through treatment with probiotics - carefully selected bacteria that are believed to contribute to improved health.

In future work, Jens Nielsen and his research group will use the model directly in clinical studies. They are already participating in a study together with Sahlgrenska University Hospital in Sweden, where older women are being treated for osteoporosis with the bacteria Lactobacillus reuteri. It has been seen that some patients respond better to treatment than others, and the new model could be used as part of the analysis to understand why this is so.

Cancer treatment with antibodies is another area where the model could be used to analyse the microbiome, helping to understand its role in why some patients respond well to immunotherapy, and some less so.

"This would be an incredible asset if our model can begin to identify bacteria that could improve the treatment of cancer patients. We believe it could really make a big difference here," says Jens Nielsen.

Credit: 
Chalmers University of Technology

Our attention is captured by eye-glance

Eyes play an important role in social communication by expressing the intentions of our interlocutors, and even more so in times of pandemic when half of the face is hidden. But is this eye contact automatic and rapid? Is it based on a priority attentional reaction or, on the contrary, on a particular emotional reaction? To answer these questions, researchers at the University of Geneva (UNIGE), Switzerland, looked at the way we process human gaze, focusing on the estimation of the temporal duration of social interactions. They discovered that when we make eye contact with another person, our attention is directly solicited, causing a distortion in our temporal perception. As a result, time seems shorter than it really is. On the contrary, this underestimation of time does not occur when we look at a non-social object. These results, to be read in the journal Cognition, will make it possible to develop a diagnostic tool to evaluate the mechanisms at work in people who are sensitive to social gaze, and then to act accordingly if disorders in the processing of social stimuli are detected, for instance in the case of autism or schizophrenia.

The way we look at others and the way we perceive others' gaze have a major impact on social communication, a fundamental function called social cognition. "From an early age, we learn to decipher the feelings and intentions of our interlocutors through their eyes. Thus, meeting someone's gaze is a very common social situation, but it always leads to a particular feeling", notes Nicolas Burra, a researcher at the Psychology Section of the Faculty of Psychology and Educational Sciences (FPSE) at the UNIGE, and first author of the study. Two hypotheses are put forward to describe this situation: one says that eye contact with others directly generates an emotional reaction, without passing through our attention. The second hypothesis is that eye contact activates rapid and automatic attentional processing, which subsequently generates an emotional response.

Our perception of time influenced by emotion and attention

To test these hypotheses, the UNIGE researchers looked at the way we perceive time, which varies according to the emotional or attentional processing of the visual stimulus. "Indeed, it has been shown that when our emotional capacities have to process an unpleasant visual stimulus, for example if we are asked to evaluate the time of the appearance of a large spider, we will overestimate the time that passes, giving the impression that it is flowing more quickly than it actually does", explains Nicolas Burra. Thus, our ability to evaluate time is disturbed by the emotional charge and accelerates. On the contrary, when the visual stimulus is processed by attention, the opposite effect occurs: focused on a stimulus that is very important for our attention, we underestimate the time that elapses and look at the object longer than we imagined. "By analyzing how long a person estimates that he or she has been looking at an object, we can determine whether the eye contact between two people is more attention-seeking or emotion-seeking", says the Geneva-based researcher.

Deviated gazes versus eye contact

To assess the impact of eye contact on our perception of time, 22 participants observed a series of nearly 300 faces moving their eyes: either gazes establishing direct eye contact - eyes look into the void and then catch the participant's gaze - or deviated gazes - the same eye movement is made, but the face and the participant's gaze never meet. Over a period of 20 minutes, the participants subjectively assessed the different durations (between 1 and 2 seconds, close to everyday social interactions) of these social interactions. "While deviated glances do not distort our perception of time, we found that, on the contrary, when glances crossed, the participants systematically underestimated the duration of these eye contacts", says Nicolas Burra. This experiment reveals that eye contact does not primarily impact the emotional system, but rather the attentional system that distracts us from our ability to evaluate time.

To assess these results, the UNIGE researchers carried out the same experiment with other participants, using non-social objects that made the same movements as the gaze. In that case, no distortion of time perception was observed. The same was true when a face was static. "It seems that not only a gaze, but also a movement is required", the neuroscientist points out. The time-distortion effect was found, however, when the participants were shown only schematic eye movements or moving parts of the eyes without the rest of the face - a situation that is similar to social interaction with a mask. But it goes further, because this effect was also found in an online experiment with a hundred people, corroborating and generalizing the results obtained in the laboratory.

Eye contact captures attention

This series of experiments shows that eye contact and social stimuli impacts attention, preferentially. "This explains the sensation we feel when someone looks at us even though we have not yet really met their gaze", explains Nicolas Burra. This work will make it possible to evaluate and then intervene more precisely on the attentional or emotional processes in people suffering from disorders in the processing of social stimuli, characterized by a lack of interest, a misinterpretation of the gaze or by an extreme emotional reaction to others' gaze, such as in autistic people, schizophrenics or people suffering from social anxiety. The research team is currently carrying out this experiment with children and elderly people, in order to observe the evolution of this preferential processing of eye contact by attention over the lifespan. Nicolas Burra concludes: "This study gives meaning to the sensation that time stops when we meet another's gaze."

Credit: 
Université de Genève

AI agent helps identify material properties faster

A team headed by Dr. Phillip M. Maffettone (currently at National Synchrotron Light Source II in Upton, USA) and Professor Andrew Cooper from the Department of Chemistry and Materials Innovation Factory at the University of Liverpool joined forces with the Bochum-based group headed by Lars Banko and Professor Alfred Ludwig from the Chair of Materials Discovery and Interfaces and Yury Lysogorskiy from the Interdisciplinary Centre for Advanced Materials Simulation. The international team published their report in the journal Nature Computational Science from 19 April 2021.

Previously manual, time-consuming, error-prone

Efficient analysis of X-ray diffraction data (XRD) plays a crucial role in the discovery of new materials, for example for the energy systems of the future. It is used to analyse the crystal structures of new materials in order to find out, for which applications they might be suitable. XRD measurements have already been significantly accelerated in recent years through automation and provide large amounts of data when measuring material libraries. "However, XRD analysis techniques are still largely manual, time-consuming, error-prone and not scalable," says Alfred Ludwig. "In order to discover and optimise new materials faster in the future using autonomous high-throughput experiments, new methods are required."

In their publication, the team shows how artificial intelligence can be used to make XRD data analysis faster and more accurate. The solution is an AI agent called Crystallography Companion Agent (XCA), which collaborates with the scientists. XCA can perform autonomous phase identifications from XRD data while it is measured. The agent is suitable for both organic and inorganic material systems. This is enabled by the large-scale simulation of physically correct X-ray diffraction data that is used to train the algorithm.

Expert discussion is simulated

What is more, a unique feature of the agent that the team has adapted for the current task is that it overcomes the overconfidence of traditional neuronal networks: this is because such networks make a final decision even if the data doesn't support a definite conclusion. Whereas a scientist would communicate their uncertainty and discuss results with other researchers. "This process of decision-making in the group is simulated by an ensemble of neural networks, similar to a vote among experts," explains Lars Banko. In XCA, an ensemble of neural networks forms the expert panel, so to speak, which submits a recommendation to the researchers. "This is accomplished without manual, human-labelled data and is robust to many sources of experimental complexity," says Banko.

XCA can also be expanded to other forms of characterisation such as spectroscopy. "By complementing recent advances in automation and autonomous experimentation, this development constitutes an important step in accelerating the discovery of new materials," concludes Alfred Ludwig.

Credit: 
Ruhr-University Bochum