Tech

Study demonstrates how to collect true incidents from head impact sensors in youth sports

Philadelphia, March 10, 2020 - An increased awareness of concussion risks in young athletes has prompted researchers to use a variety of head impact sensors to measure frequency and severity of impacts during sports. A new study from Children's Hospital of Philadelphia (CHOP) shows these head sensors can record a large number of false positive impacts during real game play. The CHOP team's study emphasizes that an extra step to video-confirm the sensor data is essential for research and for use of this data in injury prevention strategies for player safety.

The findings were published online this month by the American Journal of Sports Medicine.

Approximately 1 in 5 high school athletes who plays a contact sport - such as soccer, lacrosse, and American football - suffers a concussion each year. To understand the frequency, magnitude and direction of head impacts that athletes sustain, a wide variety of sensors have been developed to collect head impact biomechanics data, including instrumented helmets, skull caps, headbands, mouthguards and skin patches.

However, when data are collected during game play rather than in a controlled laboratory environment, there is potential for false positives and false negatives. In this study, CHOP researchers used data collected from headband-based head impact sensors worn by male and female soccer players to determine the proportion of false positives within the data and if video confirmation improved the quality of the data.

"Head impact sensors are a readily accessible tool for studying the mechanics of head impacts," said Declan Patton, PhD, lead author of the study and a research associate at the Center for Injury Research and Prevention at CHOP. "However, in order for researchers to have reliable data to analyze, they first need to verify whether sensor-recorded events are actual head impacts using either video- or observer-confirmation."

In this study, researchers fitted 72 high school varsity and junior varsity soccer players (23 female and 49 male) with headband-mounted impact sensors during 41 games over two seasons to capture sensor-recorded events during competition. All the games were video-recorded. The research team analyzed the video to quantify the percentage of events recorded by the sensors that corresponded to an observable head impact event. In addition, video-verified sensor-recorded events were compared against the manufacturer filtering algorithm, which was developed to eliminate false-positives.

The sensors recorded 9,503 events during scheduled game times, which was reduced to 6,796 when the verified game time was identified on video. Of the 6,796 events during verified game time, 4,021 or approximately 60%, were removed as they were associated with players not on the field at the time of recording and therefore not an actual head impact. This indicates that prior studies, which used head impact sensor data without these important methodological steps, probably had a high proportion of non-head impact events in their dataset.

Video footage of the 1,893 sensor-recorded events for players on the field and within the frame of the camera were reviewed, and the events were categorized into three types: impact events (69.5%), trivial events such as adjusting a headband (20.9%,), and non-events like a player remaining stationary (9.6%). The most common impact event was head-to-ball contact, which represented 78.4% of all impact events. Other impact events included player contact (10.9%), falls (9.8%) and ball-to-head contact (0.8%).

Additionally, this study looked at both male and female athletes. Female athletes had a lower proportion of impact events (48.7% vs. 78.4%) and a higher proportion of trivial events (36.6% vs. 14.2%), which may be due to more frequent adjustments of the headband. However, among the actual impact events, the breakdown of types of impacts was similar between the genders.

"This study enables the field to build on previous research and improve the accuracy of data collected from head impact sensors by emphasizing the importance of detailed confirmation of head impact sensor data," said Kristy Arbogast, PhD, Co-Scientific Director and Director of

Credit: 
Children's Hospital of Philadelphia

Knowing more about a virus threat may not satisfy you

COLUMBUS, Ohio - People who rate themselves as highly knowledgeable about a new infectious disease threat could also be more likely to believe they don't know enough, a new study suggests.

In the case of this study, the infectious disease threat was the Zika virus. But the authors of the new study, published recently in the journal Risk Analysis, say the results could apply to the recent novel coronavirus (COVID-19) outbreak.

"The Zika virus and the coronavirus have important things in common," said Shelly Hovick, co-author of the study and assistant professor of communication at The Ohio State University.

"In both cases, they are shrouded in uncertainty and have received a lot of media attention. Our research looks at how people seek and process information when there is so much uncertainty."

One of the key findings of the new study: With limited information about Zika available, more knowledge was not that comforting.

"We found that the more people thought they knew, the more they realized they didn't know enough," said Austin Hubner, lead author of the study and a doctoral student in communication at Ohio State.

"With the Zika virus, even the experts themselves didn't know much at the time. That's the same thing we're seeing with the coronavirus, and that's scary for people who believe they are at risk."

For the study, the researchers conducted an online survey of 494 people of childbearing age living in Florida in December 2016.

Florida residents were recruited for the study because it had the highest number of locally transmitted cases of Zika in the United States at the time.

Although most people infected with Zika don't have symptoms, pregnant women with the virus have a higher likelihood of their child being born with a specific birth defect.

Zika is primarily spread by mosquitoes, but it can also be transmitted from men and women to their sexual partners and through blood transfusions.

In the survey, respondents were asked a variety of questions about their knowledge and attitudes toward seeking information, how they processed what they learned about the Zika virus, and their plans for seeking more information.

As expected, participants who were pregnant or wanted to get pregnant (and men whose wives were in those situations) felt more at risk from Zika and were more likely to say they felt scared of Zika. But they weren't the only ones who felt worried about Zika.

"Novel risks like Zika or coronavirus may make some people react differently than well-known risks like cancer or the flu," Hovick said.

"Even if the data suggest someone is at low risk, the lack of information may make some people feel they are at high risk."

The findings showed that people who felt they didn't know enough about Zika didn't intend to spend more time than others seeking information. That was probably because they realized that there wasn't more information available, Hovick said.

But they did spend more time processing the information they uncovered and were more likely to agree with statements like "After I encounter information about Zika, I am likely to stop and think about it."

These findings suggest it is important for public health agencies to continuously update the public, Hovick said. Those who are worried or concerned about risks such as Zika are likely to process the information they encounter deeply, but they may not seek information on their own.

Participants were also more likely to intend to seek information about Zika if they believed other people expected them to do so. They were more likely to want to search for information if they agreed with statements like "People in my life whose opinions I value seek information about Zika."

"We should aim not just to provide information, but also shape messages that encourage people to stay on top of the situation, particularly in high-uncertainty environments," Hovick said.

"You have to make it clear that seeking more knowledge is something that their friends and family expect of them."

Hovick said they have considered trying to replicate the study with the current coronavirus outbreak, but that Zika virus was slower developing.

"The coronavirus outbreak is moving so much more quickly. I'm not sure we could get the approvals and conduct the study in time," she said.

Credit: 
Ohio State University

Researchers create a new acoustic smart material inspired by shark skin

video: This video demonstrates the on demand tuning of an acoustic 'diode' using a remote magnetic force field. When the magnetic field is applied, the pillars bend, allowing acoustic waves to pass through.

Image: 
Qiming Wang

From the headphones we use to listen to our favorite songs or podcasts, to sonic camouflage employed by submarines, how we transmit and experience sound is an essential part of how we engage with our surrounding world. Acoustic metamaterials are materials designed to control, direct and manipulate soundwaves as they pass through different mediums. As such, they can be designed and inserted into a structure to dampen or transmit sound.

The problem is, traditional acoustic metamaterials have complex geometries. Often made of metal or hard plastic, once they are created, they cannot be changed. Take for example, an acoustic device constructed to dampen outgoing sound in a submarine, so that it can achieve stealthiness. If a different condition arose, for instance an ally the submarine wanted to communicate with passes by, the same acoustic device would not allow for sound to be transmitted externally.

A team of USC researchers, led by Qiming Wang, assistant professor in the Sonny Astani Department of Civil and Environmental Engineering, created a new smart material that accommodates shifts in acoustic transmission on demand. "With traditional acoustic metamaterials, you create one structure and you achieve one property. With this new smart material, we can achieve multiple properties with just one structure," Wang said. In studying this new material, Wang and his team discovered that their smart material had the capability of re-creating properties intrinsic to electronic devices such as switches, thus showing promise of smart sound transmission--a sound "computer."

Wang and his team, including USC Viterbi Ph.D. candidates Kyung Hoon Lee, Kunhao Yu, An Xin and Zhangzhengrong Feng, and postdoctoral scholar Hasan Al Ba'ba'a, detailed their findings in their paper "Sharkskin-Inspired Magnetoactive Reconfigurable Acoustic Metamaterials," recently published in Research. Inspired by the dual properties created by the dermal denticles on the surface of a shark's skin, the team created a new acoustic metamaterial that contains magneto-sensitive nanoparticles that will bend under the force of magnetic stimuli. This magnetic force can change the structure remotely and on-demand, accommodating different transmission conditions.

Modulating Multiple Acoustic Properties in One Device

The acoustic metamaterial created by the researchers is made of rubber and a mix of iron nanoparticles. The rubber offers flexibility, allowing the materials to bend and flex reversibly and repeatedly, while the iron makes the material responsive to the magnetic field.

To make the structures responsive to acoustic inputs, Wang and his team had to assemble the materials such that the resonance between them--Mie resonance--allowed for changes in acoustic transmission--either blocking or conducting an acoustic input. If the pillars are closer together, the acoustic wave will be effectivelytrapped and prevented from propagating through to the other side of the structure. Conversely, if the pillars are further apart, the acoustic wave will easily pass through. "We use the external magnetic field to bend the pillar and unbend the pillar to achieve this sort of state switching," the leading author Lee said. The result is a shift from a position that blocks acoustic transmission to one that effectively conducts the acoustic waves. Unlike traditional acoustic metamaterials, no direct contact or pressure is required to change the architecture of the materials.

A Sound "Computer"

Wang and his team were able to demonstrate how their smart material could mimic three key electronic devices: a switch, a logic gate, and a diode. The interaction of the magneto-sensitive materials with the magnetic field manipulate acoustic transmission in such a way as to create functions like an electrical circuit.

To understand this better, let's look at how each of these three electronic devices works.

A switch allows for a channel to be turned on and off, for example, in noise-canceling headphones. In this example, using a structure built of the smart acoustic metamaterial, you can tune the magnetic field so that the Mie resonator pillars bend and allow for external noise to pass through. In another instance, you can turn off the magnetic field and the pillars will stay vertical, blocking external noise from passing through, Wang said.

A logic gate builds on this idea, by triggering decision making based on stimuli incoming to different input channels. In the case of a submarine, perhaps you want the acoustic device to modulate multiple conditions, instead of a singular one: attack when it receives one weak signal and one strong signal, but flee when it receives two strong signals. In order to allow for multiple scenarios to be a part of decision-making, you would traditionally need multiple devices, each architected for a different scenario. An AND gate operator describes an acoustic device that would trigger a certain response only when the input channels are both strong. An OR gate operator describes an acoustic device that would trigger a certain decision when either of the two signals is strong. With traditional acoustic metamaterials, you can only create one operator and thus respond to only one condition. With the new smart acoustic metamaterial developed by the researchers, Wang says you can switch from an AND gate to an OR gate operator on demand. In the case of the submarine, that means using the magnetic field, you could change the conditions for which an attack command is triggered without building a new acoustic device.

Finally, there is a diode. A diode is a device in which acoustic intensity is high in one direction and low in another, thus it offers one-way transportation of the acoustic wave. Traditional acoustic metamaterials will allow you to do this, but again, you cannot change states. Using the new smart acoustic metamaterial, you can change from a diode state to a conductor state, which allows transmission in both directions, instead of just one direction. This comes into play in the example of sonic camouflage in the submarine, where sometimes you will want the acoustic device to allow for sound to travel in only one direction and other times, you want it to be transmittable in both directions.

"Such a change has never been achieved by traditional acoustic metamaterials," Wang said.

Next Steps

Right now, Wang and his team have been testing their material in air. Next, they hope to test the same properties under water, to see if they can achieve the same characteristics at an ultrasound range.

"Rubber is hydrophobic, so the structure won't change, but we need to test if the materials will still have tunability under an external magnetic field," Wang said, noting the water will have more resistance and thus add more friction to the situation.

Credit: 
University of Southern California

Method yielding high rate of D-lactate using cyanobacteria could revolutionize bioplastic production

image: Synechocystis sp. PCC 6803 produces glycogen inside its cells from CO2 and light. It was revealed that when Synechocystis sp. PCC 6803 that has accumulated glycogen is placed in anoxic dark conditions, this activates the metabolic pathway (indicated by the red arrows) which promotes D-lactate production.

Image: 
Kobe University

A Kobe University led research team has illuminated the mechanism by which cyanobacteria (Synechocystis sp. PCC 6803) produces D-lactate, showing that malic enzyme facilitates this production. Subsequently, they succeeded in producing the world's highest rate (26.6g/L) of D-lactate directly from CO2 and light by modifying the D-lactate synthesis pathway using genetic engineering.

It is expected that this success will contribute towards the development of important process technologies for producing polylactic acid, which is used to manufacture biodegradable plastics. This could help make the concept of a sustainable, low carbon society a reality.

The research group consisted of Professor HASUNUMA Tomohisa (of Kobe University's Engineering Biology Research Center), Project Associate Professor HIDESE Ryota (of Kobe University's Graduate School of Science, Technology and Innovation) and Associate Professor OSANAI Takashi (of Meiji University's School of Agriculture).

The results of this research were published online in the international scientific journal 'ACS Synthetic Biology' on January 31, 2020.

Main points

There are few reports of microbes being utilized to produce efficient yields of D-lactate with high purity.

Currently, D-lactate is produced from edible biomass, such as corn or sugarcane, using heterotrophic microorganisms (*3). However, this method places a high burden on resources and the environment.

A methodology to produce D-lactate from cyanobacteria via CO2 photosynthesis exists as an environmentally friendly alternative, however the production rate is extremely low (1g/L).

It was found that the cyanobacteria's sugar metabolism is activated when the amount of malic acid in the cells is reduced by malic enzyme.

The research group succeeded in producing a high rate of D-lactate by placing genetically optimized cyanobacteria that had accumulated sugar (glycogen) via photosynthesis in dark, oxygen-deprived conditions (anoxic dark fermentation).

Summary

The utilization of bioproduction to synthesize versatile chemical compounds and functional raw materials that are usually derived from oil is vital for both the environment and resource sustainability. In recent years, bioproduction methods using microbes have gained attention. Among these microbes are microalgae. It is possible to produce various useful substances such as oils and pigments from microalgae using sunlight and CO2.

Cyanobacteria is a type of quick growing microalgae that is easy to genetically modify. Cyanobacteria has been used to produce D-lactate before, however the low yield has been an obstacle which prevents the practical application of these methods.

Cyanobacteria turns CO2 into the sugar glycogen via photosynthesis. If cyanobacteria which has accumulated glycogen inside its cells is then placed in a dark environment with no oxygen, the glycogen is metabolized by the cyanobacteria and it secretes organic acids (such as succinic and lactic) into the growth medium.

To synthesize D-lactate from cyanobacteria, there needs to be an increase in pyruvate production. This research group discovered that the malic enzyme, which converts malic acid into pyruvate, is vital for D-lactate production. They used dynamic metabolomics to illuminate the mechanism behind D-lactate production. Through this analysis, they discovered that when excessive malic enzyme is produced inside the cells, not only is malic acid converted into pyruvate, but also the pathway to produce pyruvate from glycogen is activated (Figure 1). D-lactate is biosynthesized from pyruvate by D-lactate dehydrogenase. The research group succeeded in producing 26.6 g/L of D-lactate with a conversion rate of 94.3% from the accumulated glycogen by genetically engineering the D-lactate dehydrogenase to optimize its function (Figure 2).

This research represents an important advancement towards the development of an industrial process to produce D-lactate from CO2 . The group aims to continue to boost D-lactate production through metabolic pathway optimization and analysis of cultivation conditions.

Research Background

There is a large market for D-lactate, which can be used as a raw material in the production stereocomplex PLA (*4) that is a biodegradable plastic. On the other hand, high purity and productivity are required in order to make biologically synthesizing D-lactate using microbes viable. Bioproduction methodologies which use heterotrophic microbes like E. Coli exist, however these use sugar (glucose) from corn or sugarcane as a source of energy for production. This means that growing these plants for bioproduction causes a multitude of issues, such as competition with food sources, use of arable land and freshwater resources, and contribution to environmental destruction (for example, deforestation).

Cyanobacteria, on the other hand, is an ideal microbe for producing useful substances, since it can convert CO2 fixed via photosynthesis into various target compounds. In addition, cyanobacteria has a much higher photosynthesizing ability than plants, meaning that it can even be grown under strong light. It does not require soil and many varieties can be cultivated in seawater. Therefore it is hoped that cyanobacteria can provide the ultimate basis for bioproduction as it only requires sunlight, CO2 and seawater.

It is widely known that cyanobacteria could provide a way to synthesize D-lactate, and attempts have been made to boost D-lactate production using genetic modification. However, almost all systems that produce D-lactate are connected to propagation via photosynthesis so low amounts of this target substance are synthesized. The reason for this is that the mechanism for D-lactate production in cyanobacteria has not been well understood.

Metabolome analysis techniques allow researchers to both identify and calculate the multitude of compounds found inside cells. This research group developed 'Dynamic Metabolomics' which allowed them to observe the amount of substances metabolized over time.

The Synechocystis sp. PCC 6803 cyanobacteria used in this study is one of the most commonly researched cyanobacteria worldwide. It is a model organism (*5) for photosynthate production because it is easy to genetically modify and grows fast. Previous research conducted by this group using dynamic metabolomics showed that succinic acid is mainly produced via malic acid in Synechocystis sp. PCC 6803. The current study focused on malic enzyme, which converts malic acid into pyruvate. First, they aimed to elucidate the effects of malic enzyme on the metabolism of Synechocystis sp. PCC 6803 through dynamic metabolomics. Their subsequent objective was to increase D-lactate production using metabolic engineering (*6).

Research Methodology

Two types of cell were created in order to comprehensively investigate the mechanism behind D-lactate production: 1. Cells which had no malic enzyme function and 2. Cells in which this function was optimized, leading to an overexpression of malic enzyme.

Dynamic metabolomics was used to analyze the difference in the metabolism between these two cells. It was found that more pyruvate was produced from glycogen when the level of malic acid in the cells was low (Figure 1).

The research group further genetically modified malic enzyme optimized cells to overexpress D-lactate dehydrogenase, and enhanced the D-lactate dehydrogenase's function of producing D-lactate from pyruvate. In addition, the group genetically engineered the cells to remove the enzyme acetate kinase in order to suppress the production of byproduct acids.

The modified Synechocystis sp. PCC 6803 was then cultivated in a dark anoxic environment (fermentation conditions). Under these conditions, the cells reached the optimum density. This research group far exceeded the world's previous highest yield of D-lactate (10.7 g/L), by producing 26.6 g/L at a rate of 0.185 g/L/h (Figure 2). It is thought that this finding can contribute towards a low cost process to produce high levels of D-lactate.

Further Research

Cyanobacteria can be utilized to produce many versatile chemical compounds and functional raw materials, however this technology is not yet well developed enough to be implemented on an industrial scale. The big problem is that lower levels of the target compound are produced using cyanobacteria, compared to the amounts produced when using heterotrophic microorganisms. The current research has shown that the dynamic metabolomics analysis is highly effective for evaluating the function of Synechocystis. Based on the result of the dynamic metabolomics, this group enabled the Synechocystis to perform at their full potential by genetically modifying their metabolism.

It is hoped that increasing the photosynthetic productivity of cyanobacteria through dynamic metabolomics and metabolic engineering could contribute towards the realization of a sustainable, low carbon society.

Credit: 
Kobe University

IKBFU Physicists keep improving 'smart' composites for biomedical sensors

image: The new composites are related to the multiferroic-class materials which have mutually controlled magnetic and electric properties. The effects observed in the compositions are considered to be a perspective platform for creating new devices from energy converters to highly sensitive sensors.

Image: 
Immanuel Kant Baltic Federal University

IKBFU Physicists have successfully tested the new magnetic micro wire-based concept of "smart" composites production. The new composites are related to the multiferroic-class materials which have mutually controlled magnetic and electric properties. The effects observed in the compositions are considered to be a perspective platform for creating new devices from energy converters to highly sensitive sensors.

These magnet-electrical sensors are of high demand in biomedicine because of being used in health monitoring devices. There already are prototypes of the sensors and further research is going on to improve the sensors' functional characteristics.

The IKBFU Laboratory of Novel Magnetic Materials scientists have an idea to use the amorphous microwire composite as a magnetic component. This approach will allow the scientists to get a compact human-hair thick sensor, capable of registering even very weak magnetic fields.

Senior researcher of the IKBFU Laboratory of Novel Magnet Materials Karim Amirov said:

"This new conception has already proved to be effective. We aim our research at both checking the conception and at getting the composition with maximal magneto-electric effect. This means that a series of experiments was conducted to search for such a microwire preparation protocol (annealing, removal of the glass shell) that would ultimately give us the maximum output. This is very important because after we understand how it works, what microwire to use and how to prepare it, the next step is to get such a composite in a flexible piezopolymer matrix. This will be another step towards the magnetoelectric sensor being developed for monitoring human parameters"

The article with the results of the research were published in the Materials Scientific Journal. The article authors are the scientists at the IKBFU Laboratory of Novel Magnetic Materials: Ph.D. Candidates in Physics and Mathematics Karim Amirov, Valeria Rodionova and Irina Baraban, Ph.D. in Physics and Mathematics Larisa Panina.

Credit: 
Immanuel Kant Baltic Federal University

Disturbed retinal gene function underlying canine blindness

A canine study carried out at the University of Helsinki has described a gene variant in the regulatory region of the retina resulting in the abnormal function of retinal genes and, eventually, in the loss of vision in dogs. The study can benefit the diagnostics and treatment of retinitis pigmentosa, a disease suffered by two million human beings globally.

"For now, changes in the regulatory regions of genes as causes of disease are relatively poorly known. In fact, our recently published study offers, in addition to diagnostics, a valuable model for understanding the biology of the retina and associated diseases," says Professor Hannes Lohi, who headed the study.

Since the 1990s, attempts to identify the genetic cause of retinal atrophy in Miniature Schnauzers have been made all over the world. The breed suffers from progressive retinal atrophy (PRA), which results in complete loss of vision for the dog. Lohi's group identified a number of dogs with the same pedigree in the process of losing their vision, an indication of hereditary disease.

"We started compiling research data by comparing eye examination statements, quickly suspecting that more than one form of the disease occurred in the breed. This was an observation central to the progress of the project," says Maria Kaukonen, DVM, the principal author of the article and postdoctoral researcher at the University of Helsinki.

Based on their symptoms, the researchers divided the dogs into two groups, after which subsequent gene analyses demonstrated that the diseases were also different on the basis of the underlying genes. Dogs in group 1 lost their vision by the age of five, while the age of vision loss among group 2 varied more, in addition to which male dogs were overrepresented among the sick dogs.

In addition to regular eye examinations, the study was supplemented by optical coherence tomography (OCT), a technique not previously used in veterinary medicine in Finland. The imaging revealed the complete destruction of photoreceptors, or cells that respond to light stimuli, in the dogs suffering from the type 1 atrophy.

Disease caused by a peculiar gene defect of the regulatory region

"For the longest time, looking for the gene defect felt like looking for a needle in a haystack. We finally made a breakthrough after finding out that some of the healthy dogs have an almost identical locus of chromosome 15, only missing the mutation causing blindness. This implies that the mutation is a recent development," Kaukonen explains.

"The mutation we identified in the regulatory region results in hyperactivity in at least two retina genes previously associated with blindness. This is an interesting and uncommon find, as well as one of the first regulatory region mutations linked to retinal atrophy. Most likely, there are many more similar mutations, but they remain underrepresented in the literature due to the challenge of identifying them," Lohi speculates.

Rectifying previous research findings

"An American study published last year identified the same chromosomal locus as did our analysis, but the PPT1 gene in their proposal is actually associated with a very different disease of the nervous system, leaving us unconvinced. Nevertheless, a commercial gene test was developed on the basis of these ambiguous findings, which has caused a great deal of confusion among dog owners and breeders. The gene variant identified by us has now been tested for in over 1,600 dogs, with the results matching the disease perfectly. Now, we will be able to design a reliable gene test for the breed to support breeding and diagnostics," Lohi enthuses.

The study was part of Kaukonen's recently approved doctoral thesis, which investigated the genetic causes of canine hereditary ocular diseases, and also part of a bigger research effort by Professor Lohi's research group looking into the heredity of ocular diseases. This was also the first study to utilise findings generated by the international DoGA research project headed by Professor Lohi, aimed at expanding the understanding of the canine genome. The DoGA project also employs researchers from the Karolinska Institutet in Stockholm, Sweden.

Credit: 
University of Helsinki

Study reveals grasshopper declines associated with declines in quality of prairie grasses

image: This a Plains Lubber grasshopper (Brachystola magna) at Konza Prairie.

Image: 
Ellen Welti

A University of Oklahoma-led study shows that grasshopper numbers have declined over 30% in a Kansas grassland preserve over the past two decades. Published in the Proceedings of the National Academy of Sciences (USA), the paper, "Nutrient dilution and climate cycles underlie declines in a dominant herbivore," reveals a new potent and potentially widespread threat to Earth's plant feeders: the dilution of nutrients like nitrogen, phosphorus, and sodium in the plants themselves due to increasing levels of atmospheric CO2.

Ellen Welti, of the Geographical Ecology Group in the Department of Biology at OU, led the collaboration of ecologists from OU, the University of Illinois and Kansas State University in this National Science Foundation-funded study.

Grasshoppers are abundant consumers in grasslands - a habitat that covers more than 30% of Earth's land mass and is the source of the majority of human crops. The same decline in plant quality revealed by Welti and her colleagues has recently raised alarms about the global human food supply.

"This decline in plant nutrient concentration poses a challenge for all animals that consume plants, including humans," Welti said.

The OU-based study adds to the growing evidence that some insect groups are declining in abundance. Such long-term data are rare but are a primary function of the National Science Foundation's LTER (Long Term Ecological Research) sites, including Konza Prairie - a large protected tallgrass prairie reserve in northeast Kansas that provided the study's key data.

"One surprise was that grasshopper abundances in this large native tallgrass prairie reserve are declining," Welti said. "This grassland appears to be a stable and prime habitat for grasshoppers and yet even here, we are seeing 2% annual declines."

The grasshoppers have been surveyed at Konza for approaching 30 years, providing a rare and detailed breakdown of this important group of insects.

The study is unique, not only in the length of the record, but in the sophisticated set of mathematical tools Welti and her colleagues implemented to account for two drivers of grasshopper populations.

"I used tools developed by geologists to look at orbital cycles, to identify cyclic patterns and understand how climate oscillations such as El Niño that may shape grasshopper abundances," Welti said.

Michael Kaspari, George Lynn Cross Research Professor in the OU College of Arts and Sciences, was the study's senior author.

"Where some folks look at these data - with their wide yearly swings in grasshopper numbers - and see only noise, Dr. Welti had the tools and the insight to reveal the music in the data," Kaspari said. "That music consisted of five-year cycles in precipitation and temperature that drove changes in grasshopper numbers, as well as the plants they feed on."

Welti and her colleagues' second discovery - that plant quality is declining even as plant growth has nearly doubled - highlights the paradoxical nature of nutrient dilution.

"The greenhouse gas CO2 is heating the Earth and acidifying its oceans, but it is also the main ingredient in the sugars, starches, and cellulose of plants," Kaspari said. "When we pump the atmosphere full of CO2, we build more plants. But, with no additional nutrients to fertilize them, the nutritional value of each bite is diluted. Mouthful by mouthful, the prairie provides less and less food to the grasshoppers. Hence, their decline."

This new cause of insect declines is particularly problematic.

"The mechanism of grasshopper declines that we propose in this study - declining plant quality with increasing atmospheric CO2 - is expected to be global in scope and pose the largest challenge to herbivores. It is notable that a large number of previous studies documenting insect declines were on another herbivorous group (butterflies and moths), but few of these papers identified a mechanism causing declines," Welti said.

The worldwide increase in CO2 highlights a potential worldwide threat to the largest group of animals: its plant feeders.

"Of late, much has been made of 'greening the Earth' as a tool to fight climate change," Kaspari said. "We show that while growing plants may indeed help scrub CO2 from the atmosphere, those same plants are likely becoming and less nutritious. It is as if, by burning fossil fuels, humans are transforming all of our kale into iceberg lettuce: still edible, but less and less sustaining."

Credit: 
University of Oklahoma

Cancerous tumors, surrounding cells illuminated by new imaging agent

image: A new imaging agent, developed at Washington University School of Medicine in St. Louis, illuminates cancerous cells of a breast tumor. The new agent lights up cancer cells and the supporting cells that act as a shield, protecting the tumor from various treatment strategies. The new investigational agent is being tested in small clinical trials.

Image: 
Achilefu lab

Scientists at Washington University School of Medicine in St. Louis have developed a new imaging agent that could let doctors identify not only multiple types of tumors but the surrounding normal cells that the cancer takes over and uses as a shield to protect itself from attempts to destroy it.

The study appears March 9 in the journal Nature Biomedical Engineering.

The imaging agent, referred to as LS301, has been approved for investigational use in small clinical trials at Siteman Cancer Center at Barnes-Jewish Hospital and Washington University School of Medicine. The first trial will investigate its use in imaging breast cancer.

"This unique imaging agent identifies cancer cells as well as other compromised cells surrounding the tumor," said Samuel Achilefu, PhD, the Michel M. Ter-Pogossian Professor of Radiology. "Cancer transforms surrounding cells so that it can proliferate, spread to other parts of the body and escape treatment. This imaging compound can detect cancer cells and their supporting cast, the diseased cells that are otherwise invisible."

The compound binds to the activated form of a protein called annexin A2, which is present in many types of solid tumors but not healthy tissue. The activated form of the protein promotes inflammation and invasiveness of these tumors, which allows the cancer to spread.

Solid tumors that contain activated annexin A2 are found in breast, colon, liver, pancreatic, head and neck, and brain cancers. Since the activated form of the protein also is present in the cells that surround the tumor -- and not normal, healthy cells -- doctors potentially could use this imaging agent to identify cells the tumor has hijacked. Despite their benign status, these hijacked cells protect the tumor from chemotherapy, radiation and other attempts to kill the cancer cells. Such co-opted cells also conceal cancer stem cells, whose stealth presence can lead to a recurrence of the tumor.

"We are coming to the realization that to eradicate cancer, we also need to focus on the microenvironment of the tumor," said Achilefu, who also directs the university's Optical Radiology Lab at the Mallinckrodt Institute of Radiology and is co-leader of the Oncologic Imaging Program at Siteman. "Most cancer drugs are designed to target cancer cells. But cancer cells create their own fiefdom, where they impose their own rules. If a normal cell nearby wants to continue living, it must follow the new rules. And slowly these cells come to identify with the tumor rather than their normal identity."

Achilefu expects that with a tumor and its surrounding fiefdom illuminated by the new imaging agent, doctors would have a better chance of removing the entire tumor as well as any areas that are likely to harbor microscopic cancer cells. In past work, Achilefu's team has developed cancer goggles that allow surgeons to visualize cancer cells in real time during surgery to remove a tumor. The new imaging agent can be used with these goggles, which are being evaluated in clinical trials. The researchers also are working on a version of the compound that could be used in positron emission tomography (PET) scans, which many cancer patients undergo to assess whether cancer has spread.

As Achilefu and his colleagues saw that the compound lit up the hijacked cells on the periphery of the tumor, they were surprised to see the imaging agent light up parts of the central core of the tumor as well.

"We were amazed when we saw this because it's extremely difficult to access anything inside a tumor," Achilefu said. "There seems to be a type of immune cell that carries the imaging agent into the core of the tumor. So we now see the tumor margin and the core light up. This allows us to imagine a situation in which we could deliver a drug to the outside and the inside of the tumor at the same time. This dual targeting is not something we purposefully designed -- it's not something we ever anticipated."

With this in mind, Achilefu's team conducted mouse studies to show that the researchers can attach a chemotherapy drug to the compound and use it to image the tumor and treat the disease simultaneously.

"Attaching a chemotherapy drug to this targeted imaging agent could reduce side effects as we are delivering the drug directly to the tumor," he said. "If the clinical trials are successful with the imaging, we will move into therapy."

Credit: 
Washington University School of Medicine

Around 100,000 convicted felons across US likely still own guns, say researchers

Around 100,000 convicted felons across the US still likely own a gun, despite being banned from doing so, concludes the first study of its kind, published online in Injury Prevention.

There's no nationwide programme to recover these weapons, with California the only state to do so. But such an initiative might go some way to curbing firearm violence in the US, suggest the researchers.

Relatively little attention has been paid to people who legally purchase guns but who are subsequently banned from firearm ownership because of a conviction for violent crime; or an admission to hospital for a mental health emergency; or a domestic abuse restraining order.

And there are likely to be many of them, say the researchers, because millions of people buy guns legally every year in the US and bans on ownership are common.

In California alone 5-10% of 21 to 49-year old gun owners with a history of arrest are convicted of a felony within five years of their purchase.

But as yet, California is the only US state with a large-scale enforcement programme for recovering firearms from people who have been banned from ownership.

To find out more about the number and profile of these people, the researchers drew on information entered into the Armed and Prohibited Persons System (APPS) database as of 1 February 2015.

The APPS cross references data from the California Department of Justice's archive of firearm ownership transfers with data on criminal convictions and domestic violence restraining orders. Specially trained enforcement officers then seek to recover the illegally owned weapons.

On February 1 2015, nearly 19,000 people in California owned just short of 50,000 firearms between them, despite being banned from doing so. Most of these weapons were handguns (92%).

This group was compared with 2400 randomly selected firearms owners who had not been banned from ownership.

Men predominated in both groups: 93% of those who had been banned and 85% of those who hadn't. And both groups of owners owned an average of 2.6 guns each. But banned firearm owners were more likely to be aged between 35 and 54 (60% vs 42.5%), and to be of black (11% vs 4%) and Hispanic ethnicities (27.5% vs 16%).

The average number of bans was 2 per person, but was as high as 85. Nearly half the lifelong bans (48%; 7711) had been prompted by a conviction for a felony. A 'Brady only' ban, resulting from federal, rather than state, law, applied to 29% (4623).

Bans due to violence, such as assault and battery; restraining orders; and mental health emergencies accounted for around 15% each of the total.

Most of those who illegally owned firearms lived in and around densely populated areas of the state, such as the LA and San Francisco metropolitan areas.

The researchers acknowledge that the data on firearms bans in California aren't complete, because they only date back to 1996, transactions for rifles and shotguns were only mandated in 2014, and the database doesn't include illegal purchases.

But, based on the proportion of firearm owners in California (4.2 million), and the proportion banned for felony (7711; 0.18%), the researchers estimate that 98,500 people across the US own a firearm despite being convicted felons.

"The true number of prohibited firearm owners nationwide cannot be determined with certainty using existing data, as many states do not keep records of firearm transactions and no state other than California has made these data available to researchers," they point out.

But they add: "The numerous federal prohibitions on firearm ownership, combined with the fact that no other state is engaging in comparable efforts to reduce the prevalence of armed and prohibited persons through firearm recovery, suggest there are large numbers of prohibited firearm owners across the country, particularly in states where there are higher rates of firearm ownership and fewer barriers to purchasing a firearm."

And they conclude: "The evidence supporting firearm prohibitions and interventions to reduce access to firearms among prohibited populations provides a prima facie reason to expect that recovering firearms from prohibited persons could reduce firearm violence."

Credit: 
BMJ Group

Are non-smoking young adults who use e-cigarettes more likely to smoke in the future?

Young people who have tried e-cigarettes but have never smoked before are nearly five times more likely to go on to try smoking, a new study has found. However, the findings do not provide clear support for the claim that e-cigarettes cause young people to start smoking (the so-called possible "gateway effect").

Researchers from the University of Bristol's Tobacco and Alcohol Research Group (TARG), with support from Bristol's MRC Integrative Epidemiology Unit (IEU) and the NIHR Bristol Biomedical Research Centre (BRC), combined the results of 17 studies to investigate whether e-cigarette use compared to non-use in young non-smokers is associated with subsequent cigarette smoking.

The research, funded by the MRC IEU and NIHR BRC, is published today [11 March 2020] in Tobacco Control.

Existing evidence suggests that e-cigarette use is less harmful than smoking and is an effective aid to stop smoking, but there are concerns that e-cigarettes may be a route into smoking cigarettes, especially among young people. If this is correct, rather than smoking rates falling, smoking rates might remain stable or increase due to a new generation of smokers where e-cigarettes have instigated smoking.

In this study, the researchers combined evidence from 17 studies that had investigated e-cigarette use and subsequent smoking, where an odds ratio could be calculated, to explore whether e-cigarette use, compared to non-use, in young non-smokers is associated with subsequent cigarette use.

Jasmine Khouja, a PhD student in TARG based in the School of Psychological Science, said: "Policymakers have used the findings of studies, including the studies we reviewed in this research, to support the heavy regulation of e-cigarettes, including restrictions on flavours and even total bans, but the evidence that e-cigarette use might cause young people to take up smoking is not as strong as it might appear."

The study found that young people who had never smoked before but had used e-cigarettes were four-and-a half times more likely to go on to use e-cigarettes. However, the research team also identified a number of issues with the studies included in this analysis, which makes them cautious to conclude that e-cigarette use is causing young people to start smoking.

Whilst the association between e-cigarette use among non-smokers and subsequent smoking appears strong, the available evidence is limited by the reliance on self-report measures of smoking history without biochemical verification. None of the studies included negative controls which would provide stronger evidence for whether the association may be causal. Much of the evidence also failed to consider the nicotine content of e-liquids used by non-smokers meaning it is difficult to make conclusions about whether nicotine is the mechanism driving this association.

The researchers recommend future studies should address the issues which have been highlighted by using more advanced tests to confirm whether or not young people are smokers or e-cigarette users, using different statistical analyses, and considering whether the e-cigarettes contain nicotine or not.

A previous meta-analysis study carried out in 2016 combined the results of nine studies, which explored whether e-cigarette use among non-smoking adolescents and young adults was linked to later smoking. The study found that young people who had used an e-cigarette were nearly four times more likely to go on to try smoking than those who hadn't.

Credit: 
University of Bristol

Inverse design software automates design process for optical, nanophotonic structures

image: Photonic inverse design produces an assortment of nonintuitive designs that can achieve better performance in smaller footprints than their traditionally designed counterparts can. Despite the multitude of possible designs for any particular task, analysis of the designs for a beam splitter reveals how the algorithm produces devices that can actually be classified into different types based on their structure and underlying physical principles.

Image: 
Logan Su

WASHINGTON, March 10, 2020 -- Stanford University researchers created an inverse design codebase called SPINS that can help researchers explore different design methodologies to find fabricable optical and nanophotonic structures.

In the journal Applied Physics Reviews, from AIP Publishing, Logan Su and colleagues review inverse design's potential for optical and nanophotonic structures, as well as present and explain how to use their own inverse design codebase.

"The idea of inverse design is to use more sophisticated optimization algorithms and automate the search for a structure," Su explained. "The ultimate goal is to have a designer input their desired performance metrics and simply wait for the algorithm to generate the best possible device."

Integrated photonics has many potential applications, ranging from optical interconnects to sensing to quantum computing.

Inspired by popular machine learning libraries such as TensorFlow and PyTorch, SPINS is a photonic design framework that emphasizes flexibility and reproducible results. SPINS has been used internally by the group to design an assortment of devices, and the group is making it available for other researchers to use.

"The mathematics behind our optimization techniques come from the mathematical optimization community," Su said. "But we also borrow ideas from the optimization community in mechanical and fluid mechanics, where they use similar optimization methods to design mechanical structures and airfoils before their adoption in photonics."

Inverse design "automates the design process for optical and photonic elements," he said. "Traditionally, photonic devices are hand designed, in the sense that a designer first comes up with the basic geometric shape of the structures, such as a circle, and then performs a few parameter sweeps of the radius of the circle to improve device performance."

This process is labor intensive and tends to ignore a large class of devices with more complicated shapes that have the potential for much better performance.

"Replacing electrical interconnects with photonic interconnects within data centers, for example, could enable an increase in memory bandwidth while substantially decreasing energy costs," Su said.

Photonic neural networks also promise faster operation speeds with lower energy requirements compared to electronic hardware, and metasurface optics promise novel optical functionalities that are cheaper and orders of magnitude smaller than their traditional bulky optical elements.

"Part of the barrier to the adoption of these technologies is the performance of the photonic components that comprise that system," Su said. "By developing a better optimization method for designing these photonic components, we hope to not only improve the performance of these technologies to the point of commercial viability but also open up new possibilities for integrated photonics."

Credit: 
American Institute of Physics

How intermittent fasting changes liver enzymes and helps prevent disease

image: Dr Mark Larance from the Charles Perkins Centre and School of Life and Environmental Sciences at the University of Sydney.

Image: 
Stefanie Zingsheim/University of Sydney

Researchers in Australia have used state-of-the-art analytical tools to understand how intermittent fasting works on the liver to help prevent disease. The findings will help medical scientists working in cancer, cardiovascular and diabetes research develop new interventions to lower disease risk and discover the optimum intervals for fasting.

In experiments with mice, researchers led by Dr Mark Larance at the University of Sydney identified how every-other-day fasting affected proteins in the liver, showing unexpected impact on fatty acid metabolism and the surprising role played by a master regulator protein that controls many biological pathways in the liver and other organs.

"We know that fasting can be an effective intervention to treat disease and improve liver health. But we haven't known how fasting reprograms liver proteins, which perform a diverse array of essential metabolic functions," said Dr Larance, a Cancer Institute of NSW Future Research Fellow in the Charles Perkins Centre and School of Life and Environmental Sciences at the University of Sydney.

"By studying the impact on proteins in the livers of mice, which are suitable human biological models, we now have a much better understanding of how this happens."

In particular, the researchers found that the HNF4-(alpha) protein, which regulates a large number of liver genes, plays a previously unknown role during intermittent fasting.

"For the first time we showed that HNF4-(alpha) is inhibited during intermittent fasting. This has downstream consequences, such as lowering the abundance of blood proteins in inflammation or affecting bile synthesis. This helps explain some of the previously known facts about intermittent fasting," Dr Larance said.

The researchers also found that every-other-day-fasting - where no food was consumed on alternate days - changed the metabolism of fatty acids in the liver, knowledge that could be applied to improvements in glucose tolerance and the regulation of diabetes.

"What's really exciting is that this new knowledge about the role of HNF4-(alpha) means it could be possible to mimic some of the effects of intermittent fasting through the development of liver-specific HNF4-(alpha) regulators," Dr Larance said.

The research, published today in Cell Reports, was done in collaboration with the Heart Research Institute and Dr John O'Sullivan at Royal Prince Alfred Hospital. Dr O'Sullivan is an Adjunct Professor in the Faculty of Medicine & Health and a Senior Lecturer at the Sydney Medical School.

A technique known as multi-Omics, which considers multiple data sets such as the total collection of proteins and genes, was used in the study, allowing for the integration of large amounts of information to discover new associations within biological systems.

Dr O'Sullivan said: "These multi-Omics approaches give us unprecedented insight into biological systems. We are able to build very sophisticated models by bringing together all the moving parts."

The multi-Omics data was obtained at Sydney Mass Spectrometry, part of the University of Sydney's Core Research Facilities.

Dr Larance said that the information can now be used in future studies to determine optimum fasting periods to regulate protein response in the liver.

"Last year we published research into the impact of every-other-day-fasting on humans. Using these mouse data, we can now build up improved models of fasting for better human health."

Credit: 
University of Sydney

Intel processors are still vulnerable to attack

image: The Load Value Injection attack on Intel processors uses the vulnerability of SGX enclaves to smuggle or 'inject' attacker-controlled data into a software program that the victim is running on their computer.

Image: 
KU Leuven

Computer scientists at KU Leuven have once again exposed a security flaw in Intel processors. Jo Van Bulck, Frank Piessens, and their colleagues in Austria, the United States, and Australia gave the manufacturer one year's time to fix the problem.

Plundervolt, Zombieload, Foreshadow: in the past couple of years, Intel has had to issue quite a few patches for vulnerabilities that computer scientists at KU Leuven have helped to expose. "All measures that Intel has taken so far to boost the security of its processors have been necessary, but they were not enough to ward off our new attack," says Jo Van Bulck from the Department of Computer Science at KU Leuven.

Like the previous attacks, the new technique - dubbed Load Value Injection - targets the 'vault' of computer systems with Intel processors: SGX enclaves (see below).

"To a certain extent, this attack picks up where our Foreshadow attack of 2018 left off. A particularly dangerous version of this attack exploited the vulnerability of SGX enclaves, so that the victim's passwords, medical information, or other sensitive information was leaked to the attacker. Load Value Injection uses that same vulnerability, but in the opposite direction: the attacker's data are smuggled - 'injected' - into a software program that the victim is running on their computer. Once that is done, the attacker can take over the entire program and acquire sensitive information, such as the victim's fingerprints or passwords."

The vulnerability was already discovered on 4 April 2019. Nevertheless, the researchers and Intel agreed to keep it a secret for almost a year. Responsible disclosure embargoes are not unusual when it comes to cybersecurity, although they usually lift after a shorter period of time. "We wanted to give Intel enough time to fix the problem. In certain scenarios, the vulnerability we exposed is very dangerous and extremely difficult to deal with because, this time, the problem did not just pertain to the hardware: the solution also had to take software into account. Therefore, hardware updates like the ones issued to resolve the previous flaws were no longer enough. This is why we agreed upon an exceptionally long embargo period with the manufacturer."

"Intel ended up taking extensive measures that force the developers of SGX enclave software to update their applications. However, Intel has notified them in time. End-users of the software have nothing to worry about: they only need to install the recommended updates."

"Our findings show, however, that the measures taken by Intel make SGX enclave software up to 2 to even 19 times slower."

What are SGX enclaves?

Computer systems are made up of different layers, making them very complex. Every layer also contains millions of lines of computer code. As this code is still written manually, the risk for errors is significant. If such an error occurs, the entire computer system is left vulnerable to attacks. You can compare it to a skyscraper: if one of the floors becomes damaged, the entire building might collapse.

Viruses exploit such errors to gain access to sensitive or personal information on the computer, from holiday pictures and passwords to business secrets. In order to protect their processors against this kind of intrusions, IT company Intel introduced an innovative technology in 2015: Intel Software Guard eXtensions (Intel SGX). This technology creates isolated environments in the computer's memory, so-called enclaves, where data and programs can be used securely.

"If you look at a computer system as a skyscraper, the enclaves form a vault", researcher Jo Van Bulck explains. "Even when the building collapses the vault should still guard its secrets - including passwords or medical data."

The technology seemed watertight until August 2018, when researchers at KU Leuven discovered a breach. Their attack was dubbed Foreshadow. In 2019, the Plundervolt attack revealed another vulnerability. Intel has released updates to resolves both flaws.

Credit: 
KU Leuven

New research finds infant cereal consumption is associated with improved nutrient intake

Arlington, VA, Mar. 10, 2020 - An investigation of infant feeding patterns found infants and toddlers consuming baby cereal, such as rice cereal, had higher intakes of key nutrients of concern, such as calcium, magnesium, iron, zinc and vitamin E. 1 Baby cereal consumers were also found to be less likely to have inadequate intakes of iron, calcium and vitamin E - important nutrients for developing infants. The study, Nutrient intake, introduction of baby cereals and other complementary foods in the diets of infants and toddlers from birth to 23 months of age, published in AIMS Public Health, illustrates the importance of rice baby cereal in the diets of infants and toddlers in achieving proper nutrition.

This study examined National Health and Nutrition Examination Survey (NHANES) data, a national survey of food intake, from 2001 to 2014 to assess food intake in infants and toddlers from birth to 23 months. The study evaluated four age ranges - ages 0 to 3 months, 4 to 6 months, 7 to 11 months and 12 to 23 months - and the role of cereal consumption such as rice cereal. Researchers investigated whether baby cereal consumption (e.g. rice cereal) was related to different eating patterns, nutrient status and intakes of added sugars, saturated fat and sodium, when compared to non-cereal consumers.

When introduced as early as 4 months, baby cereal, including rice cereal, is associated with improved nutrient status. From 4 to 6 months of age, babies who ate cereal took in more calories, carbohydrates, whole grains and key nutrients, like vitamin B6, calcium, iron and magnesium. As infants get older, the results remained similar. From 7 to 11 months, cereal consumers were found to have higher intakes of carbohydrates, vitamin E, calcium, iron, zinc and magnesium. Beyond the first year of life, baby cereal consumption continued to be associated with greater iron, zinc and vitamin E intake.1

"Based on the results, diet recommendations for infants from birth to 23 months should include baby cereal - like rice cereal - due to its role in maintaining nutrient status which supports growth and development," suggests study author, Theresa Nicklas, DrPH.

Overall, the study demonstrates there is a strong public health benefit to feeding infants ages 4 to 11 months infant baby cereal.1 Data indicates that feeding cereal, such as rice cereal, as one of babies' first foods has a positive impact on nutrient status. Babies who consumed rice and other infant cereals, during the first two years of life, had more complete and balanced nutrition than those who were not fed baby cereal. While more research is needed, this study demonstrates a link between feeding infants and toddlers baby cereal and an improved overall nutrition profile.

"The results of this study build a strong case for the benefits of feeding your infant baby cereal," says Nicklas. "As the Dietary Guidelines Advisory Committee looks to evaluate food patterns and nutrient status of infants aged 0 to 23 months, it is important to consider the role of baby cereal in the diets of infants and toddlers," she continues.

Credit: 
Pollock Communications

Injection strategies are crucial for geothermal projects

image: Drilling rig of the geothermal project in Helsinki, Finland. The project aims to supply the university campus with heat from a depth of more than 6 kilometres

Image: 
G. Dresen

Geothermal energy with its significant baseload capacity has long been investigated as a potential complement and long-term replacement for traditional fossil fuels in electricity and heat production. In order to develop deep geothermal reservoirs where there are not enough natural fluid pathways, the formation needs to be hydraulically stimulated. Creation of so called Enhanced Geothermal Systems (EGS) opens fluid flow paths by injecting large quantities of water at elevated pressures. This is typically accompanied by induced seismicity. Some especially large induced earthquakes have led to the termination or suspension of several EGS projects in Europe, such as the deep heat mining projects in Basel and in St. Gallen, both Switzerland. Recently, the occurrence of a MW 5.5 earthquake in 2017 near Pohang, South Korea, has been linked to a nearby located EGS project. As such, there now exists substantial public concern about EGS projects in densely populated areas. Developing new coupled monitoring and injection strategies to minimize the seismic risk is therefore key to safe development of urban geothermal resources and restore public faith in this clean and renewable energy.

In a new study published in Geophysical Research Letters, Bentz and co-workers analyzed the temporal evolution of seismicity and the growth of maximum observed moment magnitudes for a range of past and present stimulation projects. Their results show that the majority of the stimulation campaigns investigated reveal a clear linear relation between injected fluid volume or hydraulic energy and the cumulative seismic moments. For most projects studied, the observations are in good agreement with existing physical models that predict a relation between injected fluid volume and maximum seismic moment of induced events. This suggest that seismicity in most cases results from a stable, pressure-controlled rupture process at least for an extended injection period. This means that induced seismicity and magnitudes could be managed by changes in the injection strategy. Stimulations that reveal unbound increase in seismic moment suggest, that in these cases evolution of seismicity is mainly controlled by regional tectonics. During injection a pressure-controlled rupture may become unstable, with the maximum expected magnitude then being only limited by the size of tectonic faults and fault connectivity. Close near-real-time monitoring of the seismic moment evolution with injected fluid could help to identify stress-controlled stimulations at the early stages of injection or potentially diagnose critical changes in the stimulated system during injection for an immediate reaction in stimulation strategy.

Credit: 
GFZ GeoForschungsZentrum Potsdam, Helmholtz Centre