Tech

Significant otter helps couples communicate from the heart

image: Sensed states available from Significant Otter

Image: 
Significant Otter

Even though people stayed in touch during the pandemic's stay-at-home orders and social distancing, it was easy to feel out of touch with loved ones.

Technology and the internet have expanded the way humans communicate and added much to that communication -- think emojis, GIFs and memes. But they can still fall short of being physically with someone.

"Our social cues are limited online," said Fannie Liu, a research scientist at Snap Inc who earned her Ph.D. from the Human-Computer Interaction Institute in Carnegie Mellon University's School of Computer Science. "We're exploring a new way to support digital connection through a deeper and more internal cue."

Liu was part of a team from CMU, Snap and the University of Washington that built Significant Otter, an app designed primarily for smart watches that allows couples to communicate with each other based on their sensed heart rate. The team presented their work this month at the Association for Computing Machinery (ACM) Computer-Human Interaction (CHI) Conference.

As the app's name suggests, it uses otters to communicate. The app allows couples to send animated otters to one another that represent emotions and activities. For example, otters can be sad, excited, calm or angry, or they can be working, exercising, eating or tired. The app senses a person's heartrate and then suggests otters with the emotion or activity that may correspond to it. A fast heartrate could prompt the app to suggest an excited or angry otter, or an otter that is exercising or eating.

The partner can then respond with preset reactions. The reactions aren't based on the person's heartrate but are instead designed to give support to the person communicating based on their heartrate. Example reactions include otters hugging, holding each other's hands or even giving an encouraging thumbs up.

The team tested the app in April and May 2020 with 20 couples separated by the pandemic and found that the use of biosignals -- in this case, heartrate -- made for easier and more authentic communication. Liu and the team didn't intend to test the app during the pandemic, but couples who participated in the test said that the app gave them a sense of their partner's physical state even when they couldn't be physically together.

"It's coming from your heart," Liu said. "It can be a very intimate gesture to see or feel someone's heartbeat. It's a signal that you're living."

Credit: 
Carnegie Mellon University

eDNA analysis could contribute towards more effective pest control

image: A. Argentine ant and B. a map of the sampling areas.

Image: 
A. Prof. Mamiko Ozaki B. Minamoto et al. Scientific Reports, 2021.

Researchers have successfully detected the environmental DNA (eDNA *1) of the Argentine ant (*2) in surface soil samples from sites on Kobe's Port Island and in Kyoto's Fushimi District, two areas that have a long history of destruction caused by this invasive species. The research group included then graduate student YASASHIMOTO Tetsu and Associate Professor MINAMOTO Toshifumi of Kobe University's Graduate School of Human Development and Environment, Visiting Professor OZAKI Mamiko of the Graduate School of Engineering, and NAKAJIMA Satoko, formally of the Kyoto Prefectural Institute of Public Health and Environment.

This method can be used to enable scientists to easily gain an accurate understanding of the habitat distribution and hotspots for globally invasive ant species (*3), such as the fire ant, which cause significant damage. Combining this method with pest control plans against invasive ant species will contribute towards the formulation of targeted measures and successful elimination results.

These research results are due to be published in Scientific Reports on May 26, 2021.

Main points

Invasive ant species are causing serious damage worldwide. Early detection and rapid elimination is essential for controlling their populations.

However, current pest control methods involve a series of direct detection techniques (such as visual observation, capture, classification, elimination, follow-up observation and evaluation), which require specialist knowledge, labor and time. This is inefficient considering the widespread damage caused worldwide by these invasive species.

By focusing on one species that is difficult to eradicate (the Argentine ant), the group demonstrated that eDNA analysis can provide a useful tool for observing and evaluating the invasion, establishment and proliferation of invasive ant species.

The research group developed an Argentine ant-specific real-time PCR assay. Using this new assay, they successfully and highly accurately detected eDNA originating from this species in surface samples collected from invasion sites.

The researchers compared the presence of eDNA with the last decade of pest control records. They demonstrated the efficiency of eDNA analysis for monitoring populations of the target species and reported for the first time that this method could lead to rapid improvements in the accuracy and effectiveness of invasive ant extermination.

Research Background

In the midst of globalization, the transport of goods and commodities between nations is increasing. Consequently, the arrival, subsequent establishment and widespread proliferation of invasive ant species that are inadvertently transported to other countries has developed into a worldwide problem. In Japan, Argentine ant (Figure 1A) colonies continue to proliferate widely, and the invasion and establishment of highly poisonous fire ant populations have also been reported.

The research group chose two areas that have a long history of damage caused by the Argentine ant; 1. Kyoto's Fushimi district and 2. Kobe's Port Island. 1. Fushimi has continuously used insecticide measures for almost 10 years, and has had consistent success in suppressing Argentine ant populations in built-up areas. 2. Although the Argentine ant first invaded Kobe over 20 years ago, the situation has yet to change for the better.

In cases like 1, where insecticide-based methods have been carried out for a long period of time, it is difficult to stop the usage of insecticide on a non-scientific basis, even though this presents a problem from the perspective of conserving the natural environment and ecosystem.

In cases like 2, there are no pest control plans in place because the extent of the species' distribution is not readily understood.

Environmental DNA analysis is a biological monitoring method that was introduced into the fields of ecology and conservation biology around 2008. It has brought about revolutionary changes in the conduction of biological surveys, especially in aquatic environments. It is hoped that applying this technique to surface soil samples instead of water samples will play a key role in eliminating invasive ants, providing a breakthrough solution to this problem.

Research Methodology and Findings

Argentine ant-specific DNA assay

The researchers designed a real-time PCR assay specific to the Argentine ant and experimentally confirmed the specificity of the assay. The researchers then performed the following experiments using this method.

Selection of surface sample sites and their respective pest control histories

Surface samples were collected from a total of four sites (FM-1 to 4) in the Fushimi district of Kyoto in order to test them for the eDNA of the Argentine ant, as shown in Figure 1B. The characteristics of efforts to control invasive ant species over the past decade are different for each site (Figure 2).

Detecting the eDNA of Argentine ants using surface soil samples

Environmental DNA from the Argentine ant was found in the samples from the FM-1 and FM-2 sites, where their presence had previously been confirmed by traditional bait trap surveys. However, eDNA from this invasive species was not detected in the samples taken from sites FM-3 and FM-4. These results did not contradict those obtained using the bait trap method, suggesting that eDNA analysis can be considered more accurate than surveys based on visual observations (Table 1). In addition, native ant species were observed at each of the survey sites. At sites FM-1 and 2, where Argentine ant eDNA was detected, 7 and 3 native species were observed, whereas at sites FM-3 and 4, where the invasive ant's DNA was not detected, 15 and 10 native species were found respectively. This indicates that the invasive species may be driving out native ant species.

Conclusions and Further Developments

1. By comparing and analyzing the results of environmental DNA analysis of various ant species with the habitat data of Argentine ant and native ant species accumulated over the years, a method for estimating ant habitat by environmental DNA can be developed for practical use. In this way, eDNA could provide a scientific basis from which to reconsider previous ineffective pest control methods and to eradicate issues relating to the continuance of pest control and confirming extermination.

2. The environmental DNA method is a fundamental, all-purpose technique, which enables further research into other invasive ant species, such as fire ants, to be conducted in the same way.

3. A policy model for the control of invasive species, based on the Sustainable Development Goal (SDG) to 'Conserve and restore terrestrial ecosystems and halt biodiversity loss', can be drawn up and implemented, beginning with invasive ant species (including Argentine ant and fire ant).

Credit: 
Kobe University

Amazon indigenous group's lifestyle may hold a key to slowing down aging

image: A Tsimane child in a canoe

Image: 
Chapman University

A team of international researchers has found that the Tsimane indigenous people of the Bolivian Amazon experience less brain atrophy than their American and European peers. The decrease in their brain volumes with age is 70% slower than in Western populations. Accelerated brain volume loss can be a sign of dementia.

The study was published May 26, 2021 in the Journal of Gerontology, Series A: Biological Sciences and Medical Sciences.

Although people in industrialized nations have access to modern medical care, they are more sedentary and eat a diet high in saturated fats. In contrast, the Tsimane have little or no access to health care but are extremely physically active and consume a high-fiber diet that includes vegetables, fish and lean meat.

"The Tsimane have provided us with an amazing natural experiment on the potentially detrimental effects of modern lifestyles on our health," said study author Andrei Irimia, an assistant professor of gerontology, neuroscience and biomedical engineering at the USC Leonard Davis School of Gerontology and the USC Viterbi School of Engineering. "These findings suggest that brain atrophy may be slowed substantially by the same lifestyle factors associated with very low risk of heart disease."

The researchers enrolled 746 Tsimane adults, ages 40 to 94, in their study. To acquire brain scans, they provided transportation for the participants from their remote villages to Trinidad, Bolivia, the closest town with CT scanning equipment. That journey could last as long as two full days with travel by river and road.

The team used the scans to calculate brain volumes and then examined their association with age for Tsimane. Next, they compared these results to those in three industrialized populations in the U.S. and Europe.

The scientists found that the difference in brain volumes between middle age and old age is 70% smaller in Tsimane than in Western populations. This suggests that the Tsimane's brains likely experience far less brain atrophy than Westerners as they age; atrophy is correlated with risk of cognitive impairment, functional decline and dementia.

The researchers note that the Tsimane have high levels of inflammation, which is typically associated with brain atrophy in Westerners. But their study suggests that high inflammation does not have a pronounced effect upon Tsimane brains.

According to the study authors, the Tsimane's low cardiovascular risks may outweigh their infection-driven inflammatory risk, raising new questions about the causes of dementia. One possible reason is that, in Westerners, inflammation is associated with obesity and metabolic causes whereas, in the Tsimane, it is driven by respiratory, gastrointestinal, and parasitic infections. Infectious diseases are the most prominent cause of death among the Tsimane.

"Our sedentary lifestyle and diet rich in sugars and fats may be accelerating the loss of brain tissue with age and making us more vulnerable to diseases such as Alzheimer's," said study author Hillard Kaplan, a professor of health economics and anthropology at Chapman University who has studied the Tsimane for nearly two decades. "The Tsimane can serve as a baseline for healthy brain aging."

Healthier hearts and -- new research shows -- healthier brains

The indigenous Tsimane people captured scientists' -- and the world's -- attention when an earlier study found them to have extraordinarily healthy hearts in older age. That prior study, published by the Lancet in 2017, showed that Tsimane have the lowest prevalence of coronary atherosclerosis of any population known to science and that they have few cardiovascular disease risk factors. The very low rate of heart disease among the roughly 16,000 Tsimane is very likely related to their pre-industrial subsistence lifestyle of hunting, gathering, fishing, and farming.

"This study demonstrates that the Tsimane stand out not only in terms of heart health, but brain health as well," Kaplan said. "The findings suggest ample opportunities for interventions to improve brain health, even in populations with high levels of inflammation."

Credit: 
University of Southern California

Global cardiovascular organizations release joint opinion on achieving the 'tobacco endgame'

image: Illustration of the dangers of tobacco and the marketing innovations associated with it, as well as a proposed pathway to eradication of the tobacco epidemic.

Image: 
4C/FPO

Tobacco use continues to be a primary contributor to the global burden of disease, causing an estimated 12% of deaths worldwide among people aged 30 and over. Four leading cardiovascular organizations - American Heart Association, American College of Cardiology, European Society of Cardiology and World Heart Federation - today released a joint opinion calling for greater action at the global scale to end the tobacco epidemic once and for all.

The organizations are urging governments to take immediate action to implement the World Health Organization's MPOWER framework, which outlines six essential policy approaches proven to reduce tobacco use: Monitor tobacco use and prevention policies; Protect people from tobacco smoke; Offer help to quit tobacco use; Warn about the dangers of tobacco; Enforce bans on tobacco advertising, promotion and sponsorship; and Raise taxes on tobacco.

The joint opinion outlines comprehensive tobacco prevention strategies that are necessary to fully implement the MPOWER framework, including:

Lowering the nicotine concentrations in all combustible tobacco products.

Further research to understand the health impacts of nicotine on the cardiovascular system and the long-term effects of electronic cigarettes.

Enforcement of strong systems and premarket assessments of all tobacco products.

Strong regulation of tobacco industry marketing to ensure false health claims are not made about products that have not been thoroughly researched and authorized through regulatory review.

Greater global action to remove all non-tobacco flavored products from the market.

Raising the price of all tobacco products, through excise taxes and other means.

Youth-targeted counter-marketing campaigns to effectively reduce tobacco use among youth.

Access to comprehensive, evidence-based cessation services as a safer alternative for adults who wish to quit smoking combustible cigarettes.

Despite global reductions in tobacco use, the growing popularity of electronic cigarettes and other newer tobacco products that appeal to youth with flavorings threatens progress toward ending tobacco use and nicotine addiction - the "tobacco endgame." Countries must effectively regulate electronic cigarettes and other emerging tobacco products to protect young people and improve public health worldwide.

The joint opinion is being published simultaneously in the flagship journals of all four organizations: the Journal of the American College of Cardiology (JACC), the Journal of the American Heart Association (JAHA), the European Heart Journal (EHJ) and Global Heart.

Organizational Quotes:

"We are proud to join with our global public health colleagues to call for swift action to end tobacco use and nicotine addiction worldwide," said Mitchell S. V. Elkind, M.D., MS, FAAN, FAHA, president of the American Heart Association. "The evidence-based strategies that have been successfully implemented in countries around the world, from government regulation to tobacco taxes to funding for prevention and cessation programs, would make an enormous difference if implemented on a global scale. The time is now to redouble our efforts to reach the tobacco endgame by ending tobacco use and nicotine addiction worldwide."

"Nicotine can cause serious health risks to the cardiovascular system at all stages of life," said Athena Poppas, MD, MACC, immediate past president of the American College of Cardiology. "Nicotine may increase a person's blood pressure, heart rate and flow of blood to the heart, narrow the arteries and harden the arterial walls, which in turn can lead to a heart attack. Nicotine also impacts brain development and poses dangers to youth, pregnant women and the developing fetus. There needs to be a greater understanding of the impacts of nicotine on cardiovascular health and nicotine delivery products on children and youth to inform further treatment and regulatory approaches to nicotine."

Professor Stephan Achenbach, President of the European Society of Cardiology stated: "Today the ESC joins other leading professional organizations in cardiovascular healthcare to send a strong, global message calling for public health campaigns and legislation to fight tobacco and, in particular, to deter vaping. There is increasing evidence on the adverse effects of e-cigarettes. New measures are needed to stop marketing campaigns for e-cigarettes and flavored tobacco, especially those targeting young people."

"Tobacco use is the single greatest preventable cause of death in the world today, with the majority of deaths occurring in low- and middle-income countries as a result of aggressive marketing campaigns by the tobacco industry in these regions," said Fausto Pinto, President of the World Heart Federation. "The World Heart Federation is fully committed to fighting the global epidemic of tobacco consumption and tobacco addiction, and we encourage governments to accelerate implementation of the World Health Organization Framework Convention for Tobacco Control and the MPOWER package. Most importantly, governments must take steps to increase taxes on tobacco and nicotine products - the single most effective measure to reduce the consumption of these deadly products."

Credit: 
American College of Cardiology

Glioblastoma study discovers protective role of metabolic enzyme, revealing a novel therapeutic target

image: From left to right: Francesca Puca, Ph.D., Andrea Viale, M.D., and Giulio Draetta, M.D., Ph.D.

Image: 
MD Anderson Cancer Center

HOUSTON - Researchers at The University of Texas MD Anderson Cancer Center have discovered a novel function for the metabolic enzyme medium-chain acyl-CoA dehydrogenase (MCAD) in glioblastoma (GBM). MCAD prevents toxic lipid buildup, in addition to its normal role in energy production, so targeting MCAD causes irreversible damage and cell death specifically in cancer cells.

The study was published today in Cancer Discovery, a journal of the American Association for Cancer Research. Preclinical findings reveal an important new understanding of metabolism in GBM and support the development of MCAD inhibitors as a novel treatment strategy. The researchers currently are working to develop targeted therapies against the enzyme.

"With altered metabolism being a key feature of glioblastoma, we wanted to better understand these processes and identify therapeutic targets that could have real impact for patients," said lead author Francesca Puca, Ph.D., instructor of Genomic Medicine. "We discovered that glioblastoma cells rely on MCAD to detoxify and protect themselves from the accumulation of toxic byproducts of fatty acid metabolism. Inhibiting MCAD appears to be both potent and specific in killing glioblastoma cells."

To uncover metabolic genes that are key to GBM survival, the research team performed a functional genomic screen in a unique preclinical model system that permitted an in vivo study using patient-derived GBM cells. After analyzing 330 metabolism genes in this model, they discovered that several enzymes involved in fatty acid metabolism were important for GBM cells.

The team focused on MCAD because it was identified in multiple GBM models and found at high levels in GBM cells relative to normal brain tissue. In-depth studies determined that blocking MCAD in GBM cells resulted in severe mitochondrial failure caused by the toxic buildup of fatty acids, which normally are degraded by MCAD.

This resulted in a catastrophic and irreversible cascade of events from which GBM cells could not recover, explained senior author Andrea Viale, M.D., assistant professor of Genomic Medicine.

"It appears that the downregulation of this enzyme triggers a series of events that are irreversible, and the cells are poisoned from the inside," Viale said. "Usually, tumor cells are able to adapt to treatments over time, but, based on our observations, we think it would be very difficult for these cells to develop resistance to MCAD depletion."

While blocking MCAD appears to be detrimental to the survival of GBM cells, the research team repeatedly found that normal cells in the brain were not affected by loss of the enzyme, suggesting that targeting MCAD could be selective in killing only cancer cells. Supporting this observation is the fact that children and animals born with an MCAD deficiency are able to live normally with an altered diet.

"It has become clear that MCAD is a key vulnerability unique to glioblastoma, providing us a novel therapeutic window that may eliminate cancer cells while sparing normal cells," said senior author Giulio Draetta, M.D., Ph.D., chief scientific officer and professor of Genomic Medicine. "We are looking for discoveries that will have significant benefits to our patients, and so we are encouraged by the potential of these findings. We are actively working to develop targeted therapies that we hope will one day provide an effective option for patients."

The research team has characterized the three-dimensional structure of the MCAD protein in a complex with novel small molecules designed to block the activity of the enzyme. As promising drug candidates are discovered, the researchers will work in collaboration with MD Anderson's Therapeutics Discovery division to study these drugs and advance them toward clinical trials.

Credit: 
University of Texas M. D. Anderson Cancer Center

Recent warming weakens global dust storm activity

image: An overview map showing the locations of Lake Karakul (blue pentagon), Lake Daihai (blue star), and dust storm meteorological stations (purple circles) around the primary dust sources in northern China, including 7 stations in western Tarim Basin (TB), 7 stations around Qaidam Basin (QB), 8 stations in northern Qilian Mountain (QM, that is the southern Badain Jaran Desert and Tengger Desert), and 9 stations around Lake Daihai. The red arrow denotes the westerly.

Image: 
©Science China Press

Dust storms are often defined as catastrophic weather events where large amounts of dust particles are raised and transported by strong winds, characterized by weak horizontal visibility (Dust storm occurrence is generally a function of natural climatic factors (temperature, precipitation and surface wind speed) and human activity, while it is still unclear which one is the dominating factor controlling the dust storm frequency, magnitude, and extent. The available instrumental records are too short to unravel these questions. Dust storms registered in the natural geological archives can extend the eolian dust records beyond the limited temporal range of meteorological observations, and thus can be used to reconstruct long-term dust storm history.

Geological records extracted from natural archives, like lake sediments and ice core records, also support that dust storms were closely correlated to both climatic factors (Wang et al., 2006; Chen et al., 2013) and human activity (Neff et al., 2008; Chen et al., 2020). However, there is a puzzle whether and when human activities have surpassed natural climatic forcing. One example is a recent study by Chen et al. (2020) in which authors argued that at least 2000 years ago, human activity may have exceeded the natural climatic changes to control dust storm activity in eastern China. Therefore, reliable long-term dust storm records are necessary to deepen these issues.

Recently researchers from Tianjin University and their colleagues reconstructed dust storm history based on well-dated lake sediment grainsize records in northern China, namely Lake Karakul (Zhang et al., 2020) and Lake Daihai (Zhang et al., 2021). They found that the sedimentary sandy fraction (>63 μm) in northern China was a robust indicator of the past dust storms, and the reconstructed dust storms correlate well with those recorded in modern observation, historical literatures, and other robust geological archives (Fig. 3). The reconstructed dust storms generally occurred during cold intervals on annual/decadal scales over the past several centuries (Fig. 3), and such dusty-cold patterns can also be observed in geological archives on even longer timescales, for instance, the dust fluxes in central China loess (Sun and An, 2005), marine sediments (Rea, 1994), and Antarctica ice cores (Lambert et al., 2008) were considerably higher during glacial than interglacial stages. Prof. Hai Xu and colleagues contended that changes in the intensities of the Siberian High and the westerly modulated by temperature variations could be most likely responsible for those observed dust storm patterns.

One striking feature in the reconstructions is the substantially intensified dust storm activities after ~1870 AD, roughly coinciding with the beginning of the Industrial Revolution (Fig. 3). The increased dust particle supply induced by significantly intensified human activities could be responsible for this abrupt increase in dust storm activities. Another interesting feature is that although the dust storms have been systematically activated during the recent warming one more century, an obviously decreasing trend can be seen within this interval (Fig. 3). Authors proposed that the weakening trend was most likely due to the decrease in mean wind speed in response of the recent global warming. "In contrast to the natural solar forced warming, the greenhouse gas-triggered warming may lead to a decrease in global zonal temperature gradient and then a general increase in atmospheric static stability, which is potentially conducive to decreasing global wind speed and dust storm frequency/intensity" said Prof. Xu. Owing to the causal relationship between the recent warming and the decreasing wind speed, Prof. Hai Xu and his colleagues infer that "dust storm activity in northern China is expected to further weaken or remain at its present low level in the near future".

Credit: 
Science China Press

Dry metastable olivine and slab deformation in a wet subducting slab

image: Slab deformation and formation of a stagnant slab in a wet descending slab and their possible linkage with dehydration of hydrous phases.

Image: 
Takayuki Ishii

While the plates carry water to the Earth's interior, phase transitions of dry olivine, the main mineral in the plates, are thought to be responsible for deep-focus earthquakes and plate deformation. This study resolves the contradiction of the presence of dry olivine even in wet plates. Takayuki Ishii, a researcher at the Center for High Pressure Science & Technology Advanced Research (HPSTAR), China and the Bavarian Institute of Geosciences, University of Bayreuth, Germany, and Eiji Otani, a professor emeritus at Tohoku University, used high-pressure and high-temperature experiments to determine the water content of olivine under the conditions of a subducting plate containing water. The results show that the hydrous minerals absorb water, while the coexisting olivine contains no water at all. This experimental result overturned previous theories on the role of hydrous minerals, and revealed that deep-focus earthquakes can occur even in a wet plate, and that large plate deformation can also occur.The results of this research will be published Nature Geoscience.

The most widely accepted explanation for deep-focus earthquakes is that they are caused by the delayed transformation kinetics of dry olivine, which would seem to require that subducting slabs are dry. Metastable persistence of olivine to great depths (~630 km) leads to a wedge of olivine in the slab, and seismic observations of such wedges add weight to the proposed mechanism. However, in direct opposition to the dry wedge hypothesis, water is circulating not only at the Earth's surface but also in the interior through subducting oceanic lithosphere: many geochemical and geophysical observations and mineral physics data indicate that 'water', in the form of hydroxyl groups, is present within both hydrous and nominally anhydrous minerals, implying that subducting slabs are hydrated. The presence of dry metastable olivine in wet subducting slabs is therefore paradoxical, and the hydration state of the slabs remains a topic of debate.

Most previous studies focused on maximum solubility of water in olivine coexisting with hydrous melts outside the conditions where deep-focus earthquakes happen, under water-saturated conditions. However, natural wet subducting slabs may consist of olivine and hydrous minerals under water-undersaturated condition.

Here, they determined the water contents of olivine coexisting with hydrous phase A, a major dense hydrous magnesium silicate produced by dehydration of serpentine, under water-undersaturated conditions. They show that olivine is dry even under wet conditions. Alternately, a minor mineral of dense hydrous silicate preferentially crystallises as a host of water, which is the opposite conclusion to previous understanding of the water partitioning: olivine preferentially accommodates water and hydrous minerals forms when excess water exists. This finding changes our knowledge of the role of water in the deep Earth, implying that hydrous minerals play more important roles on water cycle in the Earth's interior.

"We thus solve the paradox: dry olivine experiences the delay transformation even in hydrated slabs, causing deep-focus earthquakes. Furthermore, this result suggests that dehydration of hydrous minerals, which is usually considered for an origin of intermediate-deep earthquakes, may also cause deep-focus earthquake," said Dr. Ishii.

Their finding also newly suggests a novel hypothesis that mysterious phenomena of slab bending and stagnation in the deep interior are caused jointly by dehydration of hydrous silicates and the subsequent rapid phase transformation of olivine enhanced by their released water, due to hydrolytic weakening of olivine and its high-pressure polymorphs.

"These results suggest that hydrous minerals not only play an important role in transporting water to the Earth's interior, but are also responsible for the occurrence of deep earthquakes and large plate deformations," Dr. Ishii added. "Dehydration of hydrous minerals has been thought to be one of the causes of shallower earthquakes than deep earthquakes, but our results suggest that it can also be a cause of deep-focus earthquakes deeper than 660 km, which cannot be explained by phase transitions in metastable olivine. The results are expected to provide an important clue to a full understanding of plate behaviour, including the occurrence of deep earthquakes and large deformations in the deep mantle."

Credit: 
Center for High Pressure Science & Technology Advanced Research

Otago study aids understanding of invisible but mighty particles

Tiny charged electrons and protons which can damage satellites and alter the ozone have revealed some of their mysteries to University of Otago scientists.

In a study, published in Geophysical Research Letters, the group looked at charged particles interacting with a type of radio wave called 'EMIC' - a wave generated in Earth's radiation belts (invisible rings of charged particles orbiting the Earth).

Lead author Dr Aaron Hendry, of the Department of Physics, says it is important to understand how these waves affect the belts - which are filled with expensive and important satellites - and Earth's climate.

"Much like the Earth's atmosphere, the Earth's magnetosphere - the region around the Earth where our magnetic field is stronger than the Sun's - sometimes experiences strong 'storms', or periods of high activity. These storms can cause significant changes to the number of particles in the radiation belts and can accelerate some of them to very high speeds, making them a danger to our satellites. Knowing how many of these particles there are, as well as how fast they're moving, is very important to us, so that we can make sure our satellites keep working.

"Activity within the radiation belts can sometimes cause the orbits of these particles to change. If these changes bring the particles low enough to reach the Earth's upper atmosphere, they can hit the dense air, lose all of their energy and fall out of orbit.

"EMIC waves are known to be able to cause these changes and drive the loss of particles from the radiation belts. As well as causing beautiful light displays that we call aurora, this rain of particles can also cause complex chemical changes to the upper atmosphere that can in turn cause small, but important, changes the amount of ozone present in atmosphere.

"Although these changes are small, understanding them is very important to properly understanding how the chemistry of the atmosphere works, how it is changing over time, and the impact it is having on the climate," Dr Hendry says.

For their latest study, the researchers used data from GPS satellites to look at how many electrons EMIC waves can knock into the Earth's atmosphere.

A general rule in the radiation belts is that at slower speeds, you have many more electrons. So, if the minimum speed of the EMIC wave interaction is lowered, there are a lot more electrons around to interact with waves.

By looking at data from satellites that monitor how many electrons there are in the radiation belts and how fast they're going, the researchers have been able to show that you can see the number of electrons in the radiation belts go down significantly when EMIC waves are around.

"Excitingly, we have also seen changes in the number of electrons at speeds significantly lower than the current 'accepted' minimum speed. This means that EMIC can affect much larger numbers of electrons than we previously thought possible. Clearly, we need to rethink how we're modelling this interaction, and the impact it has on the radiation belts. There are a lot of electrons in the radiation belts, so being able to knock enough of them into the atmosphere to make a noticeable change is quite remarkable.

"This has shown that we need to take these EMIC waves into account when we're thinking about how the radiation belts change over time, and how these changes in the radiation belt affect the climate on Earth."

Dr Hendry says the impact of EMIC-driven electrons on atmospheric chemistry is not currently being included by major climate models, which try to predict how the Earth's climate will change over time, so making sure this process is understood and included in these models is very important.

"The changes are very small compared to things like the human impact on climate, but we need to understand the whole picture in order to properly understand how everything fits together."

Credit: 
University of Otago

Base level and lithology affect fluvial geomorphic evolution at a tectonically active area

image: Topography, faults and drainage features of the study area in the northeastern Tibetan Plateau. (a) Haiyuan Fault zone and the Yellow River with the shaded relief map of the Tibetan Plateau; (b) morphologic features expressed DEM data, and drainage characteristics of the study area. LHM: Laohu Mountain; HSM: Hasi Mountain; means of other abbreviations see in the full text

Image: 
©Science China Press

The evolutionary history of fluvial geomorphology is the consequences of combined effects of tectonic, climate, lithology and base level. Previous researches had emphasized tectonic impacts on the fluvial system at the tectonically active region, while lithology and base level get little attention. In addition, the resistance of lithology may cause knickpoint and control the evolutionary history of landscape in relatively stable areas, and difference in local base-level is sufficient to induce drainage reorganization. Nevertheless, it is still unclear how far the lithology and base level affect the evolution of fluvial landforms in tectonically active areas.

In this study, researchers chose the area in the NE Tibet Plateau (Laohu and Hasi mountains) (Figure 1) where the development of fluvial landform is affected by both the activity of the Haiyuan Fault and the aggradation/incision of the Yellow River. Here, they aim to untangle the role of lithology and base level on fluvial processes in tectonically activity region. And, the geomorphic indices, i.e., drainage pattern and χ anomalies, were calculated and investigated.

Because of basin-mountain structures controlled by the tectonic activities of the left-lateral Haiyuan Fault, the study area formed the Laohu and Hasi mountains (Figure 1), and two radial drainage systems developed surrounding these mountains (Figure 2). however, since some broad valleys exist upstream and canyons develop at the river mouth (Figure 3), local fluvial landforms cannot always be explained distinctly by the tectonic movements. While, other factors, such as lithology and local base level, may play crucial roles in regional landform at various spatial scales.

Our results show that different base levels and/or bedrock lithology have significant impacts on the drainage reorganization (figure 3) and development of the specific fluvial landform at different spatial scales in this tectonically active environment. Firstly, instead of flowing into the southward river with the short path, channels from the southeastern side of the Laohu Mountain have changed their courses from around south to north direction before flowing into the Yellow River (Figure 2). This flow direction change is triggered by river piracy and drainage network reorganization due to a significant altitude difference of local base-levels (the confluences of different tributaries to the Yellow River) (Figure 1). Secondly, lithology differences lead to the formation of the alternative distributions of canyons and wide valleys. In addition, long rivers with higher steepness index invade shorter ones, thereby enhancing their drainage area and further increasing the erosive capacity. This process of positive feedback may gradually transform an unstable parallel river pattern to a stable dendritic one.

Credit: 
Science China Press

Warm ice may fracture differently than cold ice

image: The displacement measuring instruments at Aalto University's Ice Tank, the largest of its kind in the world, detect the crack opening to the level of microns. In this image the crack has split the ice completely into two pieces.

Image: 
Iman El Gharamti/Aalto University

Researchers at Aalto University in Finland have found strong evidence that warm ice - that is, ice very close in temperature to zero degrees Celsius - may fracture differently than the kinds of ice typically studied in laboratories or nature. A new study published in The Cryosphere takes a closer look at the phenomenon, studied at the world's largest indoor ice tank on Aalto's campus.

Understanding how ice breaks is crucial for ensuring safe harbours and bridges in cool climates, as well as transportation through historically ice-heavy regions. As global warming brings changes to once-predictable seasonal conditions, the rules underpinning infrastructure engineering are being tested across borders and continents.

'We need to study warm ice because it's what we're seeing in nature; global warming is happening. The mechanical properties of ice and how it responds to force may be fundamentally different when it's warm rather than cold, as we traditionally study it,' says Iman El Gharamti, lead author of the paper and doctoral student at Aalto University.

To study how warm ice responds to repeated rounds of force - known in the field as cyclical mechanical loading, which simulates conditions in nature - the team made use of Aalto University's Ice Tank. Measuring 40 metres wide by 40 metres long, the 2.8m-deep basin is considered to be largest of its kind.

Typically ice fractures are studied in small scales, often just 10-20 centimetres in length, at temperatures of -10 degrees Celsius or colder. In this study, the team used more than one-foot-thick ice sheets of fresh water measuring 3 by 6 metres. They also precisely controlled the ambient air temperature, and the ice was, in frozen terms, warm at a balmy -0.3 degrees Celsius.

With a hydraulic loading device the team applied multiple rounds of loading and unloading on the ice. Current understanding in the field suggests that ice will show viscoelastic recovery - separate from the immediate elastic response, it is a time-related, delayed elastic response - between loads, at least until the device is told to exert enough force to completely split the ice.

Under the conditions provided, however, the ice behaved in an unexpected way: it showed some elastic recovery but no significant viscoelastic recovery at all. In fact, the ice was permanently deformed.

'What we typically see between mechanical loads is that the ice recovers - it springs back to normal formation until we intentionally apply so much force that it permanently cracks. In our research, the ice was increasingly deformed after each load and we detected no significant delayed elastic recovery,' explains El Gharamti.

The main contributing factor seems to be the temperature of the ice. This research is the first to show warm ice may behave in a fundamentally different way than the cold ice normally studied.

'The fact that the ice didn't show delayed elastic response doesn't fit our conventional understanding of how ice copes with repeated rounds of force. We believe that this is because of how the granular level of ice behaves when warm, but we still need to do more research to find out what's going on,' says Jukka Tuhkuri, professor of solid mechanics at Aalto University.

As warmer conditions are increasingly expected in previously frigid regions like the Great Lakes or Baltic Sea - one of the world's busiest marine areas - Tuhkuri says its crucial to understand the mechanics of warm ice.

'A long-term ice load measurement on an icebreaker in the Baltic Sea has previously shown, surprisingly, that the largest ice load occurred during spring, when weather warms up. If our ships and infrastructure like bridges and wind turbines have been designed for fairly predictable seasons, we need to know what happens when global warming brings new conditions. It looks like the old rules may not hold up,' Tuhkuri says.

Credit: 
Aalto University

Better understanding membranes

image: The new class of membranes could be successfully used in mass separation.

Image: 
Graphic: Authors of the paper

Whether in desalination, water purification or CO2 separation, membranes play a central role in technology. The Helmholtz-Zentrum Hereon has been working for several years on a new variant: it consists of special polymers that form pores of the same size on the nanometer scale. The materials to be separated, such as certain proteins, can literally slip through these pores. Because these separation layers are very thin and thus very fragile, they are bound to a spongey structure with much coarser pores, providing the structure with the necessary mechanical stability.

"A special aspect is that these structures form in an act of self-organization," says Prof. Volker Abetz, director of the Hereon Institute of Membrane Research and professor of physical chemistry at the University of Hamburg. "In contrast to comparable membranes, which are partially manufactured through a complex process using particle accelerators, this promises relatively inexpensive production." Because the polymer membranes combine high throughput with strong separation selectivity, they could be interesting in the future for biotechnology and pharmaceutical production, but also in wastewater treatment, such as e.g., for filtering out unwanted dyes.

Advances through computer simulations

Experts have made considerable progress in the development of these new membranes in recent years. However, to tailor them for specific applications, a comprehensive theoretical understanding is still lacking. "So far, there has been a lot of trial and error as well as gut feeling involved," says Abetz. "Now it should be about fundamentally understanding these systems as much as possible." For this reason, Marcus Müller, professor of theoretical physics at the University of Göttingen and Volker Abetz have published a review article in the scientific journal Chemical Reviews. The work summarizes the current state of knowledge in the field of polymer membranes and identifies the most promising research approaches that can close existing gaps in knowledge.

Computer simulations play an important role here: they can be used to digitally model in detail what happens during the manufacturing process. "The problem is that these processes are exceedingly complex, and we are dealing with completely different length and time scales," explains Müller. "And we have not yet been in the position to cover all of these scales with a single description." There are, however, computer models that can simulate individual aspects. While some of these models describe the behavior of individual polymer molecules, others reproduce the membrane on a much coarser grid. These different approaches have so far only been rather weakly linked, and describing the time sequence of the various processes also poses a challenge. For a deeper understanding, it would be beneficial if the models were better interlinked than they are now.

Polymer membranes from the drawing board

"Polymer membrane production can be compared to making a soufflé," says Müller. "Both are about stabilizing the tiny pores that matter, before the entire thing collapses again." One of the aspects that is unclear is how and if the simultaneous formation of the separation layer and carrier layer influence each other and how this can be controlled. Another question concerns how the pores can be arranged and aligned in such a way that they allow the highest possible flow rate through the membrane--a decisive criterion for the membrane's profitability. "Fortunately, both computers and models are getting better and better, and that should facilitate considerable progress", Müller adds. "We can access the JUWELS supercomputer in Jülich, which is one of the fastest in the world." Machine learning algorithms could also possibly help in the future; there could be undiscovered potential here.

Not only theory is required, however. There is also work to be done in the experiments. "One big unknown, for example, is the humidity," explains Abetz. "We know that it can decisively influence the formation of a polymer membrane. But in order to better understand this influence, we need systematic tests." If hurdles like these can be overcome, it will bring the long-term research aim a little closer: "Our dream is to design and optimize a polymer membrane for a specific application as a "digital twin" on the computer first so that it can later be produced in a targeted manner in the laboratory," says Abetz. "And perhaps we could even discover entirely new structures on the computer, ones that we never would have encountered in the experiment.

Credit: 
Helmholtz-Zentrum Hereon

Wireless broadband connectivity enhanced by a new communication design

Current wireless networks such as Wi-Fi, LTE-Advanced, etc., work in the lower radio spectrum, below 6 GHz. Experts warn that soon this band will become congested due to mushrooming data traffic. It is calculated that by 2024, 17,722 million devices will be connected.

To meet the growing, ubiquitous demand for wireless broadband connectivity, communication via the terahertz band (THz) (0.1 to 10 THz) is seen as a necessary choice for 6G networks and beyond, due to the large amount of available spectrum in these frequencies.

A study published in the IEEE Journal on Selected Areas in Communications presents a new communication design that improves broadband wireless connectivity. It has involved Konstantinos Dovelos and Boris Bellalta of the Wireless Networking research group at the UPF Department of Information and Communication Technologies (DTIC) and of the IoT Lab, with the collaboration of Michail Matthaiou and Hien Quoc Ngo, researchers at Queen's University Belfast (UK).

Mitigating THz signal propagation loss

Despite the potential of wireless links in the THz bandwidth to achieve terabits-per-second bandwidths, THz signals suffer serious propagation losses due to their short wavelength. However, the use of multiple antennas for transmission and reception following Massive Multiple-Input Multiple-Output (MIMO) techniques allows compensating for these losses, while broadening the range of communication through beamforming.

The ultra-large bandwidths of THz band transmissions render standard techniques of beamforming and channel estimation ineffective

Beamforming basically involves concentrating and directing the electromagnetic signal radiated optimally between transmitter and receiver. In order to use beamforming, however, the channel between transmitter and receiver must be accurately known, hence the different techniques to estimate it.

A new design that mitigates signal delay

The ultra-large bandwidths of THz band transmissions render standard techniques of beamforming and channel estimation ineffective. In the article published in IEEE Journal on Selected Areas in Communications (JSAC), "we have shown that when signal propagation time between the transmitter antennas exceeds the symbol period of the data for sending, the response of the set of antennas ceases to be homogeneous. This issue, typically assumed in the design of current beamforming techniques, renders them ineffective for use in the THz band, which our proposal solves by adjusting these delays in a controlled manner", points out Konstantinos Dovelos, first author of the article.

And Dovelos adds: "In addition, with the subtle design of the channel estimator we presented, the transmitter can obtain reliable information on the state of the channel at a low estimation cost, minimizing the impact on the link capacity gain".

The numerical results obtained in this study show the performance gains provided by the design proposed by the researchers compared the use of techniques developed without considering the essential characteristics of the THz band, opening the way to achieving multi-Gbps speeds over distances of several metres.

Credit: 
Universitat Pompeu Fabra - Barcelona

What causes the deep Earth's most mysterious earthquakes?

image: This close-up view of a super-deep diamond highlights its inclusions, seen here as black spots. Inclusions like these provide geochemical evidence that a sinking oceanic plate can carry water and other fluids deep into the mantle.

Image: 
Photo by Evan Smith/© 2021 GIA

Washington, DC-- The cause of Earth's deepest earthquakes has been a mystery to science for more than a century, but a team of Carnegie scientists may have cracked the case.

New research published in AGU Advances provides evidence that fluids play a key role in deep-focus earthquakes--which occur between 300 and 700 kilometers below the planet's surface. The research team includes Carnegie scientists Steven Shirey, Lara Wagner, Peter van Keken, and Michael Walter, as well as the University of Alberta's Graham Pearson.

Most earthquakes occur close to the Earth's surface, down to about 70 kilometers. They happen when stress builds up at a fracture between two blocks of rock--known as a fault--causing them to suddenly slide past each other.

However, deeper into the Earth, the intense pressures create too much friction to allow this kind of sliding to occur and the high temperatures enhance the ability of rocks to deform to accommodate changing stresses. Though theoretically unexpected, scientists have been able to identify earthquakes that originate more than 300 kilometers below the surface since the 1920s.

"The big problem that seismologists have faced is how it's possible that we have these deep-focus earthquakes at all," said Wagner. "Once you get a few tens of kilometers down, it becomes incredibly difficult to explain how we are getting slip on a fault when the friction is so incredibly high."

Ongoing work over the past several decades has shown us that water plays a role in intermediate-depth earthquakes--those that occur between 70 and 300 kilometers below Earth's surface. In these instances, water is released from minerals, which weakens the rock around the fault and allows the blocks of rock to slip. However, scientists didn't think this phenomenon could explain deep-focus earthquakes, largely because it was believed that water and other fluid-creating compounds couldn't make it far enough down into the Earth's interior to provide a similar effect.

This thinking changed for the first time when Shirey and Wagner compared the depths of rare deep-Earth diamonds to the mysterious deep-focus earthquakes.

"Diamonds form in fluids" explained Shirey, "if diamonds are there, fluids are there."

The diamonds themselves indicated the presence of fluids, however, they also brought samples of the deep-Earth to the surface for the scientists to study. When diamonds form in the Earth's interior, they sometimes capture pieces of mineral from the surrounding rock. These minerals are called inclusions and they may make your jewelry less expensive, but they are invaluable to Earth scientists. They are one of the only ways scientists can study direct samples of our planet's deep interior.

The diamond's inclusions had the distinct chemical signature of similar materials found in oceanic crust. This means that the water and other materials weren't somehow created deep in the Earth's interior. Instead, they were carried down as part of a sinking oceanic plate.

Said Wagner: "The seismology community had moved away from the idea that there could be water that deep. But diamond petrologists like Steve were showing us samples and saying 'No, no, no. There's definitely water down here' So then we all had to get together to figure out how it got down there."

To test the idea, Wagner and van Keken built advanced computational models to simulate the temperatures of sinking slabs at much greater depths than had been attempted before. In addition to the modeling, Walter examined the stabilities of the water-bearing minerals to show that under the intense heat and pressures of the Earth's deep interior, they would, indeed, be capable of holding on to water in certain conditions. The team showed that even though warmer plates didn't hold water, the minerals in the cooler oceanic plates could theoretically carry water to the depths we associate with deep-focus earthquakes.

To solidify the study the team compared the simulations to real-life seismological data. They were able to show that the slabs that could theoretically carry water to these depths were also the ones experiencing the previously unexplained deep earthquakes.

This study is unusual in applying four different disciplines--geochemistry, seismology, geodynamics, and petrology--to the same question, all of which point to the same conclusion: water and other fluids are a key component of deep-focus earthquakes.

"The nature of deep earthquakes is one of the big questions in geoscience," said Shirey. "We needed all four of these different disciplines to come together to make this argument. It turned out we had them all in-house at Carnegie."

Credit: 
Carnegie Institution for Science

Primates change their 'accent' to avoid conflict

image: Pied tamarin (Saguinus bicolor) - photograph by Tainara Sobroza

Image: 
Please credit Tainara Sobroza

New research has discovered that monkeys will use the "accent" of another species when they enter its territory to help them better understand one another and potentially avoid conflict.

Published in the journal Behavioral Ecology and Sociobiology, the study is the first to show asymmetric call convergence in primates, meaning that one species chooses to adopt another species' call patterns to communicate.

The study, co-authored by Dr Jacob Dunn of Anglia Ruskin University (ARU), investigated the behaviour of 15 groups of pied tamarins (Saguinus bicolor) and red-handed tamarins (Saguinus midas) in the Brazilian Amazon.

Pied tamarins are critically endangered and have one of the smallest ranges of any primate in the world, much of it around the city of Manaus, while red-handed tamarins are found throughout the north-eastern Amazon region.

The researchers found that when groups of red-handed tamarins entered territory shared with pied tamarins, the red-handed tamarins adopted the long calls used by the pied tamarins.

Red-handed tamarins have greater vocal flexibility and use calls more often than pied tamarins, and the scientists believe they might alter their calls to avoid territorial disputes over resources.

Lead author Tainara Sobroza, of the Instituto Nacional de Pesquisas da Amazonia, said: "When groups of tamarins are moving quickly around mature Amazonian forest it can sometimes be difficult to tell the species apart, but during our research we were surprised to discover they also sound the same in the areas of the forest they cohabit.

"We found that only the red-handed tamarins change their calls to those of the pied tamarins, and this only happens in places where they occur together. Why their calls converge in this way is not certain, but it is possibly to help with identification when defending territory or competing over resources."

Co-author Dr Jacob Dunn, Associate Professor in Evolutionary Biology at Anglia Ruskin University (ARU), said: "We have long known that when closely related species overlap in their geographic ranges, we are likely to see interesting evolutionary patterns. One famous example is the Galapagos finches, studied by Darwin, whose beaks evolved to specialise on different foods on the islands to avoid competition.

"In some cases, rather than diverging to become more different from one another, some closely related species converge to show similar traits. Our study is the first to show asymmetric call convergence in primates, with one species' call becoming the 'lingua franca' in shared territories.

"Because these tamarin species rely on similar resources, changing their 'accents' in this way is likely to help these tiny primates identify one another more easily in dense forest and potentially avoid conflict."

Credit: 
Anglia Ruskin University

Better peatland management could cut half a billion tons of carbon

Half a billion tonnes of carbon emissions could be cut from Earth's atmosphere by improved management of peatlands, according to research partly undertaken at the University of Leicester.

A team of scientists, led by the UK Centre for Ecology and Hydrology (UKCEH), estimated the potential reduction of around 500 million tonnes in greenhouse gas (GHG) emissions by restoring all global agricultural peatlands.

Peatlands - a type of wetland, where dead vegetation is stopped from fully breaking down - cover just 3% of the global land surface, but store around 650 billion tonnes of carbon, around 100 billion tonnes more than all of the world's vegetation combined.

Dr Jörg Kaduk and Professor Sue Page, both from the University of Leicester's School of Geography, Geology and the Environment, are co-authors of the study published in Nature.

Professor Page said: "Our results present a challenge but also a great opportunity. Better water management in peatlands offers a potential 'win-win' - lower greenhouse gas emissions, improved soil health, extended agricultural lifetimes and reduced flood risk.

"For agricultural peatlands, the balance is between climate security, and livelihood and food security. Our study indicates that raising peatland water levels could allow peatland farmers to both reduce the climate impact of their activities and extend the usage of these very fertile organic soils through modified land management.

"However, this will not be possible in all locations, and will need to be considered alongside other options, including complete rewetting and ecosystem restoration."

In their natural state, peatlands can mitigate climate change by continuously removing the GHG carbon dioxide (CO2) from the atmosphere and storing it securely under waterlogged conditions. But many peatland areas have been substantially modified by human activity, including drainage for agriculture and forest plantations.

This results in the release of around 1.5 billion tonnes of CO2 into the atmosphere each year - which equates to three per cent of all global GHG emissions caused by human activities.

However, because large populations rely on these peatlands for their livelihoods, it may not be realistic to expect all agricultural peatlands to be fully returned to their natural condition in the near future.

The team therefore also analysed the impact of halving current drainage depths in croplands and grasslands on peat - which cover over 250,000km2 globally - and showed that this could still bring significant, realistic benefits for climate change mitigation. The study estimates this could cut emissions by around 500 million tonnes of CO2 a year, which equates to one per cent of all global GHG emissions caused by human activities.

Professor Chris Evans of UKCEH, who led the research, said: "Widespread peatland degradation will need to be addressed if the UK and other countries are to achieve their goal of net zero greenhouse gas emissions by 2050, as part of their contribution to the Paris climate agreement targets.

"Concerns over the economic and social consequences of rewetting agricultural peatlands have prevented large-scale restoration, but our study shows the development of locally appropriate mitigation measures could still deliver substantial reductions in emissions."

The scientists say potential reductions in GHG emissions from halving the drainage depth in agricultural peatlands are likely to be greater than estimated, given they did not include changes in emissions of the potent GHG nitrous oxide (N2O) which, like levels of CO2, are also likely to be higher in deep-drained agricultural peatlands.

The University of Leicester plays a prominent role in peatland research, as policy-makers look to make better use of this highly efficient resource. The Department for Environment, Food and Rural Affairs this month published the England Peat Action Plan, which sets out the government's long-term vision for the management, protection and restoration of peatland. The plan utilises information derived from several research projects to which University of Leicester has made key contributions, particularly on the scale of GHG emissions from peatlands in eastern England.

Dr Kaduk and Professor Page are also working with the Department for Business, Energy and Industrial Strategy in order to better understand the role that agricultural management of peatlands plays in releasing N2O, as well as examining the long-term effects of agricultural use of peatlands.

Dr Kaduk added: "This study is just the first step towards fully exploring the emission reductions achievable through a whole range of differentiated local mitigation measures. For example, together with our farming partners we are determining the effects of farming practices on greenhouse gas emissions."

And earlier this month, Professor Page addressed the Climate Exp0 conference on Leicester's peatland work ahead of COP26, the 2021 UN Climate Change Conference due to be held in Glasgow this November, of which the University is part.

The study in Nature, 'Overriding water table control on managed peatland greenhouse gas emissions', involved authors from UKCEH, the Swedish University of Agricultural Sciences, the University of Leeds, The James Hutton Institute, Bangor University, Durham University, Queen Mary University of London, University of Birmingham, University of Leicester, Rothamsted Research and Frankfurt University.

Credit: 
University of Leicester