Earth

CABBI researchers challenge the CRP status quo to mitigate fossil fuels

image: Luoye Chen (pictured) is a Ph.D. student at the University of Illinois Urbana-Champaign in CABBI Sustainability Theme Leader Madhu Khanna's lab group. Alongside the research team, Chen worked to develop an integrated modeling approach for assessing the economic and environmental feasibility of transitioning land enrolled in the Conservation Reserve Program (CRP) to bioenergy agriculture. This swap, while economically advantageous for landowners and the government, also promises significant fossil fuel mitigation in the long term.

Image: 
CABBI communications staff

Researchers at the Center for Advanced Bioenergy and Bioproducts Innovation (CABBI) found that transitioning land enrolled in the Conservation Reserve Program (CRP) to bioenergy agriculture can be advantageous for American landowners, the government, and the environment.

Land enrolled in the CRP cannot currently be used for bioenergy crop production, wherein high-yielding plants (like miscanthus and switchgrass) are harvested for conversion into marketable bioproducts that displace fossil fuel- and coal-based energy. Established by the U.S. Department of Agriculture in 1985, the CRP incentivizes landowners to retire environmentally degraded cropland, exchanging agricultural productivity for native habitats and accepting annual government payments in return.

As the world warms and its population explosively expands, global demand for food production is at odds with the decreased agricultural productivity threatened by extreme climate conditions. Therefore, allocating CRP land for high-yielding energy biomass might eliminate the need for bioenergy crops and food crops to vie for space.

A team led by CABBI Sustainability Theme Leader Madhu Khanna and Ph.D. student Luoye Chen developed an integrated modeling approach to assess the viability of transitioning CRP land in the eastern U.S. to perennial bioenergy crops. Their paper, published in Environmental Science & Technology in January 2021, confirmed that the land-use transition is indeed viable provided that certain key conditions are met.

"As proponents of a safer, more sustainable bioeconomy, we must prioritize displacing fossil fuels," said Khanna, who is also Acting Director of the Institute for Sustainability, Energy, and Environment (iSEE) at the University of Illinois Urbana-Champaign. "As scientists, it is our responsibility to take a thoughtful, innovative approach to mitigating greenhouse gases in a way that will prove beneficial in the long term.

"The transportation and electricity sectors are looking to expand bioenergy production, and it is imperative that the agricultural sector do the same. This necessitates a program wherein bioenergy cropland and food cropland coexist rather than compete."

The CABBI team takes an integrated approach to weighing the costs and benefits of swapping the CRP status quo -- uncultivated acreage -- for bioenergy, combining the Biofuel and Environmental Policy Analysis Model (BEPAM) with the biogeochemical model DayCent (Daily Time Step Version of the Century Model).

BEPAM assesses net profitability, answering the key question: What precise economic conditions will incentivize CRP landowners to make the switch to bioenergy cropland? An environmental counterpoint to BEPAM, DayCent simulates the full ecosystem effects of the transition on a given county, providing a "sneak peek" into the future and shedding light on how this land-use change might affect factors like crop yield, nutrient exchange, and soil carbon sequestration.

A key component of this study aggregates data from both models to formulate a greenhouse gas (GHG) life-cycle assessment, which calculates the total GHGs mitigated by the process as a whole -- from the physical act of planting to the introduction of clean energy into the bioeconomy.

"The full life-cycle assessment really is key to understanding the big-picture results of our research," Chen said. "We take everything into account -- the process of actually growing and harvesting the feedstocks, the carbon sequestered in the soil, and the fact that ultimately, we will be displacing fossil fuels with biofuels, and coal-based electricity with bioelectricity.

"Keeping that end result in mind anchors everything else to the ultimate goal of a net positive environmental impact."

The team concluded that converting 3.4 million hectares of CRP land to bioenergy from 2016 to 2030 is economically and environmentally viable -- under certain conditions.

Economically speaking, all systems are "go" if the market price of biomass is high and the government continues to distribute appropriate CRP land rental payments. These factors can ideally function as counterweights: If biomass prices decrease, substantial land rental payments may alleviate financial stress from farmers and encourage their continued commitment to bioenergy; alternatively, soaring biomass prices would rationalize relaxed government support, saving taxpayers money. The team identified two ideal pairings: 1) landowners receive 100 percent of their original government payments and sell biomass at $75/metric ton; or 2) landowners receive 75 percent of their original payment and sell biomass for $100/metric ton. Ideally, both parties benefit.

Converting CRP land to bioenergy can also result in substantial GHG savings. Previous studies show that a large "soil carbon debt" is liable to accrue at the outset of the venture, during the planting years of miscanthus and switchgrass. However, taking into account the full life-cycle assessment mentioned above, the research team determined that the long-term effects of displacing fossil fuel- and coal-based energy with bioproducts would more than make up for this temporary loss.

Considering landowner income from biomass sales, savings in government payments to maintain existing CRP enrollment, and the monetized benefits of GHG mitigation through displacing fossil fuels (quantified using the "social cost of carbon"), the total net value of converting CRP land to bioenergy could be as high as $28 billion to $125 billion over the 2016-2030 period.

Credit: 
University of Illinois at Urbana-Champaign Institute for Sustainability, Energy, and Environment

Twist-n-Sync: Skoltech scientists use smartphone gyroscopes to sync time across devices

Skoltech researchers have designed a software-based algorithm for synchronizing time across smartphones that can be used in practical tasks requiring simultaneous measurements. This algorithm can essentially help turn several devices into a full-fledged network of sensors. The paper was published in the journal Sensors.

If you want a network of intelligent devices - say, an array of cameras capturing a dynamic scene or another kind of network of sensors - to work properly, one of the fundamental tasks you have to solve is clock synchronization: all devices should have the same timeline, often up to sub-millisecond for the more challenging tasks. Modern smartphones can easily be used as multipurpose sensors tied into a network, but they lack the interface for hardware clock synchronization, especially in environments where GPS, which can also be used as a global clock, is not available. And due to all non-atomic clocks slowly but inevitably drifting, they have to be periodically resynchronized.

"Smartphone networks can work as microphone arrays to capture sound waves and gather more information about not just sound but also direction. This is useful for noise cancellation techniques: noise cancellation algorithms pass the signal only from a specific direction, for instance, a voice of a person among office or city noise," Marsel Faizullin, Skoltech PhD student and a coauthor of the paper, says.

Microphone arrays can also be used for what's called sound-based trilateration: a user's smartphone produces ultrasound, and an array of other smartphones receives this signal. By the delay between received signals, one can determine the position of a user.

"You can also use this technique for soft-synchronization of mobile phones with hardware-based systems. One example is a flash in cameras; with our method, any mobile phone could become part of a professional photography system," Skoltech Assistant Professor Gonzalo Ferrer adds.

Faizullin, Ferrer and their colleagues from Skoltech and Saint Petersburg State University developed a clock synchronization method based on micro-electro-mechanical systems (MEMS) gyroscopes, now installed in every smartphone. They were able to design an algorithm that, in experiments with two smartphones capturing simultaneous photos, showed better performance than existing synchronization software, achieving an accuracy of several microseconds.

"This accuracy is enough for making a panoramic photo of a football or hockey game with a smartphone rig. An ice hockey puck achieves velocities of up to 160 kilometers per hour; in one millisecond it travels roughly for four centimeters, and in 20 microseconds it's 0.9mm. This is much less than a single-pixel field of view of any professional camera. It means that this is also enough for multi-camera synchronization capturing a hockey game. Definitely, an accuracy of microseconds is more than enough for any tasks involving consumer grade photo or video cameras," Marsel Faizullin says.

To use the algorithm, one needs to grab the smartphones in one hand, twist them a little and let the software do the rest in terms of all processing and computations for clock synchronization - it is literally a "twist and sync" approach.

For future research, the team decided to apply their method to systems that include not just smartphones but other sensors such as lidars, depth cameras and so on. "This task is more complicated because of very different software and hardware compared to just several identical smartphones. We are developing our method in this direction to be more practically useful," Faizullin says.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Don't focus on genetic diversity to save our species

image: Once found throughout the semi-arid range country in South Australia, New South Wales and south-west Queensland, the yellow-footed rock wallaby is now endangered in Queensland and NSW and vulnerable in SA.

Image: 
Image by Philip Barrington from Pixabay.

Scientists at the University of Adelaide have challenged the common assumption that genetic diversity of a species is a key indicator of extinction risk.

Published in the journal PNAS, the scientists demonstrate that there is no simple relationship between genetic diversity and species survival. But, Dr Joao Teixeira and Dr Christian Huber from the University of Adelaide's School of Biological Sciences conclude, the focus shouldn't be on genetic diversity anyway, it should be on habitat protection.

"Nature is being destroyed by humans at a rate never seen before," says computational biologist Dr Huber. "We burn forests, over-fish our seas and destroy wild areas and it's estimated that about one million species are threatened with extinction, some within decades.

"Although researchers agree that this rapid decline of species numbers has to be stopped, how that's best tackled is still open to debate.

"Conservation geneticists consider genetic diversity as an important way to assess if a species is threatened by extinction. The view is that as long as individuals are genetically different from each other (having high genetic diversity), there will always be individuals with the right genetic makeup to survive under adverse conditions. On the other hand, if a species shows little genetic diversity, it's believed that the species is fragile and likely to become extinct."

Dr Teixeira and Dr Huber have compiled a wide range of evidence from laboratory experiments, field studies, and evolutionary theory which suggests a need for re-evaluation on the measurement and interpretation of genetic diversity for conservation.

"In this paper, we've shown that this simple relationship between genetic diversity and survival is often wrong," says population geneticist Dr Teixeira. "Most of the genetic diversity within a genome is 'neutral', meaning that it neither improves nor diminishes an individual's ability to survive or produce offspring. On the other hand, the genetic diversity that does affect survival is found in very specific regions of the genome and is not at all correlated with genome-wide genetic diversity.

"Researchers need to investigate for each species individually which genetic mutations allow the species to thrive and which mutations lead to diseases that can threaten the species. There is certainly no simple 'one-size-fits-all' measure of extinction risk."

The authors finally warn that, although genetics can play an important role in certain cases, fixating on genetic diversity shifts much-needed focus away from the much bigger problem: habitat destruction.

"Since the year 2000, wildlife habitat about eight times the area of the UK has been lost," says Dr Huber. "Without habitat, there is no wildlife. And without wildlife and the ecosystem services that humans rely on, we are ultimately risking our own security and survival here on Earth."

Credit: 
University of Adelaide

Fibre-integrated, high-repetition-rate water window soft X-ray source

image: a, Input coupling of the 1.9 μm wavelength laser pulses with 100 fs duration within helium gas at atmospheric pressure. b, The ARHCF is differentially pumped with the output facet located in a high-pressure chamber. At first, the pulses self-compress within the fibre, reaching a peak intensity >1014 W/cm², which causes ionization of the gas. High-order harmonic radiation with photon energies in the soft X-ray spectral region is released due to the recombination of a fraction of the liberated electrons with their parent helium ions. c, The final vacuum chamber contains an annular mirror and a thin metal filter to separate the SWIR light from the generated high-order harmonics.

Image: 
by Gebhardt, M., Heuermann, T., Klas, R. et al.

Bright, coherent soft X-ray radiation (SXR) is used in many scientific applications such as advanced absorption spectroscopy or lens-less imaging, and in fundamental research e.g. to produce extremely short isolated optical pulses. Therefore, the generation, control, and detection of this type of short-wavelength light is highly important in fields like fundamental atomic physics, solid-state physics, the semiconductor industry, material science and biology.
To date, high photon flux in the soft X-ray spectral region is mostly delivered by large-scale facilities like synchrotrons or free electron lasers. An alternative is to use high-order harmonic generation (HHG) sources, which are currently driven by pulsed laser systems with very high peak powers. It is highly desired to increase the repetition rate of such sources beyond 1000 shots per second (e.g. to allow for a faster data acquisition/increase the signal-to-noise ratio) and to make them simpler, more compact and easier to operate.

In a new paper published in Light Science & Application, a team of scientists, led by Professor Jens Limpert from the Friedrich-Schiller-University Jena, and co-workers have developed a laser-driven soft X-ray source, which is based on high-order harmonic generation inside of an antiresonant gas-filled hollow core fibre (ARHCF). They designed the source such that the intensity of the driving laser pulses is enhanced by temporal self-compression inside of the hollow fibre before the soft X-rays are generated. The integration of laser enhancement and high-order harmonic generation in a single gas-filled hollow core fibre allows "compact, high repetition rate laser technology, including commercially available systems, to drive simple and cost-effective, coherent high-flux SXR sources", the scientists state in their paper. In their work, they have optimized the input laser parameters and the gas particle density inside of the fibre such that a macroscopic HHG signal builds up towards the fibre end. This soft X-ray light emerges directly from the output of a potentially flexible hollow-core fibre. The reported method opens new avenues for simple and powerful laser-driven soft X-ray sources based on fibre technology. This allows moderate peak power laser systems with high repetition rates to drive HHG directly.

The authors state, "This enables the first 100 kHz-class repetition rate, table-top SXR source, that delivers an application-relevant flux of 2.8×106 Photons/s/eV around 300 eV.". They go on to demonstrate a proof-of-principle spectroscopy measurement at the carbon K-edge. Furthermore, they show a long term run of the source over more than 20 minutes. While this source is based on a high repetition rate 2 μm wavelength fibre laser, it is applicable to any laser technology in the short-wavelength infrared.

The scientists forecast that their results are "most interesting for a variety of applications, which significantly benefit from compact and easy-to-use high repetition rate SXR sources". They add: "These sources could use ARHCFs for beam delivery, self-compression and HHG in a single apparatus, making them more affordable, and available to a much broader community in fundamental and applied sciences with medical applications in reach."

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

Life from Earth could temporarily survive on Mars

image: MARSBOx payload in the Earth's middle stratosphere (38 km altitude). The shutter is open exposing the top layer samples to UV radiation.

Image: 
NASA

Some microbes on Earth could temporarily survive on the surface of Mars, finds a new study by NASA and German Aerospace Center scientists. The researchers tested the endurance of microorganisms to Martian conditions by launching them into the Earth's stratosphere, as it closely represents key conditions on the Red Planet. Published in Frontiers in Microbiology, this work paves the way for understanding not only the threat of microbes to space missions, but also the opportunities for resource independence from Earth.

"We successfully tested a new way of exposing bacteria and fungi to Mars-like conditions by using a scientific balloon to fly our experimental equipment up to Earth's stratosphere," reports Marta Filipa Cortesão, joint first author of this study from the German Aerospace Center, Cologne, Germany. "Some microbes, in particular spores from the black mold fungus, were able to survive the trip, even when exposed to very high UV radiation."

Microbial hitchhikers

Understanding the endurance of microbes to space travel is vital for the success of future missions. When searching for extra-terrestrial life, we need to be sure that anything we discover has not just travelled with us from Earth.

"With crewed long-term missions to Mars, we need to know how human-associated microorganisms would survive on the Red Planet, as some may pose a health risk to astronauts," says joint first author Katharina Siems, also based at the German Aerospace Center. "In addition, some microbes could be invaluable for space exploration. They could help us produce food and material supplies independently from Earth, which will be crucial when far away from home."

Mars in a box

Many key characteristics of the environment at the Martian surface cannot be found or easily replicated at the surface of our planet, however, above the ozone layer in Earth's middle stratosphere the conditions are remarkably similar.

"We launched the microbes into the stratosphere inside the MARSBOx (Microbes in Atmosphere for Radiation, Survival and Biological Outcomes experiment) payload, which was kept at Martian pressure and filled with artificial Martian atmosphere throughout the mission," explains Cortesão. "The box carried two sample layers, with the bottom layer shielded from radiation. This allowed us to separate the effects of radiation from the other tested conditions: desiccation, atmosphere, and temperature fluctuation during the flight. The top layer samples were exposed to more than a thousand times more UV radiation than levels that can cause sunburn on our skin."

"While not all the microbes survived the trip, one previously detected on the International Space Station, the black mold Aspergillus niger, could be revived after it returned home," explains Siems, who highlights the importance of this ongoing research.

"Microorganisms are closely-connected to us; our body, our food, our environment, so it is impossible to rule them out of space travel. Using good analogies for the Martian environment, such as the MARSBOx balloon mission to the stratosphere, is a really important way to help us explore all the implications of space travel on microbial life and how we can drive this knowledge towards amazing space discoveries."

Credit: 
Frontiers

Acid reflux disease may increase risk of cancers of the larynx and esophagus

Results from a large prospective study indicate that gastroesophageal reflux disease (GERD), which also causes heartburn symptoms, is linked with higher risks of various cancers of the larynx (or voice box) and esophagus. The study is published early online in CANCER, a peer-reviewed journal of the American Cancer Society.

GERD, a gastrointestinal disorder that affects approximately 20 percent of U.S. adults, occurs when stomach acid flows back into the esophagus, where it can cause tissue damage. Research indicates that this damage may put patients at risk of developing a type of cancer called esophageal adenocarcinoma.

To provide additional insights concerning this link and potential links to other types of cancer, a team led by Christian C. Abnet, PhD, of the National Cancer Institute, part of the National Institutes of Health (NIH), examined information on 490,605 adults enrolled in the NIH-AARP Diet and Health Study, a prospective study that mailed questionnaires in 1995-1996 to 3.5 million AARP members, aged between 50 and 71 years who were living in California, Florida, Louisiana, New Jersey, North Carolina, or Pennsylvania, or in the metropolitan areas of Atlanta, Georgia, and Detroit, Michigan.

Using Medicare claims data, the investigators estimated that 24 percent of participants had a history of GERD. Over the following 16 years after participants joined the study, 931 patients developed esophageal adenocarcinoma, 876 developed laryngeal squamous cell carcinoma, and 301 developed esophageal squamous cell carcinoma. People with GERD had about a two-times higher risk of developing each of these types of cancer, and the elevated risk was similar across groups categorized by sex, smoking status, and alcohol consumption. The investigators were able to replicate the results when they restricted analyses to the Medicare data subset of 107,258 adults.

The team estimated that approximately 17 percent of these cancers in the larynx and esophagus are associated with GERD.

"This study alone is not sufficient to result in specific actions by the public. Additional research is needed to replicate these findings and establish GERD as a risk factor for cancer and other diseases," said Dr. Abnet. "Future studies are needed to evaluate whether treatments aimed at GERD symptoms will alter the apparent risks."

Credit: 
Wiley

Future ocean warming boosts tropical rainfall extremes

image: (Left) Predicted change of ocean surface temperature in 2050-2099 relative to 1950-1999 using an ensemble of climate models. (Right) Predicted change in amplitude of rainfall fluctuations (year-to-year standard deviation) in 2050-2099 relative to 1950-1999.

Image: 
Kyung-Sook Yun

The El Niño-Southern Oscillation (ENSO) is the most energetic naturally occurring year-to-year variation of ocean temperature and rainfall on our planet. The irregular swings between warm and wet "El Niño" conditions in the equatorial Pacific and the cold and dry "La Niña" state influence weather conditions worldwide, with impacts on ecosystems, agriculture and economies. Climate models predict that the difference between El Niño- and La Niña-related tropical rainfall will increase over the next 80 years, even though the temperature difference between El Niño and La Niña may change only very little in response to global warming. A new study published in Communications Earth & Environment uncovers the reasons for this surprising fact.

Using the latest crop of climate models, researchers from the IBS Center for Climate Physics at Pusan National University, the Korea Polar Research Institute, the University of Hawai?i at Mānoa, and Environment and Climate Change Canada, worked together to unravel the mechanisms involved. "All climate models show a pronounced intensification of year-to-year tropical rainfall fluctuations in response to global warming." says lead author Dr. Kyung-Sook Yun from the IBS Center for Climate Physics (Image, right panel). "Interestingly the year-to-year changes in ocean temperature do not show such a clear signal. Our study therefore focuses on the mechanisms that link future ocean warming to extreme rainfall in the tropical Pacific", she goes on to say.

The research team found that the key to understanding this important climatic feature lies in the relationship between tropical ocean surface temperature and rainfall. There are two important aspects to consider: 1) the ocean surface temperature threshold for rainfall occurrence, and 2) the rainfall response to ocean surface temperature change, referred to as rainfall sensitivity. "In the tropics, heavy rainfall is typically associated with thunderstorms and deep clouds shaped like anvils. These only form once the ocean surface is warmer than approximately 27.5 degrees Celsius or 81 degrees Fahrenheit in our current climate", says co-author Prof. Malte Stuecker from the University of Hawai?i at Mānoa.

This ocean surface temperature threshold for intense tropical rainfall shifts towards a higher value in a warmer world and does not contribute directly to an increase in rainfall variability. "However, a warmer atmosphere can hold more moisture which means that when it rains, rainfall will be more intense. Moreover, enhanced warming of the equatorial oceans leads to upward atmospheric motion on the equator. Rising air sucks in moist air from the off-equatorial regions, which can further increase precipitation, in case other meteorological conditions for a rain event are met." says co-lead author Prof. June-Yi Lee from IBS Center for Climate Physics.

This increase in rainfall sensitivity is the key explanation why there will be more extreme ENSO-related swings in rainfall in a warmer world.

Credit: 
Institute for Basic Science

Rapid evolution may help species adapt to climate change and competition

image: Invasive and naturalized fruit fly species on a peach tree inside the experiment.

Image: 
Washington State University

VANCOUVER, Wash. - Loss of biodiversity in the face of climate change is a growing worldwide concern. Another major factor driving the loss of biodiversity is the establishment of invasive species, which often displace native species. A new study shows that species can adapt rapidly to an invader and that this evolutionary change can affect how they deal with a stressful climate.

"Our results demonstrate that interactions with competitors, including invasive species, can shape a species' evolution in response to climatic change," said co-author Seth Rudman, a WSU Vancouver adjunct professor who will join the faculty as an assistant professor of biological sciences in the fall.

Results were published in the Proceedings of the National Academy of Sciences as "Competitive history shapes rapid evolution in a seasonal climate."

Scientists have increasingly recognized that evolution is not necessarily slow and often occurs quickly enough to be observed in real time. These rapid evolutionary changes can have major consequences for things like species' persistence and responses to climatic change. The investigators chose to examine this topic in fruit flies, which reproduce quickly, allowing change to be observed over several generations in a matter of months. The team focused on two species: one naturalized in North American orchards (Drosophila melanogaster) and one that has recently started to invade North America (Zaprionus indianus).

The experiment first tested whether the naturalized species can evolve rapidly in response to exposure to the invasive species over the summer, then tested how adaptation in the summer affects the naturalized species' ability to deal with and adapt to the colder fall conditions.

"A cool thing about the way we conducted this study is that while most experiments that look at rapid evolution use controlled lab systems, we used an outdoor experimental orchard that mimics the natural habitat of our focal species," said Tess Grainger of the Biodiversity Centre at the University of British Columbia and the lead author on the paper. "This gives our experiment a sense of realism and makes our findings more applicable to understanding natural systems."

Over the course of just a few months, the naturalized species adapted to the presence of the invasive species. This rapid evolution then affected how the flies evolved when the cold weather hit. Flies that had been previously exposed to the invasive species evolved in the fall to be larger, lay fewer eggs and develop faster than flies that had never been exposed.

The study marks the beginning of research that may ultimately hold implications for other threatened species that are more difficult to study. "In the era of global environmental change in which species are increasingly faced with new climates and new competitors, these dynamics are becoming essential to understand and predict," Grainger said.

Rudman summarized the next big question: "As biodiversity changes, as climate changes and invaders become more common, what can rapid evolution do to affect outcomes of those things over the next century or two? It may be that rapid evolution will help biodiversity be maintained in the face of these changes."

In addition to Rudman and Grainger, the paper's co-authors are Jonathan M. Levine, Ecology and Evolutionary Biology Department, Princeton University (where Grainger was a postdoctoral fellow); and Paul Schmidt, Department of Biology, University of Pennsylvania (where Rudman was a postdoctoral fellow). The research was conducted in an outdoor field site near the University of Pennsylvania.

Credit: 
Washington State University

Investment needed to bring down pancreatic cancer death rates in Europe

Researchers have called on European policymakers to make adequate resources available to tackle pancreatic cancer, a disease that is almost invariably fatal and where little progress has been made over the past 40 years.

In the latest predictions for cancer deaths in the EU and UK for 2021, published in the leading cancer journal Annals of Oncology [1] today (Monday), researchers led by Carlo La Vecchia (MD), a professor at the University of Milan (Italy), say that pancreatic death rates are predicted to remain approximately stable for men, but continue to rise in women in most EU countries.

The researchers predict that 42,300 and 5,000 men in the EU and UK respectively will die from pancreatic cancer by the end of this year. After adjusting for differences in age distribution in the population, the age standardised rate (ASR) of deaths in men will be eight per 100,000 and 6.5 per 100,000 in the EU and UK respectively this year [2]. This represents a 0.8% decline in death rates since 2015. Among women, six per 100,000 are predicted to die from the disease in the EU, representing a 0.6% increase since 2015. In the UK, five women per 100,000 are predicted to die, representing a 4% decline in the death rate.

In contrast, the researchers predict that in nine out of ten of the other major cancers death rates will decline by 7% in men and 5% in women between 2015 and 2021 in most EU countries and the UK.

Prof La Vecchia said: "Among the major cancers, pancreatic cancer is the fourth most common and remains the only one showing no overall fall in death rates over the past three decades in Europe in both sexes. It is important that governments and policymakers provide adequate resources for the prevention, early diagnosis and management of pancreatic cancer in order to improve these trends in the near future.

"If the cancer is detected early, it is easier to treat successfully, but most cases are advanced by the time of diagnosis. Avoiding smoking and excessive alcohol consumption, controlling weight and, hence, diabetes are the main ways we know to help to prevent the disease, but they only account for a proportion of cases. New, targeted drugs are leading to some improvement in treatment, but it's difficult to quantify their potential impact at present."

The researchers analysed cancer death rates in the EU 27 Member States [3] as a whole and added the UK in order to be able to compare with previous years when the UK was still a member of the EU. They also looked at the six most populous countries - France, Germany, Italy, Poland, Spain and the UK - for all cancers, and, individually, for stomach, intestines, pancreas, lung, breast, uterus (including cervix), ovary, prostate, bladder and leukaemias for men and women [4]. This is the eleventh consecutive year the researchers have published these predictions. Prof La Vecchia and his colleagues collected data on deaths from the World Health Organization and Eurostat databases from 1970 to 2016.

They predict there will be a total of 1,443,000 deaths from the ten cancers in the EU (1,267,000) and the UK (176,000) by the end of the year. This corresponds to age standardised death rates of 130 per 100,000 men (down 7% since 2015) and 81 per 100,000 women (down 5%) in the EU. In the UK, the death rates will be 114 per 100,000 men (down 7.5% since 2015) and 89 per 100,000 women (down 4.5%).

Compared to a peak rate of cancer deaths in 1988, over 4.9 million cancer deaths will be avoided in the EU and over one million deaths avoided in the UK during the 33-year period up to 2021. In 2021 alone, 348,000 and 69,000 cancer deaths will be avoided in the EU and UK respectively.

Changes in smoking patterns, improved food storage and better treatments are driving many of the reductions in death rates for cancers such as lung, stomach and breast. However, although lung cancer death rates are falling in men, they are still rising in women in many countries, reflecting the fact that women tended to start smoking later in the twentieth century than men. In the EU, death rates from lung cancer are estimated to be 32 per 100,000 in men (down 10%), but in women it will be 15 per 100,000 (up 7%). The UK is different, with lung cancer death rates down 11.5% at 24 per 100,00 men and down 5% with a death rate of 19 per 100,000 women.

Co-author, Dr Fabio Levi (MD), emeritus professor at the Faculty of Biology and Medicine, University of Lausanne (Switzerland), said: "Lung cancer death rates in men are 25% lower in the UK than in the 27 European countries because of earlier and larger decreases in smoking prevalence in UK men. This is also reflected in the lower predicted death rates for all cancers in UK men. In the EU, men are stopping smoking, though later than in the UK, which explains the predicted fall in male lung cancer death rates in these countries.

"Lung cancer death rates in UK women are higher than those in the EU countries and this is mirrored in higher female death rates from all cancers in the UK. However, our predictions show a favourable downward trend in UK female lung cancer deaths, in contrast with persistent upward trends in EU women where rates could reach 16 or 18 per 100,000 women in the next decade."

Co-author, Professor Paolo Boffetta (MD), the Annals of Oncology associate editor for epidemiology. professor and associate director for population sciences at Stony Brook University, New York (USA), and professor at the University of Bologna (Italy), said: "Cancer remains the second major cause of death in Europe after cardiovascular disease. Although we predict that death rates in many cancers will decrease this year, the absolute number of deaths from the disease will continue to rise due to aging populations. This underlines the increasing public health importance of the issue. Delayed cancer diagnosis and treatment due to the COVID-19 pandemic may increase the cancer burden over the next several years.

"The results we report this year are particularly important because they stress the fact that trends in mortality from pancreatic cancer and female lung cancer do not show the positive pattern of other major cancers, underlying the need for further efforts for research and control of these neoplasms.

"Measures to continue to improve cancer death rates should include stopping smoking, particularly in women, controlling overweight and alcohol consumption, optimising screening and early diagnosis for breast, bowel and - in central and eastern Europe - cervical cancer too. Up-to-date data management needs to be adopted throughout Europe, particularly in central and eastern Europe, and vaccinations should be widely available for women to eliminate cervical cancer, which is caused by the human papilloma virus, and against hepatitis B, which is linked to liver cancer. Effective treatment of hepatitis C will also contribute to controlling liver cancer."

In an accompanying editorial [5], Professor José Martín-Moreno, from the University of Valencia, Spain, and Ms Suszy Lessof, from the European Observatory on Health Systems and Policies, Brussels, Belgium, write that Prof La Vecchia and his colleagues are to be commended for their 11 years of mortality predictions and that "the key to understanding the past and how to approach the future is data". They believe the analysis gives cause for hope; however, they highlight potential problems from COVID-19 as cancer is a "severe risk factor for COVID-19 infected patients, carrying as it does a higher probability of ICU admission, mechanical ventilation and mortality".

"The positives [from Prof La Vecchia's paper] - the concrete evidence that there is scope for effective action which, over time, leads to positive outcomes - should not mask the shadow of the COVID-19 pandemic. Its impact on cancer patients (and the fear of that impact) is looming. Beyond the direct harm of this new coronavirus to immunocompromised and particularly vulnerable people, there is the blow to comprehensive clinical care and the interruption of research. Perhaps most worrying for the long term is the paralysis of prevention programmes, screening and early diagnosis. Since March 2020, all of the activity linked to progress over recent decades has come to a screeching halt. It is, of course, too early to characterize the impacts, but it seems inevitable they will have marked, if not dramatic, consequences," they write. They conclude: "The possible impact of the COVID-19 pandemic on actual consolidated mortality for 2020, for 2021 and beyond, demands vigilance."

Credit: 
European Society for Medical Oncology

Pioneering research reveals gardens are secret powerhouse for pollinators

image: Residential gardens underpin the urban nectar supply, and many can be extremely rich in flowering plants.

Image: 
Nicholas Tew

Home gardens are by far the biggest source of food for pollinating insects, including bees and wasps, in cities and towns, according to new research.

The study, led by the University of Bristol and published today in the Journal of Ecology, measured for the first time how much nectar is produced in urban areas and discovered residential gardens accounted for the vast majority - some 85 per cent on average.

Results showed three gardens generated daily on average around a teaspoon of Nature's ambrosia, the unique sugar-rich liquid found in flowers which pollinators drink for energy. While a teaspoon may not sound much to humans, it's the equivalent to more than a tonne to an adult human and enough to fuel thousands of flying bees. The more bees and fellow pollinators can fly, the greater diversity of flora and fauna will be maintained.

Ecologist Nicholas Tew, lead author of the study, said: "Although the quantity and diversity of nectar has been measured in the countryside, this wasn't the case in urban areas, so we decided to investigate.

"We expected private gardens in towns and cities to be a plentiful source of nectar, but didn't anticipate the scale of production would be to such an overwhelming extent. Our findings highlight the pivotal role they play in supporting pollinators and promoting biodiversity in urban areas across the country."

The research, carried out in partnership with the universities of Edinburgh and Reading and the Royal Horticultural Society, examined the nectar production in four major UK towns and cities: Bristol, Edinburgh, Leeds, and Reading. Nectar production was measured in nearly 200 species of plant by extracting nectar from more than 3,000 individual flowers. The extraction process involves using a fine glass tube. The sugar concentration of the nectar was quantified with a refractometer, a device which measures how much light refracts when passing through a solution.

"We found the nectar supply in urban landscapes is more diverse, in other words comes from more plant species, than in farmland and nature reserves, and this urban nectar supply is critically underpinned by private gardens," said Nicholas Tew, who is studying for a PhD in Ecology.

"Gardens are so important because they produce the most nectar per unit area of land and they cover the largest area of land in the cities we studied."

Nearly a third (29 per cent) of the land in urban areas comprised domestic gardens, which is six times the area of parks, and 40 times the area of allotments.

"The research illustrates the huge role gardeners play in pollinator conservation, as without gardens there would be far less food for pollinators, which include bees, wasps, butterflies, moths, flies, and beetles in towns and cities. It is vital that new housing developments include gardens and also important for gardeners to try to make sure their gardens are as good as possible for pollinators," Nicholas Tew explained.

"Ways to do this include planting nectar-rich flowers, ensuring there is always something in flower from early spring to late autumn, mowing the lawn less often to let dandelions, clovers, daisies and other plant flowers flourish, avoiding spraying pesticides which can harm pollinators, and avoiding covering garden in paving, decking or artificial turf."

Dr Stephanie Bird, an entomologist at the Royal Horticultural Society, which helped fund the research, said: "This research highlights the importance of gardens in supporting our pollinating insects and how gardeners can have a positive impact through their planting decisions. Gardens should not be seen in isolation - instead they are a network of resources offering valuable habitats and provisions when maintained with pollinators in mind."

Credit: 
University of Bristol

Global study of 48 cities finds nature sanitizes 41.7 million tons of human waste a year

image: This photo shows an interview with a local household in peri-ueban Hyderabad.

Image: 
Dilshaad Bundhoo

The first global-scale assessment of the role ecosystems play in providing sanitation finds that nature provides at least 18% of sanitation services in 48 cities worldwide, according to researchers in the United Kingdom and India. The study, published February 19 in the journal One Earth, estimates that more than 2 million cubic meters of the cities' human waste is processed each year without engineered infrastructure. This includes pit latrine waste that gradually filters through the soil--a natural process that cleans it before it reaches groundwater.

"Nature can, and does, take the role of sanitation infrastructure," said Alison Parker, a Senior Lecturer in International Water and Sanitation at Cranfield University in the United Kingdom and one of the authors of the study. "While we are not marginalizing the vital role of engineered infrastructure, we believe a better understanding of how engineered and natural infrastructure interact may allow adaptive design and management, reducing costs, and improving effectiveness and sustainability, and safeguard the continued existence of these areas of land."

Wastewater treatment infrastructure that converts human feces into harmless products is an important tool for global human health. However, more than 25% of the world's population did not have access to basic sanitation facilities in 2017 and another 14% used toilets in which waste was disposed of onsite. While some of this waste may be hazardous to local populations, previous research has suggested that natural wetlands and mangroves, for example, provide effective treatment services. The Navikubo wetland in Uganda processes untreated wastewater from more than 100,000 households, protecting the Murchison Bay and Lake Victoria from harmful contaminants, while in the United States coastal wetlands in the Gulf of Mexico remove nitrogen from the Mississippi River.

"We realized that nature must be providing sanitation services, because so many people in the world do not have access to engineered infrastructure like sewers," adds Simon Willcock, a Senior Lecturer in Environmental Geography in Bangor University, UK, and another author of the study. "But the role for nature was largely unrecognized."

To better understand how natural ecosystems process waste, the team from Bangor University, Cranfield University, Durham University, University of Gloucestershire, University of Hyderabad (India) and the Fresh Water Action Network, South Asia quantified sanitation ecosystem services in 48 cities containing about 82 million people using Excreta Flow Diagrams, which leverage a combination of in-person interviews, informal and formal observations, and direct field measurements to document how human fecal matter flows through a city or town. The researchers assessed all diagrams that were available on December 17th, 2018, focusing on those coded as "fecal sludge contained not emptied" (FSCNE), in which the waste is contained in a pit latrine or septic tank below ground but does not pose a risk to groundwater, for example, because the water table is too deep.

Conservatively, Willcock and colleagues estimate that nature processes 2.2 million cubic meters of human waste per year within these 48 cities. Since more than 892 million people worldwide use similar onsite disposal toilet facilities, they further estimate that nature sanitizes about 41.7 million tons of human waste per year before the liquid enters the groundwater--a service worth about $4.4 billion per year. However, the authors note that these estimates likely undervalue the true worth of sanitation ecosystem services, since natural processes may contribute to other forms of wastewater processing, though these are harder to quantify.

Willcock and colleagues hope that their findings will shed light on an important but often unrecognized contribution that nature makes to many people's everyday lives, inspiring the protection of ecosystems such as wetlands that protect downstream communities from wastewater pollutants.

"We would like to promote a better collaboration between ecologists, sanitation practitioners and city planners to help nature and infrastructure work better in harmony, and to protect nature where it is providing sanitation services," said Parker.

Credit: 
Cell Press

42,000-year-old trees allow more accurate analysis of last Earth's magnetic field reversal

image: Ancient kauri tree log from Ngawha, New Zealand.

Image: 
Nelson Parker

The last complete reversal of the Earth's magnetic field, the so-called Laschamps event, took place 42,000 years ago. Radiocarbon analyses of the remains of kauri trees from New Zealand now make it possible for the first time to precisely time and analyse this event and its associated effects, as well as to calibrate geological archives such as sediment and ice cores from this period. Simulations based on this show that the strong reduction of the magnetic field had considerable effects in the Earth's atmosphere. This is shown by an international team led by Chris Turney from the Australian University of New South Wales, with the participation of Norbert Nowaczyk from the German Research Centre for Geosciences Potsdam and Florian Adolphi from the Alfred Wegener Institute, in a study that now appears in the journal Science.

The Earth's magnetic field undergoes permanent fluctuations and occasionally even reversals of polarity occur. Their causes, course and effects are not yet fully understood. Researchers have now investigated the so-called Laschamps event in more detail. It refers to the last complete reversal of the polarity of the Earth's magnetic field around 42,000 years ago. Not only did the magnetic field change direction, it also dramatically lost strength over a period of several hundred years.

About 42,000 years ago, the magnetic north pole moved south. Within this process, which lasted about 500 years, the magnetic field weakened to between six and zero per cent. During a period of about 500 years, the poles remained reversed, with a field strength that varied below 28 per cent of today's value, only to reverse again over the course of about 250 years.

This exact chronological classification is now possible by linking different data sets. Firstly, the researchers used results on the Earth's magnetic field from sediment cores of the Black Sea by Norbert Nowaczyk and his team from 2013, which were matched with Greenland ice cores via climate variation documented at the same time.

Secondly, the exact analysis and dating of the events was only made possible by the radiocarbon (14C) analysis of a sub-fossil kauri tree that grew in the wetlands of Ngawha in northern New Zealand for around 1700 years during the period in question and was subsequently very well preserved in the swamps.

Chris Turney had reported on this finding from about 40,000 years ago during a visit to the German Research Centre for Geosciences in Potsdam (GFZ) a few years ago. "As a geomagnetic scientist, I immediately had a link to the Laschamps event in mind and suggested 14C analyses, which had not yet been done on trees from that time," says Nowaczyk, who heads the Laboratory for Palaeo- and Rock Magnetism at the GFZ.

The background: With the dwindling of the magnetic field, the Earth is losing an important protective shield against cosmic radiation, at least in part. This is also reflected in increased levels of the radioactive carbon isotope 14C in the trees. The reason for that is the increased formation of 14C in the Earth's atmosphere during the bombardment of nitrogen by high-energy, electrically charged cosmic particles.

"The sub-fossil kauri trees are an exciting archive of atmospheric composition," says Florian Adolphi, palaeoclimatologist at the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI). These trees can live for several thousand years and record annual variations in atmospheric radiocarbon content as they grow, which the research team measured precisely.

"These data improve the calibration curve for radiocarbon dating, allowing more accurate dating of a wide range of climate archives and fossils. They also allow a direct comparison to ice cores: beryllium isotopes measured there show similar variations to the radiocarbon in the trees, as the production of both isotopes in the Earth's atmosphere depends on the intensity of cosmic rays hitting the Earth," explains the study's co-author. He uses this effect to synchronise trees and ice cores with high precision and reduce the uncertainty of comparing the two archives from several thousand years to about 100 years.

To investigate further effects of the weak Earth's magnetic field on the atmosphere and thus also on the global climate, the researchers carried out simulations of atmospheric chemistry. Among other things, they found a decrease in ozone. "Unfiltered radiation from space was breaking up air particles in the Earth's atmosphere, separating electrons and emitting light - a process called ionisation," Turney explains. "The ionised air 'sizzled' the ozone layer." This triggered a wave of changes in the atmosphere, including increased dazzling light shows that we know as the aurora borealis, which at the time may have been observed not only near the poles but across the globe.

It is important to further analyse the effects of the weak magnetic field in this direction in view of current developments, says Nowaczyk. Because the Earth's magnetic field has already been weakening for about 2000 years. Compared to the first direct measurements 170 years ago, a weakening of nine per cent was observed, in the area of the South Atlantic even thirty per cent. Whether this means that a pole reversal is in the offing for the next one to two thousand years is debatable. However, a collapse of the natural radiation shield would pose a great challenge to our present-day society, which is very much based on electronics.

On the basis of these new possibilities for the chronological classification of the events 42,000 years ago, the main authors of the study put forward even more far-reaching hypotheses about the effects of the Earth's magnetic field reversal - for example with regard to the extinction of the Neanderthals or the onset of cave paintings. Nowaczyk does not rule out the possibility that there are causal connections here, but considers it rather unlikely.

Credit: 
GFZ GeoForschungsZentrum Potsdam, Helmholtz Centre

Deep brain stimulation prevents epileptic seizures in mouse model

image: When the hippocampus was stimulated slowly, epileptic seizures failed to occur in the mouse model.

Image: 
Image source: Medical Center - University of Freiburg / AG Haas

Epileptic activity originating from one or more diseased brain regions in the temporal lobe is difficult to contain. Many patients with so-called temporal lobe epilepsy often do not respond to treatment with anti-epileptic drugs, and the affected brain areas must therefore be surgically removed. Unfortunately, this procedure only gives seizure freedom to about one third of patients, so the development of alternative therapeutic approaches is of great importance. Scientists led by neurobiologist Prof. Dr. Carola Haas, head of the research group at the Department of Neurosurgery at Medical Center - University of Freiburg and the BrainLinks-BrainTools research center, have investigated a new therapeutic approach to prevent epileptic seizures in temporal lobe epilepsy. They showed in mice that low-frequency stimulation of specific brain areas could completely stop epileptic activity. Instead of using electric current, the researchers stimulated the cells with light. To do this, they had previously introduced a light-sensitive molecule into the cells that allows particularly precise stimulation. They published the results in December 2020 in the scientific journal elife.

"As soon as we stimulated the brain region with a frequency of one hertz, the epileptic seizures disappeared. This effect was stable over several weeks," Haas says. Habituation, which can occur with drug therapy, did not take place. The brain region was stimulated for one hour daily.

Circuits and cells identified

In temporal lobe epilepsy, the hippocampus is often pathologically altered and usually represents the so-called focus of epileptic activity. Previous studies have used precise genetic labeling techniques to map the fiber system and its synaptic contacts between the temporal lobe and hippocampus, which are typically preserved in temporal lobe epilepsy. The researchers used this fiber system to manipulate hippocampal activity in a specific and temporally precise manner using light-dependent proteins. Measuring brain waves showed that rhythmic activation of the diseased hippocampus at a low frequency of one hertz suppressed epileptic activity and prevented it from spreading.

Haas and her colleagues demonstrated that the anti-epileptic effect is largely due to the repeated activation of surviving granule cells in the seizure focus. Single cell studies confirmed the assumption that the granule cells are less excitable due to the stimulation, making the epileptic seizure less likely to spread. "It's also possible that we have a widespread network effect because the stimulation can spread through the hippocampal circuitry," Haas said.

In the future, the team, along with the medical physics department at the Medical Center - University of Freiburg, would like to use magnetic resonance imaging to observe the entire brain during stimulation. This technique could be used to identify additional brain regions that are affected by the stimulation. Corresponding findings on these could provide information on how they are connected and what further consequences stimulation has.

Credit: 
University of Freiburg

Communal activities boost rehabilitation for older adults in long term care

image: Participants cleaning park and watering

Image: 
Tohoku University

A group of researchers has developed a new program showing participation and activity is critical for the rehabilitation of older adults in long-term care.

The results of their research were published in the journal PLOS ONE on February 12, 2021.

"Our study shows participatory programs that encourage elderly patients to be active need greater emphasis in elderly care centers," said Yoshihiko Baba, lead author of the study.

In 2015, the Ministry of Health, Labour and Welfare of Japan launched a comprehensive plan to care for the country's aging population. Crucial to this was rehabilitation centered on promoting activities that elderly patients could actively take part in.

Baba, a former graduate student at the Department of Internal Medicine and Rehabilitation Science at Tohoku University Graduate School of Medicine, and his supervisor, professor Masahiro Kohzuki, developed a program that fostered participation in activities such as park cleaning, gardening, and shopping. The program was implemented at 13 small-scale multifunctional at-home care (SMAC) facilities in Adachi Ward, Tokyo and was called the Adachi Rehabilitation Program (ARP).

A round of ARP comprises four weekly sessions. In the first session, participants take a bus to buy cleaning tools and seeds. In the following three sessions, they spend one-hour cleaning and maintaining flower beds in a nearby park. Participants are also encouraged to go to the park outside of the sessions.

The Japanese long-term care insurance system designates the amount of care needed according to seven levels: those at level one require a minimal care, while those at level seven require chronic care. ARP focused on those at the lower end of the spectrum.

Baba and his team conducted a controlled study for three courses (12 weeks) of ARP at the SMAC facilities.

As expected, step counts increased on days participants ventured out to parks and shopping centers. However, the research team also discovered that participation in ARP increased participants' step count even on days where there were no sessions.

ARP may have led to a behavioral change in which those under-long term care became more motivated to go out," added Baba. "Ultimately, community rehabilitation in long-term care insurance services can improve the physical activity of older adults."

Credit: 
Tohoku University

Sweet marine particles resist hungry bacteria

image: This Airyscan super-resolution image shows that fucose-containing sulphated polysaccharide, or FCSP, (in green) occurred around the cells of the chain-forming diatom Chaetoceros socialis and their spines. Sample collected during the 2016 spring diatom bloom period in Helgoland.

Image: 
Max Planck Institute for Marine Microbiology/S. Vidal-Melgosa

A major pathway for carbon sequestration in the ocean is the growth, aggregation and sinking of phytoplankton - unicellular microalgae like diatoms. Just like plants on land, phytoplankton sequester carbon from atmospheric carbon dioxide. When algae cells aggregate, they sink and take the sequestered carbon with them to the ocean floor. This so called biological carbon pump accounts for about 70 per cent of the annual global carbon export to the deep ocean. Estimated 25 to 40 per cent of carbon dioxide from fossil fuel burning emitted by humans may have been transported by this process from the atmosphere to depths below 1000 meter, where carbon can be stored for millennia.

Fast bacterial community

Yet, even it is very important, it is still poorly understood how the carbon pump process works at the molecular level. Scientists of the research group Marine Glycobiology, which is located at the Max Planck Institute for Marine Microbiology and the MARUM - Center for Marine Environmental Sciences at the University of Bremen, investigate in this context marine polysaccharides - meaning compounds made of multiple sugar units - which are produced by microalgae. These marine sugars are very different on a structural level and belong to the most complex biomolecules found in nature. One single bacterium is not capable to process this complex sugar-mix. Therefore a whole bunch of metabolic pathways and enzymes is needed. In nature, this is achieved by a community of different bacteria that work closely and very efficiently together - a perfect coordinated team. This bacterial community works so well that the major part of microalgal sugars are degraded before they aggregate and start to sink. A large amount of the sequestered carbon therefore is released back into the atmosphere.

But, how is it possible that nevertheless a lot of carbon is still transported to the deep-sea? The scientists of the group Marine Glycobiology now revealed a component that may be involved in this process and published their results in the journal Nature Communications. "We found a microalgal fucose-containing sulphated polysaccharide, in short FCSP, that is resistant to microbial degradation," says Silvia Vidal-Melgosa, first author of the paper. "This discovery challenges the existing paradigm that polysaccharides are rapidly degraded by bacteria." This assumption is the reason why sugars are overlooked as a carbon sink - until now. Analyses of the bacterial community, which were performed by scientists from the department of Molecular Ecology at the MPI in Bremen and the University of Greifswald, showed bacteria had a low abundance of enzymes for the degradation of this sugar.

A crucial part of the finding is that this microbial resistant sugar formed particles. During growth and upon death unicellular diatoms release a large amount of unknown, sticky long-chained sugars. With increasing concentration, these sugar chains stick together and form molecular networks. Other components attach to these small sugar flakes, such as other sugar pieces, diatom cells or minerals. This makes the aggregates larger and heavier and thus they sink faster than single diatom cells. These particles need about ten days to reach a depth of 1000 meters - often much longer. This means that the sticky sugar core has to resist biodegradation for at least so long to hold the particle together. But this is very difficult as the sugar-eating bacteria are very active and always hungry.

New method to analyse marine sugars

In order to unravel the structures of microalgae polysaccharides and identify resistant sticky sugars, the scientists of the research group Marine Glycobiology are testing new methods. This is necessary because marine sugars are found within complex organic matter mixtures. In the case of this study, they used a method which originates from medical and plant research. It combines the high-throughput capacity of microarrays with the specificity of monoclonal antibody probes. This means, that the scientists extracted the sugar-molecules out of the seawater samples and inserted them into a machine that works like a printer, which doesn't use ink but molecules. The molecules are separately "printed" onto nitrocellulose paper, in form of a microarray. A microarray is like a microchip, small like a fingernail, but can contain hundreds of samples. Once the extracted molecules are printed onto the array it is possible to analyse the sugars present on them. This is achieved by using the monoclonal antibody probes. Single antibodies are added to the arrays and as they react only with one specific sugar the scientists can see, which sugars are present in the samples.

"The novel application of this technology enabled us to simultaneously monitor the fate of multiple complex sugar molecules during an algal bloom," says Silvia Vidal-Melgosa. "It allowed us to find the accumulation of the sugar FCSP, while many other detected polysaccharides were degraded and did not store carbon." This study proves the new application of this method. "Notably, complex carbohydrates have not been measured in the environment before at this high molecular resolution," says Jan-Hendrik Hehemann, leader of the group Marine Glycobiology and senior author of the study. "Consequently, this is the first environmental glycomics dataset and therefore the reference for future studies about microbial carbohydrate degradation".

Next step: Search for particles in the deep sea

The discovery of FCSP in diatoms, with demonstrated stability and adhesive properties, provides a previously uncharacterised polysaccharide that contributes to particle formation and potentially therefore to carbon sequestration in the ocean. One of the next steps in the research is "to find out, if the particles of this sugar exist in the deep ocean," says Hehemann. "That would indicate that the sugar is stable and constitutes an important player of the biological carbon pump." Furthermore, the observed stability against bacterial degradation, and the structure and physicochemical behaviour of diatom FCSP point towards specific biological functions. "Given its stability against degradation, FCSP, which coats the diatom cells, may function as a barrier protecting the cell wall against microbes and their digestive enzymes," says Hehemann. And last but not least, another open question to be solved: These sugar particles were found in the North Sea near the island of Helgoland. Do they also exist in the sea of other regions in the world?

Credit: 
Max Planck Institute for Marine Microbiology