Earth

The IKBFU scientists created the first diamond x-ray micro lens

image: Ivan Lyatun and Polina Medvedskaya

Image: 
Immanuel Kant Baltic Federal University

After the synchrotrons of the fourth generation were invented (these are particle accelerators, which are, in fact, giant research facilities), there was an urgent need for a fundamentally new optics that could withstand high temperatures and radiation loads created by a powerful x-ray stream.

Scientists use metal and polymer lenses, but they are short-lived and the image they produce is distorted.

Several days ago the Optics Express scientific journal has published an article of the IKBFU scientists who offered a new innovative method of diamond x-ray microlens production.

Polina Medvedskaya, the scientist at the "Coherent Optics for Mega science-class plants" research center told us:

"A diamond is a unique and expensive material. But it is almost indestructible which makes the lens made of it more economically profitable than metallic or polymeric ones in the long run. The problem is that a diamond is the hardest material on the planet and it is extremely difficult to process. But we have found a way to do it. The IKBFU scientists used an electron-ion microscope (FIB) to process it"

Another scientist from the center Ivan Lyatun explained:

"It is concerning for a layman how is it possible to process something by using a microscope? But we have the microscope set on certain configuration that allows it not only to be used as a tool for analysis, but also to shape objects in the necessary form. Using microscopes like the one we make ultra-thin, nano-level cuts. And so we decided to make a microlens"

The result exceeded scientists' expectations. A series of thinner-than-human-hair lenses were produced and it may be used for in the most powerful synchrotrons and x-ray lasers. According to the scientists, the use of lenses made by them will make it possible to obtain more detailed information about any material - to study the structure of nanostructures, to obtain maximum information about protein crystals, which will make it possible to synthesize new drugs.

In a word, new diamond lenses will allow us to penetrate deeper into the secrets of matter, to find out what was previously hidden from the eyes of man.

Credit: 
Immanuel Kant Baltic Federal University

Cell-based test shows potential to predict which drugs and chemicals cause birth defects

Madison, WI (March 3, 2020): Recently published results from an evaluation of 1,065 chemical and drug substances using the devTOX quickPredict (devTOXqP) screening platform developed by Stemina Biomarker Discovery, Inc. demonstrated the platform's ability to predict developmental toxicity in humans with high accuracy using a cell-based test. In a peer-reviewed article published in Toxicological Sciences, scientists from the U.S. Environmental Protection Agency (EPA) reported that Stemina's devTOXqP test predicted the potential for developmental toxicity in a blinded set of chemicals and drugs from the agency's ToxCast TM program with an accuracy of 82%, where there was clear evidence of toxicity in humans or in animal studies. The agency's research suggests that devTOXqP is a useful tool for predicting developmental toxicants in humans and reducing the need for animal testing.

"Through this EPA research, the devTOXqP test demonstrated its potential to detect developmental toxicity across a wide variety of chemicals and pharmaceutical compounds," said Elizabeth Donley, J.D., M.B.A., M.S., chief executive officer of Stemina. "In addition to helping meet the EPA's goal of reducing the use of animals in testing chemicals, devTOXqP offers the only species-specific, commercially available platform for evaluating a drug or chemical's potential to cause birth defects in the developing human embryo. We believe this test fits well into global initiatives such as Tox21 and REACH that are seeking to reduce and refine the number of animals used for toxicity testing."

The potential of chemical substances to cause prenatal developmental toxicity is commonly assessed based on observations of fetal malformations and variations in rodent or rabbit studies. Such animal studies are costly, resource-intensive, and the results seen in one species often differ from those seen in other species or from those that might be relevant in humans. Some of the most promising non-animal testing alternatives make use of the self-organizing potential of embryonic stem cells to recapitulate developmental processes that may be sensitive to chemical exposure. The Stemina devTOXqP test uses human embryonic stem cells (hESCs) or human induced pluripotent stem cells (iPSCs) to predict developmental toxicity based on changes in cellular metabolism following drug or chemical exposure. The iPSC is a reprogrammed cell that is able to recapitulate development into all cell types like hESCs but does not come from an embryo.

In the EPA study, 1,065 ToxCast chemicals were first screened in single-concentration for the targeted biomarkers, the ratio of the amino acids ornithine (ORN) to cystine (CYSS), in response to the tested compound. Of the screened chemicals, 17% were predicted by the Stemina assay to cause developmental toxicity. These compounds were then tested at eight concentrations in the devTOXqP test to determine the exposure level at which the compound was considered to be toxic. The assay performance reached 82% accuracy with 67% sensitivity and 84% specificity. The sensitivity of the assay improved when more stringent evidence of toxicity was applied to the animal studies. Statistical analysis of the most potent chemical hits on specific biochemical targets in ToxCast provided insights into the mechanistic underpinnings of the targeted endpoint of the devTOXqP platform. The researchers found that an imbalance in Ornithine/Cystine was highly predictive of a chemical's potential to disrupt the development of an embryo or fetus, halting the pregnancy or producing birth defects.

"The extensive nature of this research helps to define the applicability domain of the test -- in other words, where does it perform well and where will it need to be paired with other endpoints to generate a better understanding of the potential to cause birth defects", said Jessica Palmer, M.S., associate director of toxicology at Stemina. "This is just the first step in our longer-term goal of moving away from a reliance on animal tests to predict human response and provides a foundation for building integrated testing systems focused on human cells."

"To address concerns about the effects of drugs and chemicals on our health, we need more human-relevant methods to assess toxicity," said Kristie Sullivan, M.P.H., vice president for research policy at the Physician's Committee for Responsible Medicine. "We encourage regulatory agencies and companies to consider how the devTOXqP test can improve safety and help reduce animal testing."

Credit: 
Bioscribe

Unstable rock pillars near reservoirs can produce dangerous water waves

image: Three types of impulse waves created when unstable rock pillars collapse into a body of water

Image: 
Huang Bolin

WASHINGTON, March 3, 2020 -- In many coastal zones and gorges, unstable cliffs often fail when the foundation rock beneath them is crushed. Large water waves can be created, threatening human safety.

In this week's Physics of Fluids, from AIP Publishing, scientists in China reveal the mechanism by which these cliffs collapse, and how large, tsunami-like waves, known as impulse waves, are created. Few experimental studies of this phenomenon have been carried out, so this work represents valuable new data that can be used to protect from impending disaster.

The experiments were carried out in a transparent, rectangular box. A large body of water was placed at one end of the box and a granular pile at the other. The pile is separated from the water by a movable gate. When the gate is brought upward quickly, the pile collapses and slides into the water, inducing wave action.

The shape of the granular particles was chosen to resemble the shapes of samples from the Three Gorges Reservoir area in China. Motion of the body of water was observed with additional floating and suspended particles. Movements of this second group of particles was detected by a camera outside the box at 100 frames per second.

The investigators varied the width and height of the granular pile and, separately, the height of the body of water. The pile height-to-width ratio was found to be critical, determining how the pile collapses as well as the types of impulse waves produced.

The pile collapsed in four stages, eventually pushing water away. The resulting water movements were transferred back to the pile under certain conditions, creating vortices. Three types of water waves were generated: transition waves, solitary waves and bores.

Transition waves decay gradually as they propagate. Solitary waves, though, consist of a single water crest that moves rapidly, without decreasing its amplitude. In a bore, the top of the wave breaks and spills forward.

By fitting the observed experimental data to a formula, the investigators developed a way to predict which type of wave would be produced.

"These formulas are highly suitable for the collapse of partially submerged granular piles," said co-author Huang Bolin.

Credit: 
American Institute of Physics

NYU Abu Dhabi researchers design new technology for targeted cancer drug delivery

Fast facts:

Conventional anticancer drugs (chemotherapeutics) suffer from a number of issues, including poor solubility, short blood circulation time, lack of selectivity, and toxicity to healthy tissue.

Nanomedicine has the potential to overcome these intrinsic limitations of conventional chemotherapy.

Nanomedicine involves the use of nanocarriers, which are typically NYUAD researchers have developed nanocarriers to effectively deliver chemotherapeutics specifically to cancer cells, while minimizing exposure of healthy tissue.

Findings have been published in an article, pH-Responsive High Stability Polymeric Nanoparticles for Targeted Delivery of Anticancer Therapeutics, in the journal Communications Biology, and selected by the editors as a research highlight.

Abu Dhabi, UAE, -- March 3, 2020: A team of researchers at NYU Abu Dhabi has developed a biocompatible, biodegradable, and economical nanocarrier for safer and more effective delivery of anticancer drugs. Researchers demonstrate that the novel pH-responsive hybrid (i.e. multi-component) nanoparticles can be loaded with a wide range of chemotherapeutics to effectively and specifically target cancer cells, as reported in their paper published on March 3, 2020, in the journal Communications Biology.

Conventional chemotherapy drugs work primarily by interfering with DNA replication and mitosis (a type of cell division) in order to induce cell death in rapidly dividing cancer cells, thereby minimizing tumor growth. The limitations of conventional chemotherapy include poor solubility, short blood circulation time, lack of selectivity, toxicity to healthy tissue, drug resistance, and tumor recurrence. Consequently, it becomes necessary to administer high doses of chemotherapeutics to ensure that a sufficient amount reaches the tumor to cause the desired effect in cancer cells. Unfortunately, the high drug doses lead to damage to healthy tissue, resulting in a range of side-effects that include nausea, hair loss, fatigue, decreased resistance to infection, infertility and organ damage.

Cancer nanomedicine - the use of nanocarriers to diagnose, track, and treat cancer - has the potential to overcome the limitations of conventional chemotherapeutics. However, the practical application of many nanocarriers as cancer drug delivery systems is often hampered by a number of issues, including poor circulation stability, inadequate accumulation in target tumor tissue and inefficient uptake and/or transport in target cancer cells.

As reported in the paper, pH-Responsive High Stability Polymeric Nanoparticles for Targeted Delivery of Anticancer Therapeutics, NYUAD's Magzoub lab in collaboration with Professor Francisco N. Barrera's lab at the University of Tennessee at Knoxville, have developed nanocarriers that that can overcome complications associated with conventional chemotherapeutics as well as current nanocarriers.

"We used a simple approach and readily available low-cost materials to prepare biocompatible and biodegradable pH-responsive hybrid nanoparticles for the effective delivery of chemotherapeutics specifically to tumor cells," said Loganathan Palanikumar, a research associate in the Magzoub lab and first author of the study. "Thus, unlike many nanocarriers, which require complex chemistry and costly equipment and materials, our nanoparticles can be easily prepared and used by other researchers, even those with limited resources," added Palanikumar.

The nanoparticles consist of a US Food and Drug Administration (FDA)?approved polymer core wrapped with a biocompatible and biodegradable protein shell. The core can be loaded with a wide range of cancer therapeutics. Designed to prolong the blood circulation time, the protein shell also serves to ensure that the nanocarrier remains stable long enough to reach the target location. Finally, the nanocarrier is decorated with a pH-responsive peptide, developed by the Barrera lab, which facilitates the cellular uptake specifically in cancer cells within the acidic environment of solid tumors. Following efficient cellular uptake, the unique conditions within cancer cells degrade the hybrid nanoparticles and release the loaded chemotherapeutic drugs.

Palanikumar, along with cancer researchers in the Magzoub lab, Sumaya Al-Hosani and Mona Kalmouni, and Vanessa Nguyen, at the time a graduate student in the Barrera lab, with support from Research Instrumentation Scientists at the Core Technology Platforms at NYUAD, Liaqat Ali and Renu Pasricha, extensively characterized the properties of the nanocarrier in a wide range of cancer cell lines and in tumor-bearing mice. The drug-loaded hybrid nanoparticles showed potent anticancer activity, leading to a substantial reduction in tumor volume and mass and prolonged survival, while exhibiting no adverse effects to healthy tissue.

"These novel pH-responsive hybrid nanoparticles are a highly promising cancer drug delivery platform that combines high stability with effective tumor targeting and triggered release of chemotherapeutic agents in cancer cells," said NYUAD Assistant Professor of Biology Mazin Magzoub.

Credit: 
New York University

Boosting energy levels within damaged nerves may help them heal

When the spinal cord is injured, the damaged nerve fibers--called axons--are normally incapable of regrowth, leading to permanent loss of function. Considerable research has been done to find ways to promote the regeneration of axons following injury. Results of a study performed in mice and published in Cell Metabolism suggests that increasing energy supply within these injured spinal cord nerves could help promote axon regrowth and restore some motor functions. The study was a collaboration between the National Institutes of Health and the Indiana University School of Medicine in Indianapolis.

"We are the first to show that spinal cord injury results in an energy crisis that is intrinsically linked to the limited ability of damaged axons to regenerate," said Zu-Hang Sheng, Ph.D., senior principal investigator at the NIH's National Institute of Neurological Disorders and Stroke (NINDS) and a co-senior author of the study.

Like gasoline for a car engine, the cells of the body use a chemical compound called adenosine triphosphate (ATP) for fuel. Much of this ATP is made by cellular power plants called mitochondria. In spinal cord nerves, mitochondria can be found along the axons. When axons are injured, the nearby mitochondria are often damaged as well, impairing ATP production in injured nerves.

"Nerve repair requires a significant amount of energy," said Dr. Sheng. "Our hypothesis is that damage to mitochondria following injury severely limits the available ATP, and this energy crisis is what prevents the regrowth and repair of injured axons."

Adding to the problem is the fact that, in adult nerves, mitochondria are anchored in place within axons. This forces damaged mitochondria to remain in place while making it difficult to replace them, thus accelerating a local energy crisis in injured axons.

The Sheng lab, one of the leading groups studying mitochondrial transport, previously created genetic mice that lack the protein--called Syntaphilin--that tethers mitochondria in axons. In these "knockout mice" the mitochondria are free to move throughout axons.

"We proposed that enhancing transport would help remove damaged mitochondria from injured axons and replenish undamaged ones to rescue the energy crisis" said Dr. Sheng.

To test whether this has an impact on spinal cord nerve regeneration, the Sheng lab collaborated with Xiao-Ming Xu, M.D., Ph.D. and colleagues from the Indiana University School of Medicine, who are experts in modeling multiple types of spinal cord injury.

"Spinal cord injury is devastating, affecting patients, their families, and our society," said Dr. Xu. "Although tremendous progress has been made in our scientific community, no effective treatments are available. There is definitely an urgent need for the development of new strategies for patients with spinal cord injury."

When the researchers looked in three injury models in the spinal cord and brain, they observed that Syntaphilin knockout mice had significantly more axon regrowth across the injury site compared to control animals. The newly grown axons also made appropriate connections beyond the injury site.

When the researchers looked at whether this regrowth led to functional recovery, they saw some promising improvement in fine motor tasks in mouse forelimbs and fingers. This suggested that increasing mitochondrial transport and thus the available energy to the injury site could be key to repairing damaged nerve fibers.

To test the energy crisis model further, mice were given creatine, a bioenergetic compound that enhances the formation of ATP. Both control and knockout mice that were fed creatine showed increased axon regrowth following injury compared to mice fed saline instead. More robust nerve regrowth was seen in the knockout mice that got the creatine.

"We were very encouraged by these results," said Dr. Sheng. "The regeneration that we see in our knockout mice is very significant, and these findings support our hypothesis that an energy deficiency is holding back the ability of both central and peripheral nervous systems to repair after injury."

Dr. Sheng also points out that these findings, while promising, are limited by the need to genetically manipulate mice. Mice that lack Syntaphilin show long-term effects on regeneration, while creatine alone produces only modest regeneration. Future research is required to develop therapeutic compounds that are more effective in entering the nervous system and increasing energy production for possible treatment of traumatic brain and spinal cord injury.

Credit: 
NIH/National Institute of Neurological Disorders and Stroke

Plastic from wood

image: Lignin is a promising raw material (left) for thermoplast (right) production.

Image: 
KTH Stockholm, Marcus Jawerth

The biopolymer lignin is a by-product of papermaking and a promising raw material for manufacturing sustainable plastic materials. However, the quality of this naturally occurring product is not as uniform as that of petroleum-based plastics. An X-ray analysis carried out at DESY reveals for the first time how the internal molecular structure of different lignin products is related to the macroscopic properties of the respective materials. The study, which has been published in the journal Applied Polymer Materials, provides an approach for a systematic understanding of lignin as a raw material to allow for production of lignin-based bioplastics with different properties, depending on the specific application.

Lignin is a class of complex organic polymers and responsible for the stability of plants, stiffening them and making them "woody" (i.e. lignification). During paper production, lignin is separated from cellulose. Lignin forms so-called aromatic compounds, which also play a key role in manufacturing synthetic polymers or plastics. "Lignin is the biggest source of naturally occurring aromatic compounds, but until now it has been viewed by the paper industry primarily as a by-product or a fuel," explains Mats Johansson from the Royal Institute of Technology (KTH) in Stockholm, who led the research team. "Millions of tonnes of it are produced every year, providing a steady stream of raw material for new potential products."

Some first applications of hard lignin-based plastics (thermosets) already exist. However, their properties often vary and until now it has been difficult to control them specifically. The Swedish team has now shed light on the nanostructure of different fractions of commercially available lignin at DESY's X-ray source PETRA III. "It turns out that there are lignin fractions with larger and smaller domains," reports the principle author Marcus Jawerth, of Stockholm's KTH. "This can offer certain advantages, depending on the particular application: it makes the lignin harder or softer by altering the so-called glass transition temperature at which the biopolymer adopts a viscous state."

Among other things, the X-ray analysis revealed that those types of lignin whose central benzene rings are arranged in the shape of a T are particularly stable. "The molecular structure affects the macroscopic mechanical properties," explains DESY's Stephan Roth, who is in charge of the P03 beamline at which the experiments were conducted and who co-authored the paper. "This is the first time this has been characterised." As a natural product, lignin comes in numerous different configurations. Further studies are needed to provide a systematic overview of how different parameters affect the properties of the lignin. "This is very important in order to be able to manufacture materials reproducibly, and in particular to predict their properties," says Roth, who is also a professor at KTH Stockholm. "If you want to use a material industrially, you need to understand its molecular structure and know how this is correlated with the mechanical properties."

According to Jawerth, up to two thirds of the lignin produced during the paper production process could be turned into polyesters and serve as a starting material for making plastics. "Along with cellulose and chitin, lignin is one of the most ubiquitous organic compounds on Earth and offers enormous potential for replacing petroleum-based plastics," says the scientist. "It's far too valuable to simply burn it."

Credit: 
Deutsches Elektronen-Synchrotron DESY

New version of Earth model captures detailed climate dynamics

image: Water vapor (gray) and sea surface temperature (blue to red) from the high-resolution E3SMv1. Just above center you can see a hurricane and the track of cold water (green) it produces trailing behind it.

Image: 
Mat Maltrud / Los Alamos National Laboratory

Earth supports a breathtaking range of geographies, ecosystems and environments, each of which harbors an equally impressive array of weather patterns and events. Climate is an aggregate of all these events averaged over a specific span of time for a particular region. Looking at the big picture, Earth’s climate just ended the decade on a high note — although not the type one might celebrate.

In January, several leading U.S. and European science agencies reported 2019 as the second-hottest year on record, closing out the hottest decade. July went down as the hottest month ever recorded.

Using new high-resolution models developed through the U.S. Department of Energy’s (DOE) Office of Science, researchers are trying to predict these kinds of trends for the near future and into the next century; hoping to provide the scientific basis to help mitigate the effects of extreme climate on energy, infrastructure and agriculture, among other essential services required to keep civilization moving forward.

Seven DOE national laboratories, including Argonne National Laboratory, are among a larger collaboration working to advance a high-resolution version of the Energy Exascale Earth System Model (E3SM). The simulations they developed can capture the most detailed dynamics of climate-generating behavior, from the transport of heat through ocean eddies — advection — to the formation of storms in the atmosphere.

“E3SM is an Earth system model designed to simulate how the combinations of temperature, winds, precipitation patterns, ocean currents and land surface type can influence regional climate and built infrastructure on local, regional and global scales,” explains Robert Jacob, Argonne’s E3SM lead and climate scientist in its Environmental Science division. “More importantly, being able to predict how changes in climate and water cycling respond to increasing carbon dioxide (CO2) is extremely important in planning for our future.”

“Climate change can also have big impacts on our need and ability to produce energy, manage water supplies and anticipate impacts on agriculture” he adds, “so DOE wants a prediction model that can describe climate changes with enough detail to help decision-makers."

“E3SM is an Earth system model designed to simulate how the combinations of temperature, winds, precipitation patterns, ocean currents and land surface type can influence regional climate and built infrastructure on local, regional and global scales.” — Robert Jacob, Argonne climate scientist

Facilities along our coasts are vulnerable to sea level rise caused, in part, by rapid glacier melts, and many energy outages are the result of extreme weather and the precarious conditions it can create. For example, 2019’s historically heavy rainfalls caused damaging floods in the central and southern states, and hot, dry conditions in Alaska and California resulted in massive wild fires.

And then there is Australia.

To understand how all of Earth’s components work in tandem to create these wild and varied conditions, E3SM divides the world into thousands of interdependent grid cells — 86,400 for the atmosphere to be exact. These account for most major terrestrial features from “the bottom of the ocean to nearly the top of the atmosphere,” collaboration members wrote in a recent article published in the Journal of Advances in Modeling Earth Systems.

“The globe is modeled as a group of cells with 25 kilometers between grid centers horizontally or a quarter of a degree of latitude resolution,” says Azamat Mametjanov, an application performance engineer in Argonne’s Mathematics and Computer Science division. “Historically, spatial resolution has been much coarser, at one degree or about 100 kilometers. So we’ve increased the resolution by a factor of four in each direction. We are starting to better resolve the phenomena that energy industries worry about most — extreme weather.”

Researchers believe that E3SM’s higher-resolution capabilities will allow researchers to resolve geophysical features like hurricanes and mountain snowpack that prove less clear in other models. One of the biggest improvements to the E3SM model was sea surface temperature and sea ice in the North Atlantic Ocean, specifically, the Labrador Sea, which required an accurate accounting of air and water flow.

“This is an important oceanic region in which lower-resolution models tend to represent too much sea ice coverage,” Jacob explains. “This additional sea ice cools the atmosphere above it and degrades our predictions in that area and also downstream.”

Increasing the resolution also helped resolve the ocean currents more accurately, which helped make the Labrador Sea conditions correspond with observations from satellites and ships, as well as making better predictions of the Gulf Stream.

Another distinguishing characteristic of the model, says Mametjanov, is its ability to run over multiple decades. While many models can run at even higher resolution, they can run only from five to 10 years at most. Because it uses the ultra-fast DOE supercomputers, the 25-km E3SM model ran a course of 50 years.

Eventually, the team wants to run 100 years at a time, interested mainly in the climate around 2100, which is a standard end date used for simulations of future climate.

Higher resolution and longer time sequences aside, running such a model is not without its difficulties. It is a highly complex process.

For each of the 86,400 cells related to the atmosphere, researchers run dozens of algebraic operations that correspond to some meteorological processes, such as calculating wind speed, atmospheric pressure, temperature, moisture or the amount of localized heating contributed by sunlight and condensation, to name just a few.

“And then we have to do it thousands of times a day,” says Jacob. “Adding more resolution makes the computation slower; it makes it harder to find the computer time to run it and check the results. The 50-year simulation that we looked at in this paper took about a year in real time to run.”

Another dynamic for which researchers must adjust their model is called forcing, which refers mainly to the natural and anthropogenic drivers that can either stabilize or push the climate into different directions. The main forcing on the climate system is the sun, which stays relatively constant, notes Jacob. But throughout the 20th century, there have been increases in other external factors, such as CO2 and a variety of aerosols, from sea-spray to volcanic.

For this first simulation, the team was not so much probing a specific stretch of time as working on the model’s stability, so they chose a forcing that represents conditions during the 1950s. The date was a compromise between preindustrial conditions used in low-resolution simulations and the onset of the more dramatic anthropogenic greenhouse gas emissions and warming that would come to a head in this century.

Eventually, the model will integrate current forcing values to help scientists further understand how the global climate system will change as those values increase, says Jacob.

“While we have some understanding, we really need more information — as do the public and energy producers — so we can see what’s going to happen at regional scales,” he adds. “And to answer that, you need models that have more resolution.”

One of the overall goals of the project has been to improve performance of the E3SM on DOE supercomputers like the Argonne Leadership Computing Facility’s Theta, which proved the primary workhorse for the project. But as computer architectures change with an eye toward exascale computing, next steps for the project include porting the models to GPUs.

“As the resolution increases using exascale machines, it will become possible to use E3SM to resolve droughts and hurricane trends, which develop over multiple years,” says Mametjanov.

“Weather models can resolve some of these, but at most for about 10 days. So there is still a gap between weather models and climate models and, using E3SM, we are trying to close that gap.”

Credit: 
DOE/Argonne National Laboratory

More than 60% of Myanmar's mangroves has been deforested in the last 20 years: NUS study

image: A pristine mangrove forest dominated by Rhizophora mucronata at Lampi Island in Tanintharyi region, Myanmar.

Image: 
Dr Maung Maung Than

Mangroves account for only 0.7 per cent of the Earth's tropical forest area, but they are among the world's most productive and important ecosystems. They provide a wealth of ecological and socio-economic benefits, such as serving as nursery habitat for fish species, offering protection against coastal surges associated with storms and tsunamis, and storing carbon.

While many countries have established legal protection for mangroves, their value for sustainable ecosystem services face strong competition from converting the land to other more lucrative uses, particularly for agriculture. In the past decade, studies have shown that mangrove deforestation rates are higher than the deforestation of inland terrestrial forests.

New research from the National University of Singapore (NUS) provided additional support for this, with results showing that mangroves deforestation rates in Myanmar, an important country for mangrove extent and biodiversity, greatly exceed previous estimates.

The research, led by Associate Professor Edward Webb and Mr Jose Don De Alban from the Department of Biological Sciences at the NUS Faculty of Science, was published online in the journal Environmental Research Letters on 3 March 2020.

Drastic mangrove deforestation

Using satellite images and multiple analytical tools, the NUS team was able to assess the extent of mangrove in 1996. The researchers then followed the fate of every 30-metre x 30-metre mangrove image pixel for 2007 and 2016.

The team's estimates revealed that in 1996, Myanmar had substantially more mangroves than previously estimated. However, over the 20-year period, more than 60 per cent of all mangroves in Myanmar had been permanently or temporarily converted to other uses. These include the growing of rice, oil palm, and rubber, as well as for urbanisation.

"Although fish and prawn farms accounted for only a minor amount of mangrove conversion, this may change in the near future. These competing land cover types are commercially important, but incompatible with mangrove persistence," said Mr De Alban.

Urgent need for mangrove protection in Myanmar

With the loss of nearly two-thirds of its mangroves, there is a need for the Myanmar government to develop holistic strategies to conserve this important habitat. This is particularly important as Myanmar strives to become more integrated into the regional and global markets for agriculture and aquaculture products.

"The fate of mangroves in the country will be tied to the strength of policies and implementation of conservation measures. Through proper long-term planning, management and conservation, this resilient ecosystem can recover and be maintained for the future," shared Assoc Prof Webb.

Credit: 
National University of Singapore

Why is the female wallaby always pregnant?

The swamp wallaby is the only mammal that is permanently pregnant throughout its life according to new research about the reproductive habits of marsupials.

Unlike humans, kangaroos and wallabies have two uteri. The new embryo formed at the end of pregnancy develops in the second, 'unused' uterus. Then, once the newborn from the first pregnancy begins to suck milk, the new embryo enters a long period of developmental arrest that may last up to 11 months or more.

When the sucking stimulus from the young in the pouch declines, the dormant embryo starts growing again and the cycle starts anew, with females returning to oestrus in late pregnancy, mating, and forming another embryo.

"Thus, females are permanently pregnant their whole lives," said Dr Brandon Menzies who collaborated on the research with Professor Marilyn Renfree and Professor Thomas Hildebrandt from the Leibniz Institute for Zoo and Wildlife research in Berlin, Germany.

The findings shed light on the record-breaking reproductive systems of marsupials.

Mammalian pregnancy is usually longer, up to 22 months in elephants, with several stages that require different combinations of hormones.

While most mammals also require a break between pregnancies, either to support new young or during periods of seasonal lack of resources, the female swamp wallaby is the only one that can claim the reproductive feat of being permanently pregnant throughout its life.

Marsupials have the largest sperm, the shortest pregnancies, and exhibit the longest periods of embryonic diapause (developmental arrest of the embryo) among mammals.

Furthermore, kangaroos and wallabies regularly support young at three different stages of development, namely, an embryo in the uterus, an early stage pouch young and a semi-dependent young at foot (still sucking milk).

"Whatever the reason, the swamp wallaby is an incredibly successful and ubiquitous species in Australia, occupying a range that stretches from the Western Victoria/South Australian border all the way up the eastern seaboard to cape York in far north Queensland," said Dr Menzies.

The research is published today as Unique reproductive strategy in the swamp wallaby (Wallabia bicolor), in the Proceedings of the National Academy of Sciences USA.

"We used high resolution ultrasound to track pregnancy and mating in 10 female swamp wallabies," said Dr Menzies. "What we found amazed us - the females come into oestrus, mate and form a new embryo 1-2 days before the end of their existing pregnancy. "The swamp wallaby is the only mammal known to be continuously pregnant in this way."

Just one other species of mammal is known to go into oestrus while still pregnant - the European Brown Hare (Lepus europaeus). This species also returns to oestrus in late pregnancy and conceives additional embryos before giving birth. This feat may be all the more remarkable in the hare because the embryos are conceived within common uterine horns already supporting late-stage fetuses.

Credit: 
University of Melbourne

World's sandy beaches under threat from climate change

Half of the world's beaches could disappear by the end of the century due to coastal erosion, according to a new study led by the JRC.

Erosion is a major problem facing sandy beaches that will worsen with the rising sea levels brought about by climate change. According to the study, published today in Nature Climate Change, effective climate action could prevent 40% of that erosion.

Sandy beaches cover more than 30% of the world's coastlines. They are popular recreational spots for people and they provide important habitats for wildlife. They also serve as natural buffer zones that protect the coastline and backshore coastal ecosystems from waves, surges and marine flooding. Their role as shock absorbers will become more important with the rising sea levels and more intense storms expected with climate change.

However, climate change will accelerate erosion and could make more than half of the world's sandy beaches completely vanish by the end of this century. Fuelled by a growing population and urbanisation along coastlines, this is likely to result in more people's homes and livelihoods being impacted by coastal erosion in the decades to come.

The findings come from the first global assessment of future sandy shoreline dynamics. JRC scientists combined 35 years of satellite coastal observations with 82 years of climate and sea level rise projections from several climate models. They also simulated more than 100 million storm events and measured the resulting global coastal erosion.

They found that reducing greenhouse gas emissions could prevent 40% of the projected erosion.

However, even if global warming is curbed, societies will still need to adapt and better protect sandy beaches from erosion. This is one of the main messages of the Paris Agreement on climate and the EU adaptation strategy.

Human presence makes sandy beaches less resilient

Sandy coastlines are extremely dynamic environments due to altering wave conditions, sea levels and winds, as well as geological factors and human activity. They are naturally resilient to climate variations, as they can accommodate higher seas and marine storms by retreating and adapting their morphology.

However, they face more and more pressure because of human development, a situation that could become worse if contemporary coastal urbanisation and population growth continues. As 'backshores' - the area of a coastline above the high-tide level - become increasingly built-up, sandy shorelines are losing their natural capacity to accommodate or recover from erosion.

At the same time, river dams and human developments retain sediment upstream that would naturally feed beaches. As a result, a substantial proportion of the world's sandy coastlines are already eroding, a process that could accelerate with rising sea levels.

Climate mitigation and adaptation challenges ahead

The study shows that without climate mitigation and adaptation almost half of the world's sandy beaches are under threat of near extinction by the end of the century. Apart from the loss of valuable ecosystems, socioeconomic implications can be severe, especially in poorer, tourism dependent communities where sandy beaches are the main tourist attraction. Small island nations are among the more vulnerable regions.

In most parts of the world, projected shoreline dynamics are dominated by sea level rise and moderate greenhouse gas emission mitigation could prevent 40% of shoreline retreat globally. Yet, in certain regions the effects of climate change are counteracted by 'accretive ambient shoreline changes' - the build-up of sandy beaches from sediments arriving due to other natural or anthropogenic factors. . This is true for areas of the Amazon, East and Southeast Asia and the North Tropical Pacific.

Commission is already working to address these challenges

The EU is committed to mitigating climate emissions. With the Green Deal for Europe it strives to achieve carbon neutrality by 2050.

The EU Strategy on Adaptation to Climate Change aims to make Europe more resilient and minimise the impact of unavoidable climate change. It stresses that coastal zones are particularly vulnerable to the effects of climate change, which challenges the climate resilience and adaptive capacity of our coastal societies. This requires a strong EU Strategy and preparedness actions by Member States aimed at reducing the vulnerability of their citizens and economies to coastal hazards in order to minimize future climate impacts in Europe.

The EC published the recommendations for Integrated Coastal Management. This policy instrument requires establishing a coastal setback zone, extending at least 100 m landward from the highest winter waterline, taking into account, inter alia, the areas directly and negatively affected by climate change and natural risks.

The EC Floods Directive requires Member States to assess if all water courses and coastlines are at risk from flooding, to map the flood extent and assets and humans at risk in these areas and to take adequate and coordinated measures to reduce this flood risk. Maintaining healthy sandy beaches is an effective coastal protection measure, and has environmental benefits.

Several sandy environments are included in the EC Habitats Directive as they are related to protected species and many of the NATURA protected areas include sandy coastlines.

Implementation of these actions should lead to rivers being more able than they are now to resume their natural function of sediment transport which, simply put, provides the material for natural processes to restore and retain sandy beaches.

The Sendai Framework for Disaster Risk Reduction 2015-2030, the Paris Agreement on Climate Change and the Sustainable Development Goals have set the agenda for reducing disaster risks through sustainable and equitable economic, social, and environmental development.

In this context, poorer low-lying countries remain particularly vulnerable to coastal hazards and huge investments or changes in these societies may be needed to close the vulnerability gap with richer countries.

Credit: 
European Commission Joint Research Centre

Coordination chemistry and Alzheimer's disease

image: Chemical modifications of the coordination sphere in Cu(II)-amyloid-? utilizing copper-O2 chemistry.

Image: 
KAIST

It has become evident recently that the interactions between copper and amyloid-β neurotoxically impact the brain of patients with Alzheimer's disease. KAIST researchers have reported a new strategy to alter the neurotoxicity in Alzheimer's disease by using a rationally designed chemical reagent.

This strategy, developed by Professor Mi Hee Lim from the Department of Chemistry, can modify the coordination sphere of copper bound to amyloid-β, effectively inhibiting copper's binding to amyloid-β and altering its aggregation and toxicity. Their study was featured in PNAS last month.

The researchers developed a small molecule that is able to directly interact with the coordination sphere of copper-amyloid-β? complexes followed by modifications via either covalent conjugation, oxidation, or both under aerobic conditions. The research team simply utilized copper-dioxygen chemistry to design a chemical reagent.

Answering how peptide modifications by a small molecule occur remains very challenging. The system includes transition metals and amyloidogenic proteins and is quite heterogeneous, since they are continuously being changed. It is critical to carefully check the multiple variables such as the presence of dioxygen and the type of transition metal ions and amyloidogenic proteins in order to identify the underlying mechanisms and target specificity of the chemical reagent.

The research team employed various biophysical and biochemical methods to determine the mechanisms for modifications on the coordination sphere of copper-A? complexes. Among them, peptide modifications were mainly analyzed using electrospray ionization-mass spectrometry.

Mass spectrometry (MS) has been applied to verify such peptide modifications by calculating the shift in exact mass. The research team also performed collision-induced dissociation (CID) of the target ion detected by MS to pinpoint which amino acid residue is specifically modified. The CID fragmentizes the amide bond located between the amino acid residues. This fragmental analysis allows us to identify the specific sites of peptide modifications.

The copper and amyloid-β complexes represent a pathological connection between metal ions and amyloid-β in Alzheimer's disease. Recent findings indicate that copper and amyloid-β can directly contribute toward neurodegeneration by producing toxic amyloid-β oligomers and reactive oxygen species.

Professor Lim said, "This study illustrates the first experimental evidence that the 14th histidine residue in copper-amyloid-β? complexes can be specifically modified through either covalent conjugation, oxidation, or both. Considering the neurotoxic implications of the interactions between copper and amyloid-β, such modifications at the coordination sphere of copper in amyloid-β? could effectively alter its properties and toxicity."

"This multidisciplinary study with an emphasis on approaches, reactivities, and mechanisms looks forward to opening a new way to develop candidates of anti-neurodegenerative diseases," she added. The National Research Foundation of Korea funded this research.

Credit: 
The Korea Advanced Institute of Science and Technology (KAIST)

New lithium batteries from used cell phones

image: Álvaro Caballero(left.) and Fernando Luna, working in the laboratory

Image: 
University of Córdoba

Lithium-ion batteries are used around the world and though over the last few years they have had some competition, such as sodium and magnesium, they continue to be indispensable due to their high density and capacity. The issue is this: this metal has major availability and concentration problems. Almost 85% of its reserves are located in what is known as the Lithium Triangle, a geographical area found on the borders of Argentina, Bolivia and Chile. In addition, it seems that demand will rocket over the next few decades because of the implementation of electric vehicles. Each car equals about 7,000 cell phone batteries, so reusing their different components has become an issue of utmost importance.

Specifically, a research project at the University of Cordoba (Spain) and San Luis University (Argentina), in which Lucía Barbosa, Fernando Luna, Yarivith González Peña and Álvaro Caballero participated, was able to manufacture new lithium batteries from used cell phones, devices with a low recycling rate, which, if not handled properly, end up adding to the long list of electronic waste produced every year worldwide.

In particular, the project found a way to recycle the graphite in these devices, a material located in the negative terminals of batteries whose function is to store and conduct lithium. As highlighted by one of the heads of the study, Professor Álvaro Caballero, the researchers were able to eliminate the impurities of used graphite, reorganize its structure and reactivate it for new use. Interestingly, this material makes up a quarter of the total weight of a lithium battery, so when it is recycled, "we are recovering 25% of the whole energy storage system, a fact that is all the more relevant considering that this material comes from crude oil", explains Caballero.

Another important aspect of this research is that on the positive terminal of this new recycled battery, they were able to forego the cobalt, which is widely used in the mobile device industry. As pointed out by one of the lead authors of the study, researcher Fernando Luna, "cobalt is a toxic element that is more expensive than others like manganese and nickel, which were used in this research". What is more, it is one of the so-called blood minerals, whose mining, like coltan mining, is associated with mines in conflict zones.

Promising results

According to the conclusions shown by the study, "the results are comparable and in some cases better than those obtained from commercial graphite". Some of the tests carried out show that, in the best cases, battery capacity is kept stable after getting through a hundred charge cycles, which equates to about a year of performance.

Despite these promising results and that the tests were done on complete cells of a real battery, the research was performed on a small scale and in the laboratory, hence there is still a long way to go until this manual process of recycling can be standardized.

"Currently, more than 90% of lead battery components used in conventional vehicles are reused", explains researcher Álvaro Caballero, so, "if we opt for sustainability and the democratization of electric cars, large-scale recycling of lithium batteries has to come about".

Credit: 
University of Córdoba

Plasma-driven biocatalysis

image: Marco Krewing, Abdulkadir Yayci and Julia Bandow are researching possibilities for environmentally friendly catalysis with enzymes.

Image: 
RUB, Marquard

Compared with traditional chemical methods, enzyme catalysis has numerous advantages. But it also has weaknesses. Some enzymes are not very stable. Enzymes that convert hydrogen peroxide are even inactivated by high concentrations of the substrate. A research team at Ruhr-Universität Bochum (RUB), together with international partners, has developed a process in which the starting material, i.e. hydrogen peroxide, is fed to the biocatalysts in a controlled manner using plasma. The enzymes themselves are protected from harmful components of the plasma by a buffer layer. Using two model enzymes, the team showed that the process works, as reported in the journal "ChemSusChem" from 5 February 2020.

Milder conditions, less energy consumption and waste

In biocatalysis, chemicals are produced by cells or their components, in particular by enzymes. Biocatalysis has many advantages over traditional chemical processes: the reaction conditions are usually much milder, energy consumption is lower and less toxic waste is produced. The high specificity of enzymes also means that fewer side reactions occur. Moreover, some fine chemicals can only be synthesised by biocatalysis.

The weak spot of enzyme biocatalysis is the low stability of some enzymes. "Since the enzyme often has to be replaced in such cases - which is expensive - it is extremely important to increase the stability under production conditions," explains lead author Abdulkadir Yayci from the Chair of Applied Microbiology headed by Professor Julia Bandow.

Hydrogen peroxide: necessary, but harmful

The research team has been studying two similar classes of enzymes: peroxidases and peroxygenases. Both use hydrogen peroxide as a starting material for oxidations. The crucial problem is that hydrogen peroxide is absolutely necessary for activity, but in higher concentrations it leads to a loss of activity of the enzymes. As far as these enzyme classes are concerned, it is therefore vital to supply hydrogen peroxide in precise doses.

To this end, the researchers investigated plasmas as a source of hydrogen peroxide. Plasma describes the fourth state of matter that is created when energy is added to a gas. If liquids are treated with plasmas, a large number of reactive oxygen and nitrogen species are formed, some of which then react to form long-lived hydrogen peroxide, which can be used for biocatalysis.

Biocatalytic reactions with plasma-generated hydrogen peroxide are possible

In an experiment in which horseradish peroxidase served as one of the model enzymes, the team showed that this system works in principle. At the same time, the researchers identified the weak points of plasma treatment: "Plasma treatment also directly attacks and inactivates the enzymes, most likely through the highly reactive, short-lived species in the plasma-treated liquid," outlines Abdulkadir Yayci. The research group improved the reaction conditions by binding the enzyme to an inert carrier material. This creates a buffer zone above the enzyme in which the highly reactive plasma species can react without harming the enzyme.

The researchers then tested their approach using a second enzyme, the unspecific peroxygenase from the fungus Agrocybe aegerita. This peroxygenase has the ability to oxidise a large number of substrates in a highly selective way. "We successfully demonstrated that this specificity is maintained even under plasma treatment and that highly selective biocatalytic reactions are possible using plasma," concludes Julia Bandow.

Credit: 
Ruhr-University Bochum

Swamp wallabies conceive new embryo before birth -- a unique reproductive strategy

image: Swamp wallaby.

Image: 
Geoff Shaw, University of Melbourne

Marsupials such as kangaroos or wallabies are known for their very different reproductive strategy compared to other mammals. They give birth to their young at a very early stage and significant development occurs during a lengthy lactation period in which the offspring spends most of its time in a pouch. Although in some marsupials new ovulation happens only a few hours after giving birth, the regular consecutive stages of ovulation, fertilization, pregnancy and lactation are respected - with one exception: Reproduction specialists from the Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW), Germany, and the University of Melbourne, Australia, recently demonstrated that swamp wallabies ovulate, mate and form a new embryo before the birth of the previous offspring. They thereby continuously support embryos and young at different development stages before and after birth. These findings are published in the Proceedings of the National Academy of Sciences of the United States of America.

Using high-resolution ultrasound to monitor reproduction in swamp wallabies during pregnancy, Prof Thomas Hildebrandt (Leibniz-IZW and University of Melbourne), Dr Brandon Menzies and Prof Marilyn Renfree (both from University of Melbourne) were able to confirm what has been suspected for a long time: swamp wallaby females ovulate, mate and form a new embryo whilst already carrying a full-term fetus that they will soon give birth to. The new embryo enters embryonic diapause until the new-born offspring leaves the pouch nine months later. Thus, when the embryonic diapause is included, females are continuously pregnant throughout their reproductive life, a unique reproductive strategy that completely blurs the normal staged system of reproduction in mammals.

This phenomenon is made possible by two anatomically completely separated uteri and cervices connected to ovaries by their oviducts. "This is true for all marsupials, but the unique overlapping reproductive cycles seem to be a special feature of the swamp wallabies", says Renfree. Normally, ovulation alternates between the two ovaries. "All female macropodid marsupials - essentially kangaroos, wallabies and a few other groups of species - except the swamp wallaby have an oestrous cycle longer than the duration of their pregnancy, so females come into oestrus, ovulate and mate within hours after birth." It has been suspected for some time that swamp wallabies might conceive during an active pregnancy, because the oestrous cycle of the swamp wallaby is shorter than the duration of their pregnancy and there have been reports about mating before the birth of the previous offspring. Such a "superfetation" has previously been only described (by Leibniz-IZW scientists) for the European brown hare where females copulate again three to four days before the birth of the incumbent young, forming new conceptuses during an active pregnancy.

In order to confirm superfetation in swamp wallabies, the scientists removed the pouch young of ten females to reactivate the dormant blastocysts (early stage embryo). They then monitored the development of the blastocyst in four of these ten females using high-resolution ultrasound. All females gave birth at around 30 days after the young had been removed. Parallel to the embryo development in one uterus, the scientists closely examined the opposite ovary. There, follicles started to appear and grow. At day 26 of the pregnancy the ultrasound examination showed that the conceptus had developed into a fetus with the head, limbs and heartbeat clearly visible - and at day 28 and 29 the largest follicle in the opposite (contralateral) ovary had ovulated and a new corpus luteum was evident. The other six females that were not scanned with ultrasound were regularly examined for sperm. Sperm was identified in the urogenital tract one to two days before birth but at no other time. "These results clearly demonstrate that swamp wallabies ovulate and mate one to two days before birth, during an existing pregnancy", says Hildebrandt.

Pregnancies of eutherian mammals (most mammals, i.e. the most taxonomically diverse of the three branches of mammals) greatly exceed the length of the oestrous cycle, so during mammalian evolution, there has been selection pressure to extend the duration of pregnancy. Among marsupials (who form a second taxonomic branch of mammals), gestation in most macropodids encompasses almost the entire duration of the oestrous cycle. The swamp wallaby takes this one step further with its pre-partum oestrus, allowing this marsupial's gestation length to exceed the oestrous cycle length.

Sadly, many of these unique animals have been lost in the current disastrous bushfires in Australia this summer.

Credit: 
Forschungsverbund Berlin

Earliest look at newborns' visual cortex reveals the minds babies start with

image: A baby next to fMRI scanning images of its brain. Understanding how an infant's brain is typically organized may help answer questions when something goes awry.

Image: 
Cory Inman

Within hours of birth, a baby's gaze is drawn to faces. Now, brain scans of newborns reveal the neurobiology underlying this behavior, showing that as young as six days old a baby's brain appears hardwired for the specialized tasks of seeing faces and seeing places.

The Proceedings of the National Academy of Sciences (PNAS) published the findings by psychologists at Emory University. Their work provides the earliest peek yet into the visual cortex of newborns, using harmless functional magnetic resonance imaging (fMRI).

"We're investigating a fundamental question of where knowledge comes from by homing in on 'nature versus nature,'" says Daniel Dilks, associate professor of psychology, and senior author of the study. "What do we come into the world with and what do we gain by experience?"

"We've shown that a baby's brain is more adult-like than many people might assume," adds Frederik Kamps, who led the study as a PhD candidate at Emory. "Much of the scaffolding for the human visual cortex is already in place, along with the patterns of brain activity, although the patterns are not as strong compared to those of adults."

Kamps has since graduated from Emory and is now a post-doctoral fellow at MIT.

Understanding how an infant's brain is typically organized may help answer questions when something goes awry, Dilks says. "For example, if the face network in a newborn's visual cortex was not well-connected, that might be a biomarker for disorders associated with an aversion to eye contact. By diagnosing the problem earlier, we could intervene earlier and take advantage of the incredible malleability of the infant brain."

For decades, scientists have known that the adult visual cortex contains two regions that work in concert to process faces and another two regions that work together to process places. More recent work shows that the visual cortex of young children is differentiated into these face and place networks. And in a 2017 paper, Dilks and colleagues found that this neural differentiation is in place in babies as young as four months.

For the current PNAS paper, the average age of the newborn participants was 27 days. "We needed to get closer to the date of birth in order to better understand if we are born with this differentiation in our brains or if it's molded by experience," Dilks says.

His lab is a leader in adapting fMRI technology to make it baby friendly. The noninvasive technology uses a giant magnet to scan the body and record the magnetic properties in blood. It can measure heightened blood flow to a brain region, indicating that region is more active.

Thirty infants, ranging in age from six days to 57 days, participated in the experiments while sleeping. During scanning, they were wrapped in an inflatable "super swaddler," a papoose-like device that serves as a stabilizer while also making the baby feel secure.

"Getting fMRI data from a newborn is a new frontier in neuroimaging," Kamps says. "The scanner is like a giant camera and you need the participant's head to be still in order to get high quality images. A baby that is asleep is a baby that's willing to lie still."

To serve as controls, 24 adults were scanned in a resting state -- awake but not stimulated by anything in particular.

The scanner captured intrinsic fluctuations of the brain for both the infants and adults.

The results showed the two regions of the visual cortex associated with face processing fired in sync in the infants, as did the two networks associated with places. The infant patterns were similar to those of the adult participants, although not quite as strong. "That finding suggest that there is room for these networks to keep getting fine-tuned as infants mature into adulthood," Kamps says.

"We can see that the face networks and the place networks of the brain are hooked up and talking to each other within days of birth," Dilks says. "They are essentially awaiting the relevant information. The next questions to ask are how and when these two functions become fully developed."

Credit: 
Emory Health Sciences