Tech

Fish size affects snake river salmon returns more than route through dams

image: Biologists examine juvenile salmon as they pass through John Day Dam on the Columbia River.

Image: 
Ritchie Graves/NOAA Fisheries

The survival and eventual return of juvenile Snake River salmon and steelhead to spawning streams as adults depends more on their size than the way they pass through hydroelectric dams on their migration to the ocean, new research shows.

Bypass systems are designed to carry juvenile salmon and steelhead around dam turbines on the Columbia and Snake rivers. The study found little evidence fish that go through these systems suffer delayed or "latent" mortality once they reach the estuary and ocean. Rather, they survive at about the same rate as fish that go through spillways and turbines.

The finding raises questions about whether spilling additional water past dams to carry more fish through spillways instead of bypass systems will substantially increase their survival in the ocean and the number that return to rivers as adults.

The new research helps address a call from the Northwest Power and Conservation Council's Independent Scientific Advisory Board (ISAB) for further study of latent mortality. One argument for breaching dams on the lower Snake River is that it would improve fish survival by eliminating latent mortality related to bypass systems.

Smaller Salmon Less Likely to Survive In the Ocean

Bypass systems are more likely to draw in smaller fish, the study found, which survive at lower overall rates in the ocean. That could give the impression that the bypass systems cause delayed mortality. This study shows that the size of the fish is the real determining factor, according to the NOAA Fisheries research published in Transactions of the American Fisheries Society.

"Once we accounted for the size of the fish, we saw little impact on their ocean survival," said James Faulkner, a research statistician at NOAA Fisheries' Northwest Fisheries Science Center in Seattle and lead author of the research. "That is not to say that the dams don't impact fish in other ways, and it's important to understand how and where that impact happens."

Fish pass through dams on the Columbia and Snake rivers primarily through three routes: spillways, bypass systems, and turbines. Improvements have reduced the number of fish that go through turbines, which generally have lower survival rates. Instead, they go through spillways or into bypass systems with screens and a complex of large pipes or channels that carry the fish through the dams.

The findings do not mean there is no latent mortality. Fish passing through the dams and reservoirs may still experience stress from other factors, including delayed arrival at the estuary and ocean.

Study answers call for Latent Mortality Research

The ISAB noted in a 2007 report that latent mortality had been estimated across a wide range. The ISAB called for more monitoring and research to better determine what influences fish survival beyond the hydrosystem.

In a 2012 review of reports on latent mortality, the ISAB found that "the factors responsible for latent mortality remain poorly understood and inadequately evaluated." The ISAB added that "further research will be needed to resolve this issue."

The ISAB noted that that a "key assumption" in estimates of latent mortality "is that fish entering the bypass system are randomly selected, or at least, that the sample of fish entering the bypass system are not disproportionately less fit in terms of their prospects for survival to return as adults. This assumption needs to be tested both with available data and with further experimental investigation."

This study helps address the ISAB's recommendation by testing that assumption. It found that that the bypass systems draw in smaller fish, which are less likely to survive the ocean and return as adults. The research was funded by the Bonneville Power Administration through the Columbia Basin Fish and Wildlife Program, as developed by the Northwest Power and Conservation Council.

Credit: 
NOAA Fisheries West Coast Region

New approach to treating incurable leukemia in children discovered

image: Frequent and often incurable: Child with acute lymphoblastic leukaemia (ALL).

Image: 
Gabriela Acklin, University Children's Hospital Zurich

Acute lymphoblastic leukemia (ALL) is a form of blood cancer that primarily affects children and young people. It involves large quantities of malignant progenitor cells building up in a person's blood instead of healthy white blood cells. This is often caused by a change in genetic material, with two chromosomes fusing together to create new abnormal genes that disrupt the system controlling normal blood development. Such types of leukemia are often extremely resistant and cannot be cured with intensive chemotherapy or stem cell transplantation. In search of new ways to tackle this problem, a team of scientists from the University of Zurich and the University Children's Hospital Zurich has been scrutinizing the molecular causes of this disorder.

Abnormal protein activates genes at wrong time

For the purpose of their investigation, the researchers - led by Jean-Pierre Bourquin and Beat Bornhauser - analyzed a protein called TCF3-HLF, which is typically associated with this type of leukemia. This protein does not occur naturally; it is produced through the fusion of two chromosomes and contains elements of what are known as transcription factors, which activate the transcription of certain genes. The analyses revealed that the abnormal protein TCF3-HLF also activates a whole range of genes, but it does so in the wrong context and at the wrong point in the blood development process. This triggers the formation of malignant white blood cells and causes leukemia. "Our research shows that the abnormal protein binds to almost 500 regulatory elements in the genetic material of the human leukemia cells, activating hundreds of genes by mistake," explains Yun Huang, lead author of the study.

Leukemia triggers figured out using "gene cutter"

The researchers also discovered that the abnormal protein does not act alone. In fact, it gathers more than 100 other proteins around it, which help to activate the genes. "We investigated the function of the individual proteins in this genetic machinery and used this to identify key elements that could be targeted through therapy," explains Huang. He and his colleagues used the CRISPR/Cas9 method, sometimes referred to as a "gene cutter", to detach the specific parts they had identified from the machinery. As a result, they managed to find eleven critical factors that are crucial to the build-up of malignant abnormal blood cells behind leukemia.

New substance kills cancer cells in targeted way

One of the essential components now identified is the protein EP300, a cofactor that boosts gene activation. An experiment with mice indicated that EP300 could be a very promising target for therapy. For this investigation, the researchers used a new kind of substance called A-485, which is known to bind to EP300 and inhibit its activity. When A-485 was administered to mice carrying human leukemia cells, the malignant cells died off. "It is therefore possible, in principle, to stop the fundamental driving force behind this leukemia directly and thus develop a targeted type of therapy," says research group leader Jean-Pierre Bourquin. "The important thing now is to build a fuller picture of what goes wrong so that we can investigate the best possible way to combine specific modes of attack like this." Given that other forms of leukemia are caused by similar mechanisms, it may also be possible to identify a common denominator for developing new drugs to combat cancer.

Credit: 
University of Zurich

Elucidation of the atomic-level mechanism by which pathogenic bacteria uptake iron ions

image: Figure 1. Crystal structures of (a) N-terminal domain of HtaA, (b) C-terminal domain of HtaA, and (c) HtaB.

Image: 
NINS/IMS

Overview:
The research group including researchers of Exploratory Research Center on Life and Living Systems (ExCELLS), Institute for Molecular Science (IMS) in National Institutes of Natural Sciences and Hokkaido University determined the structures of "heme uptake system" that is used to uptake essential iron ions, and revealed the detail mechanism of the heme uptake in Corynebacteria such as Corynebacterium diphtheriae. The results of this study are expected to provide the basic knowledge for the development of new antibiotics against diphtheria. Their findings have been published in the online version of Chemical Communications (Royal Society of Chemistry).

Research Background:
Iron ion is an essential trace element for all organisms. The organisms utilize the various molecular systems for uptake iron ions into their cells. Pathogenic bacteria have a molecular system for extracting heme from heme proteins in infected hosts, by which extracted heme molecules are transported into the bacterial cells so that they utilize heme proteins in infected hosts as the major iron source. Before this study, the heme uptake systems from several bacteria have been studied, but the detail studies of the heme uptake system from Corynebacteria including Corynebacterium diphtheriae has not progressed.

Results:
The heme uptake system from Corynebacteria consists of HtaA and HtaB, which are located on the cell surface and are responsible for heme binding and transport, and HmuT-HmuU-HmuV, which is responsible for heme transport into the cell. HtaA extracts heme from the host heme protein and transfers extracted heme to HtaB. HtaB transfers heme to HmuT. At last, heme was transported into the cell by HmuUV heme transporter. Our research group has previously reported the crystal structure of HmuT from Corynebacterium glutamicum (Muraki et al., Chem. Lett., 2016, Muraki et al., Int. J. Mol. Sci., 2016). In this study, we have succeeded to determine the crystal structures of HtaA and HtaB to find that HtaA and HtaB have a novel fold for heme-binding/transport proteins.

HtaA consists of an N-terminal domain and a C-terminal domain, in which the CR (conserved region) domain is adopted for heme-binding/transport. HtaB consists of a single domain. We determined the crystal structures of the N- and the C-terminal domains of HtaA and HtaB at 2.0 Å, 1.3 Å and 1.7 Å resolution, respectively. These structures are homologous to each other, and consist of eleven β-strands, which formed antiparallel β-sheets and two short α-helices. A part of antiparallel β-sheets formed a barrel-like structure (Fig. 1).

We have determined the crystal structures of HtaA/HtaB in the holo-form. One heme molecule is bound in the heme pocket formed by the loop region at the end of β-sheet and α-helix (α1). The propionate group of heme forms hydrogen bonds with serine (Ser54 in HtaA) and tyrosine (Tyr201 in HtaA). The ring of heme was stabilized by a π-π interaction with phenylalanine (Phe200 in HtaA). Heme Iron had a five-coordinate structure with tyrosine (Tyr58 in HtaA) (Fig. 2). This tyrosine forms a hydrogen bond with histidine (His111 in HtaA). These amino acids involved in heme recognition are highly conserved between HtaA / HtaB CR domains.

We have found that the hydrogen bond between the axial ligand Tyr and His plays an important role for the regulation of heme-binding affinity fo HtaA/HtaB. By replacing this His with Ala, we prepared an apo-protein, and determined the apo-form structure of the C-terminal domain of HtaA. The apo form of HtaA forms a domain swapped dimer (Fig. 3). In this structure, α-helix (α1) in one protomer is located close to the heme pocket of the other protomer. This domain-swapped dimer seems to mimic the reaction intermediate of the heme transfer reaction in HtaA/HtaB.

Social significance of this research:
In this study, we have determined the crystal structures of the heme uptake system from Corynebacterium glutamicum, which is a non-pathogenic bacterium, but the same heme uptake system is also used by Corynebacterium diphtheriae. The results of this research may contribute to the development of new antibiotics that inhibit the growth of Corynebacterium diphtheriae by blocking the uptake of iron ions (heme) essential for growth.

Credit: 
National Institutes of Natural Sciences

Novel tactile display using computer-controlled surface adhesion

image: The area under the index finger becomes sticky.

Image: 
Osaka University

A group of researchers at Osaka University developed a novel two-dimensional (2D) graphical tactile display to which one-dimensional (1D) adhesive information could be added by controlling adhesion of designated portions of the display surface. (Fig.1)

Their research results were presented at SIGGRAPH ASIA 2019 Emerging Technologies, which was held in Australia from November 18 through November 20, 2019. The research team received the BEST DEMO VOTED BY COMMITTEE AWARD.

With conventional techniques, it was impossible to perform dynamic and interactive control by changing the shape or friction coefficient of an area on the surface of an electronic device, such as a paper-like screen, in order to enhance its operability. Thus, researchers have made efforts to present further information by using visual presentation that can also deliver other sensory (tactile) content.

In the entertainment industry, such as in video games, displays that give players a sense of temperature or shock have been proposed so that they can feel as if they were actually in the scene of a game. In particular, many haptic displays and element technologies that give players tactile feedback have been devised.

This group of researchers developed a display in which the sense of touch, i.e. a 1D "sticky" sensation, can be added to a 2D vision display. On their display is mounted a temperature sensitive adhesive sheet, a special polymer sheet whose adhesion (friction) can be changed by controlling the temperature of the display surface with a computer.

In order to present changes in adhesion in a range that does not bring a sense of discomfort to a user, the researchers used an adhesive sheet with a boundary temperature of 40°C. The sheet rapidly becomes sticky through heating to a temperature above 40°C, showing the largest adhesion of 2.6 [N/25mm] in the temperature range of 30°C ~ 48°C.

With this display, users can take in both visual and tactile information, something difficult to achieve through ordinary 2D displays. For example, one can feel a folder and learn its capacity by touching it while navigating the folder hierarchy, which can be preset to vary adhesion by folder capacity. It is also possible to impede the operability of a device to prevent users from carelessly swiping through content so that they can focus on sections containing important information, which are set to increased adhesion levels.

In addition, it is also possible to apply this technology to touchscreens for people with visual impairments and allow users who are looking at an image of a sticky object on the screen to feel the displayed object's stickiness as if they were actually touching the object in the image.

Associate Professor Itoh says, "This graphical tactile system allows users to get 'touch and feel' information that would be difficult to perceive on a visual display. We will consider applications to entertainment and digital signage to pursue its commercial viability."

Credit: 
Osaka University

Ultrafast quantum simulations: A new twist to an old approach

New method of studying large numbers of particles at quantum level developed by Universities of Warwick and Oxford

Electrons and ions behave on vastly different timescales, making it prohibitive to simulate both on the same footing

Ultrafast quantum simulation overcomes this limitation and allows for the study of the dynamics of the interactions between electron and ion

The new approach offers insights into behaviour of matter inside giant planets and in the highly compressed core during laser-driven nuclear fusion

Billions of tiny interactions occur between thousands of particles in every piece of matter in the blink of an eye. Simulating these interactions in their full dynamics was said to be elusive but has now been made possible by new work of researchers from Oxford and Warwick.

In doing so, they have paved the way for new insights into the complex mutual interactions between the particles in extreme environments such as at the heart of large planets or laser nuclear fusion.

Researchers at the University of Warwick and University of Oxford have developed a new way to simulate quantum systems of many particles, that allows for the investigation of the dynamic properties of quantum systems fully coupled to slowly moving ions.

Effectively, they have made the simulation of the quantum electrons so fast that it could run extremely long without restrictions and the effect of their motion on the movement of the slow ions would be visible.

Reported in the journal Science Advances, it is based on a long-known alternative formulation of quantum mechanics (Bohm dynamics) which the scientists have now empowered to allow to study the dynamics of large quantum systems.

Many quantum phenomena have been studied for single or just a few interacting particles as large complex quantum systems overpower scientists' theoretical and computational capabilities to make predictions. This is complicated by the vast difference in timescale the different particle species act on: ions evolve thousands of times more slowly than electrons due to their larger mass. To overcome this problem, most methods involve decoupling electrons and ions and ignoring the dynamics of their interactions - but this severely limits our knowledge on quantum dynamics.

To develop a method that allows scientists to account for the full electron-ion interactions, the researchers revived an old alternative formulation of quantum mechanics developed by David Bohm. In quantum mechanics, one needs to know the wave function of a particle. It turns out that describing it by the mean trajectory and a phase, as done by Bohm, is very advantageous. However, it took an additional suit of approximations and many tests to speed up the calculations as dramatic as required. Indeed, the new methods demonstrated an increase of speed by more than a factor of 10,000 (four orders of magnitude) yet is still consistent with previous calculations for static properties of quantum systems.

The new approach was then applied to a simulation of warm dense matter, a state between solids and hot plasmas, that is known for its inherent coupling of all particle types and the need for a quantum description. In such systems, both the electrons and the ions can have excitations in the form of waves and both waves will influence each other. Here, the new approach can show its strength and determined the influence of the quantum electrons on the waves of the classical ions while the static properties were proven to agree with previous data.

Many-body quantum systems are the core of many scientific problem ranging from the complex biochemistry in our bodies to the behaviour of matter inside of large planets or even technological challenges like high-temperature superconductivity or fusion energy which demonstrates the possible range of applications of the new approach.

Prof Gianluca Gregori (Oxford), who led the investigation, said: "Bohm quantum mechanics has often been treated with skepticism and controversy. In its original formulation, however, this is just a different reformulation of quantum mechanics. The advantage in employing this formalism is that different approximations become simpler to implement and this can increase the speed and accuracy of simulations involving many-body systems."

Dr Dirk Gericke from the University of Warwick, who assisted the design of the new computer code, said: "With this huge increase of numerical efficiency, it is now possible to follow the full dynamics of fully interacting electron-ion systems. This new approach thus opens new classes of problems for efficient solutions, in particular, where either the system is evolving or where the quantum dynamics of the electrons has a significant effect on the heavier ions or the entire system.

"This new numerical tool will be a great asset when designing and interpreting experiments on warm dense matter. From its results, and especially when combined with designated experiments, we can learn much about matter in large planets and for laser fusion research. However, I believe its true strength lies in its universality and possible applications in quantum chemistry or strongly driven solids."

Credit: 
University of Warwick

Approaching the perception of touch in the brain

image: Larger parts of the cerebral cortex than thought process tactile stimuli.

Image: 
MPI CBS

An encouraging pat on the back or a soft sweater on the skin - even things that we do not actively explore with the hands, we perceive with our body perception. "Which brain areas are responsible for this perception of touch, however, is still largely unknown," says Privatdozent Dr. Burkhard Pleger, neurologist at the Berufsgenossenschaftlichen Kliniken Bergmannsheil in Bochum and co-author of the study. To investigate this question, he and his colleagues from Leipzig examined the brains of 70 patients using structural magnetic resonance imaging (MRI).

As the result of an injury or a stroke, the participants suffered from a disturbed body perception, such as hypoesthesia - a condition in which the pressure and touch perception of the skin is impaired. People who have suffered such damages - called lesions - in the brain are a particularly important group of patients for neuroscientists. By comparing the location of the damage in the brain and the symptoms of the patients with each other, the researchers can draw conclusions about the function of individual brain areas.

In the current study, the scientists identified different areas that were linked to a limited sense of touch. In doing so, they were able to support some findings from a previous lesion study on body awareness and also to find new areas of the brain that had not been associated with the perception of touch before.

The researchers were able to show that not only the somatosensory cortex is involved in the perception of touch, but also parts of the prefrontal cortex and posterior parietal lobe - brain regions that are known to be essential for attention-focusing and body awareness.

"The study shows that the brain network responsible for the perception of skin contact is much more complex than previously thought," says Pleger.

Credit: 
Max Planck Institute for Human Cognitive and Brain Sciences

Effective method for correcting various CNS pathologies developing under oxygen deficiency

image: Glial cell line-derived neurotrophic factor.

Image: 
Lobachevsky University

Hypoxia is a key factor that accompanies most brain pathologies, including ischemia and neurodegenerative diseases. Reduced oxygen concentration results in irreversible changes in nerve cell metabolism that entails cell death and destruction of intercellular interactions. Since neural networks are responsible for the processing, storage and transmission of information in the brain, the loss of network elements can lead to dysfunction of the central nervous system and, consequently, the development of neurological deficiency and the patient's severe disability.

This is the reason why the world's neurobiological community is currently involved in an active search for compounds that can prevent the death of nerve cells and support their functional activity under stress.

According to Maria Vedunova, Director of the Institute of Biology and Biomedicine at Lobachevsky University (UNN), the Institute's researchers propose to use the body's own potential to combat hypoxia and its consequences.

"Our particular interest is in the glial cell line-derived neurotrophic factor (GDNF). These signal molecules take an active part in the growth and development of nerve cells in the embryonic period, and they are also involved in the implementation of protective mechanisms and adaptation of brain cells when exposed to various stressors in adulthood," Maria Vedunova notes.

By applying advanced techniques for the study of the structure and functional activity of brain neural networks, a team of researchers from the Lobachevsky State University of Nizhny Novgorod and from the Institute of Cell Biology and Neurobiology at the Charité University Hospital in Berlin have shown that activation of the neurotrophic factor GDNF prevents the death of nerve cells and helps to maintain neural network activity after hypoxic injury. Of particular significance are the data that identify key players in the molecular cascades responsible for the implementation of the GDNF protective effect, namely, the RET, AKT1, Jak1 and Jak2t enzyme kinases.

"Thanks to the results already obtained, Lobachevsky University scientists have significantly advanced in developing the theoretical basis for a new method for correcting the hypoxic conditions of the central nervous system. The next stage of the work will be focused on studying the possibility of neurotrophic factor GDNF activation in experimental animals in a simulated hypoxic damage," continues Maria Vedunova.

It was shown by the researchers that activation of the glial cell line-derived neurotrophic factor helps protect brain cells from death during hypoxic damage and maintain the function of neural networks in the long term after the damaging effects.

A thorough understanding of the principles of work of neural networks subjected to hypoxic damage and of the protective action mechanisms of biologically active molecules of the body (the neurotrophic factor GDNF) can provide the basis for developing an effective method for correcting various CNS pathologies developing under oxygen deficiency.

The obtained results are of a fundamental nature, but they can be an important element in the comprehensive research aimed at developing new methods of diagnosis and treatment of CNS hypoxic conditions, which undoubtedly has great commercial potential.

Credit: 
Lobachevsky University

pinMOS: Novel memory device can be written on and read out optically or electrically

image: The pinMOS memory - an organic semiconductor device resulting from the combination of an OLED and a capacitor. It has the characteristics of a Memcapacitor, interacts with light and can be written and erased step by step.

Image: 
(c) Yichu Zheng

This device allows reading the stored information optically as well as electrically. Moreover, the information can be added in portions - thus several storage states can be mapped in one device. The results have now been published in the renowned journal "Advanced Functional Materials".

Another novelty was related to the measurements in the test series: They were carried out exclusively using the innovative "SweepMe!" measuring software, which was developed by an IAPP / cfaed start-up of the same name.

This story began a few way back in 2015. Two cfaed scientists, both experts in the field of organic electronics, were on their way to a conference in Brazil which included a long bus ride to the venue Porto de Galinhas. Plenty of time to talk. And so it happens that one of the two - Prof. Stefan Mannsfeld (Chair of Organic Devices, cfaed / IAPP) - shared the idea with the other one - Dr. Axel Fischer (Chair of Organic Semiconductors, IAPP) - that had kept him busy for a while already:

The combination of a conventional organic light-emitting diode (OLEDs) with an insulator layer would have to result in a storage unit due to the specific physical effects of the materials used, which could be written on and read out using both light and electrical signals: a misappropriated use of the OLED technology so to speak. As it turned out, the two were a perfect match - Dr. Fischer confirmed that the necessary technologies and experience were already available at the IAPP - and so the investigation of this idea was only a matter of time. Yichu Zheng at Prof. Mannsfeld's chair was a suitable candidate to dedicate her doctoral thesis to this topic.

Storing and reading - with light and electricity

The results of this work are available now and have just been published in the journal Advanced Functional Materials. The scientists describe a new type of programmable organic capacitive memory, which is a combination of an OLED and a MOS capacitor (MOS = metal oxide semiconductor). The storage unit called "pinMOS" is a non-volatile memcapacitor with high repeatability and reproducibility. The special feature is that pinMOS is able to store several states, since charges can be added or removed in controllable amounts. Another attractive feature is that this simple diode-based memory can be both electrically and optically written to and read from. Currently, a lifetime of more than 104 read-write erase cycles is achieved, and the memory states can be maintained and differentiated over 24 hours. The results show that the pinMOS memory principle as a reliable capacitive storage medium is promising for future applications in electronic and photonic circuits such as neuromorphic computers or visual storage systems. The co-authors of the Weierstraß Institute Berlin (WIAS) were able to contribute to the precise interpretation of the functional mechanism by performing drift diffusion simulations.

A diode-capacitor memory was first presented in 1952 by Arthur W. Holt at an ACM conference in Canada, but only now this concept is being revived by the use of organic semiconductors, since all functions of a discrete connection of diodes and capacitor can be integrated into a single memory cell.

Measuring with SweepMe! - innovative approach for the laboratory

All measurements within this study were performed with the new laboratory measurement software "SweepMe! This software has been developed by a start-up, which is a spin-off from TU Dresden. The physicists Axel Fischer and Felix Kaschura who received their PhD's from TUD founded SweepMe! in 2018 at the IAPP.

This study proved how versatile SweepMe! can be. Whether the measurement of voltage-dependent and time-dependent capacitances, the creation of current-voltage characteristics, the combination of signal generator and oscilloscope or the processing of images from an industrial camera - everything was implemented with only this one software. Even sophisticated parameter variations, which would normally require considerable programming effort, could be implemented in a very short time. Since October 2019 SweepMe! is available worldwide free of charge.

Credit: 
Technische Universität Dresden

Mechanized harvesting has not reduced atmospheric pollution in the sugarcane region

The burning of sugarcane, carried out to eliminate dry leaves before harvesting, for years altered the air quality in the central region of the State of São Paulo, Brazil. The particles launched into the atmosphere during the process were visible to the region's inhabitants and were deposited in the streets and on cars.

Atmospheric pollution also caused respiratory problems among the population, impacts on biodiversity and the native vegetation, and contaminated rivers.

Technological advances and pressure from society led to the end of this practice, made official via a state law in 2002. Little by little, sugarcane burning was substituted by more modern techniques, such as the use of mechanical harvesters that remove and separate the sugarcane parts without the need to burn the area that will be planted.

"In 2018, mechanized harvesting was used in 90% of the production. It was hoped, above all, that there would be an improvement in air quality with the end of burning.

However, aerosol and ozone particle concentrations remain at the same levels as before. This leads us to believe that, despite the technological advances in agricultural technologies, there are other sources of greenhouse gas emissions and particulate matter," said Arnaldo Alves Cardoso, a researcher at the Institute of Chemistry of the São Paulo State University (UNESP) in Araraquara, in a lecture given at FAPESP Week France.

Sugarcane macronutrient residues

Brazil is the biggest sugarcane producer in the world. The main producing region is located in the State of São Paulo, which has the highest population density in Brazil and an economy based primarily on agroindustry.

"The State of São Paulo covers 55% of the area planted with sugarcane in Brazil. In the 2017/2018 harvest, 13 billion liters of ethanol were produced, which corresponded to 47% of the Brazilian production," he said.

Cardoso has been analyzing the consequences of the atmospheric pollution in the sugarcane region of São Paulo since the end of the 1990s. His team has collected air samples in the city of Araraquara, in the interior of São Paulo, and measured the changes in the composition of the atmosphere between the harvest and inter-harvest period.

"We have seen, for example, that among the particulate matter there were sugarcane macronutrients. When this material falls on sugarcane plantations, great. However, when it falls on a natural forest, it can modify the soil and cause a loss of biodiversity," he said.

Manual versus mechanized harvesting

Besides the State Law of 2002, an agreement signed between the sugar-alcohol industry and the State of São Paulo government envisaged the elimination of burning by 2017. According to the researcher, in the 2016/2017 harvest, the production harvested manually was 43.6 million tons, or 10% of the total harvest.

"These facts suggest that the sources of emissions have possibly changed in quality, but not in quantity," he said.

The researcher points out that one important change occurring with mechanization in harvests has been the growth in production of electrical energy and second generation (2G) ethanol, which is indicated as a way of increasing bioenergy generation without extending the area under cultivation. The leaves and other sugarcane parts with less energy value, which were burned before, have been used as raw material for producing energy and fuel.

"It seems that we have merely changed activity, but the pollution remains the same. But there are still many questions that I intend to answer with more studies," said Cardoso.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Wastewater leak in West Texas revealed

image: a) Coverage of the ALOS PALSAR scenes used (white box). Black line shows the boundary of the Ken Regan field. Dark green line and light green line represent the boundaries of the Rustler Aquifer and Pecos Valley Aquifer in Texas, respectively. Red star represents the epicenter of the earthquake that occurred in May 2018. Blue circle represents the groundwater well for livestock drawing from the Rustler Aquifer in this area. Blue triangles are wells, which provide groundwater leveling records. (b) Vertical deformation (cm/yr) (in the red box in Fig. 1a) estimated from InSAR. Green circles with and without arrows indicate active injection/disposal wells in the Ken Regan field and oil production wells within 1.5 km from the deformation center during the research period, respectively. Purple circle represents the groundwater which provides groundwater quality records.

Image: 
Source: Zhong Lu

DALLAS (SMU) - Geophysicists at SMU say that evidence of leak occurring in a West Texas wastewater disposal well between 2007 and 2011 should raise concerns about the current potential for contaminated groundwater and damage to surrounding infrastructure.

SMU geophysicist Zhong Lu and the rest of his team believe the leak happened at a wastewater disposal well in the Ken Regan field in northern Reeves County, which could have leaked toxic chemicals into the Rustler Aquifer. The same team of geophysicists at SMU has revealed that sinkholes are expanding and forming in West Texas at a startling rate.

Wastewater is a byproduct of oil and gas production. Using a process called horizontal drilling, or "fracking," companies pump vast quantities of water, sand and chemicals far down into the ground to help extract more natural gas and oil. With that gas and oil, however, come large amounts of wastewater that is injected deep into the earth through disposal wells.

Federal and state oil and gas regulations require wastewater to be disposed of at a deep depth, typically ranging from about 1,000 to 2,000 meters deep in this region, so it does not contaminate groundwater or drinking water. A small number of studies suggest that arsenic, benzene and other toxins potentially found in fracking fluids may pose serious risks to reproductive and development health.

Even though the leak is thought to have happened between 2007 and 2011, the finding is still potentially dangerous, said Weiyu Zheng, a Ph.D. student at SMU (Southern Methodist University) who led the research.

"The Rustler Aquifer, within the zone of the effective injection depth, is only used for irrigation and livestock but not drinking water due to high concentrations of dissolved solids. Wastewater leaked into this aquifer may possibly contaminate the freshwater sources," Zheng explained.

"If I lived in this area, I would be a bit worried," said Lu, professor of Shuler-Foscue Chair at SMU's Roy M. Huffington Department of Earth Sciences and the corresponding researcher of the findings.

He also noted that leaking wastewater can do massive damage to surrounding infrastructure. For example, oil and gas pipelines can be fractured or damaged beneath the surface, and the resulting heaving ground can damage roads and put drivers at risk.

SMU geophysicists say satellite radar imagery indicates a leak in the nearby disposal well happened because of changes shown to be happening in the nearby Ken Regan field: a large section of ground, five football fields in diameter and about 230 feet from the well, was raised nearly 17 centimeters between 2007 and 2011. In the geology world, this is called an uplift, and it usually happens where parts of the earth have been forced upward by underground pressure.

Lu said the most likely explanation for that uplift is that leakage was happening at the nearby well.

"We suspect that the wastewater was accumulated at a very shallow depth, which is quite dramatically different from what the report data says about that well," he said.

Only one wastewater disposal well is located in close proximity to the uplifted area of the Ken Regan field. The company that owns it reported the injection of 1,040 meters of wastewater deep into the disposal well in Ken Regan. That well is no longer active.

But a combination of satellite images and models done by SMU show that water was likely escaping at a shallower level than the well was drilled for.

And the study, which was published on Thursday in the Nature publication Scientific Reports, estimates that about 57 percent of the injected wastewater went to this shallower depth. At that shallower depth, the wastewater - which typically contains salt water and chemicals - could have mixed in with groundwater from the nearby Rustler Aquifer. Drinking water doesn't come from the Rustler Aquifer, which spans seven counties. But the aquifer does eventually flow into the Pecos River, which is a drinking source.

The scientists made the discovery of the leak after analyzing radar satellite images from January 2007 to March 2011. These images were captured by a read-out radar instrument called Phased Array type L-band Synthetic Aperture Radar (PALSAR) mounted on the Advanced Land Observing Satellite, which was run by the Japan Aerospace Exploration Agency

With this technology called interferometric synthetic aperture radar, or InSAR for short, the satellite radar images allow scientists to detect changes that aren't visible to the naked eye and that might otherwise go undetected. The satellite technology can capture ground deformation with a precision of sub-inches or better, at a spatial resolution of a few yards or better over thousands of miles, say the researchers.

Lu and his team also used data that oil and petroleum companies are required to report to the Railroad Commission of Texas (Texas RRC), as well as sophisticated hydrogeological models that mapped out the distribution and movement of water underground as well as rocks of the Earth's crust.

"We utilized InSAR to detect the surface uplift and applied poroelastic finite element models to simulate displacement fields. The results indicate that the effective injection depth is much shallower than reported," Zheng said. "The most reasonable explanation is that the well was experiencing leakage due to casing failures and/or sealing problem(s)."

"One issue is that the steel pipes can degrade as they age and/or wells may be inadequately managed. As a result, wastewater from failed parts can leak out," said Jin-Woo Kim, research scientist with Lu's SMU Radar Laboratory and a co-author of this study.

The combination of InSAR imagery and modeling done by SMU gave the scientists a clear picture of how the uplift area in Regan field developed.

Lu, who is world-renowned for leading scientists in using InSAR applications to detect surface changes, said these types of analysis are critical for the future of oil-producing West Texas.

"Our research that exploits remote sensing data and numerical models provides a clue as to understanding the subsurface hydrogeological process responding to the oil and gas activities. This kind of research can further be regarded as an indirect leakage monitoring method to supplement current infrequent leakage detection," Zheng said.

"It's very important to sustain the economy of the whole nation. But these operations require some checking to guarantee the operations are environmentally-compliant as well," Lu said.

Co-author Dr. Syed Tabrez Ali from AIR-Worldwide in Boston also contributed to this study.

This research was sponsored by the NASA Earth Surface and Interior Program and the Schuler-Foscue endowment at SMU.

Previously, Kim and Lu used satellite radar imaging to find that two giant sinkholes near Wink, Texas--two counties over from the Ken Regan uplift--were likely just the tip of the iceberg of ground movement in West Texas. Indeed, they found evidence that large swaths of West Texas oil patch were heaving and sinking in alarming rates. Decades of oil production activities in West Texas appears to have destabilized localities in an area of about 4,000 square miles populated by small towns like Wink, roadways and a vast network of oil and gas pipelines and storage tanks.

Credit: 
Southern Methodist University

Ammonia synthesis made easy with 2D catalyst

image: Microscope images show cobalt-doped molybdenum disulfide as grown on a carbon cloth. The high-resolution transmission electron microscope image at right reveals the doped nanosheets, which facilitate the efficient electrochemical catalysis of ammonia. The process was developed for small-scale use by materials scientists at Rice University.

Image: 
Lou Group/Rice University

HOUSTON - (Nov. 25, 2019) - Rice University researchers have developed an inorganic method to synthesize ammonia that is both environmentally friendly and can produce the valuable chemical on demand under ambient conditions.

The Brown School of Engineering lab of materials scientist Jun Lou manipulated a two-dimensional crystal it understands well -- molybdenum disulfide -- and turned it into a catalyst by removing atoms of sulfur from the latticelike structure and replacing the exposed molybdenum with cobalt.

This allowed the material to mimic the natural organic process bacteria use to turn atmospheric dinitrogen into ammonia in organisms, including in humans, who use ammonia to help liver function.

The inorganic process will allow ammonia to be produced anywhere it's needed as a small-scale adjunct to industry, which produces millions of tons of the chemical each year through the inorganic Haber-Bosch process.

The research is described in the Journal of the American Chemical Society.

"The Haber-Bosch process produces a lot of carbon dioxide and consumes a lot of energy," said co-lead author and Rice graduate student Xiaoyin Tian. "But our process uses electricity to trigger the catalyst. We can get that from solar or wind."

The researchers already knew that molybdenum disulfide had an affinity to bond with dinitrogen, a naturally occurring molecule of two strongly bonded nitrogen atoms that forms about 78% of Earth's atmosphere.

Computational simulations by Mingjie Liu, a research associate at Brookhaven National Laboratory, showed replacing some exposed molybdenum atoms with cobalt would enhance the compound's ability to facilitate dinitrogen's reduction to ammonia.

Lab tests at Rice showed this was so. The researchers assembled samples of the nanoscale material by growing defective molybdenum disulfide crystals on carbon cloth and adding cobalt. (The crystals are technically 2D but appear as a plane of molybdenum atoms with layers of sulfur above and below.) With current applied, the compound yielded more than 10 grams of ammonia per hour using 1 kilogram of catalyst.

"The scale is not comparable to well-developed industrials processes, but it can be an alternative in specific cases," said co-lead author Jing Zhang, a postdoctoral researcher at Rice. "It will allow the production of ammonia where there is no industrial plant, and even in space applications." He said lab experiments used dedicated feeds of dinitrogen, but the platform can as easily pull it from the air.

Lou said other dopants may allow the material to catalyze other chemicals, a topic for future studies. "We thought there was an opportunity here to take something we're very familiar with and try to do what nature has been doing for billions of years," he said. "If we design a reactor the right way, the platform can carry out its function without interruption."

Credit: 
Rice University

Cellular origins of pediatric brain tumors identified

Montreal, November 25, 2019 - A research team led by Dr. Claudia Kleinman, an investigator at the Lady Davis Institute at the Jewish General Hospital, together with Dr. Nada Jabado, of the Research Institute of the McGill University Health Centre (RI-MUHC), and Dr. Michael Taylor, of The Hospital for Sick Children (SickKids), discovered that several types of highly aggressive and, ultimately, fatal pediatric brain tumors originate during brain development. The genetic event that triggers the disease happens in the very earliest phases of cellular development, most likely prenatal. The findings represent a significant advance in understanding these diseases, and are published in Nature Genetics.

"We have determined that stalled development of progenitor cells in the pons and forebrain, where a large proportion of high-grade embryonal and pediatric tumors emerge, is responsible for several childhood brain cancers," said Dr. Kleinman, an Assistant Professor of Human Genetics at McGill University. "Rather than developing normally, the cells' progress is arrested and they transform into malignancies. But they retain many features of the original cells, and we could pinpoint the tumor origins among the hundreds of different cell types present in the brain."

"New technologies allowing us to interrogate tumor cells each one at a time points to stalled development at the root of several high grade brain tumors in children," added Dr. Jabado, who is also an hemato-oncologist at the Montreal Children's Hospital of the MUHC and a professor of Pediatrics and Human genetics at McGill University. "We name this the Peter Pan Syndrome as these cells are stuck in time unable to age and this is what causes these tumors. The challenge is now to identify how best to unlock these cells promoting their differentiation, and allowing for normal processes to take over."

Brain tumors are the leading cause of cancer-related deaths in children. For several of these tumors, there are no effective therapies and survival is often less than two years. Indeed, Dr. Kleinman points out, very limited progress has been made in treating afflicted children.

"The cornerstone to fighting these conditions is to identify the biological process at work, which is what our research has achieved," she said. "Once we understand the underlying mechanisms, the search can begin for the means to unblock the arrested development of the cells. The complexity of the brain is astounding, and we now have narrowed down where to search."

Applying sophisticated single cell sequencing techniques and large-scale data analysis, researchers compiled the first comprehensive profile of the normal prenatal pons, a major structure on the upper part of the brainstem that controls breathing, as well as sensations including hearing, taste, and balance.

While Dr. Jabado, and her team in the Child Health and Human Development Program at the RI-MUHC, and Dr. Taylor, Paediatric Neurosurgeon and Senior Scientist in Developmental and Stem Cell Biology at SickKids, undertook much of the clinical research, Dr. Kleinman's team performed the bio-informatics and establishing the molecular identity for cell types in this and other brain regions, as well as the dynamics underlying their differentiation. They created an atlas of more than 65,000 individual cells and defined the developmental dynamics for 191 distinct cell populations. They then mapped patient samples to this atlas, and identified the origins of WNT medulloblastomas, embryonal tumors with multilayered rosettes (ETMRs), and high grade gliomas (HGGs).

This work is the result of extensive international collaborations that include researchers from across Quebec, Canada, the United States, and France. Summarizing their achievement, the authors of the paper, "Stalled developmental programs at the root of pediatric brain tumors," wrote, "Current evidence thus supports a common etiological model for these tumors, where genetic alterations in vulnerable cell types disrupt developmental gene expression programs, ultimately leading to oncogenesis."

The genesis of the tumors is very early in brain development, which means that there are really no environmental instigators or preventive measures that parents can take," Dr. Kleinman said. "Advancing our understanding of these tumors is important because the effects are so devastating, we want to bring hope to the patients."

Credit: 
McGill University

Cellular origins of pediatric brain tumors identified

A research team led by Dr. Claudia Kleinman, an investigator at the Lady Davis Institute at the Jewish General Hospital, together with Dr. Nada Jabado, of the Research Institute of the McGill University Health Centre (RI-MUHC), and Dr. Michael Taylor, of The Hospital for Sick Children (SickKids), discovered that several types of highly aggressive and, ultimately, fatal pediatric brain tumors originate during brain development. The genetic event that triggers the disease happens in the very earliest phases of cellular development, most likely prenatal. The findings represent a significant advance in understanding these diseases, and are published in Nature Genetics.

"We have determined that stalled development of progenitor cells in the pons and forebrain, where a large proportion of high-grade embryonal and pediatric tumors emerge, is responsible for several childhood brain cancers," said Dr. Kleinman, an Assistant Professor of Human Genetics at McGill University. "Rather than developing normally, the cells' progress is arrested and they transform into malignancies. But they retain many features of the original cells, and we could pinpoint the tumor origins among the hundreds of different cell types present in the brain."

"New technologies allowing us to interrogate tumor cells each one at a time points to stalled development at the root of several high grade brain tumors in children," added Dr. Jabado, who is also an hemato-oncologist at the Montreal Children's Hospital of the MUHC and a professor of Pediatrics and Human genetics at McGill University. "We name this the Peter Pan Syndrome as these cells are stuck in time unable to age and this is what causes these tumors. The challenge is now to identify how best to unlock these cells promoting their differentiation, and allowing for normal processes to take over."

Brain tumors are the leading cause of cancer-related deaths in children. For several of these tumors, there are no effective therapies and survival is often less than two years. Indeed, Dr. Kleinman points out, very limited progress has been made in treating afflicted children.

"The cornerstone to fighting these conditions is to identify the biological process at work, which is what our research has achieved," she said. "Once we understand the underlying mechanisms, the search can begin for the means to unblock the arrested development of the cells. The complexity of the brain is astounding, and we now have narrowed down where to search."

Applying sophisticated single cell sequencing techniques and large-scale data analysis, researchers compiled the first comprehensive profile of the normal prenatal pons, a major structure on the upper part of the brainstem that controls breathing, as well as sensations including hearing, taste, and balance.

While Dr. Jabado, and her team in the Child Health and Human Development Program at the RI-MUHC, and Dr. Taylor, Paediatric Neurosurgeon and Senior Scientist in Developmental and Stem Cell Biology at SickKids, undertook much of the clinical research, Dr. Kleinman's team performed the bio-informatics and establishing the molecular identity for cell types in this and other brain regions, as well as the dynamics underlying their differentiation. They created an atlas of more than 65,000 individual cells and defined the developmental dynamics for 191 distinct cell populations. They then mapped patient samples to this atlas, and identified the origins of WNT medulloblastomas, embryonal tumors with multilayered rosettes (ETMRs), and high grade gliomas (HGGs).

This work is the result of extensive international collaborations that include researchers from across Quebec, Canada, the United States, and France. Summarizing their achievement, the authors of the paper, "Stalled developmental programs at the root of pediatric brain tumors," wrote, "Current evidence thus supports a common etiological model for these tumors, where genetic alterations in vulnerable cell types disrupt developmental gene expression programs, ultimately leading to oncogenesis."

The genesis of the tumors is very early in brain development, which means that there are really no environmental instigators or preventive measures that parents can take," Dr. Kleinman said. "Advancing our understanding of these tumors is important because the effects are so devastating, we want to bring hope to the patients."

Credit: 
McGill University

Rapamycin may slow skin aging, Drexel study reports

The search for youthfulness typically turns to lotions, supplements, serums and diets, but there may soon be a new option joining the fray. Rapamycin, a FDA-approved drug normally used to prevent organ rejection after transplant surgery, may also slow aging in human skin, according to a study from Drexel University College of Medicine researchers published in Geroscience.

Basic science studies have previously used the drug to slow aging in mice, flies, and worms, but the current study is the first to show an effect on aging in human tissue, specifically skin - in which signs of aging were reduced. Changes include decreases in wrinkles, reduced sagging and more even skin tone -- when delivered topically to humans.

"As researchers continue to seek out the elusive 'fountain of youth' and ways to live longer, we're seeing growing potential for use of this drug," said senior author Christian Sell, PhD, an associate professor of Biochemistry and Molecular Biology at the College of Medicine. "So, we said, let's try skin. It's a complex organism with immune, nerve cells, stem cells - you can learn a lot about the biology of a drug and the aging process by looking at skin."

In the current Drexel-led study, 13 participants over age 40 applied rapamycin cream every 1-2 days to one hand and a placebo to the other hand for eight months. The researchers checked on subjects after two, four, six and eight months, including conducting a blood test and a biopsy at the six- or eight-month mark.

After eight months, the majority of the rapamycin hands showed increases in collagen protein, and statistically significant lower levels of p16 protein, a key marker of skin cell aging. Skin that has lower levels of p16 has fewer senescent cells, which are associated with skin wrinkles. Beyond cosmetic effects, higher levels of p16 can lead to dermal atrophy, a common condition in seniors, which is associated with fragile skin that tears easily, slow healing after cuts and increased risk of infection or complications after an injury.

So how does rapamycin work? Rapamycin blocks the appropriately named "target of rapamycin" (TOR), a protein that acts as a mediator in metabolism, growth and aging of human cells. The capability for rapamycin to improve human health beyond outward appearance is further illuminated when looking deeper at p16 protein, which is a stress response that human cells undergo when damaged, but is also a way of preventing cancer. When cells have a mutation that would have otherwise created a tumor, this response helps prevent the tumor by slowing the cell cycle process. Instead of creating a tumor, it contributes to the aging process.

"When cells age, they become detrimental and create inflammation," said Sell. "That's part of aging. These cells that have undergone stress are now pumping out inflammatory markers."

In addition to its current use to prevent organ rejection, rapamycin is currently prescribed (in higher doses than used in the current study) for the rare lung disease lymphangioleiomyomatosis, and as an anti-cancer drug. The current Drexel study shows a second life for the drug in low doses, including new applications for studying rapamycin to increase human lifespan or improve human performance.

Rapamycin -- first discovered in the 1970s in bacteria found in the soil of Easter Island - also reduces stress in the cell by attacking cancer-causing free radicals in the mitochondria.

In previous studies, the team used rapamycin in cell cultures, which reportedly improved cell function and slowed aging.

In 1996, a study in Cell of yeast cultures which used rapamycin to block TOR proteins in yeast, made the yeast cells smaller, but increased their lifespan.

"If you ramp the pathway down you get a smaller phenotype," said Sell. "When you slow growth, you seem to extend lifespan and help the body repair itself - at least in mice. This is similar to what is seen in calorie restriction."

The researchers note that, as this is early research, many more questions remain about how to harness this drug. Future studies will look at how to apply the drug in clinical settings, and find applications in other diseases. During the current study, the researchers confirmed that none of the rapamycin was absorbed in the bloodstream of participants.

There are two pending patents on this technology, both of which have been licensed to Boinca Therapeutics LLC., of which Sell, Ibiyonu Lawrence, MD, an associate professor of Internal Medicine in the College of Medicine, are shareholders.

Credit: 
Drexel University

NASA tracking Extra-Tropical Storm Sebastien towards the UK

image: On Nov. 25 at 0400 UTC (Nov. 24 at 11 p.m. EST) the MODIS instrument that flies aboard NASA's Aqua satellite revealed that Extra-tropical cyclone Sebastien's strongest storms had cloud tops as cold as or colder than minus 50 degrees Fahrenheit (minus 45.5 Celsius).

Image: 
NASA/NRL

NASA's Aqua satellite passed over eastern North Atlantic Ocean and captured an infrared view of what is now Extra-tropical cyclone Sebastien. Sebastien transitioned from a tropical storm to an extra-tropical storm on Nov. 24. It has coupled with a cold front and is now headed for the United Kingdom.

Sebastien's Final Advisory

At 10 p.m. EST on Sunday, Nov. 24 (0300 UTC Nov. 25), NOAA's National Hurricane Center (NHC) issued the final advisory on Sebastien. At that time, the center of Post-Tropical Cyclone Sebastien was located near latitude 41.0 degrees north and longitude 28.9 degrees west. It was centered 230 miles (370 km) north-northwest of the Azores Islands. Sebastien was moving toward the northeast near 40 mph (65 kph) and this motion is expected to continue for the next day or so. Maximum sustained winds are near 60 mph (95 kph) with higher gusts. Post-tropical cyclone Sebastien should maintain its intensity before merging with another low during the next day or two. The estimated minimum central pressure is 993 millibars. 

NHC's Forecaster Latto noted "Sebastien has lost its tropical characteristics this evening. The deep convection has decoupled from the low-level circulation, and these features are now separated by a distance of 100 nautical miles and increasing. Two [satellite] passes revealed that the surface low is becoming stretched out, and there is some evidence of a boundary, possibly a front, extending northeast from the center. Furthermore, recent observations from the Azores indicated a slight temperature decrease as the center passed by to the north of those islands, suggesting that a weak cold front is associated with the cyclone. Based on all of these data, there is high confidence that Sebastien has transitioned to an extratropical cyclone."

What does Extra-tropical Mean?

When a storm becomes extra-tropical, it means that a tropical cyclone has lost its "tropical" characteristics. The National Hurricane Center defines "extra-tropical" as a transition that implies both poleward displacement (meaning it moves toward the north or south pole) of the cyclone and the conversion of the cyclone's primary energy source from the release of latent heat of condensation to baroclinic (the temperature contrast between warm and cold air masses) processes. It is important to note that cyclones can become extratropical and still retain winds of hurricane or tropical storm force.

NASA's Infrared View of Sebastien

NASA's Aqua satellite used infrared light to analyze the strength of storms in Sebastien. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures. On Nov. 25 at 0400 UTC (Nov. 24 at 11 p.m. EST), Aqua passed over the storm after it had become associated with a cold front. The coldest cloud top temperatures were near minus 50 degrees Fahrenheit or minus 45.5 degrees Celsius. The infrared imagery revealed that the strongest precipitation had become elongated, resembling a frontal system.

Sebastien Headed to the U.K.

The United Kingdom Meteorological Service noted that rains from Sebastien are headed toward the U.K. by Tuesday. The storm is forecast to affect southern England and then move northeast. The Met Office warned that flood-ravaged areas may be affected by more heavy rain over the next several days. The U.K. Met Service issued several yellow weather warnings in southwest and northeast England as torrential showers are expected from Sebastien's remnants. For updated forecasts, visit: https://www.metoffice.gov.uk/

Hurricanes are the most powerful weather event on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center