Earth

Why the dose matters: Study shows levels and anti-tumor effectiveness of a common drug vary widely

image: Above are images of a tumor in the lung of a patient on itraconazole who took part in the study, including a map generated from Dynamic Contrast-Enhanced (DCE) magnetic resonance imaging (right panel). Inset shows the tumor measured for vascular permeability (red arrow).

Image: 
UT Southwestern Medical Center

DALLAS - Sept. 17, 2020 - When used to manage infections, the drug itraconazole is generally given at a single, fixed dose to all patients. But determining the correct dosage of the drug to help treat cancer isn't that simple, new research by UT Southwestern suggests.

When the team of researchers and clinicians measured how much itraconazole ended up in the bloodstreams and tumors of 13 patients treated for lung cancer, they found a sixfold variation in drug levels in tumor samples. Moreover, the levels in the patients' bodies correlated with how effectively the drug shrank their tumors.

"What this means going forward is that, in future studies of itraconazole for the treatment of cancer, it may be important to check each patient's drug level and tailor the dose," says David Gerber, M.D., a professor of internal medicine and population and data sciences at UTSW and first author of the new paper, published online in the journal Clinical Cancer Research. "In this context, there's no one-size-fits-all dose," notes Gerber, also Associate Director of Clinical Research in the Harold C. Simmons Comprehensive Cancer Center.

Itraconazole, sold as Sporonox, Sporaz, or Orungal, has been used for more than 25 years to treat fungal infections. Ten years ago, James Kim, M.D., Ph.D., an associate professor of internal medicine at UTSW and senior author of the study, was part of a team to discover that the antifungal drug also shuts down pathways used by cancer cells to grow. Further research has shown that itraconazole may help treat lung, prostate, skin, and other cancers by both blocking cellular growth pathways and stopping the formation of new blood vessels initiated by cancers.

"There was growing evidence that itraconazole conveyed a survival advantage to patients," says Kim. "But, in this new work, we wanted to take a step back and look more at the biology and pharmacology of what was going on with this drug in cancer patients."

For infections, itraconazole is typically given in two 100 milligram doses per day. Other cancer trials have used doses of the drug ranging from 200 to 600 milligrams per day. Gerber, Kim, and other colleagues studied the effect of a steady dose of itraconazole - 300 milligrams twice a day with food - in 13 patients with non-small cell lung cancer who were already scheduled for surgery to remove their tumors. After each surgery, the researchers analyzed resected tumor samples to determine how much itraconazole had accumulated in the cancer cells.

Levels of itraconazole detected within patients' tumors ranged from 1,244 ng/g to 7,094 ng/g. This nearly sixfold variation could not be fully explained by factors known to affect drug dosing, including body mass and kidney or liver function. Over the 14-day treatment period, change in tumor size ranged from a 26 percent decrease to 13 percent growth. Patients with the highest blood and tumor levels of itraconazole also had the largest decreases in their tumor volumes.

Further tissue analysis and imaging studies revealed corresponding changes in the growth of tumor blood vessels and blood flow; the patients with higher levels of itraconazole also had greater reductions in both of these parameters.

"This study highlights the need, when repurposing drugs, to look closely at the dosing," says William Trey Putnam, Ph.D., director of the Clinical Pharmacology Center at Texas Tech University Health Sciences Center, who collaborated with Gerber and Kim on the research. "In different diseases, the dosing can end up needing to be quite different."

The current study was not designed to look for side effects, but the researchers say the dose being used was within the range previously determined to be safe.

Because it's been used as an antifungal drug for decades, itraconazole is significantly cheaper than most other cancer drugs that have similar molecular effects on tumor growth and tumor blood vessels. The researchers say future studies will examine the use of itraconazole in combination with other cancer drugs to reveal why the drug is processed so differently by different patients.

"We had a small number of patients enrolled in this trial, but we were able to optimize the use of specimens and clinical data to get statistically significant results," says Farjana Fattah, Ph.D., a UTSW assistant professor with the Harold C. Simmons Comprehensive Cancer Center who helped lead the research. "Larger studies might be able to draw even more conclusions."

Credit: 
UT Southwestern Medical Center

0.5°C of additional warming has a huge effect on global aridity

image: Based on models specifically designed to examine the difference between 1.5°C and 2°C of global warming, UTokyo researchers reveal major effects of the additional warming on drought in many regions of the world

Image: 
Institute of Industrial Science, the University of Tokyo

Tokyo, Japan - In a new climate modeling study, researchers from the Institute of Industrial Science, The University of Tokyo have revealed major implications for global drought and aridity when limiting warming to 1.5°C rather than 2°C above pre-industrial levels. Drought has serious negative impacts on both human society and the natural world and is generally projected to increase under global climate change. As a result, assessment of the risk of drought under climate change is a critical area of climate research.

In the 2015 Paris Agreements, the United Nations Framework Convention on Climate Change (UNFCCC) proposed that the increase in global average temperature should be limited to between 1.5°C and 2°C above pre-industrial levels to limit the effects of severe climate change. However, there have been few studies focusing on the relative importance of this 0.5°C of global average temperature rise and what effect it might have on drought and aridity around the world.

"We wanted to contribute to the understanding of how important that 0.5°C could be, but it such a study is not easy to conduct based on previous modeling approaches," explains corresponding author Hyungjun Kim. "This is mainly because most models look at the extreme high levels and you cannot simply take a slice out of the data while the model spins up to this maximum. Therefore, we used data from the specially designed Half a degree Additional warming Prognosis and Projected Impacts (HAPPI) project to assess the impacts on aridity based on estimations of the balance between water and energy at the Earth's surface."

The study revealed that 2°C of warming led to more frequent dry years and more severe aridification in most areas of the world compared with 1.5°C, which emphasizes that efforts should be made to limit warming to 1.5°C above pre-industrial levels.

"There is a really strong message that some parts of the world could have more frequent drought at 2°C than at 1.5°C. This situation could be especially severe in the Mediterranean, western Europe, northern South America, the Sahel region, and southern Africa," says lead author Akira Takeshima. "However, this situation is highly regional. In some parts of the world, like Australia and some of Asia, the opposite situation was simulated, with a wetter climate at 2°C than at 1.5°C."

These findings show the importance of considering the regional impacts of the additional 0.5°C of warming, especially with respect to any future relaxation of the 1.5°C target.

Credit: 
Institute of Industrial Science, The University of Tokyo

Secret of plant dietary fibre structure revealed

image: A scanning electron micrograph captured x100,000 magnification - showing the random cellulose microfibril network produced by the University of Queensland bacterial cellulose model. The plant primary wall cellulose (i.e. in fruit and vegetables) are usually thinner and may have different microfibril geometries compared to the bacterial cellulose microfibrils, which are 10 times larger. However, bacterial produced cellulose has the same general features of cellulose deposition as plants.

Image: 
(c) Dr Deirdre Mikkelsen, The University of Queensland

The secret of how fibre shapes the structure of plant cell walls has been revealed, with potentially wide-ranging applications ranging from nutrition and health to agriculture.

Researchers from The University of Queensland and KTH Royal Institute of Technology in Sweden have uncovered the mechanics of how plant cell walls balance the strength and rigidity provided by cellulose with its ability to stretch and compress.

UQ Director of the Centre for Nutrition and Food Sciences Professor Mike Gidley said the team identified that a family of cell wall polymers - hemicelluloses - played a critical role in balancing the need for rigidity with the flexibility to bend without breaking.

"This discovery is important for understanding dietary fibre properties in nutrition, but also for applications in medicine, agriculture and a range of other industries," Professor Gidley said.

"Plants don't have a skeleton, and their structures can range from soft, floppy grasses to the majestic architecture of a Eucalypt tree, with the key differences lying in their cell wall fibre structures."

The diversity of plant structures results from the three core building blocks of plant fibre - cellulose, hemicellulose and lignins - in the plant cell walls.

"Lignins provide the water-proofing in woody fibre and cellulose is the rigid scaffolding material in almost all plant types, but the mechanical function of hemicellulose was something of a mystery," Professor Gidley said.

Professor Gidley and Dr Deirdre Mikkelsen, in collaboration with Dr Francisco Vilaplana at KTH's Wallenberg Wood Science Centre, experimented with two major components of hemicellulose - with dramatic effect.

"We tested the properties of cellulose when adding different proportions of the two components, and found that 'mannans' improved compression while 'xylans' drastically increase its stretchiness," Dr Mikkelsen said.

"We generated modified cellulose material in the laboratory that could be stretched to twice its resting length - the equivalent to watching a wet sheet of paper being stretched to double its length without tearing."

The team said its discovery had many applications, including in wound care and in the texture of plant foods.

"This information is also of interest for gut microbiome research in understanding more about how plant cells walls, or fibre, break down in the gut," Professor Gidley said.

"Complex plant fibre is already processed for low value applications, but high value materials are usually made from pure (bacterial) cellulose.

"Our work creates the basis for a new cellulose chemistry in which xylans and mannans are added to make composites with useful properties.

"This means new possibilities for developing better, environmentally-sustainable plant-based materials, as well as selecting natural plant fibres with desirable properties in agriculture and food."

Credit: 
University of Queensland

Interim data from early US COVID-19 hotspot show mortality and seriousness of disease were not associated with race/ethnicity

A study of interim data from two hospitals in an early US COVID-19 hotspot, to be presented at the ESCMID Conference on Coronavirus Disease (ECCVID, held online 23-25 September), shows that race and ethnicity were not significantly associated with higher in-hospital COVID-19 mortality, and that rates of moderate, severe, and critical forms of COVID-19 were similar between racial and ethnic groups.

The study, by Dr Daniel Chastain (University Of Georgia College Of Pharmacy, Albany, GA, USA) and colleagues included data from adult patients hospitalised between March 10 and and May 22 with COVID-19, defined by laboratory-detected severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection, in Southwest Georgia.

The authors compared severity of illness categories on presentation to the hospital between patients from different racial and ethnic groups based on criteria from the US National Institutes of Health (NIH) COVID-19 treatment guidelines. They also studied outcomes including comorbidities, laboratory values, vital signs, and in-hospital mortality.

A total of 164 randomly selected non-consecutive patients were included with a median age of 61.5 years. These consisted of 119 African American patients, 36 Caucasian patients, and 9 Latinx patients. Thus the majority were African American (73%) and 51% were female. Rates of moderate, severe, and critical COVID-19 did not significantly differ between African American (9%, 56%, and 35%), Caucasian (0%, 69%, and 31%), and Latinx patients (0%, 56%, and 44%). In-hospital mortality was not statistically significantly different between groups but was highest among Caucasians (31%) followed by Latinx (22%) and African Americans (16%).

Caucasian patients had significantly higher Charlson comorbidity index scores (meaning more underlying conditions) (4.5) compared to African American (4) and Latinx (2) patients, while median BMI was significantly higher in African Americans (33.7 kg/m2) than in Caucasians (26.9) or Latinx patients (25.9).

Duration of time from symptom onset to admission was similar between groups, whereas median temperature on admission was significantly higher in African Americans (38.3oC) than in Caucasians (37.9) or Latinx patients (37.8)

The authors conclude: "Despite the majority of our cohort being African American, the rates of moderate, severe, and critical forms of COVID-19 were similar between racial and ethnic groups in a major transmission hotspot during the early spread of the pandemic in the Southeastern US. Race and ethnicity were not significantly associated with higher in-hospital mortality...our results were similar to findings from other recent studies from the states of Massachusetts and Louisiana*. However, since this is an interim analysis, there is a possibility that these results were due to chance. We are in the process of conducting additional follow-up studies with a larger sample size."

Credit: 
European Society of Clinical Microbiology and Infectious Diseases

Self-imaging of a molecule by its own electrons

image: Difference between the electron scattering cross-section measured (a) and calculated (b) at R = 3.68 A (corresponding to the outer turning point of the vibrational motion) and R = 2.78 A (corresponding to the inner turning point of the vibrational motion) for the case of an I2 vibrational wave packet created by photo-excitation of the B-state using a visible laser with a wavelength of 555 nm. The difference in the scattering cross-sections is shown as a function of the kinetic energy of the returning electron and the angle into which the electron is scattered. Particularly near a rescattering angle of 180 degrees (i.e. for back-scattered electrons) a major difference is seen between the scattering cross-section at the geometry corresponding to the inner and outer turning point of the vibration. In other words, the time-dependent changes in the internuclear distance are recognizable in time-dependent changes of the measured electron scattering cross-sections.

Image: 
MBI Berlin

One of the long-standing goals of research on the light-induced dynamics of molecules is to observe time-dependent changes in the structure of molecules, which result from the absorption of light, as directly and unambiguously as possible. To this end, researchers have developed and applied a plethora of approaches. Of particular promise among these approaches are several methods developed in the last years that rely on diffraction (of light or electrons) as means of encoding the internuclear spacings between the atoms that together form the molecule.

In a recent paper (Phys. Rev. Lett. 125, 123001, 2020), researchers at the Max Born Institute (MBI) led by Dr. Arnaud Rouzée have shown that high-resolution movies of molecular dynamics can be recorded using electrons ejected from the molecule by an intense laser field. Following strong field ionization, the electrons that are set free are generally accelerated away from the molecule under the influence of the laser electric field. However, due to the oscillating nature of this field, a fraction of the electrons are driven back to their parent molecular ion. This sets the stage for a so-called re-collision process, in which the electron can be reabsorbed in the molecule (and where the absorbed energy is released in the form of high energy photons) or scatters off the molecular ion. Depending on the kinetic energy of the electron, it can be transiently trapped inside a centrifugal potential barrier. This is a well-known process in electron scattering and in single photon ionization experiments, and is referred to as a shape resonance. The smoking gun for the occurrence of a shape resonance is a large increase of the scattering cross-section. As its name implies, the kinetic energy for which the shape resonance occurs is highly sensitive to the shape of the molecular potential, and consequently to the molecular structure. Therefore, shape resonances can be used to make a movie of a molecule that is undergoing ultrafast nuclear rearrangement.

To demonstrate this effect, the team at MBI recorded a movie of the ultrafast vibrational dynamics of photo-excited I2 molecules. A first laser pulse, with a wavelength in the visible part of the wavelength spectrum, was used to prepare a vibrational wavepacket in the electronic B-state of the molecule. This laser pulse was followed by a second, very intense, time-delayed laser pulse, with a wavelength in the infrared part of the wavelength spectrum. Electron momentum distributions following strong field ionization by the second laser pulse were recorded at various time delays between the two pulses, corresponding to different bond distances between the two iodine atoms. A strong variation of the laser-driven electron rescattering cross-section was observed with delay, which could unambiguously be assigned to a change of the shape resonance energy position (see Fig. 1) induced by the vibrational wavepacket motion. As such, this work introduces new opportunities for investigating photo-induced molecular dynamics with both high temporal and spatial resolution.

Credit: 
Forschungsverbund Berlin

Potential target identified for migraine therapy

image: Representative raw traces of the direct current potential changes during CSD in control and
GLT-1 KO.

Image: 
Department of Molecular Neuroscience,TMDU

Tokyo, Japan -- Migraines affect millions of people worldwide, often lasting days and severely disrupting lives. More than simply super-intense headaches, some migraines actually result from pathological excitation of neurons in the brain. A new study in mice led by Kohichi Tanaka at Tokyo Medical and Dental University (TMDU) shows that susceptibility to migraines could be related to a molecular transporter that normally works to prevent excessive excitation of neurons.

Neurons in the brain communicate with each other by passing along molecules called neurotransmitters. After a neurotransmitter takes care of business, it is transported away from the synapse--the space between two neurons--so that it cannot be used over and over again. This process is called reuptake, and is one of many ways in which over-excitation of neurons in the brain is prevented. Migraines are related to a condition called cortical depression, in which a large wave of hyperactivity spreads across the brain, followed by a wave of inhibition, or depressed brain activity. Tanaka and his team hypothesized that susceptibility to cortical spreading depression is related to disrupted transport of glutamate, the most common excitatory neurotransmitter.

In turns out that mammals have four molecules that transport glutamate, and three of them are in the cerebral cortex. To determine which of these, if any, is related to cortical spreading depression, the researchers created three strains of knockout mice, each of which lacked one of the three cortical glutamate-transporter genes. They found that when mice lacked the GLT-1 transporter, cortical spreading depression occurred more frequently and spread more quickly than in control mice or in the other knockout mice.

"We know that 90% of glutamate is transported by GLT-1 back into astrocytes, not neurons," says Tanaka. "Our findings thus highlight another important function of glial cells in the brain as they support neuronal function."

To confirm their findings, the team then measured the amount of glutamate outside of cells using a platinum-iridium electrode coated with glutamate oxidase. When glutamate oxidase interacts with glutamate, it creates a negative current that can be detected by the electrode very quickly, allowing almost real-time measurements of glutamate concentration in the region.

"A fast biosensor is critical," explains Tanaka, "because cortical spreading depression only lasts about 5 minutes, and the changes in glutamate concentration could never be found using conventional methods that take minutes to hours of sampling." When testing the three knockout mice, only the GLT-1 knockout mice produced current that differed from that of the control mice. This means that the greater and faster accumulation of glutamate outside of neurons resulted from impaired uptake by astrocytes.

"Abnormal glutamate reuptake by astrocytes is just one way overexcite neurons," says Tanaka. "Nevertheless, if GLT-1 proves to be disrupted in people who have migraines, drug therapy that acts to increase glial reuptake of glutamate could be a reasonable therapeutic approach."

Credit: 
Tokyo Medical and Dental University

CNIC researchers discover a mechanism allowing immune cells to regulate obesity

image: From left to right: Rebeca Acín-Pérez, Salvador Iborra, José Antonio Enríquez and David Sancho.

Image: 
CNIC

Macrophages are immune system cells. They are essential in the early response to infections, and they also have a key role in the proper functioning of our tissues and the regulation of obesity. Now, researchers at the Centro Nacional de Investigaciones Cardiovasculares (CNIC) have shown how this regulation unfolds in a paper published in Nature Metabolism, which could be useful to design new treatments for the obese and overweight, and for some associated pathologies, including fatty liver disease and type 2 diabetes.

The study was led by CNIC researchers directed by Dr. José Antonio Enríquez and Dr. David Sancho. It was completed in collaboration with the David Geffen School of Medicine and the Department of Medicine/Division of Cardiology of the University of California, Los Angeles (UCLA), in the US; the University of Eastern Finland and the Kuopio University Hospital (Finland); and the University of Salamanca and the Complutense University of Madrid. It explains how the activation of the mitochondrial metabolism of macrophages in response to oxidative stress due to excess nutrients contributes to fatty tissue inflammation and obesity.

"In recent decades, several studies have verified that fatty tissue macrophages facilitate an anti-inflammatory and reparative environment in normal conditions. This contributes to deactivating any processes altering the normal functioning of these tissues. These are known as anti-inflammatory or 'type M2' macrophages," Dr. Enríquez explains. However, in certain cases, he adds: "the M2 macrophages interpret that there are stress signals, normally arising in response to infection, and they foster inflammation as a defense mechanism."

These inflammation processes sourced to macrophages -says Dr. Enríquez- are responsible for the emergence of fatty tissue alterations, and "are the origin of obesity and the metabolic syndrome associated to cardiovascular disorders, fatty liver disease and type 2 diabetes." This means that, as a response to the excess nutrients created by a high-fat diet "macrophages change their function and support inflammatory processes, forming 'type M1' proinflammatory macrophages."

Mitochondrial metabolism changes

The research now published has analyzed how macrophage metabolic changes regulate this inflammatory process, which underlies obesity and the metabolic syndrome. The new findings, says Dr. Rebeca Acín-Pérez (currently at UCLA): "reveal how the detection by macrophages of oxidative danger signals -known as reactive oxygen species- leads to mitochondrial metabolism changes of these immune cells, needed to distinguish them from an M1 proinflammatory type. This oxidative stress -she clarifies- is found in morbidly obese patients, and it seems to be related to a high-fat diet, commonplace in the inadequate Western diet."

One of the conditions of this study, Dr. Sancho says, is that it proves that when this oxidative stress is reduced "it ameliorates some of the harmful parameters associated with obesity."

In previous studies, CNIC scientists had found that the Fgr protein is decisive in regulating one of the complexes of the transport chain of mitochondrial electrons -the II complex- in response to this oxidative stress, and to benefit the generation of signals (cytokines and metabolites) fostering immune responses.

Salvador Iborra says that this study "proves that this same molecular mechanism regulates the conversion process of an anti-inflammatory macrophage (M2) governing the function of the tissue to a proinflammatory macrophage (M1), where lipid droplets accumulate (Figure 1). A balance between both types of M2/M1 macrophages is crucial for the proper functioning of the body."

Although inflammation is a normal body response and it is beneficial to face acute and transitory threats, it is very damaging when it becomes persistent or chronic, even in low-grade inflammation scenarios. The researchers explain that this happens in obesity and the metabolic syndrome, and it leads to increased cardiovascular mortality and diabetes.

The information contained in this new paper proves that, in the absence of the Fgr protein, the liver increases its ability to eliminate fat by generating ketone bodies (chemical compounds produced by ketogenesis, a process using body fats as an energy source), which are eliminated in the urine, and that this further enhances the alterations of obesity to the glucose metabolism (type 2 diabetes).

The results, found in mice, have been corroborated by human cohorts, where the authors found a stark correlation of Fgr and the negative consequences of obesity.

The researchers conclude that their data suggest the potential of using specific Fgr protein inhibitors to treat obese and/or metabolic syndrome patients. The goal would be reducing the associated inflammation, thereby improving the parameters associated with these illnesses, like fatty liver and type 2 diabetes, and contributing to raise patients' life expectancy and quality.

Obesity is a major health problem, and it is involved in the development of heart diseases, cerebrovascular accidents, cancer, fatty liver disease, metabolic syndromes, high blood pressure and some autoimmune diseases. A combination of an excess intake of nutrients, a lack of physical activity and genetic risk factors leads to an imbalance of energy demanded vs. energy consumed, and this is where obesity starts. In Spain alone, it is expected that in only a decade (by 2030), there will be 27 million obese and overweight adults (80% men and 55% women).

Credit: 
Centro Nacional de Investigaciones Cardiovasculares Carlos III (F.S.P.)

Kang finds keys to control the 'driver of cancer's aggressiveness'

image: A drug-resistant protein named SNAI2 helps cancers metastasize and shields cancer from both the immune system and chemotherapy. But now Princeton University's Yibin Kang (lower right) and his colleagues have found a way to use the cell's recycling system to control SNAI2, providing a new possibility for treatments. Their findings appear in two new papers in Genes and Development. This photo of the Kang lab includes two of the studies' key contributors: Hanqiu Zheng (center front, next to Kang) and Wenyang Li (center back, in a red jacket).

Image: 
Photo by Maša Alečkovi

"Do not erase." "Recycle me." "Free to a good home." Humans post these signs to indicate whether something has value or not, whether it should be disposed of or not. Inside our cells, a sophisticated recycling system uses its own enzymatic signs to flag certain cells for destruction -- and a different set of enzymes can remove those flags.

Changing the balance between those two groups might provide a way to control a dangerous protein called SNAI2 that helps cancers metastasize, said Yibin Kang, Princeton University's Warner-Lambert/Parke-Davis Professor of Molecular Biology, who has spent his career studying the cells and molecules behind metastatic cancers. His team has a pair of papers coming out in next month's issue of Genes and Development, released online today.

The key is the cell's recycling system. In 2004, the Nobel Prize was awarded to the three scientists who discovered that the body will shred proteins into tiny pieces after they are tagged with a "recycle me" sign by a molecule called "ubiquitin." Some scientists refer to ubiquitin as the "kiss of death," since once a protein has enough ubiquitin tags, that protein is headed on a one-way trip to the shredder -- unless another enzyme comes along to remove its "recycle me" sign.

Scientists call these rival teams ubiquitination ligases and deubiquitinases (DUBs). For simplicity, I'll call them recyclers and dubs: The recyclers run around the body hanging "Recycle me!" signs on any protein that is damaged or has outstayed its welcome, while the dubs pull those signs down.

Unlike New Jersey's single-stream recycling, cellular recyclers and dubs are remarkably specific, with some 600 recyclers and 100 dubs sharing the work of identifying the cell's 20,000 proteins. After years of work, Kang's team succeeded in identifying both the recycler and the dub for SNAI2: enzymes ASB13 and USB20, respectively.

"That specificity gives us another advantage in looking for drug treatments," said Kang. "If you target this specific enzyme, it's unlikely to cause side effects on other proteins."

In both animal models and human breast cancer patients, Kang's team found that in tumors with a high number of ASB13 recyclers, SNAI2 gets flagged for destruction in a timely way. On the other hand, the more USB20 dubs are around, the more SNAI2 is protected -- leaving it to stick around to wreak havoc.

What's so terrible about SNAI2?

SNAI2 weakens the connectors between cell surfaces that stick our cells together, allowing tumor cells to move around the body. In effect, it is a skeleton key, an all-access pass from one organ to another.

SNAI2 is not inherently bad; it plays an important role at key stages of development. But in healthy cells, SNAI2 only turns on for very narrow windows of time, such as during wound repair, when healthy cells need to move in to close the gap. In cancer patients, SNAI2 lingers, allowing cancer cells to use it to metastasize around the body.

In addition to increasing mobility, SNAI2 has two other tricks to help cancer cells: It makes them invisible to the immune system and resistant to chemotherapy.

Most importantly, while SNAI2 is in a family of proteins that are notoriously difficult to target with medications, recyclers and dubs are both vulnerable to drugs.

"This gives us one possibility of attack," said Kang. "We showed that the recycling system in the cell can control this protein, and now we've found the switches in the recycling system that we could utilize to eliminate SNAI2 -- the driver of cancer's aggressiveness -- in potential therapies."

Credit: 
Princeton University

Emissions could add 15 inches to 2100 sea level rise, NASA-led study finds

image: Ice shelves in Antarctica, such as the Getz Ice Shelf seen here, are sensitive to warming ocean temperatures. Ocean and atmospheric conditions are some of the drivers of ice sheet loss that scientists considered in a new study estimating additional global sea level rise by 2100.

Image: 
Jeremy Harbeck/NASA

An international effort that brought together more than 60 ice, ocean and atmosphere scientists from three dozen international institutions has generated new estimates of how much of an impact Earth's melting ice sheets could have on global sea levels by 2100. If greenhouse gas emissions continue apace, Greenland and Antarctica's ice sheets could together contribute more than 15 inches (38 centimeters) of global sea level rise - and that's beyond the amount that has already been set in motion by Earth's warming climate.

The results point to a greater range of possibilities, from ice sheet change that decreases sea level by 3.1 in (7.8 cm), to increasing it by 12 in (30 cm) by 2100, with different climate scenarios and climate model inputs. The regional projections show the greatest loss in West Antarctica, responsible for up to 7.1 in (18 cm) of sea level rise by 2100 in the warmest conditions, according to the research.

"The Amundsen Sea region in West Antarctica and Wilkes Land in East Antarctica are the two regions most sensitive to warming ocean temperatures and changing currents, and will continue to lose large amounts of ice," said He?le?ne Seroussi, an ice scientist at NASA's Jet Propulsion Laboratory in Southern California. Seroussi led the Antarctic ice sheet modeling in the ISMIP6 effort. "With these new results, we can focus our efforts in the correct direction and know what needs to be worked on to continue improving the projections."

Different groups within the ISMIP6 community are working on various aspects of the ice sheet modeling effort. All are designed to better understand why the ice sheets are changing and to improve estimates of how much ice sheets will contribute to sea level rise. Other recent ISMIP6 studies include:

How historical conditions and warming ocean temperatures that melt floating ice shelves from below play a significant role in Antarctic ice loss? (Reese et al, 2020)

How sudden and sustained collapse of the floating ice shelves impact the Antarctic ice sheet as a whole? (Sun et al., 2020)

How to convert large scale climate output into local conditions that ice sheet models can use? (Barthel et al, 2020; Slater et al; 2019, 2020; Nowicki et al., 2020, and Jourdain et al., 2020)

"It took over six years of workshops and teleconferences with scientists from around the world working on ice sheet, atmosphere, and ocean modeling to build a community that was able to ultimately improve our sea level rise projections," Nowicki said. "The reason it worked is because the polar community is small, and we're all very keen on getting this problem of future sea level right. We need to know these numbers."

The new results will help inform the Sixth IPCC report scheduled for release in 2022.

Results from this effort are in line with projections in the Intergovernmental Panel on Climate Change's (IPCC) 2019 Special Report on Oceans and the Cryosphere. Meltwater from ice sheets contribute about a third of the total global sea level rise. The IPCC report projected that Greenland would contribute 3.1 to 10.6 inches (8 to 27 cm) to global sea level rise between 2000-2100 and Antarctica could contribute 1.2 to 11 inches (3 to 28 cm).

These new results, published this week in a special issue of the journal The Cryosphere, come from the Ice Sheet Model Intercomparison Project (ISMIP6) led by NASA's Goddard Space Flight Center in Greenbelt, Maryland. The study is one of many efforts scientists are involved in to project the impact of a warming climate on melting ice sheets, understand its causes and track sea level rise.

"One of the biggest uncertainties when it comes to how much sea level will rise in the future is how much the ice sheets will contribute," said project leader and ice scientist Sophie Nowicki, now at the University at Buffalo, and formerly at NASA Goddard. "And how much the ice sheets contribute is really dependent on what the climate will do."

"The strength of ISMIP6 was to bring together most of the ice sheet modeling groups around the world, and then connect with other communities of ocean and atmospheric modelers as well, to better understand what could happen to the ice sheets," said Heiko Goelzer, a scientist from Utrecht University in the Netherlands, now at NORCE Norwegian Research Centre in Norway. Goelzer led the Greenland ice sheet ISMIP6 effort.

With warming air temperatures melting the surface of the ice sheet, and warming ocean temperatures causing ocean-terminating glaciers to retreat, Greenland's ice sheet is a significant contributor to sea level rise. The ISMIP6 team investigated two different scenarios the IPCC has set for future climate to predict sea level rise between 2015 and 2100: one with carbon emissions increasing rapidly and another with lower emissions.

In the high emissions scenario, they found that the Greenland ice sheet would lead to an additional global sea level rise of about 3.5 inches (9 cm) by 2100. In the lower emissions scenario, the loss from the ice sheet would raise global sea level by about 1.3 inches (3 cm). This is beyond what is already destined to be lost from the ice sheet due to warming temperatures between pre-industrial times and now; previous studies have estimated that 'locked in' contribution to global sea level rise by 2100 to be about a quarter-inch (6 millimeters) for the Greenland ice sheet.

The ISMIP6 team also analyzed the Antarctic ice sheet to understand how much ice melt from future climate change would add to sea level rise, beyond what recent warming temperatures have already put in motion. Ice loss from the Antarctic ice sheet is more difficult to predict: In the west, warm ocean currents erode the bottom of large floating ice shelves, causing loss; while the vast East Antarctic ice sheet can gain mass, as warmer temperatures cause increased snowfall.

Credit: 
NASA/Goddard Space Flight Center

Study: Europe's old-growth forests at risk

image: UVM forest ecologist Bill Keeton walks a low ridge line in a stand of old-growth beech in the Alps, Kalkalpen National Park, Austria.

Image: 
Hanns Kirchmeir

Like its ancient cathedrals, Europe has a remarkable--but poorly understood--legacy of old-growth forests. These primeval landscapes, scattered on remote hillsides and forested valleys across many countries, are a "living treasure," says University of Vermont scientist Bill Keeton.

A new study, by scientists from 28 institutions including UVM, presents the first comprehensive assessment of the conservation status of these primary forests in Europe--and shows that many of them are not protected and at risk of being destroyed.

Gathering data and mapping for five years, the team's research makes clear that Europe's ancient forests are in a perilous state--and that many of them continue to be logged. The researchers conclude that formal conservation of these forests should be a top priority for countries to meet their climate change and biodiversity goals.

"While many primary forests are in fact well protected, we also found many regions where they are not--particularly where primary forests are still common," says Francesco Sabatini, the study's lead author from the German Centre for Integrative Biodiversity Research and Martin Luther University, Halle-Wittenberg. "And where they are protected, in some cases, the level of protection is inadequate to ensure these forests will be protected in the long-term."

The study also highlights that remaining primary forests are very unevenly distributed across Europe. "Some regions, particularly in Scandinavia and Finland as well as Eastern Europe, still have many primary forests. But often those countries do not realize how unique their forests are at the European scale and how important it is to protect them," says senior author Tobias Kuemmerle from Humboldt University in Berlin. "At the same time, we were shocked to see that there are many natural forests types in Europe without any primary forest remaining at all, particularly in Western Europe."

The European Union has recently put forward a new Biodiversity Strategy for 2030 that highlights the value of old-growth forest; the results of this new study provide valuable information for implementing this strategy, the team notes.

The new research was published on September 16, 2020, in the journal Diversity and Distributions.

WHAT REMAINS?

Earlier research by this same team had shown that many primary forests remain in Europe and modelled where others are likely to occur. "But what we didn't know: are these remaining primary forests representative of the 54 forest types found in Europe? How much of each forest type is protected? And where are opportunities to restore old-growth forest?" says UVM's Bill Keeton, second author on the new study, professor of forest ecology and forestry in the Rubenstein School of Environment and Natural Resources, and fellow in the Gund Institute for Environment. "This research answers these critical questions."

Primary forests are forests without signs of past human use and where ecological processes are not disrupted by human influence. "Primary and old-growth forests have huge value for biodiversity, for carbon and climate mitigation, for flood resilience and other ecological values--and they're important as part of Europe's historical legacy just like their ancient cities and cathedrals," says Keeton. In Europe, where millennia of land use have transformed forested landscapes, very few such forests remain, and these are mostly found in remote and relatively unproductive areas.

The new study found a "substantial bias," the scientists write, in how these remaining primary forests are distributed across forest types. Of the 54 forest types they assessed, they found that six had no remaining old-growth stands at all. And in two-thirds of the forest types, they found that less than one percent was old growth. And only ten forest types had more than half of their old growth strictly protected.

In other words, even if scarce and irreplaceable, many of these primary forests are not legally protected and continue to be logged in Europe. However, with swift action, strict conservation protections on those that remain can be put in place, the team says--plus: old-growth forests, and their many values, can be restored.

RESTORATION

"Notre Dame burned, but it's being restored," says UVM's Keeton. "It won't be exactly the same as the original construction--and there's debate over architectural details and what style to use for its spire--but it will return as an inspiring, ancient place for reflection and worship. The active restoration of old-growth forests is similar. We're not going to create exactly what was there before, but many functions, like habitat and carbon storage, can return." The new study identifies many of the most promising areas for this kind of work.

"Forest restoration to establish primary forests will take a long time, but it is attractive because such forests will not only benefit biodiversity but also store a lot of carbon and therefore help to mitigate climate change," says Tobias Kuemmerle. "The good news is that there are huge opportunities for restoring primary forests even within existing protected areas, which means that restoration efforts would not necessarily require reducing the area of forests used for timber production."

"Now is the time to be ambitious. There is a lot of momentum for forest conservation and restoration in Europe at the moment," says Francesco Sabatini, in part because of the European Union's Biodiversity Strategy for 2030 that explicitly recognizes the irreplaceable value of primary forests. "Our study provides a foundation for putting this strategy into practice," he says.

"Our work shows that all the remaining primary forests in Europe could be protected with a modest expansion of protected areas," says UVM's Bill Keeton, "and I think this study will change the whole dialogue around old forest restoration in Europe, highlighting where that would be most valuable."

Credit: 
University of Vermont

Understanding the movement patterns of free-swimming marine snails

image: During the study, the team of investigators used ZooScan, a flatbed scanning device which allowed them to make high resolution digital images of the zooplankton. This is one of the images that was created of Cuvierina atlantica, a species of thecosome pteropod.

Image: 
Zooplankton Ecology Lab, BIOS

A new study published in the journal Frontiers in Marine Science is changing the way that biological oceanographers view the swimming and sinking behaviors of open ocean, or pelagic, snails. Pteropods and heteropods are small marine snails, most measuring on the order of millimeters to centimeters, that are found throughout the world's ocean from the surface to depths of 3000 feet (1000 meters). Although small in size, these organisms play a vital role in the ocean's food web and biogeochemical cycles, as well as the global carbon cycle.

Led by Ferhat Karakas, a graduate student in mechanical engineering at the University of South Florida (USF), the study was co-authored by Jordan Wingate, a National Science Foundation (NSF) Research Experiences for Undergraduates (REU) intern at the Bermuda Institute of Ocean Sciences (BIOS); Leocadio Blanco-Bercial and Amy Maas, both associate scientists at BIOS; and David Murphy an assistant professor at USF.

The study looked at the movements, or swimming kinematics, of nine species of warm water pelagic snails found in the waters off Bermuda: seven thecosome pteropods (which may have coiled, elongated, or globular shells), one gymnosome pteropod (which loses its juvenile shell during development), and one heteropod (which has a spiral shell). Pteropods, perhaps the most well-known among the pelagic snails, are often referred to as "sea butterflies," as their snail foot has evolved into a pair of wing-like appendages that appear to "flap" as they move through the water.

Historically, study of these delicate organisms has been difficult, as they cannot be grown and maintained in a laboratory environment. However, the proximity of BIOS to the open ocean allowed living organisms to be collected and transported back to shore in under than one hour.

Data collection began immediately upon return and most experiments were completed within one day of collection.

Using a low magnification, high speed 3-D photography system, the research team was able to study the swimming behaviors of the snails, developing detailed models showing their swimming paths (trajectories) through the water column, swimming speeds, "flapping" rates of their appendages, and even the speeds at which they sank and how their shells were oriented as they did so.

"While different large-scale swimming patterns were observed, all species exhibited small-scale sawtooth-shaped swimming trajectories caused by reciprocal appendage flapping," Blanco Bercial said.

The researchers then analyzed zooplankton samples collected from the surface to 3000 feet (1000 meters) with a MOCNESS net system (an array of long, tapered nets and sensors towed behind a research vessel) to determine the abundance and distribution of these organisms off Bermuda. When combined with molecular data and imaging using ZooScan, a device used to make digital images of zooplankton, the team was also able to relate swimming behaviors to night time and day time vertical distributions. Larger species sank down and swam up much faster and could be active at much greater depths, whereas the slower and smaller species were limited to shallower depths. This indicates that size does play a role in the vertical structure of habitat, as well as in predator-prey interactions.

"This project combined the expertise of engineers, molecular biologists, and ecologists, as well as a variety of different technologies, to look at the movement, ecology, and distribution of this beautiful group of organisms," Maas said. "This type of transdisciplinary collaboration doesn't happen very often and it allowed us to learn about an aspect of ocean science that has previously been understudied."

Adding to the uniqueness of this investigation is the role of the study's second author, Jordan Wingate, who was an NSF REU intern at BIOS in 2018 while attending Georgia Military College. During the course of her three-month internship, Wingate worked with Maas on a project that became the basis for this paper, eventually presenting the results of their research at the 2020 Ocean Sciences Meeting in San Diego, California.

"I feel so accomplished to be a published author in a peer-reviewed scientific journal as an undergraduate student," said Wingate, who will graduate from the University of West Florida in the fall of 2021 with a bachelor's degree in marine biology. "I was very fortunate to be able to see this project through from start to finish and I'm grateful to Amy for her mentorship and guidance as I worked through the challenges of learning about pteropods, new computer programming languages, and the data analysis skills required to get this study published."

Credit: 
Bermuda Institute of Ocean Sciences

Wildfire on the rise since 1984 in Northern California's coastal ranges

image: This map overlays the probability of burn severity in California's northern coastal mountains, as forecasted in a UC Davis study, with burn perimeters of wildfires burning in September 2020.

Image: 
UC Davis

High-severity wildfires in northern coastal California have been increasing by about 10 percent per decade since 1984, according to a study from the University of California, Davis, that associates climate trends with wildfire.

The study, published online in Environmental Research Letters, shows that the drought of 2012-2016 nearly quadrupled the area burned severely, compared to the relatively cooler drought of 1987-1992.

"The severity of wildfires has been increasing over the past four decades," said lead author Yuhan Huang, a graduate student researcher at UC Davis. "We found that fires were much bigger and more severe during dry and hot years compared to other climatic conditions."

HEAT WAVE FANS FLAMES

The study area includes coastal foothills and mountains surrounded by Central Valley lowlands to the east and stretching north to the Klamath Mountains. Berryessa Snow Mountain National Monument resides in the southeast portion. It and several areas described in the study have been impacted by wildfire in recent months during a heat wave and the largest wildfire season recorded in California.

"Most of the fires occurring now are exacerbated by this heat wave," said co-leading author Yufang Jin, an associate professor in the UC Davis Department of Land, Air and Water Resources. "Our study shows how prolonged and historic dry conditions lead to extreme behaviors of wildfires, especially when they coincide with warmer temperature."

THE HOT AND DRY DIFFERENCE

The scientists used a machine-learning model that enables near real-time prediction of the likelihood of different levels of fire severity, given ignition. The model shows that during dry years, the northwest and southern parts of the study area are particularly at risk of high-severity fires, although the entire area is susceptible.

According to the historical data, about 36 percent of all fires between 1984 and 2017 in the mapped area burned at high severity, with dry years experiencing much higher burn severity. During wet years, however, only about 20 percent of burns were considered high-severity fires, while the remainder burned at moderate or low severity. Higher temperature further amplified the severity of wildfires.

The research highlights the importance of careful land-use planning and fuel management in the state's most vulnerable areas to reduce the risk of large, severe fires as the climate becomes drier and warmer.

"Those are things we can control in the short-term," Jin said. "Prioritizing high-risk areas is something more practical to reduce the damages."

Credit: 
University of California - Davis

Scientists sound alarm on plastic pollution

In January 2018, China stopped accepting most plastic recyclables from Western nations. Within days, there was no hiding just how much plastic nations were producing and consuming. Piles of plastic sprung up in Britain, Europe, Canada, the United States, and elsewhere. Other Eastern nations began banning the import of plastic waste. Governments worldwide are now scrambling for solutions to mitigate the growing problem of plastic pollution.

Now a new study shows that despite global commitments to address plastic pollution, growth in plastic waste, or "plastics emissions" continues to outpace reduction. What's more, the study shows that even if governments around the world adhere to their ambitious commitments to curb plastic pollution, annual plastic emissions may increase more than six-fold by 2030.

The study, "Predicted Growth in Plastic Waste Exceeds Efforts to Mitigate Plastic Pollution," published in the Sept. 18 issue of the journal Science, evaluated the level of effort needed to achieve a targeted global reduction in plastic pollution.

This is the first global analysis of the magnitude of the plastic pollution problem or an analysis of the relative impact of interventions, like banning plastic bags and straws, said Leah Gerber, professor of conservation science in Arizona State University's School of Life Sciences and co-investigator on the study.

Plastics are slow to degrade, and even when they do, bits of them, known as microplastics, make their way into the aquatic food chain, and eventually into humans. The Great Pacific Garbage Patch, located halfway between California and Hawaii, embodies the growing problem of plastic pollution. The patch is said to cover 1.6 million square kilometers, an area twice the size of Texas.

It's estimated that eight million metric tons of plastic waste enters the world's ocean, lakes and rivers annually. The new study, based on mathematical models, estimates that by 2030, annual plastic waste of 173 countries may increase to 53 metric tons.

The study modeled future scenarios to achieve a global reduction target of less than 8 metric tons by 2030 using existing mitigation strategies: reducing plastic waste, which includes bans on plastic; improving waste management; and environmental clean-up.

To reach that goal, the study found that a 25% to 40% reduction in plastic waste would be required; plastic waste management would have to increase from 6% to 60% in low-income economies; and a clean-up of 40% of annual plastic emissions would be needed.

The study's findings emphasize that unless growth in plastic production and use is halted (an unlikely scenario), a fundamental transformation of the plastics economy is essential; that is, where end-of-life plastic products are valued rather than discarded as waste.

"There's a lot of popular attention toward clean up, but there hasn't been as much attention to the fact that we're still producing large quantities of plastic," said Gerber. "And where there's not good infrastructure, that plastic is making its way into marine and aquatic habitats."

The study's authors suggest that to achieve a substantial reduction in global plastic emissions requires meaningful policy change. Such changes include reducing or eliminating unnecessary plastics; establishing global limits for new plastics production; creating global standards that ensure plastics are recoverable and recyclable; and developing and scaling plastic processing and recycling technologies.

"In the U.S. we're huge consumers of single-use plastic," said Gerber. "I'm hopeful that our findings will get people to rethink these consumption patterns. Even here in Arizona, the choices we make impact the future of our oceans."

Credit: 
Arizona State University

Reduction in insomnia symptoms associated with non-invasive neurotechnology

WINSTON-SALEM, N.C. -- September 17, 2020 -- For people with chronic insomnia, a good night's sleep is elusive. But what if insomnia symptoms could be alleviated by simply listening to one's own brainwaves?

Researchers at Wake Forest Baptist Health conducted a clinical trial that showed reduced insomnia symptoms and improved autonomic nervous system function using a closed-loop, acoustic stimulation neurotechnology. The study is published in the September 17 online edition of the journal Brain and Behavior.

High-resolution, relational, resonance-based electroencephalic mirroring (HIRREM) uses scalp sensors to monitor brainwaves and software algorithms to translate specific frequencies into audible tones of varying pitch in real time.

These tones linked to brainwaves are echoed back instantaneously via ear buds. This allows the brain a chance to listen to itself, to look at itself in an acoustic mirror.

"Sleep is foundational for optimal health, healing and well-being," said principal investigator Charles H. Tegeler, M.D., chair of neurology at Wake Forest School of Medicine, part of Wake Forest Baptist Health.

"HIRREM is a unique non-drug, noninvasive, acoustic neuromodulation intervention that supports the brain to balance and quiet itself. Our results show durable benefit for both reduced symptoms of insomnia and significantly improved objective measures of autonomic function."

HIRREM technology supports the brain to self-adjust, to reset from what may have become stuck trauma and stress patterns, believed to contribute to insomnia, Tegeler said. The brain pattern is observed to shift toward improved balance and reduced hyperarousal with no conscious, cognitive activity required.

According to the American Academy of Sleep Medicine, about 30 to 35% of Americans have experienced insomnia, which can reduce life expectancy and increase the risk of cardiovascular events, obesity, diabetes and other illnesses.

The study included 107 adult men and women with moderate to severe insomnia. Approximately half received the HIRREM intervention, and the placebo group received an active intervention of random tones. All participants kept a daily sleep diary, and each received 10, 60-minute intervention sessions (either HIRREM or placebo), over a three-week period.

In the study, changes were recorded on the Insomnia Severity Index (ISI), a self-reporting instrument to assess insomnia symptoms. Researchers also recorded heart rate and blood pressure to objectively analyze autonomic cardiovascular regulation.

After completion of the intervention sessions and at follow-up visits up to four months later, subjects in the HIRREM group reported clinically meaningful reductions for insomnia symptoms. Four months following the intervention, 78% of those receiving HIRREM reported no significant insomnia symptoms. They also showed significant, durable improvements in autonomic function across multiple objective measures of heart rate variability (HRV) and baroreflex sensitivity (BRS) compared to those who received random tones. HRV is a powerful biometric that reflects the health of the autonomic nervous system, and BRS measures blood pressure regulation.

In this study, the HIRREM participants were five times more likely than placebo to have improvement in their HRV measured as rMSSD by more than 50%. They were also twice as likely to have improved BRS by more than 50% compared to placebo.

These changes may lead to long-term improvement in the cardiovascular health of the participants, Tegeler said. There were no serious adverse events, and less than 6% of study participants dropped out.

"These findings add to the rapidly growing interest in neuromodulation and demonstrate that a brief intervention with closed-loop acoustic stimulation can improve sleep in a meaningful way, while also improving autonomic function," Tegeler said. "It's an important alternative approach for people who suffer from insomnia."

Credit: 
Atrium Health Wake Forest Baptist

NASA finds tropical storm Noul packing a punch    

image: On Sept. 17 at 10:30 a.m. EDT (1430 UTC) the MODIS instrument that flies aboard NASA's Aqua satellite revealed a large area of the most powerful thunderstorms (yellow) were around Noul's center where cloud top temperatures were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius). Those storms were over the central Vietnam coast and extended over the South China Sea. Strong storms (red) with cloud top temperatures as cold as minus 70 degrees Fahrenheit (minus 56.6. degrees Celsius) surrounded the center and were generating large amounts of rain.

Image: 
NASA/NRL

Powerful storms with heavy rainmaking capabilities appeared over the coast of central Vietnam in NASA provided infrared imagery on Sept. 17.

NASA's Infrared Data Reveals Heavy Rainmakers

Tropical cyclones are made up of hundreds of thunderstorms, and infrared data can show where the strongest storms are located. That is because infrared data provides temperature information, and the strongest thunderstorms that reach highest into the atmosphere have the coldest cloud top temperatures.

On Sept. 17 at 10:30 a.m. EDT (1430 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite revealed a large area of the most powerful thunderstorms (yellow) were around Noul's center where cloud top temperatures were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius). Those storms were mostly in the western quadrant of the storm and over the central Vietnam coast and they extended over the South China Sea. Strong storms with cloud top temperatures as cold as minus 70 degrees Fahrenheit (minus 56.6. degrees Celsius) surrounded the center and were generating large amounts of rain.

Noul was encountering some northeasterly vertical wind shear. That is, outside winds blowing from the northeast and pushing the bulk of clouds and precipitation toward the southwest of the center of circulation.

National Centre for Hydro-Meteorological Forecasting (NCHMF) is a governmental organization belonging Vietnam Meteorological Hydrological Administration (VMHA). NCHMF has authority to issue forecasting/warning information for weather, climate, hydrology, water resource, marine weather (i.e. hydrometeorology) and provide hydrometeorology services. NHCMF has issued coastal warnings for central Vietnam. Those warnings can be found on their website: https://nchmf.gov.vn/KttvsiteE/en-US/2/index.html.

Noul's Status on Sept. 17

At 11 a.m. EDT (1500 UTC), Noul had maximum sustained winds near 45 knots. It was located over the coastline of central Vietnam, near latitude 15.9 degrees north and longitude 110.9 degrees east, about 191 nautical miles east of Da Nang, Vietnam.  Noul was moving to the west and is forecast to move across Laos and Thailand over the next couple of days as it weakens due to interaction with land.

NASA Researches Earth from Space

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

Credit: 
NASA/Goddard Space Flight Center