Tech

Lockdown study reports surge in health anxieties

New research into people's coping strategies faced with COVID-19 highlights the mental health toll for those shielding

Coronavirus and the imposition of lockdown this year 'significantly raised' mental health challenges, particularly so for the most vulnerable groups, including those shielding, according to the first study to look at people's coping styles in face of the pandemic.

The new research, published today [Tuesday 4 August 2020] in the journal American Psychologist, draws on survey responses from over 800 people recruited online and via social media who answered questions over a ten-day period when the UK was in full lockdown (from 17 - 26 April 2020).

The study from psychologists at the University of Bath is the first study to substantiate extensive media debate that health anxieties were heightened as a result of the pandemic, and it is also the first study to indicate that those in vulnerable groups are clinically more distressed as a result.

Results suggest that a quarter of all participants revealed significantly elevated anxiety and depression, exacerbated by lockdown and isolation. Nearly 15% reached clinical levels of health anxiety, which reflects that health-related anxiety has become distressing and is likely to be causing preoccupation and disruption to normal activities. Health anxiety focusses on the fear of having or contracting a serious illness despite medical reassurance.

Lead author, Dr Hannah Rettie from the University of Bath's Department of Psychology explains: "The COVID-19 pandemic has caused global uncertainty which has had a direct, detrimental effect on so many people across the UK and around the world. People have been unsure when they would see relatives again, job security has been rocked, there is an increased threat to many people's health and government guidance is continuously changing, leading to much uncertainty and anxiety.

"What our research focused in on is how some individuals have struggled to tolerate and adapt to these uncertainties - much more so than in normal times. These results have important implications as we move to help people psychologically distressed by these challenging times in the weeks, months and years ahead."

Deeper analysis reveals that those in vulnerable groups - classified according to the UK government 'vulnerable' categories - report twice the rates of health-related anxiety than the general population. Those who identified themselves in these categories were on average more anxious and depressed, with anxiety and health anxiety specifically significantly higher than in non-vulnerable groups. Those who are in the vulnerable group are at risk both physically and psychologically.

Average age of participants in the study was 38 years old, 22% of whom had a pre-existing medical condition. The majority of respondents were female (80% female: 20% male).

The team who led the work hope their findings can help inform clinical practice in dealing with the mental health aftermath caused by these tumultuous past six months. They suggest one of the most important findings concerns those in vulnerable groups who demonstrate significantly higher levels of distress yet are also those most likely to have shielded for longest. This needs to be addressed by policymakers to ensure adequate and appropriately tailored provision of mental health services moving forwards, they say.

The researchers suggest that clinicians could use their findings to target intolerance of uncertainty as part of standard psychological therapies, focussing on developing coping skills to reduce distress. This could also be extended to public resources, drawing out individuals' abilities to manage uncertainty and reduce reliance on less effective coping strategies, for example denial or self-blame.

Research lead Dr Jo Daniels also of the Department of Psychology at Bath, who has written and spoken extensively about health anxiety and how this relates to coronavirus, added: "This is important research which looks at the potential mechanisms in COVID-19 related distress, a recently prioritised area of research. These findings can help us to tailor our existing psychological treatments to help those most in need but may also be useful in considering what coping strategies might be particularly helpful at a new time of uncertainty.

"We are also now better informed as to the likely number of the population that are experiencing clinical levels of health-related anxiety. This may serve to normalise distress at this difficult time and promote the uptake of emerging models of COVID-19 related distress for those who may need support at this time of uncertainty."

"While this research offers important insights into how common distress was during 'lockdown', it is important to stress that anxiety is a normal response to an abnormal situation such as a pandemic. It can be helpful to mobilise precautionary behaviours such as hand-washing and social distancing. Yet for many, as reflected in our findings, anxiety is reaching distressing levels and may continue despite easing of restrictions - it is essential we create service provision to meet this need, which is likely to be ongoing, particularly with current expectations of a second wave. Further longitudinal research is needed to establish how this may change over time."

Credit: 
University of Bath

Epidemic model shows how COVID-19 could spread through firefighting camps

image: Larimer County and Wellington firefighters mop up a spot fire area on the Elk Fire, Oct. 18, 2019.

Image: 
Bill Cotton/Colorado State University Photography

With wildfire season in full swing, a COVID-19 outbreak at a traditional large fire camp is a potential disaster. A transient, high-density workforce of firefighters and volunteers responds to blazes while staying in close quarters with limited hygiene - conditions that could facilitate the spread of a contagious respiratory disease.

To support fire agencies as they continue their mission-critical work, a team that includes Colorado State University experts has developed an epidemiological modeling exercise for the USDA Forest Service and other fire managers that demonstrates potential risks and various scenarios COVID-19 could pose for the fire management community. Their model is published in the journal Fire.

The report is co-authored by Jude Bayham, assistant professor in the CSU Department of Agricultural and Resource Economics; and Erin Belval, research scientist in the CSU Department of Forest and Rangeland Stewardship; with first author Matthew P. Thompson, Research Forester at the USDA Forest Service Rocky Mountain Research Station. Bayham and Belval worked with Thompson on the study under a longstanding joint venture agreement with the Forest Service on wildfire-related research, which primarily operates through a partnership with the Warner College of Natural Resources. Thompson serves as the team's liaison to the fire management community.

The researchers developed a simulation model of COVID-19 in the context of a wildfire incident in which the population of firefighters changes over time. The team then analyzed a range of scenarios with different infection transmission rates, percentages of arriving workers who are infected, and fatality rates.

They applied their model to real firefighter population data from three recent wildfires - Highline, Lolo Peak and Tank Hollow - to illustrate potential outbreak dynamics.

During the Highline fire in Idaho, for example, which at its peak had over 1,000 firefighters on site (See Figure 1.), a worst-case scenario would have seen close to 500 infections, and a best-case scenario of eight infections. (See Figure 7.) The researchers used a variety of infection fatality rates to estimate possible deaths due to COVID-19 on the fires, ranging from a low of 0.1% to an "extreme" of 2%, with a medium, or best-guess, of 0.3%. (See Table 1.)

Model is not a prediction

Like most modeling exercises, the report is not intended to predict real numbers; rather, it is a tool for comparing different scenarios and analyzing how various interventions could have small or large effects.

"There is a need in the modeling community to better communicate what we can and cannot learn from models," Bayham said. "The model itself is not meant to be predictive in the sense of number of cases or deaths, because there are so many things moving."

Bayham said the model does provide insight into the relative benefits of two risk-mitigation strategies: screening; and implementing social distancing measures at camps.

They found that aggressive screening as soon as firefighters arrive at camp could reduce the spread of infection, but those benefits diminish as a wildfire incident goes on longer. For longer campaigns lasting several months, aggressive social distancing measures, including increased use of remote briefings, dispersed sleeping camps, and operating under the "module as one" concept, would be more effective at reducing infections than screening. "Module as one" is a social distancing adaptation in which a crew operates mostly as normal but isolates from other, similarly isolating crews.

"It all comes down to exposure, which is a basic risk management concept," Thompson said. "Reducing the exposure of susceptible individuals to those who may be infectious is the idea behind screening and social distancing. Our results underscore the importance of deploying these risk mitigation measures and provide insights into how characteristics of a wildfire incident factor into the effectiveness of these mitigations."

Bayham added, "Both interventions are useful, and they both have an effect, but they each have times and places where they are even more effective,"

Such findings could help inform the wildland fire management community as it develops guidance for fire response strategies during the pandemic.

Thompson added, "I'm fortunate to have worked with Jude and Erin for several years now, and in my opinion their collective depth and breadth of expertise is uniquely well suited to address this complex issue. We're grateful for the support from the Joint Fire Science Program and more broadly the fire management community to continue this important work."

Extending the work

The team will continue their work with a $74,200 award from the Joint Fire Science Program by way of the USDA Forest Service Rocky Mountain Research Station joint venture agreement. They plan to extend their model and create an interactive dashboard for agencies to provide real-time modeling and risk assessment support as fire season continues.

They are also working on a model that would be better suited to analyze season-long implications of COVID-19 outbreaks, spread across multiple fires and geographic distances.

Credit: 
Colorado State University

Break it down: A new way to address common computing problem

In this era of big data, there are some problems in scientific computing that are so large, so complex and contain so much information that attempting to solve them would be too big of a task for most computers.

Now, researchers at the McKelvey School of Engineering at Washington University in St. Louis have developed a new algorithm for solving a common class of problem -- known as linear inverse problems -- by breaking them down into smaller tasks, each of which can be solved in parallel on standard computers.

The research, from the lab of Jr-Shin Li, professor in the Preston M. Green Department of Electrical & Systems Engineering, was published July 30 in the journal Scientific Reports.

In addition to providing a framework for solving this class of problems, the approach, called Parallel Residual Projection (PRP), also delivers enhanced security and mitigates privacy concerns.

Linear inverse problems are those that attempt to take observational data and try to find a model that describes it. In their simplest form, they may look familiar: 2x+y = 1, x-y = 3. Many a high school student has solved for x and y without the help of a supercomputer.

And as more researchers in different fields collect increasing amounts of data in order to gain deeper insights, these equations continue to grow in size and complexity.

"We developed a computational framework to solve for the case when there are thousands or millions of such equations and variables," Li said.

This project was conceived while working on research problems from other fields involving big data. Li's lab had been working with a biologist researching the network of neurons that deal with the sleep-wake cycle.

"In the context of network inference, looking at a network of neurons, the inverse problem looks like this," said Vignesh Narayanan, a research associate in Li's lab:

Given the data recorded from a bunch of neurons, what is the 'model' that describes how these neurons are connected with each other?

"In an earlier work from our lab, we showed that this inference problem can be formulated as a linear inverse problem," Narayanan said.

If the system has a few hundred nodes -- in this case, the nodes are the neurons -- the matrix which describes the interaction among neurons could be millions by millions; that's huge.

"Storing this matrix itself exceeds the memory of a common desktop," said Wei Miao, a PhD student in Li's lab.

Add to that the fact that such complex systems are often dynamic, as is our understanding of them. "Say we already have a solution, but now I want to consider interaction of some additional cells," Miao said. Instead of starting a new problem and solving it from scratch, PRP adds flexibility and scalability. "You can manipulate the problem any way you want."

Even if you do happen to have a supercomputer, Miao said, "There is still a chance that by breaking down the big problem, you can solve it faster."

In addition to breaking down a complex problem and solving in parallel on different machines, the computational framework also, importantly, consolidates results and computes an accurate solution to the initial problem.

An unintentional benefit of PRP is enhanced data security and privacy. When credit card companies use algorithms to research fraud, or a hospital wants to analyze its massive database, "No one wants to give all of that access to one individual," Narayanan said.

"This was an extra benefit that we didn't even strive for," Narayanan said.

Credit: 
Washington University in St. Louis

How thoughts could one day control electronic prostheses, wirelessly

image: Photo of a current neural implant, that uses wires to transmit information and receive power. New research suggests how to one day cut the wires.

Image: 
Sergey Stavisky

Stanford researchers have been working for years to advance a technology that could one day help people with paralysis regain use of their limbs, and enable amputees to use their thoughts to control prostheses and interact with computers.

The team has been focusing on improving a brain-computer interface, a device implanted beneath the skull on the surface of a patient's brain. This implant connects the human nervous system to an electronic device that might, for instance, help restore some motor control to a person with a spinal cord injury, or someone with a neurological condition like amyotrophic lateral sclerosis, also called Lou Gehrig's disease.

The current generation of these devices record enormous amounts of neural activity, then transmit these brain signals through wires to a computer. But when researchers have tried to create wireless brain-computer interfaces to do this, it took so much power to transmit the data that the devices would generate too much heat to be safe for the patient.

Now, a team led by electrical engineers and neuroscientists Krishna Shenoy, PhD, and Boris Murmann, PhD, and neurosurgeon and neuroscientist Jaimie Henderson, MD, have shown how it would be possible to create a wireless device, capable of gathering and transmitting accurate neural signals, but using a tenth of the power required by current wire-enabled systems. These wireless devices would look more natural than the wired models and give patients freer range of motion.

Graduate student Nir Even-Chen and postdoctoral fellow Dante Muratore, PhD, describe the team's approach in a Nature Biomedical Engineering paper.

The team's neuroscientists identified the specific neural signals needed to control a prosthetic device, such as a robotic arm or a computer cursor. The team's electrical engineers then designed the circuitry that would enable a future, wireless brain-computer interface to process and transmit these these carefully identified and isolated signals, using less power and thus making it safe to implant the device on the surface of the brain.

To test their idea, the researchers collected neuronal data from three nonhuman primates and one human participant in a (BrainGate) clinical trial.

As the subjects performed movement tasks, such as positioning a cursor on a computer screen, the researchers took measurements. The findings validated their hypothesis that a wireless interface could accurately control an individual's motion by recording a subset of action-specific brain signals, rather than acting like the wired device and collecting brain signals in bulk.

The next step will be to build an implant based on this new approach and proceed through a series of tests toward the ultimate goal.

Credit: 
Stanford University School of Engineering

Changes in land evaporation shape the climate

image: To produce better climatic predictions, scientists estimate how much water is evaporated from the vegetated land surface

Image: 
Ákos Szabó

Accurate estimation of how much water is evaporated from the vegetated land surface is a challenging task. A physical-based method--such as the complementary relationship (CR) of evaporation, which explicitly accounts for the dynamic feedback mechanisms in the soil-land-atmosphere system and requires minimal data--is advantageous for tracking the ongoing changes in the global hydrological cycle and relating them to historical base values.

Unfortunately, such a method cannot be employed with recently developed remote sensing-based approaches, as they are typically available only for the last couple of decades or so.

An international team of Hungarian, American and Chinese scientists have demonstrated that an existing calibration-free version of the CR method that inherently tracks the aridity changes of the environment in each step of the calculations can better detect long-term trends in continental-scale land evaporation rates than a recently developed and globally calibrated one without such dynamic adjustments to aridity.

With the ongoing climate change, the global hydrological cycle is affected significantly. As climate research indicates, wet areas will get even wetter in general, while dry ones drier, which is not the best scenario for the vast semi-arid and arid regions of the globe. In order to produce better climatic predictions, general circulation models need to upgrade their existing evaporation estimation algorithms. A computational method that automatically adjusts its predictions to short- as well as long-term changes in aridity can improve the existing algorithms employed by these climate models.

"By repeatedly demonstrating the superb capabilities of our calibration-free evaporation method in all venues accessible to us, our ultimate goal is to have the climate modeling community take notice and give it a try," explains Dr Jozsef Szilagyi, the lead author of the study. "As it requires only a few, surface-measured meteorological input variables, such as air temperature, humidity, wind speed and net surface radiation, without detailed information of the soil moisture status or land-surface properties, it can be readily applied with available historical records of meteorological data and see if it indeed improves past predictions of the climate or not."

"Any changes in land use and land cover is inherently accounted for by the CR method via its dynamic aridity term that does not even require precipitation measurements--one of the most variable and difficult meteorological parameter to predict," he concludes.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Small trees offer hope for rainforests

image: Drought in the Amazon affects larger trees more severely as they are more likely to die from hydraulic failure

Image: 
David Bartholomew

Small trees that grow up in drought conditions could form the basis of more drought-resistant rainforests, new research suggests.

Severe and long-lasting droughts are becoming more common in the Amazon, often killing large trees that form the forest canopy.

But a new study, led by the University of Exeter, suggests small trees adapt better to droughts and could grow into a new generation to help the rainforest survive.

Using data from a long-running drought experiment in Brazil, the scientists discovered small trees respond positively to the extra light they get when larger trees die, managing to increase their capacity for photosynthesis and their growth despite the lack of water.

"Conditions in the Amazon are shifting due to climate change, and trees will have to adapt if they are to survive," said lead author David Bartholomew, of Exeter's Global Systems Institute.

"Our findings show that small trees are more capable of changing their physiology in response to environmental changes than their larger neighbours.

"Having grown in up in drought conditions, these trees might develop traits that will help them cope with future droughts - even once they are fully grown.

"Ultimately, this may allow them to form the next generation of canopy trees, leading to greater overall resilience in the forest."

The study examined trees in a 15-year Amazonian drought experiment, in which clear plastic panels catch 50% of rainfall.

Researchers sampled 66 small trees (1-10 cm diameter at a height of 1.3m from the ground) and 61 large trees (more than 20cm in diameter) in the drought experiment area and a nearby control area with no rainfall exclusion.

Small trees in the drought area showed increased capacity for photosynthesis (Jmax 71%, Vcmax 29%), 32% more leaf respiration and 15% more leaf mass per area compared to small trees in the control area.

"This long-running experiment has shown that large trees are quite vulnerable to drought, and probably won't survive if droughts continue to become more common and severe," said Bartholomew, a PhD student on the NERC GW4+ Doctoral Training Partnership.

"However, relatively little is known about the response of small understorey trees which could be vital in determining the future of tropical forests.

"The understorey of an intact rainforest is usually a dark and humid environment.

"Trees found in low-light conditions will typically downregulate their photosynthetic capacity to conserve resources.

"However, if droughts cause larger trees to die, these trees will have to adapt to both decreasing water availability and increased light.

"Our study suggests they have a remarkable ability to do this."

The responses of tree species in the study varied, with some showing a strong ability to adapt and some showing very little.

More research is needed to understand how this might change the makeup of the famously diverse Amazon rainforest in the future.

Credit: 
University of Exeter

Methanol synthesis: Insights into the structure of an enigmatic catalyst

image: Holger Ruland, Daniel Laudenschleger and Martin Muhler (left to right) collaborated for the study.

Image: 
RUB, Marquard

Methanol is one of the most important basic chemicals used, for example, to produce plastics or building materials. To render the production process even more efficient, it would be helpful to know more about the copper/zinc oxide/aluminium oxide catalyst deployed in methanol production. To date, however, it hasn't been possible to analyse the structure of its surface under reaction conditions. A team from Ruhr-Universität Bochum (RUB) and the Max Planck Institute for Chemical Energy Conversion (MPI CEC) has now succeeded in gaining insights into the structure of its active site. The researchers describe their findings in the journal Nature Communications from 4 August 2020.

In a first, the team showed that the zinc component of the active site is positively charged and that the catalyst has as many as two copper-based active sites. "The state of the zinc component at the active site has been the subject of controversial discussion since the catalyst was introduced in the 1960s. Based on our findings, we can now derive numerous ideas on how to optimise the catalyst in the future," outlines Professor Martin Muhler, Head of the Department of Industrial Chemistry at RUB and Max Planck Fellow at MPI CEC. For the project, he collaborated with Bochum-based researcher Dr. Daniel Laudenschleger and Mülheim-based researcher Dr. Holger Ruland.

Sustainable methanol production

The study was embedded in the Carbon-2-Chem project, the aim of which is to reduce CO2 emissions by utilising metallurgical gases produced during steel production for the manufacture of chemicals. In combination with electrolytically produced hydrogen, metallurgical gases could also serve as a starting material for sustainable methanol synthesis. As part of the Carbon-2-Chem project, the research team recently examined how impurities in metallurgical gases, such as are produced in coking plants or blast furnaces, affect the catalyst. This research ultimately paved the way for insights into the structure of the active site.

Active site deactivated for analysis

The researchers had identified nitrogen-containing molecules- ammonia and amines - as impurities that act as catalyst poisons. They deactivated the catalyst, but not permanently: if the impurities disappear, the catalyst recovers by itself. Using a unique research apparatus that was developed in-house, i.e. a continuously operated flow apparatus with an integrated high-pressure pulse unit, the researchers passed ammonia and amines over the catalyst surface, temporarily deactivating the active site with a zinc component. Despite the zinc component being deactivated, another reaction still took place on the catalyst: namely the conversion of ethene to ethane. The researchers thus detected a second active site operating in parallel, which contains metallic copper but doesn't have a zinc component.

Since ammonia and the amines are bound to positively charged metal ions on the surface, it was evident that zinc, as part of the active site, carries a positive charge.

Credit: 
Ruhr-University Bochum

Cell diversity in the embryo

image: After 8 days, the developing mouse embryo resembles a seahorse. Without the epigenetic regulator PRC2, it is less complex and looks more like an egg. A closer look into the cells of the embryo reveals the tasks of regulatory factors during development.

Image: 
Abhishek Sampath Kumar / MPI

A research team at the Max Planck Institute for Molecular Genetics in Berlin has explored the role of factors in embryonic development that do not alter the sequence of DNA, but only epigenetically modify its "packaging". In the scientific journal Nature, they describe how regulatory mechanisms contribute to the formation of different tissues and organs in early mouse embryos.

A fertilized egg cell develops into a complete organism with a multitude of different tissues and organs, although the genetic information is exactly the same in every cell. A complex clockwork of molecules regulates which cell in the body fulfills each task and determines the proper time and place to activate each gene.

Epigenetic regulator factors are part of this molecular mechanism and act to modify the "packaging" of the DNA molecule without altering the underlying genetic information. Specifically, they act to bookmark the DNA and control what parts can be accessed in each cell.

Most of these regulators are essential, and embryos lacking them tend to die during the time of development when organs begin to emerge. However, these regulators may have specific functions that differ in every cell, making them difficult to study. This has also been a major hindrance for studying these proteins, which are not only relevant for the development of embryos, but also involved in the formation of cancer.

Detailed examination of embryos

"The same regulator is present in all cells, but can have very different tasks, depending on cell type and time of development," says Stefanie Grosswendt, one of the first authors of a new study in the scientific journal Nature.

Grosswendt and her colleague Helene Kretzmer from Alexander Meissner's lab at the Max Planck Institute for Molecular Genetics (MPIMG) in Berlin together with Zachary Smith from Harvard University, MA, have now succeeded in elucidating the significance of epigenetic regulators for embryonic development with unprecedented precision.

The researchers analyzed ten of the most important epigenetic regulators. Using the CRISPR-Cas9 system, they first specifically removed the genes coding for the regulatory factors in fertilized oocytes and then observed the effects on embryo development days later.

After the embryos had developed for about six to nine days, the team examined the anatomical and molecular changes that resulted from the absence of the respective regulator. They found that the cellular composition of many of the embryos was substantially altered. Cells of certain types existed in excessive numbers, while others were not produced at all.

Analyzing thousands of individual cells

In order to make sense of these changes on a molecular level, researchers examined hundreds to thousands of individual cells from embryos, from which single epigenetic regulators had been systematically removed. They sequenced the RNA molecules of almost 280,000 individual cells to investigate the consequences of the loss of function. RNA relays information encoded on the DNA, allowing researchers to understand the identity and behavior of cells using sequencing technologies.

In their analysis, the scientists focused on a phase of development, in which epigenetic regulators are particularly important. When they compared the data of altered and unaltered embryos, they identified genes that were dysregulated, and cell types that are abnormally over- or underproduced. From this overall picture, they deduced previously unknown functions of many epigenetic regulators.

Complex effects during development

An eight-day-old mouse embryo looks a bit like a seahorse and does not have any organs yet. "From the outer appearance of an early embryo, one can often only guess which structures and organs will form and which will not," say bioinformatician Helene Kretzmer and biologist Zachary Smith, who are also both first authors of the publication. "Our sequencing allows for a much more precise and high resolution view."

The single-cell analysis gave them a highly detailed view over the first nine days of mouse development. Often, switching off a single regulator led to ripple effects throughout the network of interacting genes, with many differentially activated or inactivated genes over the course of development.

Removing the epigenetic regulator Polycomb (PRC2) had a particularly striking impact. "Without PRC2, the embryo looks egg-shaped and very small after eight and a half days, which is very unusual," says Kretzmer. "We see vast changes to how DNA is packaged that happens much earlier, long before the embryo develops morphological abnormalities."

The researchers found that PRC2 is responsible for limiting the amount of germline progenitor cells - the cells that later become sperm and eggs. Without PRC2, the embryo develops an excessive number of these cells, loses its shape, and dies after a short time.

Starting point for further analyses

"With the combination of new technologies we addressed issues that have been up in the air for 25 years," says Alexander Meissner, who headed the study. "We now understand better how epigenetic regulators arrange for the many different types of cells in the body."

The work is only the first step for even more detailed investigations, says Meissner. "Our method lets us investigate other factors such as transcription or growth factors or even a combination of these. We are now able to observe very early developmental stages in a level of detail that was previously unthinkable."

Credit: 
Max-Planck-Gesellschaft

More carbon in the ocean can lead to smaller fish

As humans continue to send large quantities of carbon into the atmosphere, much of that carbon is absorbed by the ocean, and UConn researchers have found high CO2 concentrations in water can make fish grow smaller.

Researchers Christopher Murray PhD '19, now at the University of Washington, and UConn Associate Professor of Marine Sciences Hannes Baumann have published their findings in PLOS ONE.

"The ocean takes up quite a bit of CO2. Estimates are that it takes up about one-third to one-half of all CO2 emissions to date," says Murray. "It does a fantastic job of buffering the atmosphere but the consequence is ocean acidification."

Life relies on chemical reactions and even a slight change in pH can impede the normal physiological functions of some marine organisms; therefore, the ocean's buffering effect may be good for land-dwellers, but not so good for ocean inhabitants.

Baumann explains that in the study of ocean acidification (or OA), researchers have tended to assume fish are too mobile and tolerant of heightened CO2 levels to be adversely impacted.

"Fish are really active, robust animals with fantastic acid/base regulatory capacity," says Murray. "So when OA was emerging as a major ocean stressor, the assumption was that fish are going to be OK, [since] they are not like bivalves or sea urchins or some of the other animals showing early sensitivities."

The research needed for drawing such conclusions requires long-term studies that measure potential differences between test conditions. With fish, this is no easy task, says Baumann, largely due to logistical difficulties in rearing fish in laboratory settings.

"For instance, many previous experiments may not have seen the adverse effects on fish growth, because they incidentally have given fish larvae too much food. This is often done to keep these fragile little larvae alive, but the problem is that fish may eat their way out of trouble -- they overcompensate - so you come away from your experiment thinking that fish growth is no different under future ocean conditions," says Baumann.

In other words, if fish are consuming more calories because their bodies are working harder to cope with stressors like high CO2 levels, a large food ration would mask any growth deficits.

Additionally, previous studies that concluded fish are not impacted by high CO2 levels involved long-lived species of commercial interest. Baumann and Murray overcame this hurdle by using a small, shorter-lived fish called the Atlantic silverside so they could study the fish across its life cycle. They conducted several independent experiments over the course of three years. The fish were reared under controlled conditions from the moment the eggs were fertilized until they were about 4 months old to see if there were cumulative effects of living in higher CO2 conditions.

Murray explains, "We tested two CO2 levels, present-day levels and the maximum level of CO2 we would see in the ocean in 300 years under a worst-case emissions scenario. The caveat to that is that silversides spawn and develop as larvae and early juveniles in coastal systems that are prone to biochemical swings in CO2 and therefore the fish are well-adapted to these swings."

The maximum CO2 level applied in the experiments is one aspect that makes this research novel, says Murray,

"That is another important difference between our study and other studies that focus on long-term effects; almost all studies to date have used a lower CO2 level that corresponds with predictions for the global ocean at the end of this century, while we applied this maximum level. So it is not surprising that other studies that used longer-lived animals during relatively short durations have not really found any effects. We used levels that are relevant for the environment where our experimental species actually occurs."

Baumann and Murray hypothesized that there would be small, yet cumulative, effects to measure. They also expected fish living in sub-ideal temperatures would experience more stress related to the high CO2 concentrations and that female fish would experience the greatest growth deficits.

The researchers also used the opportunity to study if there were sex-determination impacts on the population in the varying CO2 conditions. Sex-determination in Atlantic silversides depends on temperature, but the influence of seawater pH is unknown. In some freshwater fish, low pH conditions produce more males in the population. However, they did not find any evidence of the high CO2 levels impacting sex differentiation in the population. And the growth males and females appeared to be equally affected by high CO2.

"What we found is a pretty consistent response in that if you rear these fish under ideal conditions and feed them pretty controlled amounts of food, not over-feeding them, high CO2 conditions do reduce their growth in measurable amounts," says Murray.

They found a growth deficit of between five and ten percent, which Murray says amounts to only a few millimeters overall, but the results are consistent. The fish living at less ideal temperatures and more CO2 experienced greater reductions in growth.

Murray concludes that by addressing potential shortcomings of previous studies, the data are clear: "Previous studies have probably underestimated the effects on fish growth. What our paper is demonstrating is that indeed if you expose these fish to high CO2 for a significant part of their life cycle, there is a measurable reduction in their growth. This is the most important finding of the paper."

Credit: 
University of Connecticut

Mount Sinai researchers discover treatment option for rare genetic disorder

Researchers from the Icahn School of Medicine used a novel genetic sequencing technology to identify the genetic cause of--and a treatment for--a previously unknown severe auto inflam-matory syndrome affecting an 18-year-old girl since infancy.

The technology, tailored to the patient's own genetic code at a single cell level, helped the re-searchers characterize an unknown mutation in a gene called JAK1 that caused the patient's immune system to be permanently turned on, resulting in rashes over much of her skin, growth abnormalities, kidney failure, allergic hypersensitivities, and an unusual inflammatory condi-tion throughout the digestive tract.

The study, led by Dusan Bogunovic, PhD, Associate Professor of Microbiology, and Pediatrics, at the Icahn School of Medicine at Mount Sinai, faculty member of The Mindich Child Health and Development Institute and the Precision Immunology Institute at Mount Sinai, and Director of the Center for Inborn Errors of Immunity, was published in the August 3 issue of the journal Immunity. The discovery points toward new ways to study how genetic diseases manifest and presents a model of personalized diagnosis and treatment for patients with genetic diseases.

Autoinflammatory diseases are caused by abnormal activation of the immune system, leading to recurrent episodes of inflammation that may result in damaged or failed organs. The researchers determined that not all of the patient's cells carried this mutation and had different genetic makeups or genotypes, what the researcher describe as a mosaic.

"Most genes use both their maternal and paternal copies, called alleles," said Dr. Bogunovic. "Our findings show the JAK1 mutation in this patient used only one copy per cell, known as monoallelic expression. This challenges the textbook principles of genetics and may help ex-plain irregularities that are frequently encountered across genetic diseases."

In the paper, the researchers describe the use of next-generation genomic, molecular, and multi-parametric immunological tools to probe the effects of the patient's JAK1 mutation. By map-ping the genotype of JAK1 across the patient's body, researchers were able to pinpoint precisely when the mutation arose in early development in the embryo. It later gave way to a host of symptoms from early childhood to early adulthood. The hunt began for a specific therapy that would curb the excessive activity of her mutant JAK1 and potentially cure her inflammatory symptoms.

"We identified one drug, tofacitinib, a JAK inhibitor, that curbed the excessive activity of her hyperactive inflammation. When administered the therapy, she rapidly improved within weeks. Her skin lesions cleared, her daily gastrointestinal symptoms resolved, and the clinical signs of inflammation went away, putting the patient in remission for two years until her unfortunate demise from coronavirus-related illness," said Dr. Bogunovic. "This research helps better un-derstand the basic function of JAK1, which has broad implications for diseases of the immune system and how to treat them. In addition, the genetic discoveries uncovered in this case open up new research avenues into the complexities of how genetic diseases manifest and present a model of the future of personalized medicine. By coupling advanced clinical care with next-generation sequencing and detailed laboratory studies, we successfully diagnosed and treated a life-threatening disease."

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

Dozens of pesticides linked with mammary gland tumors in animal studies

In an analysis of how regulators review pesticides for their potential to cause cancer, researchers at Silent Spring Institute identified more than two dozen registered pesticides that were linked with mammary gland tumors in animal studies. The new findings raise concerns about how the US Environmental Protection Agency (EPA) approves pesticides for use and the role of certain pesticides in the development of breast cancer.

Several years ago, a resident on Cape Cod in Massachusetts contacted researchers at Silent Spring looking for information on an herbicide called triclopyr. Utility companies were looking to spray the chemical below power lines on the Cape to control vegetation.

"We know pesticides like DDT increase breast cancer risk, so we decided to look into it," says co-author Ruthann Rudel, an environmental toxicologist and director of research at Silent Spring. "After examining pesticide registration documents from EPA, we found two separate studies in which rodents developed mammary gland tumors after being exposed to triclopyr, yet for some reason regulators dismissed the information in their decision not to treat it as a carcinogen."

When manufacturers apply to register a pesticide, EPA reviews existing studies and based on those studies assigns the chemical a cancer classification--for instance, how likely or unlikely the chemical is to cause cancer. After reviewing triclopyr, Silent Spring researchers wondered if evidence of mammary tumors was being ignored for other pesticides as well.

Reporting in the journal Molecular and Cellular Endocrinology, Rudel and Silent Spring scientist Bethsaida Cardona reviewed more than 400 EPA pesticide documents summarizing the health effects of each registered pesticide. They found a total of 28 pesticides linked with mammary gland tumors, yet EPA acknowledged only nine of them as causing mammary tumors and dismissed the evidence entirely for the remaining 19.

Rudel and Cardona also found that many of the pesticides in their analysis behaved like endocrine disruptors, for instance, by interfering with estrogen and progesterone. "Breast cancer is highly influenced by reproductive hormones, which stimulate the proliferation of cells within the breast, making it more susceptible to tumors," says Rudel. "So, it's important that regulators consider this kind of evidence. If they don't, they risk exposing people to pesticides that are breast carcinogens."

Traditionally, toxicologists focus on whether a chemical causes DNA damage when determining its potential to cause cancer. But recent findings in cancer biology show there are many ways chemicals can trigger the development of cancer. For example, chemicals can suppress the immune system, cause chronic inflammation, or disrupt the body's system of hormones, all of which can lead to the growth of breast tumors and other types of tumors as well.

"In light of our findings, we hope EPA updates its guidelines for assessing mammary gland tumors by considering evidence that more completely captures the biology of breast cancer, such as the effects of endocrine disruptors," says Cardona.

Rudel and Cardona recommend that EPA re-evaluate five pesticides in particular--IPBC, triclopyr, malathion, atrazine and propylene oxide--due to their widespread use and the evidence uncovered in the new analysis. IPBC is a preservative in cosmetics; triclopyr is an agricultural herbicide that is also used to control vegetation growth along rights-of-way; malathion is a common residential and agricultural pesticide and is used in some lice treatments; atrazine is one of the most commonly-used herbicides in agriculture; and propylene oxide is used to preserve food, cosmetics, and pharmaceuticals, and has many similarities with ethylene oxide, a known human carcinogen.

The project is part of Silent Spring Institute's Safer Chemicals Program which is developing new cost-effective ways of screening chemicals for their effects on the breast. Knowledge generated by this effort will help government agencies regulate chemicals more effectively and assist companies in developing safer products.

Credit: 
Silent Spring Institute

NASA infrared imagery shows Hagupit nearing landfall in China

image: On Aug. 3 at 1:41 p.m. EDT (1741 UTC) NASA's Aqua satellite analyzed Typhoon Hagupit less than 2 hours before landfall in China. Using the Atmospheric Infrared Sounder or AIRS instrument, NASA found coldest cloud top temperatures as cold as or colder than (purple) minus 63 degrees Fahrenheit (minus 53 degrees Celsius).

Image: 
NASA JPL/Heidar Thrastarson

NASA's Aqua satellite provided a look at Typhoon Hagupit as it was nearing landfall in southeastern China.
One of the ways NASA researches tropical cyclones is using infrared data that provides temperature information. Cloud top temperatures provide information to forecasters about where the strongest storms are located within a tropical cyclone. Tropical cyclones do not always have uniform strength, and some sides have stronger sides than others. The stronger the storms, the higher they extend into the troposphere, and the colder the cloud temperatures.

On Aug. 3 at 1:41 p.m. EDT (1741 UTC) NASA's Aqua satellite analyzed Hagupit using the Atmospheric Infrared Sounder or AIRS instrument. Aqua passed over Hagupit less than 2 hours before its official landfall.

The infrared data showed the bulk of the storms were southeast of the center because of vertical wind shear or outside winds pushing against the storm from the northwest.

AIRS found coldest cloud top temperatures as cold as or colder than minus 63 degrees Fahrenheit (minus 53 degrees Celsius). NASA research has shown that cloud top temperatures that cold indicate strong storms that have the capability to create heavy rain.

China's National Meteorological Center reported that Hagupit made landfall China's Zhejiang province on the coastal areas of Yueqing City at around 3:30 a.m. local time on Aug. 4 (3:30 p.m. EDT on Aug 3). At the time of landfall, Hagupit had maximum sustained winds near 85 mph (137 kph), equivalent to a Category 1 hurricane.

At 5 a.m. EDT (0900 UTC) on Aug. 4, the Joint Typhoon Warning Center noted that Tropical depression Hagupit was centered near latitude 29.8 degrees north and longitude 120.3 degrees east, about 104 nautical miles southwest of Shanghai, China. Hagupit was moving to the north with maximum sustained winds decreasing to 30 knots (35 mph/56 kph).

Hagupit is moving inland over eastern China and the Joint Typhoon Warning Center forecasts that it will reemerge into the Yellow Sea on Aug. 5, but adverse conditions will lead to Hagupit's dissipation.

The AIRS instrument is one of six instruments flying on board NASA's Aqua satellite, launched on May 4, 2002.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

For updated forecasts, visit: http://www.nhc.noaa.gov

Credit: 
NASA/Goddard Space Flight Center

Studies shed new light on how biodiversity influences plant decay

image: Scientists have provided new insights on the relationship between plant diversity in forests and the diversity of organisms involved in their decay, such as bacteria and fungi.

Image: 
Léa Beaumelle (CC BY 4.0)

Scientists have provided new insights on the relationship between plant diversity in forests and the diversity of organisms involved in their decay, such as bacteria and fungi.

Plant litter decomposition is a major ecosystem function, linking plant biomass to carbon stocks in the soil and atmosphere, and releasing nutrients including nitrogen and phosphorus that influence soil biodiversity. Two new independent studies, published today in eLife, report how plant biodiversity impacts decomposition processes and could help predict how the loss of species might affect forest ecosystems.

For the first study, researchers based in China and France analysed the relationship between the diversity of plant litter and decomposition across 65 field studies in forests around the world. Their results show that plant decomposition is faster when litter is composed of more than one species. This was particularly clear in forests with mild temperatures, but were more variable in other forest environments.

"We also found that plant diversity accelerated the release of nitrogen, but not phosphorus, potentially indicating a shift in ecosystem nutrient limitation caused by a change in biodiversity," explains joint first author Liang Kou, Associate Professor at the Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing, China. "This discovery was again clear for temperate forests, but still needs confirmation for boreal, Mediterranean, subtropical and tropical forests that are currently limited on data."

"Our results suggest that biodiversity loss will modify carbon and nutrient cycling in forest ecosystems," adds joint senior author Huimin Wang, Professor at the Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences. "The potential impact of changes in litter diversity on carbon and nutrient cycling warrants particular attention in future studies, which would ideally integrate responses from decomposers for a better understanding of changes in carbon and nutrient cycling and the mechanisms driving them."

The second study in eLife, from researchers based in Germany and Belgium, similarly highlights the important links between plant litter and decomposer diversity, but it also shows how these links can be influenced by human activity.

"Industrial and agricultural activities can have detrimental effects on decomposer organisms," says first author Léa Beaumelle, a postdoctoral researcher at the German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig, University of Leipzig, Germany. "They release chemical stressors such as metals and pesticides, as well as nutrients, into soil and water. Chemical stressors and added nutrients modify decomposer communities by affecting their diversity, abundance and metabolism."

Previous experiments conducted in simplified conditions have shown that biodiversity loss has detrimental effects on ecosystem processes. But how these results apply to real-world scenarios of change in biodiversity remains unclear. The researchers set out to discover if the responses of plant litter decomposition to chemical stressors and added nutrients can be explained by changes in decomposer diversity across ecosystems.

To do this, the team analysed the results of 69 independent studies that reported 660 observations of the effects of chemical stressors or nutrient enrichment on animal and microbial decomposers and on plant litter decomposition. They found that declines in the diversity and abundance of decomposers explained reductions in plant decay rates under the influence of chemical stressors, but not added nutrients. This suggests that human activities decrease decomposer biodiversity, which then leads to significant effects on ecosystem functions.

"These findings could inform the design of suitable strategies to maintain biodiversity and ecosystem functioning," concludes senior author Nico Eisenhauer, Head of Experimental Interaction Ecology at the German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig, University of Leipzig. "But they also show that these strategies must take human activities into account and cannot rely solely on improving biodiversity alone."

Credit: 
eLife

Altered lipid metabolism following childbirth predicts later diabetes risk

Scientists have found that disruptions to the metabolism of lipids occur after childbirth in women with gestational diabetes who go on to develop type 2 diabetes.

Their results, published today in eLife, suggest that specific lipids - molecules that make up the building blocks of living cells, store energy and can serve as cellular messengers - could help to predict the progression from gestational to type 2 diabetes in women who have just given birth. The findings may one day help physicians identify those at risk of developing type 2 diabetes after a pregnancy and ensure they receive preventive care.

Gestational diabetes is a complication that affects between 1-14% of pregnant women. It occurs when a pregnant woman is unable to produce enough of a hormone called insulin and, if untreated, can lead to dangerously high blood sugar. While most women recover after childbirth, more than a third will develop type 2 diabetes within 10 years, but scientists do not yet understand why. "It is critical to uncover the underlying metabolic changes that cause women to develop type 2 diabetes so that we can identify and start preventing more of these cases," says lead author Mi Lai, Postdoctoral Research Fellow at the University of Toronto, Canada.

To do this, Lai and her colleagues analysed more than 1,000 different lipids found in fasting blood plasma samples collected from 350 women with gestational diabetes between six to nine weeks after their delivery. Of these women, 171 went on to develop type 2 diabetes within the next eight years and 179 did not. The analysis identified 311 lipids associated with a higher risk of developing diabetes and 70 associated with a lower risk.

The team found that dyslipidemia after childbirth, where an abnormal amount of lipids occur in the blood, is enabled by an increase in the metabolism of glycerolipid - which activates lipid storage in the body - and the suppression of two other types of lipid called phospholipids and sphingolipids.

"There appears to be an increase in the formation of fat prior to the onset of type 2 diabetes," explains co-senior author Feihan Dai, a research scientist of physiology at the University of Toronto. "This process shifts metabolites towards triacylglycerol formation and away from phospholipids and sphingolipids, and this might be the driving force of the onset of the disease."

Additionally, the team showed that testing women for changes in a set of 11 lipids could help predict which women would go on to develop type 2 diabetes. Combining this lipid panel with two routine postpartum tests for blood sugar levels was even more effective at identifying which women would develop the condition.

"Our findings lay the groundwork for further research and possible development of new methods to help predict the progression from gestational diabetes to type 2 diabetes in women following childbirth," concludes co-senior author Michael Wheeler, Professor of Physiology and Medicine at the University of Toronto. "Predicting this risk early would allow for the timely intervention and prevention of diabetes in new mothers."

Credit: 
eLife

NASA providing data on Tropical Storm Isaias as it blankets eastern seaboard

image: NASA's Aqua satellite passed over Isaias on Aug. 4 at 2:50 a.m. EDT (0650 UTC). Coldest cloud top temperatures (darker brown) were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 degrees Celsius) in some of the bands of thunderstorms north of Isaias's center. These temps were also found over parts of Virginia, Pennsylvania, Maryland, Delaware, and over the Atlantic Ocean. Storms with cloud top temperatures that cold have the capability to produce heavy rainfall.

Image: 
Credits: NASA/NRL

Tropical Storm Isaias made landfall late on Aug. 3 and by today, Aug. 4, the huge storm stretched from Virginia to Maine. NASA satellites have been providing forecasters with rainfall rates, cloud top temperatures, storm extent and strength as Isaias batters the U.S. East Coast.

Warnings and Watches for Tuesday, August 4, 2020

NOAA's National Hurricane Center (NHC) posted many warnings and watches on Aug. 4. Storm Surge Warning is in effect for the Pamlico and Albemarle Sounds, Ocracoke Inlet, North Carolina to the North Carolina/Virginia border.

A Tropical Storm Warning is in effect from north of Surf City, North Carolina to Eastport, Maine, for the Pamlico and Albemarle Sounds, the Chesapeake Bay, the Tidal Potomac River, Delaware Bay, Long Island and Long Island Sound, NY, Martha's Vineyard, Nantucket, and Block Island.

Infrared Imagery Reveals Isaias' Rain Potential

One of the ways NASA researches tropical cyclones is using infrared data that provides temperature information. Cloud top temperatures provide information to forecasters about where the strongest storms are located within a tropical cyclone. The stronger the storms, the higher they extend into the troposphere, and the colder the cloud temperatures.

On Aug. 3 at 2:47 p.m. EDT (1847 UTC) NASA's Aqua satellite analyzed Isaias using the Atmospheric Infrared Sounder or AIRS instrument. The infrared data showed the bulk of the storms wrapped from west to north to east of the center of circulation. AIRS found coldest cloud top temperatures as cold as or colder than minus 63 degrees Fahrenheit (minus 53 degrees Celsius). NASA research has shown that cloud top temperatures that cold indicate strong storms that have the capability to create heavy rain. That heavy rain brought flooding to areas of South and North Carolina as the storm approached for landfall.

Isaias Landfall

Doppler radar imagery and surface observations indicate that eye of Hurricane Isaias made landfall in southern North Carolina around 11:10 p.m. EDT (0310 UTC) on Monday, Aug. 3 near Ocean Isle Beach, with maximum sustained winds of 85 mph (140 kph).

Isaias' Water Vapor Content

The Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite provided forecasters at the National Hurricane Center with water vapor imagery in Tropical Storm Isaias. NASA's Aqua satellite passed over Isaias on Aug. 4 at 2:50 a.m. EDT (0650 UTC).

Coldest cloud top temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 degrees Celsius) in some of the bands of thunderstorms north of Isaias's center and were over parts of Virginia, Pennsylvania, Maryland, Delaware, and over the Atlantic Ocean. Storms with cloud top temperatures that cold have the capability to produce heavy rainfall.

Water vapor analysis of tropical cyclones tells forecasters how much potential a storm has to develop. Water vapor releases latent heat as it condenses into liquid. That liquid becomes clouds and thunderstorms that make up a tropical cyclone. Temperature is important when trying to understand how strong storms can be. The higher the cloud tops, the colder and the stronger they are.

That rainfall potential is evident in the NHC forecast for Aug. 4. The following rainfall accumulations are expected along and near the track of Isaias: central and eastern North Carolina into the Mid-Atlantic: 3 to 6 inches, isolated totals a maximum of 8 inches; eastern New York into Vermont: 2 to 4 inches, isolated maximum totals 6 inches; and western Connecticut, western Massachusetts, New Hampshire and western Maine: 1 to 3 inches.

Status of Isaias on Aug. 4

At 8 a.m. EDT (1200 UTC), the center of Tropical Storm Isaias was located over southeastern Virginia near latitude 37.7 degrees north and longitude 76.8 degrees west. That is about 15 miles (20 km) south-southeast of Tappahannock, Virginia.

Isaias is moving toward the north-northeast near 33 mph (54 kph), and this general motion accompanied by some additional increase in forward speed is expected through today. Maximum sustained winds are near 70 mph (110 kph) with higher gusts. The estimated minimum central pressure based on surface observations is 993 millibars.

In the 7 a.m. EDT hour, the National Weather Service reported sustained winds of 63 mph (101 kph) and a gust to 77 mph (124 kph) at Third Island, Virginia, at the mouth of the Chesapeake Bay.

NHC noted, "In addition to the storm surge and wind threats, Isaias is expected to produce heavy rainfall along and just west of the I-95 corridor today, and the Weather Prediction Center has placed a portion of this area in a high risk for life-threatening flash flooding. There is also a risk of tornadoes from southeast Virginia to New Jersey through midday. The risk of tornadoes will spread northward into southeastern New York this afternoon and across New England by tonight."

NHC Forecast for Isaias

Only gradual weakening is anticipated while Isaias moves north northeastward near the mid-Atlantic coast today. A faster rate of weakening is expected to begin tonight, and the system is forecast to become post-tropical tonight or early Wednesday. On the NHC forecast track, the center of Isaias will continue to move near or along the coast of the mid-Atlantic states today, and move across the northeastern United States into southern Canada tonight.

NASA Researches Tropical Cyclones

Hurricanes/tropical cyclones are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

For updated forecasts, visit: http://www.nhc.noaa.gov

Credit: 
NASA/Goddard Space Flight Center