Earth

Discovery of 'hidden' outbreak hints that Zika virus can spread silently

image: The Andersen lab at Scripps Research uses infectious disease genomics to investigate how pathogenic viruses such as Zika cause large-scale outbreaks. Pictured here are graduate students Nate Matteson (left), Glenn Oliveira (back) and Karthik Gangavarapu (front), and principal investigator Kristian Andersen, Ph.D. (right); all contributed to the Aug. 22, 2019 study in Cell.

Image: 
Scripps Research

Just when international fears of contracting Zika began to fade in 2017, an undetected outbreak was peaking in Cuba--a mere 300 miles off the coast of Miami. A team of scientists at Scripps Research, working in concert with several other organizations, uncovered the hidden outbreak by overlaying air-travel patterns with genomic sequencing of virus samples from infected travelers. The discovery is featured on the cover of the Aug. 22 issue of Cell.

"Infectious diseases such as Zika are global problems, not local problems, and greater international collaboration and coordination is critical if we are to stay ahead of looming threats," says Kristian Andersen, PhD, associate professor at Scripps Research and director of Infectious Disease Genomics at the Scripps Research Translational Institute. "Through this study, we developed a framework for a more global, more proactive way of understanding how viruses are spreading. The traditional reliance on local testing may not always be sufficient on its own."

Scripps Research partnered on this project with Yale University, Florida Gulf Coast University, the Florida Department of Health, and many other organizations.

Piecing together an outbreak

When the mosquito-borne Zika virus was discovered in Brazil during spring of 2015, it had already been circulating for at least a year, making its way to more than 40 countries, the study notes. Very quickly, Zika ascended from a little-known virus to a source of international panic--with worries accentuated by the fact that it could sometimes cause a severe condition known as microcephaly in babies born to women who contracted Zika during pregnancy.

Coordinated response to Zika relied upon countries accurately detecting cases and reporting them to international health agencies. By the end of 2016, data from these health agencies suggested that the epidemic was nearing its end. The virus subsequently fell off travelers' radar, and the World Health Organization ended its designation of Zika as a "Public Health Emergency of International Concern."

However, Andersen and his collaborators found that an undetected outbreak was reaching its peak in Cuba at that time, off the radar of international health agencies. Surprisingly, the outbreak lagged other Caribbean countries by a year, likely due to an aggressive mosquito-control campaign that delayed the disease's emergence, according to the study. The researchers noted that other infectious diseases spread by Aedes aegypti mosquitoes, including dengue, also were absent in Cuba at the same time.

Andersen's team had no idea it would expose an unknown outbreak when it began investigating travel-associated Zika cases in 2017. The scientists simply wanted to know if the epidemic really was winding down. Instead, they found that a steady number of travelers from the Caribbean were still contracting the virus. With limited access to reliable local case reporting, the team devised a way to estimate local prevalence by obtaining blood samples from infected travelers who had visited Cuba, then using genomic sequencing to reconstruct virus ancestry and outbreak dynamics. The approach is known as "genomic epidemiology."

Turning back time

Karthik Gangavarapu, a Scripps Research graduate student in Andersen's lab and one of three co-first authors of the study, says all of the Zika viruses from the epidemic in the Americas harkened from a single ancestor, which allowed the team to create a "family tree" and trace the roots of the virus.

By examining tiny genomic changes in each virus sample, Gangavarapu was able to determine a "clock rate" to reveal the age of the virus. The timeline determined that the outbreak in Cuba was established a year later than other outbreaks in the Caribbean. "We realized there was a whole outbreak that had gone undetected," Gangavarapu says.

Another major source of data came from travel surveillance. Sharada Saraf, an undergraduate intern in the Andersen Lab and co-first author of the study, analyzed airline travel schedules, flight patterns and cruise ship destinations. Together, the information painted a picture of how many people visited Cuba and other Zika-endemic countries during the time period in question.

"Given that undetected viral outbreaks have the potential to spread globally, I hope that this study will encourage utilizing both travel surveillance and genomic data--in addition to local reporting--for future surveillance efforts," Saraf says.

Saraf also secured health data from multiple sources and parsed it out for analysis. "This data was public but often not available in formats to analyze," she says. "For example, the data might be presented in bar graphs, requiring a lot of time to extract actual numbers that we could analyze. One of the nice things that has come out of this work is that we have consolidated this data in an easily downloadable and usable format, and made it publicly available on our lab website for anyone to use as soon as it was generated."

Similarly, the lab made its Zika virus sequence data available immediately, highlighting a new framework for how data can be openly shared during public health emergencies, Saraf says.

Preventing the next outbreak

Andersen's team notes that it's still unclear today whether Zika transmission is ongoing, as discrepancies in local reporting continue to hinder detection. While using travelers as sentinels can shed light on outbreaks, as it did in this case, richer public data is necessary to get ahead of threats, Andersen says.

Public health organizations and academic labs must step up their information-sharing practices, he says. That, along with better detection technologies and improved government funding for activities such as mosquito surveillance could help avert future outbreaks.

"So many serious diseases--not just Zika--are almost perfectly linked to fluctuations in mosquito populations, yet this type of data isn't being collected or made available in most places of the world," Andersen says. "Especially as mosquito populations and other animal reservoirs of infectious diseases increase due to climate change and an expanding human population, it is becoming critically important for governments to prioritize this type of proactive monitoring."

Gangavarapu notes that the implications of the team's method of combining travel surveillance with genomic epidemiology go far beyond this study: "It can be applied to many countries that may not have the capacity to detect diseases or may have a reporting bias."

Credit: 
Scripps Research Institute

Scientists build a synthetic system to improve wound treatment, drug delivery for soldiers

image: An Army research project at UMass Amherst takes advantage of differences in electrical charge to create an "all aqueous," water-in-water construct that achieves compartmentalization in a synthetic biologic system. This research could lead to materials that provide new avenues to deliver medicine, treat wounds, and purify water for Soldiers.

Image: 
UMass Amherst

RESEARCH TRIANGLE PARK, N.C. (August 22, 2019) - For the first time, scientists built a synthetic biologic system with compartments like real cells. This Army project at the University of Massachusetts Amherst could lead to materials that provide new avenues to deliver medicine, treat wounds and purify water for Soldiers.

The Army Research Office, an element of the U.S. Army Combat Capabilities Development Command's Army Research Laboratory, funded the project. The research, published in Chem, takes advantage of differences in electrical charge to create an all-aqueous, water-in-water construct that achieves compartmentalization in a synthetic biologic system.

"This ability to program stable structure and chemical functionality in all-aqueous systems that are environmentally friendly and bio-compatible will potentially provide unprecedented future capabilities for the Army," said Dr. Evan Runnerstrom, program manager in materials design at ARO. "The knowledge generated by this project could be applicable to future technologies for all-liquid batteries, water purification or wound treatment and drug delivery in the field."

Runnerstrom said the research could lead to the development of materials that are stable in water and bio-compatible, which for Soldiers could mean, a material available in the field that comes pre-loaded with medicine such as blood clotting factors. With an engineered stimulus, such as opening the packaging and applying a wound dressing, these water-in-water structures could be made to release the medicine.

Postdoctoral researcher Ganhua Xie at University of Massachusetts Amherst used two polymer aqueous solutions, one of polyethylene glycol, orPEG and water, the other dextran and water, with different electrical charges; they can be combined but do not mix, like the non-mixing wax-and-water in a lava lamp.

Next, Xie used a needle to send a high velocity jet of the dextran-plus-water solution into the PEG-plus-water solution, something Dr. Thomas Russell, the paper's lead author calls, "3D printing water-in-water." The operation creates a coacervate-membrane-stabilized aqueous or water-filled tubule where the path-length of the tube can be kilometers long. The 3D water-on-water printing forms a membranous layer of a coacervate that separates the two solutions.

"Our results point to new opportunities for manipulating and improving continuous separation and compartmentalized reactions," said Russell, distinguished professor, Polymer Science and Engineering Department at University of Massachusetts Amherst and the Lawrence Berkeley National Laboratory. "I feel we have developed a strategy to mimic the behavior of living cells. I think this is the first time this has been demonstrated."

Another feature of the water tube formed this way is that electrical charge regulates whether and in which direction a material can pass through the coacervate membrane. A negatively charged dye or other molecule can only pass through a negatively charged wall of the asymmetrical membrane, and likewise for positively charged materials.

"It effectively forms a diode, a one-sided gate," Xie said. "We can do a reaction inside this tube or sac that will generate a positively charged molecule that can only diffuse into the positive phase through the coacervate. If we design the system right, we can separate things out easily by charge, so it can be used for separations media in all-aqueous compartmentalized reaction systems. We can also trigger one reaction that will allow a coordinated reaction cascade, just as it happens in our bodies."

Xie explains that 3D water-on-water printing allows them to direct where they put these domains.

"We can build multi-layered structures with positive/negative/positive layers. We can use the sac-shaped ones as reaction chambers," he said.

Advantages of separating functions and materials in cells by compartmentalization include allowing many processes to occur at once, many different chemical environments to coexist and otherwise incompatible components to work side-by-side.

Among other tests and experiments, the researchers report on how they designed an all-aqueous tubular system and attached needles and syringe pumps at each end to allow water to pump through the entire structure without leakage, creating a flow-through coordinated reaction system.

"Once we'd done it, we looked at the biological mimicry," Russell said. "There have been lots of efforts to mimic biological systems, and a biologist might object and say this is too simple. But I do think that even though it involves simple materials, it works. It's treading very close to vasculature, and it mimics any place where chemicals flow through a membrane. Is it in the body? No, but it does mimic a real metabolic process, a compartmental reaction."

"There is a lot of power in the bio-mimetic approach with this research, where certain molecules or ions can be compartmentalized, just like in real cells, without using any organic solvents or oils, just water," Runnerstrom said. "This could give us new avenues to deliver medicine and drugs in a more controlled way for both the military and civilians."

Credit: 
U.S. Army Research Laboratory

'100-year' floods will happen every 1 to 30 years, according to new flood maps

image: Researchers at Princeton University calculated flood risks for 171 counties across four regions: New England (green), mid-Atlantic (orange), southeast Atlantic (blue), and Gulf of Mexico (red). They found that what used to be considered 100-year floods will occur far more often depending on the location.

Image: 
Reza Marsooli et al

A 100-year flood is supposed to be just that: a flood that occurs once every 100 years, or a flood that has a one-percent chance of happening every year.

But Princeton researchers have developed new maps that predict coastal flooding for every county on the Eastern and Gulf Coasts and find 100-year floods could become annual occurrences in New England; and happen every one to 30 years along the southeast Atlantic and Gulf of Mexico shorelines.

"The historical 100-year floods may change to one-year floods in Northern coastal towns in the U.S.," said Ning Lin, associate professor of civil and environmental engineering at Princeton University.

In a new paper published in the journal Nature Communications, researchers combined storm surge, sea level rise, and the predicted increased occurrence and strength in tropical storms and hurricanes to create a map of flood hazard possibility along the U.S. East Coast and Gulf of Mexico. Coastlines at northern latitudes, like those in New England, will face higher flood levels primarily because of sea level rise. Those in more southern latitudes, especially along the Gulf of Mexico, will face higher flood levels because of both sea level rise and increasing storms into the late 21st century.

"For the Gulf of Mexico, we found the effect of storm change is compatible with or more significant than the effect of sea level rise for 40% of counties. So, if we neglect the effects of storm climatology change, we would significantly underestimate the impact of climate change for these regions," said Lin.

The study's predictions are different than what else is currently available, said Reza Marsooli, assistant professor at the Stevens Institute of Technology, who worked on this study while a research scholar at Princeton, because they combine multiple variables that are typically addressed separately. For example the new maps use the latest climate science to look at how tropical storms will change in the future instead of what they are right now, or even looking backwards at previous storms, as federal disaster officials do to build their flood maps. These data, in turn, are integrated with sea level analysis.

The researchers hope that creating more accurate maps - especially those that are customized according to local conditions down to the county level - will help coastal municipalities prepare to face the effects of climate change head on.

"Policy makers can compare the spatial risk change, identify hotspots, and prioritize the resource allocation for risk reduction," said Lin. "Coastal counties can use the county-specific estimates in their decision making: Is their risk going to significantly change? Should they apply more specific, higher-resolution data to quantify the risk? Should they apply coastal flood defenses or other planning strategies or policy for reducing the future risk?"

Credit: 
Princeton University, Engineering School

International team discovers unique pathway for treating deadly children's brain cancer

image: Seven-year-old Hollis Doherty passed away Jan. 2, 2017, less than a year after he was diagnosed with DIPG -- Diffuse Intrinsic Pontine Glioma -- an agressive childhood brain cancer.

Image: 
Courtesy of the Doherty family

PHOENIX, Ariz. -- Aug. 22, 2019 -- An international team of researchers led by Yale University, University of Iowa, and the Translational Genomics Research Institute (TGen), an affiliate of City of Hope, has discovered a new pathway that may improve success against an incurable type of children's brain cancer.

The study results, published today in Nature Communications, suggest that scientists have identified a unique way to disrupt the cellular process that contributes to Diffuse Intrinsic Pontine Gliomas (DIPG).

DIPG is a highly aggressive and inoperable type of tumor that grows in the brain stem. This cancer usually strikes children less than 10 years old, and most patients do not survive more than a year after diagnosis.

Earlier studies identified a genetic mutation called PPM1D -- which is critical for cell growth and cell stress response --as a contributor to DIPG. Previous efforts to directly attack the PPM1D mutation, however, proved futile in controlling DIPG.

The TGen-Yale-Iowa led team discovered a vulnerability in the metabolic process for creating NAD, a metabolite that is necessary for all cell life.

"This is really an amazing new way to attack this cancer. We found that the mutated gene PPM1D essentially sets the stage for its own demise," said Michael Berens, Ph.D., a TGen Deputy Director, head of TGen's DIPG research, and one of the study's senior authors.

Researchers found that mutated PPM1D silences a gene called NAPRT, which is key to the production of the NAD metabolite. With NAPRT unavailable, the cell switches to another protein needed to create NAD called NAMPT. By using a drug that inhibits the production of NAMPT, researchers found they could essentially starve to death those cancer cells with the PPM1D mutation.

"It is such a devastating disease, and we have been so stymied in our progress for new DIPG therapies. Many drugs have been tested with no success at all. These findings now offer new hope for children with this truly terrible disease," said another senior author Ranjit Bindra, M.D., Ph.D., Associate Professor of Therapeutic Radiology at the Yale Cancer Center, where he treats children with DIPG.

Researchers had long thought DIPG was a childhood version of adult brain tumors, and so similar treatments for adult gliomas were tested extensively in children, and failed.

Frustration over the lack of an effective therapy for DIPG led the researchers to take a different approach in the search for new drugs to treat this disease. They chose to look at the tumor in terms of its potential vulnerabilities, and thus began a year-long molecular journey to understand what role the PPM1D mutation played in altering cancer metabolism.

"When epigenetic silencing results were analyzed, we were gratified to discover that DIPG cells with the PPM1D mutation had created a vulnerability to a key enzyme for which small molecule inhibitors were already available," said Sen Peng, Ph.D., a bioinformatician in TGen's Cancer & Cell Biology Division, and one of the study's contributing authors.

While the number of patients affected in the U.S. is small -- about 300 annually -- DIPG is recognized as a profoundly tragic illness.

"Our study's potential translational impact should lead to clinical trials and renewed hope for these families who face such a difficult diagnosis for their child," said Charles Brenner, Ph.D., Chairman of Biochemistry at the University of Iowa, and an expert in nicotinamide adenine dinucleotide (NAD) metabolism. Dr. Brenner also was one of the study's senior authors.

Dr. Bindra said this study suggests that other cancers with PPM1D mutations, such as breast and gynecological cancers, could be similarly targeted.

Credit: 
The Translational Genomics Research Institute

Research details impact of energy development on deer habitat use

image: University of Wyoming researcher Samantha Dwinnell releases a doe mule deer in western Wyoming.

Image: 
Benjamin Kraushaar

For every acre of mule deer habitat taken by roads, well pads and other oil and gas development infrastructure in Wyoming's Green River Basin, an average of 4.6 other acres of available forage is lost, according to new research by University of Wyoming scientists.

That's because deer avoid areas close to such human disturbance, even when there's quality forage in those areas, says the research published in the journal Ecological Applications.

"Large herbivores have adapted to the naturally occurring constraints of their foodscape, but certain levels of human disturbance appear to prompt behaviors across multiple scales that, in turn, result in exaggerated losses of forage," the scientists wrote. "Recognizing the cumulative losses of forage is key to providing wildlife managers and industry with realistic expectations of population effects that are likely to ensue on winter ranges where energy development occurs. Such knowledge can guide the evaluation of trade-offs between energy development and the performance and abundance of large herbivore populations."

The new findings help explain why previous research showed a 36 percent decline in the mule deer population during 15 years of energy development on the Pinedale Anticline in western Wyoming's Sublette County. While those previous studies correlated energy development with declining deer numbers, the new research specifically documented changes in the foraging behavior of deer in relation to oil and gas activity.

The research involved measuring production and use of the primary food for deer during winter -- sagebrush -- along with the capture, collaring and monitoring of a total of 146 deer between March 2013 and March 2015 in three sagebrush-covered areas of the Green River Basin. Those are the Pinedale Anticline and adjacent areas southwest of Pinedale; an area of the northern Wyoming Range foothills northwest of LaBarge; and an area of the southern Wyoming Range foothills west of Kemmerer. All provide winter range for components of the Wyoming Range deer herd, historically Wyoming's largest mule deer population.

The scientists found that, above all, deer favor areas where new growth of new leader shoots of sagebrush is high, but that human disturbance from oil and gas activity negatively influenced their use of such forage.

"Across three winter ranges and different development scenarios, mule deer avoided areas close to disturbance, tended to move away from disturbance and increased vigilant behavior when near disturbance," the researchers wrote. "Mule deer selected for areas with high foraging opportunities, but their use of available forage near energy development was never realized to the same potential as similar forage patches farther from development."

As a result, the indirect loss of deer forage due to human disturbance far exceeded direct habitat loss from roads, well pads and other oil and gas infrastructure. In fact, across all three study areas, human disturbance resulted in a 10.5 percent decrease in use of available forage; direct habitat loss from construction accounted for just 2.3 percent.

The loss of habitat due to human disturbance did vary significantly from area to area, ranging from 19.5 percent on the Pinedale Anticline to 4.3 percent in the area northwest of LaBarge. The scientists believe that's primarily because the area northwest of LaBarge is more rugged, with better sagebrush production and a lower intensity of human disturbance even though energy development is present.

Overall, the scientists hope their findings will help guide decisions on energy development in important wildlife winter ranges.

"To meet global demands for energy resources, oil and gas resources will continue to be extracted from critical wildlife ranges, including winter ranges of migratory, large herbivores," they wrote. "Accordingly, understanding how those disturbances associated with energy development can affect behavior, foraging and, ultimately, population dynamics will help identify ways to minimize the effects."

Credit: 
University of Wyoming

Shocking rate of plant extinctions in South Africa

Over the past 300 years, 79 plants have been confirmed extinct from three of the world's biodiversity hotspots located in South Africa - the Cape Floristic Region, the Succulent Karoo, and the Maputuland-Pondoland-Albany corridor.

According to a study published in the journal Current Biology this week, this represents a shocking 45.4% of all known plant extinctions from 10 of the world's 36 biodiversity hotspots. Biodiversity hotspots are areas that harbour exceptionally high numbers of unique species, but at the same time they are under severe threat from human disturbance.

South Africa is remarkable in that, for a relatively small country, it is home to three of these hotspots.

An international team of researchers, led by Prof Jaco Le Roux and Dr Heidi Hirsch, affiliated with the Centre for Invasion Biology (CIB) at Stellenbosch University (SU), analysed a comprehensive dataset of 291 plant extinctions since 1700 in ten biodiversity hotspots and six coldspots, covering about 15% of the earth's land surface.

The main drivers for extinctions in South Africa were found to be agriculture (49.4%), urbanisation (38%) and invasive species (22%).

Variability in predictions on the rate of plant extinctions

The results of their analysis show that, since the 1990s, extinction rates for plants over the past 300 years appear to have settled at about 1.26 extinctions per year. At its peak, however, it was at least 350 times that of historical background rates during pre-human times.

At this rate, they predict that, in the areas they studied, an additional 21 plant species will go extinct by 2030, 47 species by 2050 and 110 species by 2100.

However, these findings stand in sharp contrast to predictions from other studies that as much as half of the earth's estimated 390 000 plant species may disappear within the remainder of this century.

"This would translate into more than 49 000 extinctions in the regions we studied over the next 80 years, which seems unlikely, bar a cataclysmic event such as an asteroid strike!" they argue.

Prof Le Roux says regional datasets provide valuable data to make general inferences around plant extinctions and the drivers underlying these extinctions. There are, however, still many regions in the world without a Red List of Plants, or with outdated lists, such as Madagascar and Hawaii. These 'hottest' of hotspots were therefore not included in their analysis.

"A lack of up-to-date lists prevents us from gaining a more complete and precise picture of what we are losing, and at exactly what rate," Dr Hirsch adds.

They believe the only way to better understand the magnitude of the extinction crisis faced by plants, and biodiversity in general, is to urgently initiate regional or at least country-level biodiversity assessments.

"While our study suggests that modern plant extinctions are relatively low, it is important to keep in mind that plants are exceptionally good at 'hanging in there'. Some of them are among the longest living organisms on earth today and many can persist in low densities, even under prolonged periods of unfavourable environmental conditions. A recent report, for example, indicated that 431 plant species, previously thought to be extinct, have been rediscovered," Le Roux explains. This means that many plant species may technically not be extinct, even though they only have one or a few living individuals remaining in the wild.

Claiming extinction rates for plant species therefore remains a particularly challenging exercise.

"We need comprehensive and up-to-date datasets to make informative forecasts about the future and preservation of Earth's flora," they emphasise.

Lost plant species in South Africa's biodiversity hotspots

The first recorded species to be lost to forestry in South Africa in the 1700s was a type of fountainbush that used to grow next to streams in the Tulbagh region - Psoralea cataracta. In 2008 it was listed as extinct on the Red List of South African Plants.

The next species to be confirmed extinct was one of the African daisies, Osteospermum hirsutum, last seen in 1775, followed by the honeybush, Cyclopia laxiflora, last seen around 1800. The reasons for their extinction are listed as agriculture, forestry and urbanisation.

More recently in 2012, an extremely rare species of vygie, Jordaaniella anemoniflora, was declared extinct in the wild after losing its battle against sprawling urbanisation and coastal developments around Strand, Macassar and Hermanus.

The Succulent Karoo has seen three confirmed plant extinctions - a vygie, Lampranthus vanzijliae (extinct in 1921, due to agriculture and urbanisation), the legume, Leobordea magnifica (extinct in 1947 due to agriculture and grazing) and the 'knopie' Conophytum semivestitum, lost to urbanisation and mining.

For the Maputuland-Pondoland-Albany corridor, twenty species have been confirmed extinct, mainly due to agriculture and utilisation, and include Adenia natalensis (1865), Barleria natalensis (1890) and more recently, Pleiospilos simulans (2007).

In conclusion

The researchers emphasise that biodiversity loss, together with climate change, are the biggest threats faced by humanity: "Along with habitat destruction, the effects of climate change are expected to be particularly severe on those plants not capable of dispersing their seeds over long distances," they conclude.

Credit: 
Stellenbosch University

Study: Climate change could pose danger for Muslim pilgrimage

For the world's estimated 1.8 billion Muslims -- roughly one-quarter of the world population -- making a pilgrimage to Mecca is considered a religious duty that must be performed at least once in a lifetime, if health and finances permit. The ritual, known as the Hajj, includes about five days of activities, of which 20 to 30 hours involve being outside in the open air.

According to a new study by researchers at MIT and in California, because of climate change there is an increasing risk that in coming years, conditions of heat and humidity in the areas of Saudi Arabia where the Hajj takes place could worsen, to the point that people face "extreme danger" from harmful health effects.

In a paper in the journal Geophysical Review Letters, MIT professor of civil and environmental engineering Elfatih Eltahir and two others report the new findings, which show risks to Hajj participants could already be serious this year and next year, as well as when the Hajj, whose timing varies, again takes place in the hottest summer months, which will be from 2047 to 2052 and from 2079 to 2086. This will happen even if substantial measures are taken to limit the impact of climate change, the study finds, and without those measures, the dangers would be even greater. Planning for countermeasures or restrictions on participation in the pilgrimage may thus be needed.

The timing of the Hajj varies from one year to the next, Eltahir explains, because it is based on the lunar calendar rather than the solar calendar. Each year the Hajj occurs about 11 days earlier, so there are only certain spans of years when it takes place during the hottest summer months. Those are the times that could become dangerous for participants, says Eltahir, who is the Breene M. Kerr Professor at MIT. "When it comes in the summer in Saudi Arabia, conditions become harsh, and a significant fraction of these activities are outdoors," he says.

There have already been signs of this risk becoming real. Although the details of the events are scant, there have been deadly stampedes during the Hajj in recent decades: one in 1990 that killed 1,462 people, and one in 2015 that left 769 dead and 934 injured. Eltahir says that both of these years coincided with peaks in the combined temperature and humidity in the region, as measured by the "wet bulb temperature," and the stress of elevated temperatures may have contributed to the deadly events.

"If you have crowding in a location," Eltahir says, "the harsher the weather conditions are, the more likely it is that crowding would lead to incidents" such as those.

Wet bulb temperature (abbreviated as TW), which is measured by attaching a wet cloth to the bulb of a thermometer, is a direct indicator of how effectively perspiration can cool off the body. The higher the humidity, the lower the absolute temperature that can trigger health problems. At anything above a wet bulb temperature of about 77 degrees Fahrenheit, the body can no longer cool itself efficiently, and such temperatures are classified as a “danger” by the U.S. National Weather Service. A TW above about 85 F is classified as “extreme danger,” at which heat stroke, which can damage the brain, heart, kidneys, and muscles and can even lead to death, is “highly likely” after prolonged exposure.

Climate simulations considered by Eltahir and his co-investigators, using both "business as usual" scenarios and scenarios that include significant countermeasures against climate change, show that the likelihood of exceeding these thresholds for extended periods will increase steadily over the course of this century with the countermeasures, and very severely so without them.

Because evaporation is so crucial to maintaining a safe body temperature, the level of humidity in the air is key. Even an actual temperature of just 90 F, if the humidity rises to 95 percent, is enough to reach the deadly 85 degree TW threshold for “extreme danger.” At a lower humidity of 45 percent, the 85 TW threshold would not be reached until the actual temperature climbed to 104 F or more. (At very high humidity, the wet bulb temperature equals the actual temperature).

Climate change will significantly increase the number of days each summer where wet bulb temperatures in the region will exceed the "extreme danger" limit. Even with mitigation measures in place, Eltahir says, "it will still be severe. There will still be problems, but not as bad" as would occur without those measures.

The Hajj is "a very strong part of the culture" in Muslim communities, Eltahir says, so preparing for these potentially unsafe conditions will be important for officials in Saudi Arabia. A variety of protective measures have been in place in recent years, including nozzles that provide a mist of water in some of the outdoor locations to provide some cooling for participants, and widening some of the locations to reduce overcrowding. In the most potentially risky years ahead, Eltahir says, it may become necessary to severely limit the number of participants allowed to take part in the ritual. This new research "should help in informing policy choices, including climate change mitigation policies as well as adaptation plans," he says.

Editor's Note: This release has been updated since it was originally published by request of the submitting institution. Numbers in paragraphs 7 and 9 have been modified.

Credit: 
Massachusetts Institute of Technology

High-precision technique stores cellular 'memory' in DNA

CAMBRIDGE, MA - Using a technique that can precisely edit DNA bases, MIT researchers have created a way to store complex "memories" in the DNA of living cells, including human cells.

The new system, known as DOMINO, can be used to record the intensity, duration, sequence, and timing of many events in the life of a cell, such as exposures to certain chemicals. This memory-storage capacity can act as the foundation of complex circuits in which one event, or series of events, triggers another event, such as the production of a fluorescent protein.

"This platform gives us a way to encode memory and logic operations in cells in a scalable fashion," says Fahim Farzadfard, a Schmidt Science Postdoctoral Fellow at MIT and the lead author of the paper. "Similar to silicon-based computers, in order to create complex forms of logic and computation, we need to have access to vast amounts of memory."

Applications for these types of complex memory circuits include tracking the changes that occur from generation to generation as cells differentiate, or creating sensors that could detect, and possibly treat, diseased cells.

Timothy Lu, an MIT associate professor of electrical engineering and computer science and of biological engineering, is the senior author of the study, which appears in the Aug. 22 issue of Molecular Cell. Other authors of the paper include Harvard University graduate student Nava Gharaei, former MIT researcher Yasutomi Higashikuni, MIT graduate student Giyoung Jung, and MIT postdoc Jicong Cao.

Written in DNA

Several years ago, Lu's lab developed a memory storage system based on enzymes called DNA recombinases, which can "flip" segments of DNA when a specific event occurs. However, this approach is limited in scale: It can only record one or two events, because the DNA sequences that have to be flipped are very large, and each requires a different recombinase.

Lu and Farzadfard then developed a more targeted approach in which they could insert new DNA sequences into predetermined locations in the genome, but that approach only worked in bacterial cells. In 2016, they developed a memory storage system based on CRISPR, a genome-editing system that consists of a DNA-cutting enzyme called Cas9 and a short RNA strand that guides the enzyme to a specific area of the genome.

This CRISPR-based process allowed the researchers to insert mutations at specific DNA locations, but it relied on the cell's own DNA-repair machinery to generate mutations after Cas9 cut the DNA. This meant that the mutational outcomes were not always predictable, thus limiting the amount of information that could be stored.

The new DOMINO system uses a variant of the CRISPR-Cas9 enzyme that makes more well-defined mutations because it directly modifies and stores bits of information in DNA bases instead of cutting DNA and waiting for cells to repair the damage. The researchers showed that they could get this system to work accurately in both human and bacterial cells.

"This paper tries to overcome all the limitations of the previous ones," Lu says. "It gets us much closer to the ultimate vision, which is to have robust, highly scalable, and defined memory systems, similar to how a hard drive would work."

To achieve this higher level of precision, the researchers attached a version of Cas9 to a recently developed "base editor" enzyme, which can convert the nucleotide cytosine to thymine without breaking the double-stranded DNA.

Guide RNA strands, which direct the base editor where to make this switch, are produced only when certain inputs are present in the cell. When one of the target inputs is present, the guide RNA leads the base editor either to a stretch of DNA that the researchers added to the cell's nucleus, or to genes found in the cell's own genome, depending on the application. Measuring the resulting cytosine to thymine mutations allows the researchers to determine what the cell has been exposed to.

"You can design the system so that each combination of the inputs gives you a unique mutational signature, and from that signature you can tell which combination of the inputs has been present," Farzadfard says.

Complex calculations

The researchers used DOMINO to create circuits that perform logic calculations, including AND and OR gates, which can detect the presence of multiple inputs. They also created circuits that can record cascades of events that occur in a certain order, similar to an array of dominos falling.

Most previous versions of cellular memory storage have required stored memories to be read by sequencing the DNA. However, that process destroys the cells, so no further experiments can be done on them. In this study, the researchers designed their circuits so that the final output would activate the gene for green fluorescent protein (GFP). By measuring the level of fluorescence, the researchers could estimate how many mutations had accumulated, without killing the cells. The technology could potentially be used to create mouse immune cells that produce GFP when certain signaling molecules are activated, which researchers could analyze by periodically taking blood samples from the mice.

Another possible application is designing circuits that can detect gene activity linked to cancer, the researchers say. Such circuits could also be programmed to turn on genes that produce cancer-fighting molecules, allowing the system to both detect and treat the disease. "Those are applications that may be further away from real-world use but are certainly enabled by this type of technology," Lu says.

Credit: 
Massachusetts Institute of Technology

Rising summer heat could soon endanger travelers on annual Muslim pilgrimage

image: These are holy sites in and by Mecca where important practices during Hajj take place. Pilgrims in these areas during future summer Hajj events run the risk of extreme heat stress.

Image: 
AGU

WASHINGTON - Over two million Muslim travelers just finished the annual religious pilgrimage to Mecca, Saudi Arabia, traveling during some of the country's hottest weather. New research finds pilgrims in future summers may have to endure heat and humidity extreme enough to endanger their health. The results can help inform policies that would make the trip safer for the several million people who make the pilgrimage each year, according to the study's authors.

Hajj, or Muslim Pilgrimage, is one of the five pillars of the Muslim faith. It is an annual pilgrimage to Mecca, Saudi Arabia, that involves living in the hot weather conditions of Saudi Arabia. Muslims are expected to make the pilgrimage at least once in their lifetimes. Islam follows a lunar calendar, so the dates for Hajj change every year. But for five to seven years at a time, the trip falls over summer.

A new study projecting future summer temperatures in the region around Mecca finds that as soon as 2020, summer days in Saudi Arabia could surpass the United States National Weather Service's extreme danger heat-stress threshold, at a wet-bulb temperature of 29.1 degrees C (84.3 degrees Fahrenheit).

Wet-bulb temperature is a measurement combining temperature with the amount of moisture in the air. At the extreme danger threshold defined by the National Weather Service, sweat no longer evaporates efficiently, so the human body cannot cool itself and overheats. Exposure to these conditions for long periods of time, such as during Hajj, could cause heat stroke and possibly death.

"When the Hajj happens in summer, you can imagine with climate change and increasing heat-stress levels conditions could be unfavorable for outdoor activity," said Elfatih Eltahir, a civil and environmental engineer at Massachusetts Institute of Technology and co-author of the new study in the AGU journal Geophysical Research Letters.

"Hajj may be the largest religious tourism event," Eltahir said. "We are trying to bring in the perspective of what climate change could do to such large-scale outdoor activity."

Adapting to rising temperatures

Middle Eastern temperatures are rising because of climate change and scientists project them to keep rising in the coming decades. In the new study, Eltahir and his colleagues wanted to know how soon and how frequently temperatures during summer Hajj would pass the extreme danger threshold. The researchers examined historical climate models and used past data to create a projection for the future.

In the past 30 years, they found that wet-bulb temperature surpassed the danger threshold 58 percent of the time, but never the extreme danger threshold. At the danger threshold, heat exhaustion is likely and heat stroke is a potential threat from extended exposure. Passing the extreme danger threshold for extended periods of time means heat stroke is highly likely.

The researchers then calculated how climate change is likely to impact wet-bulb temperature in Saudi Arabia in the future.
They found that in the coming decades, pilgrims will have to endure extremely dangerous heat and humidity levels in years when Hajj falls over summer. Their projections estimate heat and humidity levels during Hajj will exceed the extreme danger threshold six percent of the time by 2020, 20 percent of the time from 2045 and 2053, and 42 percent of the time between 2079 and 2086.

Climate change mitigation initiatives make passing the threshold during these years less frequent, projecting one percent by 2020, 15 percent of the time between 2045 and 2053, and 19 percent of the time between 2079 and 2086, according to the study.

The study authors stress that their projections are meant not to cause anxiety among pilgrims but instead to help them adapt, and to help authorities plan for safe Hajj.

"These results are not meant to spread any fears, but they are meant to inform policies about climate change, in relation to both mitigation and adaptation" Eltahir said. "There are ways people could adapt, including structural changes by providing larger facilities to help people perform Hajj as well as nonstructural changes by controlling the number of people who go."

"They've provided a very compelling example of an iconic way that 2 to 3 million people per year that can be really vulnerable to what to me is the biggest underrated climate hazard - this combination of high temp and high humidity," said Radley Horton, a climate scientist at Columbia University Lamont Doherty Earth Observatory who was not involved with the study. "I believe as the century progresses if we don't reduce our greenhouse gases [this] could become every much as an existential threat as sea level rising and coastal flooding."

Credit: 
American Geophysical Union

Moffitt researchers develop model to personalize radiation treatment

TAMPA, Fla. - A personalized approach to cancer treatment has become more common over the last several decades, with numerous targeted drugs approved to treat particular tumor types with specific mutations or patterns. However, this same personalized strategy has not translated to radiation therapy, and a one-size-fits-all approach for most patients is still common practice. Moffitt Cancer Center researchers hope to change this mindset for radiation treatment with the development of a genomically-based model that can optimize and personalize a radiation dose to match an individual patient's needs.

Radiation therapy is part of the standard treatment approach for breast cancer, but the dose administered to most patients is largely the same. Currently, clinical studies are being conducted to determine the benefits and risks of omitting radiation treatment in certain patients with breast cancer who are at a low risk of local disease recurrence. However, according to Javier Torres-Roca, M.D., senior member of Moffitt's Department of Radiation Oncology, "a true genomic approach to personalize radiotherapy dose has not yet been undertaken."

Previously, Torres-Roca and his team developed and validated a radiation sensitivity index (RSI) to predict the radiation sensitivity of tumors based on the patterns of 10 genes. However, they wanted to further develop the use of the RSI to more accurately determine the appropriate radiation dose for individual patients. The research team combined the RSI with a model that is used to determine the effect of radiation dose on tumor and normal tissues to create a new radiation dose determining method called genomically-adjusted radiation dose (GARD).

"GARD is the first opportunity for a genomically-driven personalized approach in radiation oncology, and is a research priority for the field," explained Torres-Roca. "Our research has found that GARD values are lower for those tumors that are resistant to radiation and higher for those tumors that are sensitive to radiation treatment."

In a new study published this month in EBioMedicine, Moffitt researchers validated the use of the GARD model in two separate groups of triple-negative breast cancer patients treated with radiation therapy from Europe (N=58) and the Total Cancer Care® program at Moffitt (N=55). They demonstrated that GARD values were associated with the risk of breast cancer recurring locally. The researchers also used GARD to calculate an individualized radiation dose for each breast cancer patient in the group of patients from Moffitt. They found that the range for biological optimal radiation dose in triple negative breast cancer ranged from 30 to 76 Gy, and that the current standard to deliver 60 Gy to all patients could be overdosing a significant number of patients.

The researchers are now planning a clinical trial to be initiated at Moffitt where the radiation dose for breast cancer patients will be selected based on this model. They hope that the continued study of the GARD model and its implementation into practice will benefit patients by allowing a personalized approach to radiation treatment and will minimize the risks of additional radiation exposure.

"Our analyses provide the first proposed range for optimal radiotherapy dose at an individual patient level in triple-negative breast cancer and proposes a significant number of patients can be treated with lower doses of radiotherapy while still maintaining high levels of local control," said Kamran Ahmed, M.D., assistant member of the Department of Radiation Oncology at Moffitt and lead author of the study.

Credit: 
H. Lee Moffitt Cancer Center & Research Institute

Training teams for timely NICU evacuation

image: Children's National is the nation's No. 1 NICU, and its educators worked with a diverse group within Children's National to design and implement periodic evacuation simulations. From June 2015 to August 2017, 213 members of NICU staff took part in simulated drills, honing their skills by practicing with mannequins with varying levels of acuity.

Image: 
Children's National in Washington, D.C.

In late August 2011, a magnitude 5.8 earthquake - the strongest east of the Mississippi since 1944 - shook Washington, D.C., with such force that it cracked the Washington Monument and damaged the National Cathedral.

On the sixth floor of the neonatal intensive care unit (NICU) at Children's National in Washington, D.C., staff felt the hospital swaying from side to side.

After the shaking stopped, they found the natural disaster exposed another fault: The unit's 200-plus staff members were not all equally knowledgeable or confident regarding the unit's plan for evacuating its 66 newborns or their own specific role during an emergency evacuation.

More than 900 very sick children are transferred to Children's National NICU from across the region each year, and a high percentage rely on machines to do the work that their tiny lungs and hearts are not yet strong enough to do on their own.

Transporting fragile babies down six flights of stairs along with vital equipment that keeps them alive requires planning, teamwork and training.

"Fires, tornadoes and other natural disasters are outside of our team's control. But it is within our team's control to train NICU staff to master this necessary skill," says Lisa Zell, BSN, a clinical educator. Zell is also lead author of a Children's National article featured on the cover of the July/September 2019 edition of The Journal of Perinatal & Neonatal Nursing. "Emergency evacuations trigger safety concerns for patients as well as our own staff. A robust preparedness plan that is continually improved can alleviate such fears," Zell adds.

Children's National is the nation's No. 1 NICU, and its educators worked with a diverse group within Children's National to design and implement periodic evacuation simulations. From June 2015 to August 2017, 213 members of NICU staff took part in simulated drills, honing their skills by practicing with mannequins with varying levels of acuity.

"Each simulation has three objectives. First, the trainee needs to demonstrate knowledge of their own individual role in an evacuation. Second, they need to know the evacuation plan so well they can explain it to someone else. And finally, they need to demonstrate that if they had to evacuate the NICU that day, they could do it safely," says Lamia Soghier, M.D., FAAP, CHSE, NICU medical director and the study's senior author.

The two-hour evacuation simulation training at Children's National begins with a group prebrief. During this meeting, NICU educators discuss the overarching evacuation plan, outline individual roles and give a hands-on demonstration of all of the evacuation equipment.

This equipment includes emergency backpacks, a drip calculation sheet and an emergency phrase card. Emergency supply backpacks are filled with everything that each patient needs post evacuation, from suction catheters, butterfly needles and suture removal kits to flashlights with batteries.

Each room is equipped with that emergency backpack which is secured in a locked cabinet. Every nurse has a key to access the cabinet at any time.

Vertical evacuation scenarios are designed to give trainees a real-world experience. Mannequins that are intubated are evacuated by tray, allowing the nurse to provide continuous oxygen with the use of a resuscitation bag during the evacuation. Evacuation by sled allows three patients to be transported simultaneously. Patients with uncomplicated conditions can be lifted out of their cribs and swiftly carried to safety.

Teams also learn how to calm the nerves of frazzled parents and enlist their help. "Whatever we need to do, we will to get these babies out alive," Joan Paribello, a clinical educator, tells 15 staff assembled for a recent prebriefing session.

An "X" on the door designates rooms already evacuated. A designated charge nurse and another member of the medical team remain in the unit until the final patient is evacuated to make a final sweep.

The simulated training ends with a debrief session during which issues that arose during the evacuation are identified and corrected prior to subsequent simulated trainings, improving the safety and expediency of the exercise.

Indeed, as Children's National NICU staff mastered these evacuation simulations, evacuation times dropped from 21 minutes to as little as 16 minutes. Equally important, post evacuation surveys indicate:

86% of staff report being more comfortable in being able to safely evacuate the Children's National NICU

94% of NICU staff understand the overall evacuation plan and

97% of NICU staff know their individual role during an evacuation.

"One of the most surprising revelations regarded one of the most basic functions in any NICU," Dr. Soghier adds. "Once intravenous tubing is removed from its pump, the rate at which infusions drip needs to be calculated manually. We created laminated cards with pre-calculated drip rates to enable life-saving fluid delivery to continue without interruption."

Credit: 
Children's National Hospital

NASA finds Tropical Depression Bailu forms east of Philippines

image: On Aug. 20, 2019, the MODIS instrument aboard NASA's Terra satellite provided a visible image of Tropical Depression Bailu in the Northwestern Pacific Ocean.

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

NASA's Terra satellite passed over the Northwestern Pacific Ocean and captured an image of newly developed Tropical Depression Bailu, east of the Philippines.

On Aug. 20, 2019, the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Terra satellite provided a visible image of Bailu in the Philippine Sea. The storm appeared somewhat elongated.

At 11 a.m. EDT (1500 UTC), the center of Bailu was located near latitude 15.9 degrees north and longitude 130.7 degrees east. Bailu was about 674 nautical miles south-southwest of Kadena Air Base, Okinawa, Japan. Bailu was moving to the northwest and had maximum sustained winds near 30 knots (34.5 mph/55.5 kph).

The Joint Typhoon Warning Center expects Bailu to move northwest and make landfall in Taiwan, then proceed to a second landfall in southeastern China.

Credit: 
NASA/Goddard Space Flight Center

Are we really protecting rivers from pollution? It's hard to say, and that's a problem

image: The impact of agricultural best management practices and urban stormwater control measures is mostly perceived; data is undocumented -- or simply missing.

Image: 
The Academy of Natural Sciences of Drexel University

More public and private resources than ever are being directed to protecting and preserving aquatic ecosystems and watersheds. Whether mandated for land development, farming or in response to the growing severity and number of natural disasters - scientists from the Academy of Natural Sciences of Drexel University found evidence that decades of watershed restoration and mitigation projects have taken place, but their impact is mostly perceived; data is relatively undocumented -- or simply missing.

In their report, entitled "Large-scale protection and restoration programs aimed at protecting stream ecosystem integrity: the role of science-based goal-setting, monitoring, and data management," which was published recently online in Freshwater Science, Academy researchers and the Stroud Water Research Center attribute the dearth of data to a need for greater investment in planning, goal-setting, monitoring and documenting stages of mitigation programs throughout the watersheds.

Stefanie A. Kroll, PhD, an assistant research professor in Drexel's department of Biodiversity, Earth & Environmental Science and one of the authors of the report encountered these challenges first hand while working on The Delaware River Watershed Initiative (DRWI).

"I was surprised to find a very small fraction of stream restoration projects that implemented agricultural best practices (BMPs) and storm water control (SCMs) - over the past few decades had produced and most importantly documented measurable change in physicochemical aspects of the streams targeted," said Kroll.

Kroll and her collaborators at the Academy drew on their observations from seven years with the DRWI, and a review of similar projects across the region, to identify the main challenges of applying scientific planning and monitoring for restoration.

The most significant obstacles they found were:

Lack of planning for implementation of a monitoring program

Lack of considerations of geographic region or scale of project

Failure to develop specific goals

Limitations to the scope of projects, including long term monitoring, as a result of expectations from the funding agency

To address these challenges, the authors suggest a combination of setting a more stringent standard for monitoring the programs and partnering with established conservation groups to implement it.

"You don't have to re-build the wheel, to solve this challenge," said Kroll. "One solution is to use water restoration funding to leverage existing scientific and conservation organizations in the region to work to improve water quality and help measure its success."

And when planning these programs, the authors note that it's important to set an appropriate scope, both geographically and temporally, for the monitoring.

"The cumulative effects of small, restored watersheds can show greater results than similar-scale implementations spread out in large catchments," said Kroll. "By choosing the right focus areas, even smaller zones within subwatersheds, can have a more critical impact than choosing to treat a larger portion of a stream network with more challenging conditions, evoking a true 'less is more' mentality."

The authors suggest several types of monitoring programs, that could be scaled to a variety of sites and conditions, which would produce useable data for making comparative measurements over a time period in which the programs should be showing an effect.

What those effects are will vary from watershed to watershed, they acknowledge, so it's equally important to develop specific mitigation and preservation goals that are realistic and appropriate for that particular watershed. Currently there are few data addressing what ecosystem parameters can or should be expected to change in response to river restoration.

"Defining degradation in context of a desired condition must be tailored to the objectives of a project," said Kroll. "We need data to set realistic goals based on different criteria or examples from nearby restoration successes and potential factors that interfere with signals of recovery, like past land use, changes in farming/water practices and climate change."

For example, Kroll and her team collect data differently from agencies that are checking on whether streams are attaining their designated use, but they want the data to be useful to agencies. They meet regularly with agencies from Pennsylvania and New Jersey, in addition to the Delaware River Basin Commission, to share findings and talk about ways to work together.

But the universal challenge, the study suggests, is that funding for these projects does not align with their scope. As a result, the efforts can end up being truncated or fail to produce results in the time allotted by the funding organization.

"Those who fund restoration activities generally provide resources for small projects or groups of small projects that are rarely combined or integrated as a part of a large long-term and comprehensive restoration plan," said Kroll.

The authors suggest helping the funders better understand the scope of the project by reporting or meeting with them regularly; and doing their best to partner with community scientists and conservation groups to share data and best practices, which could help increase the cost-effectiveness and sustainability of monitoring programs.

Looking forward, the researchers are considering these protocols as blueprints for future monitoring programs. By sharing data and collaborating with regional partner organizations, monitoring programs will ideally be more efficient and collect more meaningful data that can be used for the creation of future restoration projects and the continual improvement of water quality across the board.

"There is no 'one-size-fits-all' approach to watershed restoration," said Kroll. "But a framework that enables better planning, monitoring and management will help us better inform restoration practices to make limited funding more targeted and effective - insuring activities are achieving their intended benefits and ultimately improving water quality and preserving the integrity of our ecosystems."

Credit: 
Drexel University

Satellite sees Eastern Pacific Depression 10E form

image: NOAA's GOES-West satellite provided a visible image of the newly developed depression on August 21, 2019 at 11:30 a.m. EDT (1530 UTC). The system appears more organized and circular on satellite imagery.

Image: 
NOAA/NRL

Tropical Depression 10E has formed in the Eastern Pacific Ocean and the GOES-West satellite caught its formation far from the Baja Peninsula.

NOAA's GOES-West satellite provided a visible image of the newly developed depression on August 21, 2019. The system appears more organized and circular on satellite imagery. In addition, An early morning scatterometer instrument overpass that looks at winds in a system, showed a nearly closed surface low pressure area.

At 11 a.m. EDT (1500 UTC), the center of Tropical Depression Ten-E was located near latitude 15.4 degrees north and longitude 107.3 degrees west. That's about 545 miles (875 km) south-southeast of the southern tip of Baja California, Mexico.

NOAA's National Hurricane Center or NHC said the depression is moving toward the west-northwest near 18 mph (30 kph). A turn to the northwest along with a decrease in forward speed is expected by Thursday, Aug. 22. Maximum sustained winds are near 35 mph (55 kph) with higher gusts. The estimated minimum central pressure is 1007 millibars.

NHC said, "Steady strengthening is forecast for the next couple of days and the depression is expected to become a tropical storm by tonight, and a hurricane by Friday."

For updated forecasts visit: http://www.nhc.noaa.gov

Credit: 
NASA/Goddard Space Flight Center

NASA sees a lopsided Atlantic Tropical Storm Chantal form

image: On Aug. 21 at 8:20 a.m. EDT (1220 UTC), the MODIS instrument that flies aboard NASA's Aqua satellite showed strongest storms (yellow) in Tropical Storm Chantal were east of center, where cloud top temperatures in those areas were as cold as minus 50 degrees Fahrenheit (minus 45.5 Celsius).

Image: 
NASA/NRL

NASA's Aqua satellite provided a view of newly formed Tropical Storm Chantal in the North Atlantic Ocean. The image revealed that the storm formed despite being battered by outside winds.

The third named storm of the Atlantic Ocean hurricane season formed around 11 p.m. EDT on Aug. 20, far from land and almost 500 miles southeast of Halifax, Nova Scotia, Canada.

NASA's Aqua satellite obtained an infrared view of the storm nine hours later. An instrument aboard Aqua uses infrared light to analyze the strength of storms by providing temperature information about the system's clouds. The strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

On Aug. 21 at 8:20 a.m. EDT (1220 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite gathered infrared data on Chantal. The strongest storms were east of the center of circulation and indicative of vertical wind shear, outside westerly winds pushing against the storm. Storms east of the center had cloud top temperatures as cold as minus 50 degrees Fahrenheit (minus 45.5 Celsius).

In general, wind shear is a measure of how the speed and direction of winds change with altitude. Tropical cyclones are like rotating cylinders of winds. Each level needs to be stacked on top each other vertically in order for the storm to maintain strength or intensify. Wind shear occurs when winds at different levels of the atmosphere push against the rotating cylinder of winds, weakening the rotation by pushing it apart at different levels.

At 11 a.m. EDT (1500 UTC), the center of Tropical Storm Chantal was located near latitude 40.2 degrees north and longitude 51.6 degrees west. The center of Chantal is about 455 miles (730 km) south of Cape Race, Newfoundland, Canada.

Chantal is moving toward the east near 20 mph (31 kph). A turn toward the southeast with a decrease in forward speed is expected by Thursday, August 22. Chantal is forecast to slow further and turn southward on Friday. Maximum sustained winds are near 40 mph (65 kph) with higher gusts. The estimated minimum central pressure is 1009 millibars.

NOAA's National Hurricane Center anticipates gradual weakening and Chantal is forecast to become a tropical depression in a couple of days.

For updated forecasts, visit: http://www.nhc.noaa.gov

Credit: 
NASA/Goddard Space Flight Center