Earth

Skin creams, make-up and shampoos should be free from Pluralibacter

Since its introduction in 2005, an increasing number of products posing a microbiological risk have been notified via the European rapid alert system for consumer products "Safety Gate" (formerly RAPEX). Ten cosmetic products listed in the RAPEX database were affected by confirmed contamination with P. gergoviae.

If products contaminated with P. gergoviae are used, the bacterium can enter the body via open wounds or the mucous membranes. Severe infections may develop in people with pre-existing conditions.

In the opinion of the BfR, externally applied cosmetic products should be free of P. gergoviae in order to avoid a health risk for humans. The health risks associated with the use of such cosmetic products cannot currently be quantified due to the lack of reliable data.

Credit: 
BfR Federal Institute for Risk Assessment

Historical climate fluctuations in Central Europe overestimated due to tree ring analysis

"Was there a warm period in the Middle Ages that at least comes close to today's? Answers to such fundamental questions are largely sought from tree ring data," explains lead author Josef Ludescher of the Potsdam Institute for Climate Impact Research (PIK). "Our study now shows that previous climate analyses from tree ring data significantly overestimate the climate's persistence. A warm year is indeed followed by another warm rather than a cool year, but not as long and strongly as tree rings would initially suggest. If the persistence tendency is correctly taken into account, the current warming of Europe appears even more exceptional than previously assumed."

To examine the quality of temperature series obtained from tree rings, Josef Ludescher and Hans Joachim Schellnhuber (PIK) as well as Armin Bunde (Justus-Liebig-University Giessen) and Ulf Büntgen (Cambridge University) focused on Central Europe. Main reason for this approach was the existing long observation series dating back to the middle of the 18th century to compare with the tree ring data. In addition, there are archives that accurately recorded the beginning of grape and grain harvests and even go back to the 14th century. These records, as well as the width of tree rings, allow temperature reconstructions. A warm summer is indicated by a wide tree ring and an early start of the harvest, a cold summer by a narrow tree ring and a late start of the harvest. The trees studied are those from altitudes where temperature has a strong influence on growth and where there is enough water for growth even in warm years.

"Medieval archives confirm modern climate system research"

"It turned out that in the tree ring data the climatic fluctuations are exaggerated. In contrast, the temperatures from the harvest records have the same persistence tendency as observation data and also the computer simulations we do with climate models," says co-author Hans Joachim Schellnhuber of PIK. "Interestingly, medieval archives thus confirm modern climate system research."

To eliminate the inaccuracies of the tree ring data, the scientists used a mathematical method to adjust the strength of the persistence tendency to the harvest data and the observation data. "The adjustment does not change the chronological position of the respective cold and warm periods within the tree rings, but their intensity is reduced," explains co-author Armin Bunde from the University of Gießen. "The corrected temperature series corresponds much better with the existing observations and harvest chronicles. In its entirety the data suggests that the medieval climate fluctuations and especially the warm periods were much less pronounced than previously assumed. So the present human-made warming stands out even more."

Credit: 
Potsdam Institute for Climate Impact Research (PIK)

Older and richer: Old grasslands show high biodiversity and conservation value

image: Ski-run grassland in Sugadaira highland.

Image: 
University of Tsukuba

Tsukuba, Japan - "The grass is always greener on the other side," as the saying goes, but in this case, it's more diverse. Researchers from Japan have discovered that old grasslands have higher plant diversity than new ones, and that grassland longevity can be an indicator of high conservation priority.

In a study published this month in Ecological Research as online version, researchers from the University of Tsukuba have revealed that the longer grasslands have been around, the higher their plant diversity, and the more likely they are to be of high conservation priority.

Grasslands can be classified as natural (existing in natural climatic conditions and disturbance systems) or seminatural (maintained by artificial disturbances such as pasturing, fire or mowing). Seminatural grasslands are ecosystems with rich biodiversity. Unfortunately, both types of grasslands are declining globally.

"There's an urgent need to identify grasslands of high conservation priority," says lead author of the study Taiki Inoue. "The results of a growing number of recent studies show that vegetation history affects current biological communities. The aim of our study was to evaluate whether the uninterrupted continuity of grasslands through time promotes biodiversity, and therefore can be an indicator of conservation priority."

To do this, the researchers investigated plant communities in old (160-1000+ years) and new (52-70 years after deforestation) seminatural grasslands, as well as in forests, in highland areas of central Japan. Geographical information system (GIS) data were constructed using aerial photos and past maps to judge the vegetation history of these ecosystems.

"Old grasslands had the highest number of plant species, followed by new grasslands and forests," explains Professor Tanaka Kenta, senior author. "This pattern was much clearer in the number of native and endangered species dependent on grasslands, indicating the role of old grasslands as refuges for those species."

Old and new grasslands also differed in species composition, with the composition of new grasslands ranging between that of old grasslands and forests. This suggests that new grasslands continue to be affected by past forestation more than 52 years after deforestation. Old grasslands were found to have eleven indicator species, with none found in new grasslands, revealing that the plant community in old grasslands was unique.

"Our findings indicate that grasslands that have been around for a long time are where conservation effort should be focused," says Inoue.

Future studies investigating the effect of vegetation history on the current biodiversity of grassland plant species will improve understanding of how biological communities are formed, and will be key to allocating conservation priority.

Credit: 
University of Tsukuba

Seeing the eye like never before

While there is no cure for blindness and macular degeneration, scientists have accelerated the process to find a cure by visualizing the inner workings of the eye and its diseases at the cellular level.

In an effort led by UW Medicine, researchers successfully modified the standard process of optical coherence tomography (OCT) to detect minute changes in response to light in individual photoreceptors in the living eye.

The results were published Sept. 9 in Science Advances.

"We have now accelerated the life cycle of vision restoration," said lead author Vimal Prabhu Pandiyan, a ophthalmology researcher at the University of Washington School of Medicine.

The study was funded in part by the National Eye Institute's Audacious Goals Initiative, which embraces bold ideas in helping people to see better.

The OCT modifications outlined in the study will help researchers who want to test therapies such as stem cells or gene therapy to treat retinal disease. They now have the tools to zoom in on the retina to evaluate whether the therapy is working.

Corresponding author Ramkumar Sabesan, a UW assistant research professor of ophthalmology, said the only way to objectively measure the eye currently is to look at a wide retinal area. Sabesan said researchers currently can attach electrodes on the cornea but it captures a large area with around 1 million cells. Now they are talking about nanometers, or one billionth of a meter - a small fraction of the size of a cell, providing orders of magnitude improvement.

"Since photoreceptors are the primary cells affected in retinal generation and the target cells of many treatments, noninvasive visualization of their physiology at high resolution is invaluable," the researchers wrote.

Cone photoreceptors are the building blocks of sight, capturing light and funneling information to the other retinal neurons. They are a key ingredient in how we process images and patterns of light falling on the retina.

Optical coherence tomography has been around since the 1990s. In this study, researchers used OCT with adaptive optics, line-scanning and phase-resolved acquisition to deliver the concept of Thomas Young's interference to the human eye. With the ability to zoom in on the retina at high speeds, they found that cone photoreceptors deform at the scale of nanometers when they first capture light and begin the process of seeing.

As Sabesan explained: "You can imagine a picture that looks visually and structurally normal. But when we interrogate the inner working of the retina at a cellular scale, we may detect a dysfunction sooner than what other modalities can do. A doctor then can prescribe medication to intervene early or follow the time-course of its repair via gene therapy or stem cell therapy in the future."

"We will now have a way to see if these therapies are acting in the way they should," Sabesan said.

Credit: 
University of Washington School of Medicine/UW Medicine

Structure of 'immortality protein' now better understood

image: The solution structure of the HpEst3. (a) The topology of the secondary structure elements of the HpEst3 protein. (b) The stereo view of the ensemble of the final 20 calculated structures. Images were made using PyMOL v. 2.3 (Schrodinger, LLC).

Image: 
Kazan Federal University

A key role in studying the telomerase of Hansenula polymorpha was played by KFU's nuclear magnetic resonance spectrometer.

"The work frequency of our NMR spectrometer with a cryo sensor is 700 MHz. It can glance into the structure of the most complex biochemical objects and detect how they interact with cell membranes," shares Head of NMR Lab, Professor Vladimir Klochkov.

The telomerase was studied by the Department of Medical Physics (Kazan Federal University) and the Laboratory of Magnetic Spectroscopy and Tomography (Moscow State University).

"Telomerase, as many complex enzymes, is not just a protein molecule, but rather a combination of several subunits. A fragment of one of the subunits, the Est2 protein, was studied by us earlier, and now we studied the structure and functions of Est3," explains co-author, Senior Research Associate Sergey Efimov. "We found out that Est3 is important for the stabilization of the whole protein complex."

Spectroscopy helped understand the spatial structure of Est3 molecules and the interaction between them.

"When molecules interact, you can see a general picture through resonance frequencies and other spectral characteristics of magnetic nuclei in their structure - how proteins contact with each other and the DNA strand, which subunits are responsible for attaching the enzyme to the DNA, and which move the complex along the chain and restore telomeres," adds Efimov.

The analogs of the studied yeast proteins can be found in telomerases of higher organisms, including humans. If scientists create medications to influence particular components in the telomerase to suppress its activities in cancer cells, we will be able to counter the progress of oncological diseases.

"As an organism grows, telomeres become shorter. The purpose is to limit the further division of cells which might have amassed errors in their DNA. Such a limitation does not exist in embryonic and stem cells thanks to the telomerase. This enzyme restores telomeres at the ends of chromosomes. A similar process happens during malignant changes in cells. Telomeres and systems of length control exist in all eukaryotes," concludes Dr. Efimov.

Credit: 
Kazan Federal University

Scientists map freshwater transport in the Arctic Ocean

image: Map of the study region. The colored lines denote the ship tracks of the oceanographic surveys whose data were used for the analysis of freshwater transport in the Arctic Ocean.

Image: 
Alexander Osadchiev et al./Scientific Reports

The Ob, Yenisei, and Lena rivers flow into the Kara and Laptev seas and account for about half of the total freshwater runoff to the Arctic Ocean. The transport and transformation of freshwater discharge in these seas have a large impact on ice formation, biological productivity, and many other processes in the Arctic. Researchers from Shirshov Institute of Oceanology and MIPT have investigated the spreading of large river plumes -- that is, freshened water masses formed as a result of river runoff mixing with ambient saltwater -- in the Russian Arctic seas. The findings were published in Scientific Reports.

The Ob, Yenisei, and Lena rivers provide a huge volume of freshwater discharge to the Kara and Laptev seas. The total annual runoff from these three rivers is estimated at 2,300 cubic kilometers. The majority of this volume is discharged into the sea during the ice-free season, forming the Ob-Yenisei plume and the Lena plume, which are the largest in the Arctic and among the largest in the world ocean.

"River plumes are freshened water masses that form near river mouths and spread at sea as a relatively thin surface layer. River plume dynamics are mostly determined by wind forcing and river discharge rate," explained Alexander Osadchiev, a co-author of the study and a senior researcher at Shirshov Institute of Oceanology.

Previous studies revealed that in the absence of strong wind, the Coriolis force and the density gradient between the plume and the ambient seawater cause alongshore spreading of river plumes. That process induces a large-scale eastward freshwater transport that is observed in the Arctic Ocean along large segments of the Eurasian and North American shores. This feature strongly affects ice conditions in the region.

The study described in this article revealed how the Ob-Yenisei plume spreads from the Kara Sea to the Laptev Sea through the Vilkitsky Strait, which is located between the Severnaya Zemlya archipelago and the Taymyr Peninsula. The paper also addresses the Lena plume and its spreading from the Laptev Sea into the East Siberian Sea through the Laptev and Sannikov straits.

The authors demonstrated that continental runoff from the Ob and Yenisei mostly accumulates in the Kara Sea during the ice-free season. Topographic barriers -- namely, the western coast of the Taymyr Peninsula and the Severnaya Zemlya archipelago -- generally hinder eastward spreading of the Ob-Yenisei plume to the Laptev Sea. This process occurs only as a result of very specific wind forcing conditions.

On the contrary, the Lena plume is almost constantly spreading to the western part of the East Siberean Sea as a large-scale water mass, forming a narrow freshened coastal current in the eastern part of this sea. Known as the Siberian Coastal Current, it is intensified by freshwater runoff from the large Indigirka and Kolyma rivers and flows farther eastward to the Chukchi Sea.

"Freshwater from the rivers flowing into the Arctic Ocean very slowly mixes with seawater, therefore the large river plumes are very stable. As we revealed, freshwater can spread eastward across hundreds of kilometers, forced by local winds. The recent findings enable us to assess freshwater transport between the Kara, Laptev, and East Siberian seas during the ice-free season," added Associate Professor Sergey Shchuka, deputy chair of ocean thermohydromechanics at MIPT.

The new data are crucial for understanding ice formation, biological productivity, and many other processes in the Arctic affected by continental runoff.

Credit: 
Moscow Institute of Physics and Technology

How chemical diversity in plants facilitates plant-animal interactions

image: A male Passerini's tanager, Ramphocelus passerinii, eats the fruit of Piper sancti-felicis. Photo by Bernadette Wynter Rigley.

Image: 
Bernadette Wynter Rigley.

We aren't the only beings who enjoy feasting on tasty fruits like apples, berries, peaches, and oranges. Species like bats, monkeys, bears, birds, and even fish consume fruits -- and plants count on them to do so.

Wildlife disperse their seeds by eating the fruit and defecating the seed elsewhere, thus carrying the fruit farther away and spreading the next generation of that plant. But attracting wildlife might also mean attracting harmful organisms, like some species of fungi.

Plants walk a fine line between attraction and repulsion, and to do this, they evolved to become complex chemical factories. Chemical ecologists at the Whitehead Lab at Virginia Tech are working to uncover why plants have such diverse chemicals and to determine the functions of these chemicals in plant-microbe and plant-animal interactions.

"There is still so much we don't know about the chemical compounds plants use to mediate these complicated interactions. As we continue to lose global biodiversity, we are also losing chemical diversity and the chance for discovery," said Lauren Maynard, a Ph.D. candidate in the Department of Biological Sciences within the College of Science.

Piper sancti-felicis is a neotropical shrub related to Piper nigrum, which produces black peppercorn. Although P. sancti-felicis isn't as economically important as its peppery cousin, it fulfills an important ecological role as one of the first plants to colonize a recently disturbed area. It also serves as an important food source for wildlife, especially bats and birds.

At La Selva Biological Station in Costa Rica, Maynard and a team of international ecologists worked to better understand the evolutionary ecology of P. sancti-felicis. Their findings were recently published in Ecology and serve as a step forward in understanding why plants have such great chemical diversity.

By analyzing the samples, the team discovered 10 previously undocumented alkenylphenol compounds in P. sancti-felicis. Alkenylphenols are rare in the plant kingdom, as they have been reported only in four plant families.

The alkenylphenol compounds were not distributed evenly across the plant, though. Maynard found that fruit pulp had the highest concentrations and diversity of alkenylphenol compounds, while leaves and seeds had only a few compounds at detectable levels. Later, a pattern emerged: Levels of alkenylphenol were highest as flowers developed into unripe pulp, but then decreased as the pulp ripened.

When Maynard and her collaborators tested alkenylphenols with different species of fruit fungi, they found that the alkenylphenols had antifungal properties. But those same compounds also made the fruits less tasty to bats, which are the plant's main seed dispersers.

This is a delicate balance: high levels of alkenylphenols protected the fruit from harmful fungi as it developed, but when it ripened, alkenylphenol levels dwindled so that bats would be interested in eating it.

"Many fungal pathogens attack ripe fruits and can make fruits unattractive to dispersers, or worse, completely destroy the seeds. Our study suggests that these toxins represent a trade-off in fruits: They do deter some potential beneficial partners, but the benefits they provide in terms of protecting seeds outweigh those costs," said Susan Whitehead, an assistant professor in the Department of Biological Sciences.

This study is the first to document an ecological role of alkenylphenols. Chemical interactions in the plant kingdom are not easy to see, but they play a crucial role in balancing trade-offs in various interactions. In the case of P. sancti-felicis, alkenylphenols help the plant walk the fine line between appealing to seed dispersers and repelling harmful fungi.

"Finding the nonlinear pattern of alkenylphenol investment across fruit development was really exciting. It suggests that the main function of the compounds is defense," said Maynard, who is also an Interfaces of Global Change Fellow in the Global Change Center, housed in the Fralin Life Sciences Institute.

This discovery helps researchers understand the nuances of tropical forest ecology and how chemical diversity in plants helps maintain that delicate balance. Plant chemical defenses have mostly been studied in leaves of plants, so this new discovery furthers scientists' understanding of how and why these compounds are crucial in fruits. And because fruits are the vehicle for seed dispersal, these chemicals play a significant ecological role.

"This study advanced our understanding of how tropical forests work by bringing together scientists and expertise from multiple fields of study: plant ecology, animal behavior, chemistry, and microbiology," said Whitehead, who is also an affiliated faculty member of the Global Change Center and the Fralin Life Sciences Institute.

The Whitehead Lab has several ongoing projects focused on plant chemistry and seed dispersal at La Selva Biological Station. Since international travel is not possible at the moment, the team hopes to resume their research when it is safe to do so. 

Credit: 
Virginia Tech

In the line of fire

image: The Slink Fire burning east of Modesto, California, in September, 2020.

Image: 
U.S. Forest Service

People are starting almost all the wildfires that threaten U.S. homes, according to an innovative new analysis combining housing and wildfire data. Through activities like debris burning, equipment use and arson, humans were responsible for igniting 97% of home-threatening wildfires, a University of Colorado Boulder-led team reported this week in the journal Fire.

Moreover, one million homes sat within the boundaries of wildfires in the last 24 years, the team found. That's five times previous estimates, which did not consider the damage done and threatened by small fires. Nearly 59 million more homes in the wildland-urban interface lay within a kilometer of fires.

"We have vastly underestimated the wildfire risk to our homes," said lead author Nathan Mietkiewicz, who led the research as a postdoc in Earth Lab, part of CIRES at the University of Colorado Boulder. "We've been living with wildfire risk that we haven't fully understood."

To better understand wildfire trends in the United States, Mietkiewicz, now an analyst at the National Ecological Observatory Network, and his colleagues dug into 1.6 million government spatial records of wildfire ignition between 1992 and 2015; Earth Lab's own compilation of 120,000 incident reports; and 200 million housing records from a real estate database from Zillow.

Among their findings:

Humans caused 97% of all wildfires in the wildland-urban interface, 85% of all wildfires in "very-low-density housing" areas, and 59% of all wildfires in wildlands between 1992 and 2015.

Human-started wildfires are expensive, eating up about one-third of all firefighting costs.

Overall, about half of fire suppression costs were related to protecting houses in all locations: the wildland-urban interface, low-density housing areas, and elsewhere.

Most human-caused wildfires were relatively small (
The wildland-urban interface or "WUI," represented only 10% of U.S. land in 2010, but was the site of 32% of all wildfire ignitions.

The WUI is also expanding our vulnerability, between 1992 and 2015, we built 32 million new homes in the WUI.

"Our fire problem is not going away anytime soon," said co-author Jennifer Balch, director of Earth Lab, a CIRES Fellow, and associate professor of geography. It's not just that we're building more homes in the line of fire, she said, but climate change is creating warmer, drier conditions that make communities more vulnerable to wildfire.

The new study, she said, does provide guidance for policy makers. "This provides greater justification that prescribed burns, where safe, can mitigate the risk and threat of future wildfires," Balch said. And we need to construct more fireproof homes in these beautiful, but flammable landscapes, she added. "We essentially need to build better and burn better."

"Smokey Bear needs to move to the suburbs," Mietkiewicz concluded. "If we can reduce the number of human-caused ignitions, we will also reduce the amount of homes threatened by wildfires."

Credit: 
University of Colorado at Boulder

Inexpensive, non-toxic nanofluid could be a game-changer for oil recovery

image: Zhifeng Ren, director of the Texas Center for Superconductivity at UH and corresponding author for a paper describing the work, said allowed for recovery in lab tests of 80% of extra-heavy oil with a viscosity of more than 400,000 centipoise at room temperature.

Image: 
University of Houston

Researchers from the University of Houston have demonstrated that an inexpensive and non-toxic nanofluid can be used to efficiently recover even heavy oil with high viscosity from reservoirs.

The nanofluid, made in a common household blender using commercially available sodium, allowed for recovery in lab tests of 80% of extra-heavy oil with a viscosity of more than 400,000 centipoise at room temperature. Zhifeng Ren, director of the Texas Center for Superconductivity at UH and corresponding author for a paper describing the work, said recovery in the field is expected to be less than the 80% shown in the lab; how much less will depend on oilfield conditions.

The work, published in Materials Today Physics, suggests a breakthrough in the use of nanotechnology to provide cost-effective and environmentally sustainable ways to produce oil.

The researchers note that so-called heavy oil - the result of the molecular structure of the oil - makes up 70% of global oil reserves, suggesting it will be needed to meet increasing energy demands until clean energy sources are fully developed. Current extraction technologies that involve the use of steam are expensive and environmentally damaging.

Ren, who is also M.D. Anderson Chair Professor of Physics at UH, said the nanofluid works to recover oil from the reservoir through at least three mechanisms:

A chemical reaction produced when the sodium nanoparticles come in contact with water in the reservoir generates heat, working much like steam flooding and other heat-based techniques to push oil from the reservoir, without the need for an external - and greenhouse gas-producing - source of heat.

The nanofluid also sparks a reaction producing sodium hydroxide, a chemical commonly used for alkaline flooding in oil fields. Sodium hydroxide can foment motion in the oil and spark a reaction that reduces viscosity.

A third reaction produces hydrogen gas, which can be used for gas flooding, another common oil recovery technique.

The sodium nanomaterials dissipate after the reaction, eliminating concerns about environmental damage. Optimal concentrations will vary based on individual reservoir conditions, Ren said, noting that increasing the concentration of the nanomaterial didn't necessarily lead to higher oil recovery.

He worked with co-author Dan Luo, a postdoctoral researcher at the Texas Center for Superconductivity. "Based on these advantages, we anticipate that the sodium nanofluid could become a game-changing technology for recovery of oil of any viscosity, as well as a milestone in using nanotechnology to solve oil-recovery problems in the petroleum industry," they wrote.

Sodium is highly reactive with water, suggesting that it could be useful in enhancing oil recovery, but that also complicated the preparation - exposing it to water too soon meant it wouldn't deliver the designed benefits. The researchers addressed that by preparing the sodium nanoparticles in a silicone oil, allowing the sodium to disperse throughout the reservoir before it came into contact with water in the reservoir, triggering smaller chemical reactions across a larger area. It is also possible to disperse the sodium nanoparticles in other solvents, including pentane and kerosene, or even to mix them with polymers or surfactants to achieve a higher oil recovery rate.

Sodium is also a light element, allowing the researchers to create sodium nanoparticles within the silicone oil, using a kitchen blender.

While the paper focuses on using the nanofluid to enhance recovery of heavy oil, Ren said it also could be used in the production of light oil, as well as for more general household uses, such as clearing a grease-clogged pipe.

Credit: 
University of Houston

How coronavirus took hold in North America and in Europe

image: This schematic map shows early and
apparently 'dead-end' introductions of SARS-CoV-2 to Europe and the US (dashed arrows). Successful dispersals between late January and mid-February are shown with solid arrows: from Hubei Province, China to Northern Italy, from China to Washington State, and later from Europe (as the Italian outbreak spread more widely) to New York City and from China to California.

Image: 
Andrew Rambaut/University of Edinburgh and Jeffrey Joy/University of British Columbia

A new study combines evolutionary genomics from coronavirus samples with computer-simulated epidemics and detailed travel records to reconstruct the spread of coronavirus across the world in unprecedented detail.

Published in the journal Science, the results suggest an extended period of missed opportunity when intensive testing and contact tracing might have prevented SARS-CoV-2 from becoming established in North America and Europe.

The paper also challenges suggestions that linked the earliest known cases of COVID-19 on each continent in January to outbreaks detected weeks later, and provides valuable insights that could inform public health response and help with anticipating and preventing future outbreaks of COVID-19 and other zoonotic diseases.

"Our aspiration was to develop and apply powerful new technology to conduct a definitive analysis of how the pandemic unfolded in space and time, across the globe," said University of Arizona researcher Michael Worobey, who led an interdisciplinary team of scientists from 13 research institutions in the U.S., Belgium, Canada and the U.K. "Before, there were lots of possibilities floating around in a mish-mash of science, social media and an unprecedented number of preprint publications still awaiting peer review."

The team based their analysis on results from viral genome sequencing efforts, which began immediately after the virus was identified. These efforts quickly grew into a worldwide effort unprecedented in scale and pace and have yielded tens of thousands of genome sequences, publicly available in databases.

Contrary to widespread narratives, the first documented arrivals of infected individuals traveling from China to the U.S. and Europe did not snowball into continental outbreaks, the researchers found.

Instead, swift and decisive measures aimed at tracing and containing those initial incursions of the virus were successful and should serve as model responses directing future actions and policies by governments and public health agencies, the study's authors conclude.

How the Virus Arrived in the U.S. and Europe

A Chinese national flying into Seattle from Wuhan, China on Jan. 15 became the first patient in the U.S. shown to be infected with the novel coronavirus and the first to have a SARS-CoV-2 genome sequenced. This patient was designated 'WA1.' It was not until six weeks later that several additional cases were detected in Washington state.

"And while all that time goes past, everyone is in the dark and wondering, 'What's happening?'" Worobey said. "We hope we're OK, we hope there are no other cases, and then it becomes clear, from a remarkable community viral sampling program in Seattle, that there are more cases in Washington and they are genetically very similar to WA1's virus."

Worobey and his collaborators tested the prevailing hypothesis suggesting that patient WA1 had established a transmission cluster that went undetected for six weeks. Although the genomes sampled in February and March share similarities with WA1, they are different enough that the idea of WA1 establishing the ensuing outbreak is very unlikely, they determined. The researchers' findings indicate that the jump from China to the U.S. likely occurred on or around Feb. 1 instead.

The results also puts to rest speculation that this outbreak, the earliest substantial transmission cluster in the U.S., may have been initiated indirectly by dispersal of the virus from China to British Columbia, Canada, just north of Washington State, and then spread from Canada to the U.S. Multiple SARS-CoV-2 genomes published by the British Columbia Center for Disease Control appeared to be ancestral to the viral variants sampled in Washington State, strongly suggesting a Canadian origin of the U.S. epidemic. However, the present study revealed sequencing errors in those genomes, thus ruling out this scenario.

Instead, the new study implicates a direct-from-China source of the U.S. outbreak, right around the time the U.S. administration implemented a travel ban for travelers from China in early February. The nationality of the "index case" of the U.S. outbreak cannot be known for certain because tens of thousands of U.S. citizens and visa holders traveled from China to the U.S. even after the ban took effect.

A similar scenario marks the first known introduction of coronavirus into Europe. On Jan. 20, an employee of an automotive supply company in Bavaria, Germany, flew in for a business meeting from Shanghai, China, unknowingly carrying the virus, ultimately leading to infection of 16 co-workers. In that case, too, an impressive response of rapid testing and isolation prevented the outbreak from spreading any further, the study concludes. Contrary to speculation, this German outbreak was not the source of the outbreak in Northern Italy that eventually spread widely across Europe and eventually to New York City and the rest of the U.S.

The authors also show that this China-to-Italy-US dispersal route ignited transmission clusters on the East Coast slightly later in February than the China-to-US movement of the virus that established the Washington State outbreak. The Washington transmission cluster also predated small clusters of community transmission in February in California, making it the earliest anywhere in North America.

Early Containment Works

The authors say intensive interventions, involving testing, contact tracing, isolation measures and a high degree of compliance of infected individuals, who reported their symptoms to health authorities and self-isolated in a timely manner, helped Germany and the Seattle area contain those outbreaks in January.

"We believe that those measures resulted in a situation where the first sparks could successfully be stamped out, preventing further spread into the community," Worobey said. "What this tells us is that the measures taken in those cases are highly effective and should serve as a blueprint for future responses to emerging diseases that have the potential to escalate into worldwide pandemics."

To reconstruct the pandemic's unfolding, the scientists ran computer programs that carefully simulated the epidemiology and evolution of the virus, in other words, how SARS-CoV-2 spread and mutated over time.

"This allowed us to re-run the tape of how the epidemic unfolded, over and over again, and then check the scenarios that emerge in the simulations against the patterns we see in reality," Worobey said.

"In the Washington case, we can ask, 'What if that patient WA1 who arrived in the U.S. on Jan. 15 really did start that outbreak?' Well, if he did, and you re-run that epidemic over and over and over, and then sample infected patients from that epidemic and evolve the virus in that way, do you get a pattern that looks like what we see in reality? And the answer was no," he said.

"If you seed that early Italian outbreak with the one in Germany, do you see the pattern that you get in the evolutionary data? And the answer, again, is no," he said.

"By re-running the introduction of SARS-CoV-2 into the U.S. and Europe through simulations, we showed that it was very unlikely that the first documented viral introductions into these locales led to productive transmission clusters," said co-author Joel Wertheim of the University of California, San Diego. "Molecular epidemiological analyses are incredibly powerful for revealing transmissions patterns of SARS-CoV-2."

Other methods were then combined with the data from the virtual epidemics, yielding exceptionally detailed and quantitative results.

"Fundamental to this work stands our new tool combining detailed travel history information and phylogenetics, which produces a sort of 'family tree' of how the different genomes of virus sampled from infected individuals are related to each other," said co-author Marc Suchard of the University of California, Los Angeles. "The more accurate evolutionary reconstructions from these tools provide a critical step to understand how SARS-CoV-2 spread globally in such a short time."

"We have to keep in mind that we have studied only short-term evolution of this virus, so it hasn't had much time to accumulate many mutations," said co-author Philippe Lemey of the University of Leuven, Belgium. "Add to that the uneven sampling of genomes from different parts of the world, and it becomes clear that there are huge benefits to be gained from integrating various sources of information, combining genomic reconstructions with complementary approaches like flight records and the total number of COVID-19 cases in various global regions in January and February."

"Our research shows that when you do early intervention and detection well, it can have a massive impact, both on preventing pandemics and controlling them once they progress," Worobey said. "While the epidemic eventually slipped through, there were early victories that show us the way forward: Comprehensive testing and case identification are powerful weapons."

Credit: 
University of Arizona

Genome analyses track SARS-CoV-2's early introduction to the US and Europe

SARS-CoV-2 arrived in Washington State somewhere between late January and early February 2020, sparking rapid community transmission of the virus that went undetected for several weeks before this community spread became evident, prompting a change in testing criteria to emphasize individuals with no travel history. That's the scenario proposed by Trevor Bedford and colleagues after their analysis of the genetic sequences of 455 SARS-CoV-2 viruses from the Washington State outbreak collected between January 19 and March 15, 2020. Their results highlight the critical need for widespread surveillance for community transmission of SARS-CoV-2, even after the pandemic is brought under control, say the authors. They note that several factors could have contributed to the delayed detection of presumptive community spread in Washington, including limited testing among non-travelers. Their analysis of 455 SARS-CoV-2 viruses from Washington State reveals that 84% of the genomes studied fall into a closely related group that the researchers call the Washington State outbreak clade, and they are derived from a variant of the virus from China. This pattern suggests that most of the early SARS-CoV-2 cases in the state came from a single introduction of the virus, probably between January 22 and February 10, Bedford et al. conclude. The first confirmed case of SARS-CoV-2 in the United States, identified in Washington State on January 19 in an individual returning from Wuhan, belongs to the Washington State outbreak clade, but the researchers say the genomic information is too incomplete to know whether this case was the single introduction that led to community spread, or if the introduction might have come from very closely related virus variants sampled in British Columbia. The first case of community spread was detected on 28 February. To better understand the transmission chain that led to it, Bedford and colleagues analyzed more than 10,000 specimens collected as part of the Seattle Flu Study between 1 January and 15 March 2020. They find evidence for SARS-CoV-2 a few days before the first previously reported community case in Washington. Refining the time and geographic origin of the introduction into Washington State will require a combination of earlier samples and samples from other geographic locations, the authors say. They note that other states in the U.S. have shown different genetic histories from Washington State, with a majority of SARS-CoV-2 sequences from New York and Connecticut clustering within European lineages, for example.

Michael Worobey and colleagues analyzed collections of SARS-CoV-2 genomes from around the world to decipher their viral family trees and to determine whether introductions of the virus in early January 2020 in Washington State and in Germany led to major outbreaks in the U.S. and Europe. In the U.S., their reconstruction of events suggests that the first confirmed U.S. case in Washington State in early January prepared the local and state response so that state officials were relatively successful initially in slowing the virus' spread, compared to places like New York City. However, an influx of returning travelers in late January or early February, who were only loosely monitored by public health officials, may have led to multiple introductions of the virus that sparked community spread in Washington State and California, the researchers say. Worobey et al. also took a closer look at the first confirmed SARS-CoV-2 in Europe, and whether this late January case in Bavaria, Germany, might be have sparked Italy's major outbreak in Lombardy in February. They conclude that the Bavarian virus variant is unlikely to be the cause of the northern Italy outbreak. While genomic data have suggested differences in the timing, spatial origins and transmission dynamics of early SARS-CoV-2 outbreaks, particularly in the U.S., Worobey and colleagues say their findings emphasize that epidemiological linkages inferred from genetically similar SARS-CoV-2 associated with outbreaks in different locations can be "highly tenuous," given low levels of sampled viral genetic diversity and insufficient background data from key locations. They say their findings highlight the potential value of establishing intensive, community-level respiratory virus surveillance architectures, such as the Seattle Flu Study, during a pre-pandemic period.

Credit: 
American Association for the Advancement of Science (AAAS)

NASA finds Tropical Storm Rene less affected by wind shear

image: On Sept. 10, 2020, NASA's Terra satellite provided a visible image of Tropical Storm Rene, and the eastern side of Tropical Storm Paulette (top left) moving through the Atlantic Ocean.

Image: 
Image Courtesy: NASA Worldview, Earth Observing System Data and Information System (EOSDIS).

NASA's Terra satellite obtained visible imagery of Tropical Storm Rene is it continued moving north though the central North Atlantic Ocean. Rene appeared more organized on satellite imagery as wind shear eased.

NASA Satellite View: Rene's Organization

The Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite captured a visible image of Tropical Storm Rene on Sept. 10. Rene appeared slightly more circular. That is because vertical wind shear (outside winds blowing at different levels of the atmosphere) appears to have lessened somewhat over Rene allowing the storm to organize. The image showed Tropical Storm Paulette was located to Rene's northwest.

Satellite imagery was created using NASA's Worldview product at NASA's Goddard Space Flight Center in Greenbelt, Md.

Rene on Sept. 10

At 11 a.m. EDT (1500 UTC) on Sept. 10, the center of Tropical Storm Rene was located near latitude 18.6 degrees north and longitude 35.8 degrees west. Rene was about 800 miles (1,285 km) west-northwest of the Cabo Verde Islands.

Rene is moving toward the west-northwest near 12 mph (19 kph). This general motion is expected to continue for the next couple of days, followed by a turn toward the northwest. Maximum sustained winds have increased to near 50 mph (85 kph) with higher gusts. The estimated minimum central pressure is 1000 millibars.

NOAA's National Hurricane Center (NHC) noted that additional strengthening is forecast for the next couple of days as vertical wind shear is not expected to be strong. Therefore, Rene is expected to become a hurricane by Saturday, Sept. 12. The storm is not expected to affect any land areas in the next five days, according to the NHC forecast.

Credit: 
NASA/Goddard Space Flight Center

High-precision electrochemistry: The new gold standard in fuel cell catalyst development

image: Atomic force microscopy images showing varied coverage of a gold layer (the lighter shade) over the edges of a platinum surface. The gold layer mitigates platinum dissolution during fuel cell operations.

Image: 
Argonne National Laboratory

Vehicles powered by polymer electrolyte membrane fuel cells (PEMFCs) are energy-efficient and eco-friendly, but despite increasing public interest in PEMFC-powered transportation, current performance of materials that are used in fuel cells limits their widespread commercialization.

Scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory led a team to investigate reactions in PEMFCs, and their discoveries informed the creation of technology that could bring fuel cells one step closer to realizing their full market potential.

“We performed these studies — from single crystals, to thin films, to nanoparticles — which showed us how to synthesize platinum catalysts to increase durability.” — Pietro Papa Lopes, scientist in Argonne’s Materials Science division

PEMFCs rely on hydrogen as a fuel, which is oxidized on the cell’s anode side through a hydrogen oxidation reaction, while oxygen from the air is used for an oxygen reduction reaction (ORR) at the cathode. Through these processes, fuel cells produce electricity to power electric motors in vehicles and other applications, emitting water as the only by-product.

Platinum-based, nano-sized particles are the most effective materials for promoting reactions in fuel cells, including the ORR in the cathode. However, in addition to their high cost, platinum nanoparticles suffer from gradual degradation, especially in the cathode, which limits catalytic performance and reduces the lifetime of the fuel cell.

The research team, which included DOE’s Oak Ridge National Laboratory and several university partners, used a novel approach to examine dissolution processes of platinum at the atomic and molecular level. The investigation enabled them to identify the degradation mechanism during the cathodic ORR, and the insights guided the design of a nanocatalyst that uses gold to eliminate platinum dissolution.

“The dissolution of platinum occurs at the atomic and molecular scale during exposure to the highly corrosive environment in fuel cells,” said Vojislav Stamenkovic, a senior scientist and group leader for the Energy Conversion and Storage group in Argonne’s Materials Science Division (MSD). “This material degradation affects the fuel cell’s long-term operations, presenting an obstacle for fuel cell implementation in transportation, specifically in heavy duty applications such as long-haul trucks.”

Starting small

The scientists used a range of customized characterization tools to investigate the dissolution of well-defined platinum structures in single-crystal surfaces, thin films and nanoparticles.

“We have developed capabilities to observe processes at the atomic scale to understand the mechanisms responsible for dissolution and to identify the conditions under which it occurs,” said Pietro Papa Lopes, a scientist in Argonne’s MSD and first author on the study. “Then we implemented this knowledge into material design to mitigate dissolution and increase durability.”

The team studied the nature of dissolution at the fundamental level using surface-specific tools, electrochemical methods, inductively coupled plasma mass spectrometry, computational modeling and atomic force, scanning tunneling and high-resolution transmission microscopies.

In addition, the scientists relied on a high-precision synthesis approach to create structures with well-defined physical and chemical properties, ensuring that the relationships between structure and stability discovered from studying 2D surfaces were carried over to the 3D nanoparticles they produced.

“We performed these studies — from single crystals, to thin films, to nanoparticles — which showed us how to synthesize platinum catalysts to increase durability,” said Lopes, “and by looking at these different materials, we also identified strategies for using gold to protect the platinum.”

Going for gold

As the scientists uncovered the fundamental nature of dissolution by observing its occurrence in several testbed scenarios, the team used the knowledge to mitigate dissolution with the addition of gold.

The researchers used transmission electron microscopy capabilities at Argonne’s Center for Nanoscale Materials and at the Center for Nanophase Materials Sciences at Oak Ridge National Laboratory — both DOE Office of Science User Facilities — to image platinum nanoparticles after synthesis and before and after operation. This technique allowed the scientists to compare the stability of the nanoparticles with and without incorporated gold.

The team found that controlled placement of gold in the core promotes the arrangement of platinum in an optimal surface structure that grants high stability. In addition, gold was selectively deposited on the surface to protect specific sites that the team identified as particularly vulnerable for dissolution. This strategy eliminates dissolution of platinum from even the smallest nanoparticles used in this study by keeping platinum atoms attached to the sites where they can still effectively catalyze the ORR.

Atomic-level understanding

Understanding the mechanisms behind dissolution at the atomic level is essential to uncovering the correlation between platinum loss, surface structure and size and ratio of platinum nanoparticles, and determining how these relationships affect long-term operation.

“The novel part of this research is resolving the mechanisms and fully mitigating platinum dissolution by material design at different scales, from single crystals and thin films to nanoparticles,” said Stamenkovic. “It’s the insights we gained in conjunction with the design and synthesis of a nanomaterial that addresses durability issues in fuel cells, as well as the ability to delineate and quantify dissolution of platinum catalyst from other processes that contribute to fuel cell performance decay.”

The team is also developing a predictive aging algorithm to assess the long-term durability of the platinum-based nanoparticles and found a 30-fold improvement in durability compared to nanoparticles without gold.

Credit: 
DOE/Argonne National Laboratory

Brazilian researcher creates an ultra-simple inexpensive method to fabricate optical fiber

A novel process to fabricate special optical fiber that is far simpler, faster and cheaper than the conventional method has been developed by Cristiano Cordeiro, a researcher and professor at the University of Campinas's Physics Institute (IFGW-Unicamp) in the state of São Paulo, Brazil. Cordeiro created the innovation during a research internship at the University of Adelaide in Australia, supported by a scholarship from São Paulo Research Foundation - FAPESP and by a partnership with his host, Heike Ebendorff-Heidepriem. An article signed by them and a third collaborator is published in Scientific Reports.

"The conventional process requires very large and expensive machinery and takes almost a week. Our process can be completed with bench-mounted equipment that's at least 100 times cheaper and takes less than an hour from feedstock to end-product. It will enable many more researchers and labs to produce their own optical fiber," Cordeiro told.

The procedure roughly resembles the extrusion method used to produce pasta: pressure is brought to bear on a ductile material so as to force it through a die, producing fiber with the appropriate inner structure. "Of course, this is all done with much more rigor and precision," Cordeiro said.

Hundreds of millions of kilometers of optical fiber are installed worldwide, and the amount of data they transport doubles approximately every two years. They are used not only in telecommunications but also for remote sensing to monitor temperature, mechanical stress, hydrostatic pressure, or fluid flow, among many other parameters.

Thanks to their strength and thinness they are effective in hostile environments and barely accessible locations.

These features help explain the importance of innovative fabrication processes. "The conventional process has several stages and requires highly complex equipment, such as a fiber drawing tower," Cordeiro said. "First a preform is produced, a giant version of the fiber with a diameter of between 2 cm and 10 cm. This structure is heated and drawn in a highly controlled manner by the tower. Mass is conserved and diameter decreases as length increases. Our method simplifies the process at an enormously reduced cost. The device we designed carries out a single continuous process starting with polymer pellets and ending with the finished fiber."

The procedure can be used to fabricate not only all-solid fiber, in which light is transmitted via a core with a higher refractive index but also microstructured fiber containing an array of longitudinal holes, which enhances optical properties control and brings an increase in functionality - including the opportunity to guide light with low energy loss in an air channel. To create the microstructures, the researchers used titanium dies with suitable designs.

"To simplify the fabrication of special optical fiber, we deployed equipment and techniques that are becoming more affordable and accessible thanks to the popularization of 3D printing," Cordeiro said. "The only machine required is a compact horizontal extruder similar to the device used to produce filament for 3D printers. It's about the size of a microwave oven and is far less costly than a draw tower. The titanium die with solid parts and holes is coupled to the extruder exit."

Owing to the fiber's intricate inner structure, the researchers produced the dies by additive manufacturing using appropriate 3D printers. Specialist firms can provide additive manufacturing services, so the only item of equipment need to produce the fiber is the horizontal extruder.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

The web of death

image: The physical structures of cancer cells are disrupted by a web forming inside of the cells - which activates their self destruction mechanism.

Image: 
MPI-P

Cancer is a disease in which cells multiply uncontrollably, which could lead to tumor growth. In addition to radiation therapy, cancer is often combated with chemotherapy: The chemicals administered affect various biochemical processes of the body, especially of the cancer cells ensuring that a tumor can no longer grow and slowly dies.

Chemotherapy, however, is stressful for the body and it can become ineffective with time: In addition to side effects, the cancer can sometimes adapt to the chemicals, resist their effects and create new ways of growing further. "We have now tried to take a different approach and not to influence the cancer by interfering with the biochemical processes, but to attack its structure directly," says Dr. David Ng, group leader in Prof. Tanja Weil's department at the Max Planck Institute of Polymer Research.

The scientists have synthetically produced a type of molecular Lego brick for this purpose and these bricks travel into both normal and cancer cells via a special attachment. The Lego brick alone is harmless, however, the unique conditions present in cancer cells set a series of chemical reactions in motion. "In cancer tissue, the environment is much more acidic than in normal tissue," says Ng. "In addition, much more highly reactive oxidative molecules are found within the cancer cells due to the cancer's increased metabolic activity - and we take advantage of that".

If both conditions are met, the individual Lego bricks can connect - and thus form a large web-like network. This web, which grows inside the cancer cells, is extremely stable and deforms the cancer cells from the inside out. Unable to cope with the physical stress, the cancer cell activates its own self-destruction mechanism. "We thus attack the cancer cell in a way it cannot defend itself against," says Ng.

The researchers have so far investigated the method on cancer cells in a laboratory culture and were able to prove that the cells die within a very short time of approximately four hours. In the future, their method could possibly represent an alternative to cancer treatment and further studies are ongoing.

As a perspective, Ng, Weil and colleagues will continue to work on increasing the precision of the deformation and on the biodegradation of the net after the cancer cells have died.

Credit: 
Max Planck Institute for Polymer Research