Tech

There's no place like home: Cleaning toxic tobacco smoke residue in our homes

San Diego, CA - The COVID-19 pandemic has created a number of challenges, including that many are spending more time at home than ever before. This is a significant problem for those with neighbors who smoke. Smoking continues to be a problem in multi-unit housing, and while stay-at-home orders have helped to reduce transmission of COVID-19, they have also increased exposure to secondhand smoke from neighbors.

But the problem is more than just secondhand smoke: long after secondhand smoke has cleared, the harmful chemicals in tobacco smoke and e-cigarette vapors remain as thirdhand smoke. The chemicals in thirdhand smoke stick to dust and household surfaces, and can built up over time creating significant reservoirs of thirdhand smoke.

Like other household hazards such as lead and allergy-causing dust mites, thirdhand smoke can be incredibly difficult and expensive to completely remove. It can be particularly dangerous for children as they are closer to surfaces where dust gathers such as the floor, they are more likely to put objects or hands in their mouths, and they have weaker immune systems than adults. However, while thirdhand smoke may be difficult to eliminate completely, it is possible to reduce the risk.

Potential thirdhand smoke cleaning methods were examined by a group of researchers led by Georg Matt of San Diego State University. The study, "Remediating Thirdhand Smoke Pollution in Multiunit Housing: Temporary Reductions and the Challenges of Persistent Reservoirs," will be published Sept. 15 in the journal Nicotine and Tobacco Research.

The researchers looked specifically at nicotine on surfaces and in dust -- a clear indicator of tobacco smoke residue -- and is listed by the State of California as a hazardous chemical. The researchers tested homes of strict non-smokers in multi-unit apartment buildings to confirm the levels detected were not caused by the residents or their guests, but by secondhand smoke from neighbors or previous residents who smoked.

Matt described the study, explaining "We wanted to see if there were any solutions available for people who live in homes that are polluted with thirdhand smoke. We explored three options for cleaning, and tested apartments for nicotine contamination before, after, and three months after each cleaning."

Households were split into three groups: the first group received dry/damp cleaning (thorough vacuuming and dusting), followed by wet cleaning (professional carpet/furniture steaming and cleaning of all household linens) a month later. The second group received the opposite, and the third group received both cleaning types on the same day.

Nicotine contamination was immediately reduced in all three groups following cleaning. Regardless of cleaning method, nicotine contamination in all homes increased again during the three months following cleaning, showing that the positive effects of cleaning, while significant in some cases, are temporary.

Matt continued, "We would like to be able to tell residents that there is a simple way to remove this contamination permanently, but that is not what we found. What we can say, as a result of this study, is that there are two important steps you can take to reduce thirdhand smoke contamination and make your home safer."

First, keep household dust as low as possible. Frequent vacuuming of all soft furnishings and floors, and dusting/mopping of all hard surfaces, is the easiest, best way to reduce thirdhand smoke exposure through household dust. Second, keep surfaces that you touch often as clean as possible. Frequently wiping table tops, doors, cabinets, and chairs, and washing pillow cases, blankets, and drapery will help keep our families safe from exposure to thirdhand smoke while we are all staying at home more than ever before.

Credit: 
San Diego State University

Ultrahigh energy density transition-metal-free cathodes designed by band structure engineering

image: Schematic open-circuit voltage (Voc) of battery. The energy separation of the lowest-unoccupied-molecular-orbital (LUMO) and the highest-occupied-molecular-orbital (HOMO) is the electrolyte window. Electrochemical potential vs. capacity is presented for both graphite-anode and cathodes. The cathodes are commonly transition-metal (TM) compounds which have layered, spinel, or olivine crystal structures.

Image: 
©Science China Press

The growing demands for ultrahigh energy density batteries used in electronic devices, electrical vehicles, and large-scale energy storage have inspired wide search on novel electrode materials, especially high-capacity cathode materials for Li-ion batteries (LIBs). The traditional design paradigm for the transition-metal oxide (TMO) based cathodes, e.g., LiCoO2, LiMn2O4, LiFePO4 are to use TM as the sole source of electrochemical activity, and their theoretical capacity is limited by the number of electrons that TM can exchange per unit mass. Moreover, the usage of transition metals is also confronted with the pressure of raw material cost and environmental contamination. Although Li-rich cathodes exhibit enhanced capacity to some extent by activating the redox reactions at oxygen sites, they inevitably suffer from poor structural stability and potential safety issues.

Replacing the TMO cathodes with a high capacity and light-weighted carbonaceous analog can deliver huge advantages in cathode designing. However, their low electrochemical potential (3.0 V)?

Recently, Prof. Chuying Ouyang from Jiangxi Normal University and Prof. Siqi Shi from Shanghai University proposed an effective strategy to achieve a breakthrough in the design of carbonaceous materials as cathodes for rechargeable LIBs/SIBs. By using the p-type doping strategy, they showed that the new transition-metal-free Li(Na)BCF2/Li(Na)B2C2F2 cathodes have the high potential of 2.7-3.7 V that would lead to a record-breaking high energy density (> 1000 Wh kg?1). Meanwhile, they also demonstrate great potentials in Na-ion storage.

In this work, the researchers uncover the tuning effect of the p-type doping strategy on the Li+ intercalation potentials of carbonaceous electrodes. It is emphasized that the high Fermi level (?4.31 eV) of graphite material results in a low Li+ intercalation potential of ~0.2 V, while for fluorinated graphite (CF), the hybridization of carbon atoms changes from sp2 to sp3, which leads to the increasing of the potential to 2.29 V. However, the electronic saturation nature of CF at Fermi level causes irreversible structural change during lithiation. To address this issue, the authors proposed that some carbon atoms could be substituted with boron atoms to create unoccupied states close to the valence bands (p-type doping), which efficiently shift-down the Fermi level of BCF2 to ?8.36 eV and consequently increases the Li+ intercalation potential to 3.49 V, comparable to those of common TMO cathodes for LIBs. Besides, the p-type doped B-C-F can achieve not only good electronic structural stability during lithiation, but also improved structural stability due to the strong Coulomb interactions between boron and fluorine. Amazingly, they for the first time transform the carbonaceous anode material into cathode material for rechargeable battery application through the p-type doping strategy.

This p-type doping strategy realizes great enhancement of Li+/Na+ intercalation potentials and structural stability for carbonaceous materials, by building visual links between electrochemical potential and band-structure properties. And it can be further applied to other charge transfer-dominated ion intercalation/deintercalation systems, to provide an avenue for developing next-generation cathode materials with ultrahigh energy density. More importantly, this work presents a new paradigm for systematically evaluating performance of electrochemical energy storage materials from multiple perspectives including crystal structure searching, phase diagram calculation, phonon spectrum, electronic structure tuning, voltage platform prediction, ions diffusion barrier, etc.

This work has been published on National Science Review (2020, DOI: 10.1093/nsr/nwaa174), entitled "Efficient potential-tuning strategy through p-type doping for designing cathodes with ultrahigh energy-density". Subsequently, Prof. Michel Armand, one of the founders of Li ion battery field has published corresponding research highlight on National Science Review (2020, DOI: 10.1093/nsr/nwaa185) and highly rated this work: "...achieved a breakthrough in the design of carbonaceous materials as cathodes for rechargeable LIBs/SIBs. This is a new paradigm for battery design, which is helpful in addressing issues related to the battery energy-density limit as well as the transition-metal cost and shortages...In a broader sense...can help guide a rational design of these compounds in the future and inform prospective theoretical and experimental researches in this field.".

Credit: 
Science China Press

New on/off functionality for fast, sensitive, ultra-small technologies

image: Bright-field microscopy image of a VO2 chevron-type planar actuator. Superposition in false color of the tip of the shuttle at low and high temperature. Bar, 1μm.

Image: 
Osaka University

Osaka, Japan - How do you turn on and off an ultra-small component in advanced technologies? You need an actuator, a device that transmits an input such as electricity into physical motion. However, actuators in small-scale technologies to date have critical limitations. For example, if it's difficult to integrate the actuator into semiconductor electronics, real-world applications of the technology will be limited. An actuator design that operates quickly, has precise on/off control, and is compatible with modern electronics would be immensely useful.

In a study recently published in Nano Letters, a team including researchers from Osaka University has developed such an actuator. Its sensitivity, fast on/off response, and nanometer-scale precision are unparalleled.

The researchers' actuator is based on vanadium oxide crystals. Many current technologies use a property of vanadium oxide known as the phase transition to cause out-of-plane bending motions within small-scale devices. For example, such actuators are useful in ultra-small mirrors. Using the phase transition to cause in-plane bending is far more difficult, but would be useful, for example, in ultra-small grippers in medicine.

"At 68°C, vanadium oxide undergoes a sharp monoclinic to rutile phase transition that's useful in microscale technologies," explains co-author Teruo Kanki. "We used a chevron-type (sawtooth) device geometry to amplify in-plane bending of the crystal, and open up new applications."

Using a two-step protocol, the researchers fabricated a fifteen-micrometer-long vanadium oxide crystal attached by a series of ten-micrometer arms to a fixed frame. By means of a phase transition caused by a readily attainable stimulus—a 10°C temperature change—the crystal moves 225 nanometers in-plane. The expansion behavior is highly reproducible, over thousands of cycles and several months.

"We also moved the actuator in-plane in response to a laser beam," says Nicola Manca and Luca Pelligrino, co-authors. "The on/off response time was a fraction of a millisecond near the phase transition temperature, with little change at other temperatures, which makes our actuators the most advanced in the world."

Small-scale technologies such as advanced implanted drug delivery devices wouldn't work without the ability to rapidly turn them on and off. The underlying principle of the researchers' actuator—a reversible phase transition for on/off, in-plane motion—will dramatically expand the utility of many modern technologies. The researchers expect that the accuracy and speed of their actuator will be especially useful to micro-robotics.

Credit: 
Osaka University

Single photons from a silicon chip

image: Schematic representation of a single defect in a silicon wafer created by the implantation of carbon atoms, which emits single photons in the telecom O-band (wavelength range: 1260 to 1360 nanometers) coupled to an optical fiber.

Image: 
HZDR/Juniks

Quantum technology holds great promise: Just a few years from now, quantum computers are expected to revolutionize database searches, AI systems, and computational simulations. Today already, quantum cryptography can guarantee absolutely secure data transfer, albeit with limitations. The greatest possible compatibility with our current silicon-based electronics will be a key advantage. And that is precisely where physicists from the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) and TU Dresden have made remarkable progress: The team has designed a silicon-based light source to generate single photons that propagate well in glass fibers (DOI: 10.1364/OE.397377).

Quantum technology relies on the ability to control the behavior of quantum particles as precisely as possible, for example by locking individual atoms in magnetic traps or by sending individual light particles - called photons - through glass fibers. The latter is the basis of quantum cryptography, a communication method that is, in principle, tap-proof: Any would-be data thief intercepting the photons unavoidably destroys their quantum properties. The senders and receivers of the message will notice that and can stop the compromised transmission in time.

This requires light sources that deliver single photons. Such systems already exist, especially based on diamonds, but they have one flaw: "These diamond sources can only generate photons at frequencies that are not suitable for fiber optic transmission," explains HZDR physicist Dr. Georgy Astakhov. "Which is a significant limitation for practical use." So Astakhov and his team decided to use a different material - the tried and tested electronic base material silicon.

100,000 single photons per second

To make the material generate the infrared photons required for fiber optic communication, the experts subjected it to a special treatment, selectively shooting carbon into the silicon with an accelerator at the HZDR Ion Beam Center. This created what is called G-centers in the material - two adjacent carbon atoms coupled to a silicon atom forming a sort of artificial atom.

When radiated with red laser light, this artificial atom emits the desired infrared photons at a wavelength of 1.3 micrometers, a frequency excellently suited for fiber optic transmission. "Our prototype can produce 100,000 single photons per second," Astakhov reports. "And it is stable. Even after several days of continuous operation, we haven't observed any deterioration." However, the system only works in extremely cold conditions - the physicists use liquid helium to cool it down to a temperature of minus 268 degrees Celsius.

"We were able to show for the first time that a silicon-based single-photon source is possible," Astakhov's colleague Dr. Yonder Berencén is happy to report. "This basically makes it possible to integrate such sources with other optical components on a chip." Among other things, it would be of interest to couple the new light source with a resonator to solve the problem that infrared photons largely emerge from the source randomly. For use in quantum communication, however, it would be necessary to generate photons on demand.

Light source on a chip

This resonator could be tuned to exactly hit the wavelength of the light source, which would make it possible to increase the number of generated photons to the point that they are available at any given time. "It has already been proven that such resonators can be built in silicon," reports Berencén. "The missing link was a silicon-based source for single photons. And that's exactly what we've now been able to create."

But before they can consider practical applications, the HZDR researchers still have to solve some problems - such as a more systematic production of the new telecom single-photon sources. "We will try to implant the carbon into silicon with greater precision," explains Georgy Astakhov. "HZDR with its Ion Beam Center provides an ideal infrastructure for realizing ideas like this."

Credit: 
Helmholtz-Zentrum Dresden-Rossendorf

The Josep Carreras Institute identifies a marker of poor evolution in Hodgkin's lymphoma

image: Light microscope image of Hodgkin lymphoma, with its characteristic rounded formation called the Reed-Sternberg cell right in the center of the photograph.

Image: 
@JosepCarrerasInstitute

Hodgkin lymphoma is one of the most common hematological cancers and is an example of how medical research has changed the prognosis of the disease. If a few decades ago it was a tumor with a very bad clinical evolution in almost all patients, the introduction of new drugs has made it a curable tumor in 85% of cases. However, there is 15% of Hodgkin lymphomas that do not respond to therapy and that continue to have a reduced survival.

Today, an article published in the journal Blood, the official organ of expression of the American Society of Hematology (ASH), by the group of Dr. Manel Esteller, Director of the Josep Carreras Leukemia Research Institute (IJC), Professor of ICREA research and Professor of Genetics at the University of Barcelona, ??discovers a marker that allows predicting which patient with Hodgkin's lymphoma will present an aggressive clinical course, and will therefore be a case of special risk.

Credit: 
Josep Carreras Leukaemia Research Institute

NASA-NOAA satellite's "night vision" find wind shear battering Tropical Storm Vicky

image: NASA-NOAA's Suomi NPP satellite passed the eastern north Atlantic Ocean overnight on Sept. 15 at 0335 UTC (Sept. 14 at 11:35 p.m. EDT) and captured a night-time image of Tropical Storm Vicky that revealed wind shear was pushing the bulk of its clouds to the northeastern quadrant of the storm.

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

Infrared imagery is like having night vision, and NASA-NOAA's Suomi NPP satellite provided a nighttime view of Tropical Storm Vicky that revealed outside winds are weakening the storm.

About Wind Shear 

The shape of a tropical cyclone provides forecasters with an idea of its organization and strength. When outside winds batter a storm, it can change the storm's shape. Winds can push most of the associated clouds and rain to one side of a storm. Outside winds from the southwest are pushing against Tropical Storm Vicky.

In general, wind shear is a measure of how the speed and direction of winds change with altitude. Tropical cyclones are like rotating cylinders of winds. Each level needs to be stacked on top each other vertically in order for the storm to maintain strength or intensify. Wind shear occurs when winds at different levels of the atmosphere push against the rotating cylinder of winds, weakening the rotation by pushing it apart at different levels.

NASA's Night-Time View of Vicky's Wind Shear

The Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard Suomi NPP provided a nighttime image of Vicky on Sept. 15 at 0335 UTC (Sept. 14 at 11:35 p.m. EDT) using infrared imagery. That imagery revealed strong southwesterly wind shear was pushing the bulk of its clouds to the northeastern quadrant of the storm.

In the NHC Discussion at 5 a.m. EDT on Sept. 15, Eric Blake, senior hurricane specialist at NOAA's National Hurricane Center in Miami, Fla. noted, "Vicky remains sheared this morning with strong upper-level winds causing any deep convection to be located northeast of the center. The low-level circulation has also become distorted as well, with new bursts of convection causing the mean circulation to re-form to the north."

The image was created using the NASA Worldview application at NASA's Goddard Space Flight Center in Greenbelt, Md.

Vicky's Status on Sept. 15

NOAA's National Hurricane Center (NHC) noted at 5 a.m. EDT (0900 UTC), the center of Tropical Storm Vicky was located near latitude 20.3 degrees north and longitude 30.1 degrees west. Vicky is 500 miles (800 km) northwest of the Cabo Verde Islands. Vicky is moving toward the northwest near 9 mph (15 kph), and a turn toward the west-northwest is expected within the next day or so, followed by a turn toward the west. Maximum sustained winds are near 50 mph (85 kph) with higher gusts. The estimated minimum central pressure is 1004 millibars.

Vicky's Fading Forecast

Weakening is forecast due to strong upper-level winds during the next 48 hours, and Vicky is likely to degenerate into remnant low by Wednesday, Sept. 16.

About NASA's EOSDIS Worldview

NASA's Earth Observing System Data and Information System (EOSDIS) Worldview application provides the capability to interactively browse over 700 global, full-resolution satellite imagery layers and then download the underlying data. Many of the available imagery layers are updated within three hours of observation, essentially showing the entire Earth as it looks "right now."

NASA Researches Earth from Space

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

For updated forecasts, visit: http://www.nhc.noaa.gov

By Rob Gutro 
NASA's Goddard Space Flight Center

Credit: 
NASA/Goddard Space Flight Center

Tracking hammerhead sharks reveals conservation targets to protect a nearly endangered species

video: B-roll of smooth hammerhead sharks

Image: 
Kyle McBurnie

FORT LAUDERDALE/DAVIE, Fla. - They are some of the most iconic and unique-looking creatures in our oceans. While some may think they look a bit "odd," one thing researchers agree on is that little is known about hammerhead sharks. Many of the 10 hammerhead shark species are severely overfished worldwide for their fins and in need of urgent protection to prevent their extinction.

To learn more about a declining hammerhead species that is data poor but in need of conservation efforts, a team of researchers from Nova Southeastern University's (NSU) Save Our Seas Foundation Shark Research Center (SOSF SRC) and Guy Harvey Research Institute (GHRI), Fisher Finder Adventures, the University of Rhode Island and University of Oxford (UK), embarked on a study to determine the migration patterns of smooth hammerhead sharks (Sphyrna zygaena) in the western Atlantic Ocean. This shark, which can grow up to 14-feet (400 cm), remains one of the least understood of the large hammerhead species because of the difficulty in reliably finding smooth hammerheads to allow scientific study.

To learn about smooth hammerhead behavior, the research team satellite tagged juvenile hammerhead sharks off the US Mid-Atlantic coast and then tracked the sharks for up to 15 months. The sharks were fitted with fin-mounted satellite tags that reported the sharks' movements in near real time via a satellite link to the researchers.

"Getting long-term tracks was instrumental in identifying not only clear seasonal travel patterns, but importantly, also the times and areas where the sharks were resident in between their migrations," said Ryan Logan, Ph.D. student at NSU's GHRI and SOSF SRC, and first author of the newly published research. "This study provides the first high resolution, long term view of the movement behaviors and habitats used by smooth hammerhead sharks - key information for targeting specific areas and times for management action to help build back this depleted species."

The researchers found that the sharks acted like snowbirds, migrating between two seasonally resident areas - in coastal waters off New York in the Summer and off North Carolina in the Winter. Their residency times in these two locations coincided with two environmental factors: warmer surface water temperatures and areas with high productivity - indicative of food rich areas.

"The high resolution movements data showed these focused wintering and summering habitats off North Carolina and New York, respectively, to be prime ocean "real estate" for these sharks and therefore important areas to protect for the survival of these near endangered animals," said Mahmood Shivji, Ph.D., director of NSU's GHRI and SOSF SRC, who oversaw the study.

Identifying such areas of high residency provides targets for designation as "Essential Fish Habitat" - an official title established by the US Government, which if formally adopted can subsequently be subject to special limitations on fishing or development to protect such declining species.

The tracking data also revealed a second target for conservation. The hammerheads spent a lot of resident time in the winter in a management zone known as the Mid-Atlantic Shark Area (MASA) - a zone already federally closed for seven-months per year (January 1 to July 31) to commercial bottom longline fishing to protect another endangered species, the dusky shark. However, the tracking data showed that the smooth hammerheads arrived in the MASA earlier in December, while this zone is still open to fishing.

"Extending the closure of the MASA zone by just one month, starting on December 1 each year, could reduce the fishing mortality of juvenile smooth hammerheads even more", said Shivji. "It's particularly gratifying to see such basic research not only improving our understanding of animal behavior in nature but also illuminating pathways for recovery of species and populations that have been overexploited so we can try and get back to a balanced ocean ecosystem".

The tracks of the smooth hammerheads (and other shark species) can be found here:
http://www.ghritracking.org.

The teams' complete research paper can be found ONLINE (https://www.frontiersin.org/articles/10.3389/fmars.2020.566364/full).

Credit: 
Nova Southeastern University

NASA Aqua satellite casts three eyes on sally and finds heavy rain potential

image: On Sept. 15 at 3:25 a.m. EDT (0725 UTC) Aqua's MODIS instrument also gathered water vapor content and temperature information on Sally. The MODIS image showed highest concentrations of water vapor (dark brown) and coldest cloud top temperatures were around the center of circulation and in the northeastern quadrant of the storm. The water vapor data showed coldest cloud top temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 degrees Celsius) in those storms.

Image: 
Credits: NASA/NRL

NASA's Aqua satellite analyzed the cloud top temperatures and water vapor content in Hurricane Sally as it crawls toward landfall, and found the potential for large amounts of rainfall, which, coupled with slow movement, can lead to catastrophic flooding. Two instruments provided three views of Sally's temperatures and water vapor that revealed the soaking capability of the slow-moving hurricane.

At 8 a.m. EDT on Sept. 15, NOAA's National Hurricane Center (NHC) cautioned, "Historic flooding is possible from Sally with extreme life-threatening flash flooding likely through Wednesday [Sept. 16] along portions of the northern Gulf Coast." Forecasters are using NASA's infrared and water vapor data to analyze the rainfall potential from Sally.

Warnings and Watches in Effect on Sept. 15

NHC issued a Storm Surge Warning for the mouth of the Mississippi River to the Okaloosa/Walton County Line, Florida and for Mobile Bay. A Hurricane Warning is in effect for east of the Mouth of the Pearl River to Navarre, Florida. A Tropical Storm Warning is in effect for east of Navarre, Florida to Indian Pass, Florida and from the mouth of the Pearl River westward to Grand Isle, Louisiana, including Lake Pontchartrain and Lake Maurepas and metropolitan New Orleans.

NASA's Infrared Data Reveals Heavy Rainmakers

Tropical cyclones consist of hundreds of thunderstorms, and infrared data can show where the strongest storms are located. That is because infrared data provides temperature information, and the strongest thunderstorms that reach highest into the atmosphere have the coldest cloud top temperatures. Two instruments aboard NASA's Aqua satellite provided infrared data on Sally's cloud tops.

On Sept. 15 at 3:25 a.m. EDT (0725 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite revealed the most powerful thunderstorms were around Hurricane Sally's center and in the northeastern quadrant. In those areas, cloud top temperatures were as cold as or colder than minus 80 degrees Fahrenheit (minus 62.2 Celsius). They were located over the Gulf of Mexico and just offshore from coastal Alabama, Mississippi and the Florida Panhandle. Strong storms with cloud top temperatures as cold as minus 70 degrees Fahrenheit (minus 56.6. degrees Celsius) surrounded both areas and were generating large amounts of rain.

Four minutes later, another instrument aboard NASA's Aqua satellite analyzed Sally's cloud top temperatures to verify the data from MODIS. The Atmospheric Infrared Sounder or AIRS instrument or AIRS instrument showed the strongest storms were offshore in the northern Gulf of Mexico where they were as cold as or colder than 210 Kelvin (minus 81 degrees Fahrenheit (minus 63.1 degrees Celsius). NASA research has shown that cloud top temperatures that cold indicate strong storms that have the capability to create heavy rain.

Water Vapor Content

Water vapor analysis of tropical cyclones tells forecasters how much potential a storm has to develop. Water vapor releases latent heat as it condenses into liquid. That liquid becomes clouds and thunderstorms that make up a tropical cyclone. Temperature is important when trying to understand how strong storms can be. The higher the cloud tops, the colder and stronger the storms.

On Sept. 15 at 3:25 a.m. EDT (0725 UTC) Aqua's MODIS instrument also gathered water vapor content and temperature information on Sally. The MODIS image showed the highest concentrations of water vapor and coldest cloud top temperatures were around the center of circulation and in the northeastern quadrant of the storm. The water vapor data showed the coldest cloud top temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 degrees Celsius) in those storms. Storms with cloud top temperatures that cold have the capability to produce heavy rainfall.

Sally's Rainfall Forecast from NHC

The NHC warns, "Sally is expected to be a slow moving system as it approaches land producing 10 to 20 inches of rainfall with isolated amounts of 30 inches along and just inland of the central Gulf Coast from the western Florida Panhandle to far southeastern Mississippi. Historic flooding is possible with extreme life-threatening flash flooding likely through Wednesday. In addition, this rainfall will lead to widespread moderate to major flooding on area rivers.

Sally is forecast to move inland early Wednesday and move across the Southeast producing rainfall of 4 to 8 inches, with isolated maximum amounts of 12 inches, across portions of southeastern Mississippi, southern and central Alabama, northern Georgia, and the western Carolinas. Significant flash and urban flooding is likely, as well as widespread minor to moderate flooding on some rivers."

Sally's Status on Sept. 15

At 8 a.m. EDT (1200 UTC) on Sept. 15, the National Hurricane Center noted the center of Hurricane Sally was located near latitude 29.1 degrees north and longitude 88.0 degrees west. That is about 65 miles (110 km) east of the mouth of the Mississippi River.

Sally is moving toward the northwest near 2 mph (4 kph), and this general motion is expected to continue this morning. A northward turn is expected this afternoon, followed by a slow north-northeastward to northeastward motion tonight and continuing through Wednesday night. Maximum sustained winds are near 85 mph (140 kph) with higher gusts. The estimated minimum central pressure based on data from the Air Force Hurricane Hunter aircraft is 982 millibars.

Sally's Forecast Track

NHC said, "Although little change in strength is forecast until landfall occurs, Sally is still expected to be a dangerous hurricane when it moves onshore along the north-central Gulf coast. On the forecast track, the center of Sally will pass near the coast of southeastern Louisiana today, and make landfall in the hurricane warning area tonight or Wednesday morning."

NASA Researches Tropical Cyclones

Hurricanes are the most powerful weather event on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

NASA's Aqua satellite is one in a fleet of NASA satellites that provide data for hurricane research.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

By Rob Gutro
NASA's Goddard Space Flight Center

Credit: 
NASA/Goddard Space Flight Center

Study shows difficulty in finding evidence of life on Mars

ITHACA, N.Y. - In a little more than a decade, samples of rover-scooped Martian soil will rocket to Earth.

While scientists are eager to study the red planet's soils for signs of life, researchers must ponder a considerable new challenge: Acidic fluids - which once flowed on the Martian surface - may have destroyed biological evidence hidden within Mars' iron-rich clays, according to researchers at Cornell University and at Spain's Centro de Astrobiología.

The researchers conducted simulations involving clay and amino acids to draw conclusions regarding the likely degradation of biological material on Mars. Their paper, "Constraining the Preservation of Organic Compounds in Mars Analog Nontronites After Exposure to Acid and Alkaline Fluids," published Sept. 15 in Nature Scientific Reports.

Alberto G. Fairén, a visiting scientist in the Department of Astronomy in the College of Arts and Sciences at Cornell, is a corresponding author.

NASA's Perseverance rover, launched July 30, will land at Mars' Jezero Crater next February; the European Space Agency's Rosalind Franklin rover will launch in late 2022. The Perseverance mission will collect Martian soil samples and send them to Earth by the 2030s. The Rosalind Franklin rover will drill into the Martian surface, collect soil samples and analyze them in situ.

In the search for life on Mars, the red planet's clay surface soils are a preferred collection target since the clay protects the molecular organic material inside. However, the past presence of acid on the surface may have compromised the clay's ability to protect evidence of previous life.

"We know that acidic fluids have flowed on the surface of Mars in the past, altering the clays and its capacity to protect organics," Fairén said.

He said the internal structure of clay is organized into layers, where the evidence of biological life - such as lipids, nucleic acids, peptides and other biopolymers - can become trapped and well preserved.

In the laboratory, the researchers simulated Martian surface conditions by aiming to preserve an amino acid called glycine in clay, which had been previously exposed to acidic fluids. "We used glycine because it could rapidly degrade under the planet's environmental conditions," he said. "It's perfect informer to tell us what was going on inside our experiments."

After a long exposure to Mars-like ultraviolet radiation, the experiments showed photodegradation of the glycine molecules embedded in the clay. Exposure to acidic fluids erases the interlayer space, turning it into a gel-like silica.

"When clays are exposed to acidic fluids, the layers collapse and the organic matter can't be preserved. They are destroyed," Fairén said. "Our results in this paper explain why searching for organic compounds on Mars is so sorely difficult."

Credit: 
Cornell University

NASA satellite imagery shows Teddy consolidating

image: On Sept. 15, 2020 at 12:15 a.m. EDT (0415 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Aqua satellite gathered temperature information about Teddy's cloud tops. Some cloud top temperatures were as cold as or colder than minus 70 degrees (red) Fahrenheit (minus 56.6 Celsius) and building thunderstorms around the storm's core.

Image: 
NASA/NRL

When a tropical cyclone consolidates, it means that it is getting more organized and its circulation is improving. An improved circulation helps make for a stronger storm. Infrared imagery from NASA's Aqua satellite showed that Teddy was consolidating in the Central North Atlantic Ocean.

Infrared Data Reveals Teddy is organizing

On Sept. 15 at 12:15 a.m. EDT (0415 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Aqua satellite gathered temperature information about Teddy's cloud tops. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

MODIS found some cloud top temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall. The infrared data showed that Teddy's structure is slowly improving. The infrared imagery and visible imagery revealed that the building of thunderstorms around the storm's core has increased, despite the continued presence of dry slots.

Teddy's Status on Sept. 15

At 11 a.m. EDT (1500 UTC), the center of Tropical Storm Teddy was located near latitude 14.0 degrees north and longitude 47.0 degrees west. Teddy is located about 960 miles (1,545 km) east of the Lesser Antilles. Teddy is moving toward the west-northwest near 13 mph (20 kph). A steady northwest motion at 10 to 15 mph is expected through the end of the week.

Maximum sustained winds have increased to near 65 mph (100 kph) with higher gusts. Additional strengthening is forecast for the next several days. Teddy will likely become a hurricane later today or tonight and could reach major hurricane strength in a few days. The estimated minimum central pressure is 999 millibars.

Teddy's Forecast

"Teddy's low [wind] shear and warm sea surface temperature environment should be conducive for further strengthening, and the NHC intensity forecast is largely unchanged," noted David Zelinsky, Hurricane Model Diagnostician and Meteorologist at the NHC. "Some dry air in the environment could restrict Teddy's intensification rate, but is not expected to prevent Teddy from becoming a hurricane later today or tonight. Continued strengthening is expected thereafter and Teddy is forecast to become a major hurricane within the next few days."

Credit: 
NASA/Goddard Space Flight Center

Researchers identify key role of immune cells in brain infection

image: Graphical Abstract for 'Astrocyte- and Neuron-Derived CXCL1 Drives Neutrophil Transmigration and Blood-Brain Barrier Permeability in Viral Encephalitis'

Image: 
Michael BD, et al. Cell Rep 2020

A new study has detailed the damaging role played by the immune system in a severe brain condition most commonly caused by the cold sore virus.

Researchers have identified the specific type of immune cell that induces brain inflammation in herpes simplex virus (HSV) encephalitis. Crucially, they have also determined the signalling protein that calls this immune cell into the brain from the bloodstream.

The findings, published in Cell Reports, could aid the development of targeted treatments for the brain infection, which is the most common cause of viral encephalitis worldwide.

HSV encephalitis takes hold quickly and, despite rapid anti-viral drug treatment, many patients die. Most survivors are left with brain injury due to the inflammation and damage caused by the virus and immune cells gaining access to the brain, breaking down the blood-brain barrier.

"Determining the roles of specific immune cells and the factors that allow them to cross the protective blood-brain barrier is critical to develop targeted immune-therapies," explains Dr Benedict Michael, a Senior Clinician Scientist Fellow at the University of Liverpool, who led the research.

Using a mouse model, the researchers showed that neutrophils (a type of immune cell) made the blood-brain barrier more permeable and contributed to the brain damage associated with HSV encephalitis. They also found that these neutrophils were not needed to control the virus.

Meanwhile, monocyte immune cells were found to play a protective role and were needed to control the virus and prevent brain damage.

The researchers also identified the exact signalling protein, called CXCL1, that drove the migration of these damaging neutrophils into the brain during HSV infection. By blocking this CXCL1 protein, neutrophils were prevented from crossing the blood-brain barrier and causing inflammation which resulted in less severe disease.

The findings make the CXCL1 protein an attractive target for new therapies that can stop the influx of damaging white blood cells without limiting the roles of protective ones.

Dr Michael said: "There is currently no licenced treatment for the severe brain swelling which occurs despite antiviral therapy in HSV encephalitis. Sometimes steroids are given, but as these suppress the immune system in a very broad way, there is a risk of uncontrolled viral infection.

"There is an urgent need for targeted treatment that prevents damaging immune cells from entering the brain without limiting the immune cells needed to control the virus."

Now Dr Michael and colleagues are planning to examine the impact of the CXCL1 signalling protein in patients who have already had steroids as part of a clinical trial led by Professor Tom Solomon at the University of Liverpool.

Credit: 
University of Liverpool

Real neurons are noisy. Can neural implants figure that out?

image: This image from a confocal microscope shows four retinal ganglion cells, the cells that send a signal from the retina to the brain.

Image: 
Marija Rudzite, Duke University

DURHAM, N.C. - If human eyes came in a package, it would have to be labeled "Natural product. Some variation may occur." Because the million-plus retinal ganglion cells that send signals to the human brain for interpretation don't all perform exactly the same way.

They are what an engineer would call 'noisy' -- there is variance between cells and from one moment to the next. And yet, when we see a photograph of a beautiful flower, it looks sharp and colorful and we know what it is.

The brain's visual centers must be adept at filtering out the noise from the retinal cells to get to the true signal, and those filters have to constantly adapt to light conditions to keep the signal clear. Prosthetic retinas and neural implants are going to need this same kind of adaptive noise-filtering to succeed, new research suggests.

"Neurons in the brain are noisy -- meaning that when the same stimulus is presented, the neurons do not produce the same response each time," said Greg Field, an assistant professor of neurobiology at Duke University, who has coauthored a new study in Nature Communications with a Canadian colleague on how the brain compensates for visual noise.

"If brain-machine interfaces do not account for noise correlations among neurons, they are likely to perform poorly," Field said.

Working in a special darkroom inside Field's laboratory, Duke graduate student Kiersten Ruda exposed small squares of living rat retina to patterns and videos under varying light conditions while an array of more than 500 tiny electrodes beneath the retinal cells recorded the signals that are normally sent down the optic nerve to the brain.

"All of this is done in total darkness with the aid of night vision goggles, so that we can preserve the peak sensitivity of the retinas," Field said.

The researchers ran those nerve impulses through software instead of a brain to see how variable and noisy the signals were, and to experiment with what kind of filtering the brain would need to achieve a clear signal under different light conditions, like moonlight versus sunlight.

Sensory systems like the eyes, nose and ears work on populations of sensors, because each individual sensing cell is noisy. However, this noise is shared or 'correlated' across the cells, which presents a challenge for the brain to understand the original signal.

At high light levels, the computer could improve decoding by about 20 percent by using the noise correlations, as opposed to just assuming each neuron was noisy in its own way. But at low light levels, this value increased to 100%.

Indeed, earlier research by other groups has found that assuming uncorrected noise in the cortex of the brain can make decoding 30 percent worse. Field's research in the retina shows these assumptions lead to even larger losses of information.

"It helps to understand this correlated noise if you think of an orchestra," Fields said in an email. "All the members of the orchestra are playing a little out of tune, that's the 'noise,' but how out-of-tune they are depends on their neighbors, that's the correlation. For example, all the violins are playing a little sharp, while the flutes are a little flat, and the cellos are very flat. A major question in neuroscience has been the extent to which this correlated noise corrupts the ability of the brain to figure out what song is being played."

"We showed that to utilize the benefit of having lots of sensory cells, the brain must know how to filter out this correlated noise," Field said. But the problem is even more complex because the amount of correlated noise depends on the amount of light, with more noise at lower light levels like moonlight.

Aside from being difficult and fascinating observations, Field said the study's findings point to the challenges ahead for engineers who would hope to replicate the retina in a prosthetic or in the sort of neural implant Elon Musk has announced.

"To make an ideal retinal prosthetic (a bionic eye), it would probably need to incorporate these noise correlations for the brain to correctly interpret the signals it receives from the prosthetic," Field said. Similarly, computers that readout brain activity from neural implants will probably need to have a model of noise correlations among neurons.

But it won't be easy. Field said the researchers have yet to understand the structure of these noise correlations across the brain.

"If the brain were to assume that the noise is independent across neurons, instead of having an accurate model of how it is correlated, we show the brain would suffer from a catastrophic loss of information about the stimulus," Field said.

Credit: 
Duke University

Mayo scientists develop mathematical index to distinguish healthy microbiome from diseased

ROCHESTER, Minn. -- What causes some people to develop chronic diseases such as rheumatoid arthritis, cancer and metabolic syndrome while others stay healthy? A major clue could be found in their gut microbiome -- the trillions of microbes living inside the digestive system that regulate various bodily functions.

To utilize the huge population of tiny organisms as a proxy for people's well-being, Mayo Clinic researchers have developed a Gut Microbiome Health Index. This index distinguishes a healthy microbiome from one that is diseased.

In a new study published in the Sept. 15 issue of Nature Communications, the researchers reveal how their index, composed of a biologically-interpretable mathematical formula, can take a gut microbiome profile from a person's stool sample to reveal the likelihood of having a disease independent of the clinical diagnosis.

"This discovery advances our understanding of the composition of a healthy gut microbiome that has been long sought after," says Jaeyun Sung, Ph.D., the corresponding author. "Our index predicts how closely a gut microbiome sample resembles healthy or unhealthy conditions." Dr. Sung is an assistant professor of surgery, Mayo Clinic College of Medicine and Science, and researcher within the Mayo Clinic Center for Individualized Medicine Microbiome Program.

Gut microbiome's effects on health

Dr. Sung says the effects of the highly complex microbiome on human health are profound, but the science around how to properly detect whether anything could be wrong, and how to apply the gut microbiome as an indicator of general health is relatively new. What's known is that the ecosystem of microbes is tied to a host of health benefits, including helping to digest food, regulating metabolism and playing a role in immunity. Still, many questions remain. Recent studies link alterations in the gut microbiome to major chronic illnesses. Dr. Sung says a lack of analytical tests or algorithm-driven biomarkers hinders the detection of early signs of disease prior to the occurrence of specific, diagnosable symptoms.

For the new study, Dr. Sung and his team analyzed 4,347 publicly available human stool shotgun metagenomes, which allow researchers to extensively sequence all genes in all known organisms present in a stool sample. The samples were pooled across 34 published studies spanning healthy conditions and 12 non-healthy disease conditions. Nearly 1,700 of the gut microbiome samples were from non-healthy people, that is, those with a clinically diagnosed disease or abnormal body weight based on BMI. Nearly 2,600 samples were from people reported as healthy, that is, with no overt disease or adverse symptoms.

"We pooled together the non-healthy samples into one group and the healthy samples into another," Dr. Sung explains. "Then we did a comparison of the frequencies of the microbes that were observed in both groups. We found some microbes are much more frequently observed in the healthy group, compared to the non-healthy group and vice versa.

His analysis led to a microbiome signature of the healthy human gut composed of 50 microbial species.

"But the real challenge then was to apply this information to design an indicator of health," Dr. Sung says.

Mathematical formula advances microbiome discovery

The discovery of a healthy gut microbiome signature led Dr. Sung and his team to develop a mathematical formula that predicts how closely a gut microbiome sample resembles healthy or non-healthy conditions. They created a ratio between the health-abundant species and the health-scarce species. The higher the number, the higher the chances are that the corresponding microbiome sample comes from a healthy person.

"So a higher number is going to tell you: 'Oh, you look very healthy. Your microbiome resembles that of a healthy population,'" Dr. Sung says. "But a low number reveals: 'Oh, we can't tell yet exactly which disease you may have, but we can tell that something looks off. Your microbiome resembles very close to what a microbiome would be in a disease population.' And that's what we call the Gut Microbiome Health Index. You can view it as a 'credit score for your gut.'"

On an independent validation cohort of close to 700 human subjects, healthy samples were distinguished from non-healthy samples 74% of the time.

Cardiovascular connection to gut microbiome

During the study, Dr. Sung discovered another finding: a moderate correlation between the Gut Microbiome Health Index and high-density lipoprotein, or HDL, or "good cholesterol," in the blood.

"The higher the Gut Microbiome Health Index, the higher your HDL level is," he explains. "That we were able to find this correlation with a marker of cardiovascular health is really exciting, as now we're connecting gut microbiome information with clinical data. One area of research in my group is to identify how the gut microbiome talks to various tissues in the body through chemical signals. Currently, we're far from being able to conclude on specific mechanisms, but we have some promising leads we'd like to further pursue."

He says the discovery highlights the potential of the index to be a powerful and consistent predictor of health. In the not-too-distant future, Dr. Sung imagines a time when providing a stool sample to assess the microbiome during regular visits with a health care provider would become routine. He hopes to further develop the Gut Microbiome Health Index so that it can one day contribute toward comprehensive medical and preventive health screening programs.

"Our proposed quantitative stool metric for indexing microbiome health is a conceptual and technical innovation, and has the potential to inform treatments for maintaining or restoring health through gut microbiome modulation," Dr. Sung says. "Our index provides a destination point of what you want your microbiome to resemble, especially after a massive perturbation, such as food poisoning or antibiotics. Moreover, this work demonstrates the power of integrating existing samples across various sources and health conditions to identify truly robust insight that benefits human health."

Credit: 
Mayo Clinic

Research news tip sheet: Story ideas from Johns Hopkins Medicine

image: Research from Johns Hopkins Medicine

Image: 
Johns Hopkins Medicine

STUDY SAYS BLOOD PRESSURE READINGS ACCURATE WITH LESS THAN THREE MINUTES REST

Media Contact: Michael E. Newman, mnewma25@jhmi.edu

Physicians, nurses and other health care providers traditionally been taught to let patients rest three to five minutes between blood pressure measurements. Now, researchers at Johns Hopkins Medicine have shown that this established time span may not be as medically necessary as previously believed and that shorter rest periods are just as accurate if the patient's blood pressure reading is not elevated (greater than or equal to a systolic over diastolic reading of 140/90).

The findings were presented virtually on Sept. 10, 2020, during the American Heart Association's Hypertension 2020 Scientific Sessions, and appear in the October 2020 issue of the journal Hypertension.

"Reducing the rest period prior to taking readings to less than three minutes could enable more patients to be screened in a shorter timeframe, which could be beneficial to medical facilities in high demand or that are understaffed because of economic reasons," says study lead researcher Tammy Brady, M.D., Ph.D., medical director of the pediatric hypertension program at Johns Hopkins's Children's Center and an associate professor of pediatrics at the Johns Hopkins University School of Medicine.

In a simulated clinical setting, the Johns Hopkins Medicine team studied 113 adults ages 18 to 80 -- with both normal and high blood pressure -- who each received four sets of three blood pressure measurements after different rest times: zero, two and five minutes. To estimate the repeatability of these readings, the blood pressure of each participant was recorded once more at five minutes after the last measurement.

The researchers then compared the difference in average blood pressure measured after the first and second five-minute rest periods with the difference in the readings following the first five-minute rest period and the two-minute one. They also compared the difference between the measurements after the first five-minute rest period and the zero rest time.

"We were expecting to see significant differences in these average blood pressure and overall readings," says Brady. "Instead, we discovered only marginal changes -- overall, no more than a plus or minus two-point difference between the systolic blood pressures obtained after five-minute resting periods and the readings taken with rest periods of less than a minute and two minutes."

Based on their findings, the researchers believe that a reasonable approach is measuring blood pressure after minimal to no rest, and then repeating the measurement after a five-minute rest only if the patient has an initial systolic reading of greater than or equal to 140.

DEFINITION OF PAIN UPDATED FOR FIRST TIME IN FOUR DECADES TO MAKE IT MORE INCLUSIVE

Media Contact: Rachel Butch, rbutch1@jhmi.edu

Johns Hopkins Medicine pain experts have joined the International Association for the Study of Pain (IASP) and collaborators worldwide to make a subtle but important update to the definition of "pain" for the first time in 40 years. With this change, the experts aim to make pain diagnosis and management more inclusive of all people who experience it.

The revised definition is published in the September 2020 issue of the journal PAIN.

"Pain is not just a sensation or symptom. It is a much more complex condition that is important to acknowledge properly to guide basic science research, patient care and public policy," says Srinivasa Raja, M.B.B.S., professor of anesthesiology and critical care medicine at the Johns Hopkins University School of Medicine and chair of the IASP task force that created the revised description.

According to the researchers, the new pain definition -- the first update since 1979 -- features a key phrase (shown in boldface): "An unpleasant sensory and emotional experience associated with, or resembling that associated with, actual or potential tissue damage."

The phrase is important, say the researchers, because it includes types of pain not well understood 40 years ago, such as nociplastic -- where pain receptors are stimulated without evidence of the cause, such as with fibromyalgia -- and neuropathic pain caused by a sensitized or maladaptive nervous system, commonly associated with chronic pain conditions such as persistent pains after surgery, nerve injuries and limb amputations (commonly called "phantom limb pain").

The revised definition also is more inclusive of people who cannot express their pain in words. According to Raja, the previous definition used the word "described," which excludes non-verbal behaviors to indicate pain in animals and certain vulnerable individuals.

"Changing this language enables physicians to adequately account for pain in disempowered and neglected populations such as children, the elderly and people with disabilities," he says.

The updated definition can ultimately have an impact on access to health care, if adopted by insurance providers. A 2016 survey by the U.S. Centers for Disease Control and Prevention estimates that 20.4% of U.S. adults live with chronic pain and approximately 20 million reported that pain limits the ability to accomplish tasks in their work and daily life.

"When pain affects peoples' function, psyche and their social well-being, it needs to be recognized by insurance carriers that dictate who gets care and what aspects of their multidisciplinary care are reimbursed," says Raja.

Overall, the IASP task force says the revised definition has provided a starting point to integrate more evidence-based and holistic pain management into medical and mental health care.

"Pain cannot be simplified to a 0 to 10 scale. Assessments instead need to look at the whole person, accounting for the cognitive, psychological and social factors that are critical to treating pain," says Raja.

SHORTER TREATMENT TIME NEEDED TO PREVENT HEPATITIS C IN KIDNEY TRANSPLANT RECIPIENTS

Media Contact: Michael E. Newman, mnewma25@jhmi.edu

In a 2018 clinical trial, researchers at Johns Hopkins Medicine demonstrated that a 12-week prophylactic course of direct-acting antivirals (DAAs) -- drugs that destroy the hepatitis C virus (HCV) by targeting specific steps in its life cycle -- could keep HCV-negative patients virus free after they received kidneys from deceased donors who had HCV. In a new follow-up study, the researchers improved on their past success and showed that the same protection can be achieved in a third of the time without losing effectiveness or putting the recipient at greater risk of HCV infection.

The latest findings were posted online on Sept. 8, 2020, in the Annals of Internal Medicine.

According to statistics from the U.S. Department of Health and Human Services, nearly 100,000 Americans with kidney failure, also known as end-stage renal disease, are awaiting donor organs. Unfortunately, as reported by the U.S. Centers for Disease Control and Prevention, nearly 9,000 of these patients drop off the waiting list each year, succumbing to death or deteriorating in health so much that transplantation is no longer possible.

Making matters worse, says the National Institute of Diabetes and Digestive and Kidney Diseases, the need for donor kidneys is rising 8% per year. However, their availability has not grown to match.

"Kidneys from deceased donors with HCV are increasingly available, yet hundreds are discarded annually because they have traditionally only been given to the small number of recipients who also had the virus," says study senior author Niraj Desai, M.D., surgical director of the Johns Hopkins kidney and pancreas transplant program, and assistant professor of surgery at the Johns Hopkins University School of Medicine. "Our previous trial showed it's possible to make these donor organs acceptable for HCV-negative recipients, but the timing and duration of the DAA therapy wasn't clear until now."

In their clinical trial known as Renal Transplants in Hepatitis C Negative Recipients with RNA Positive Donors (REHANNA), Desai and his colleagues gave HCV-negative kidney transplant candidates a four-week course of two DAAs, glecaprevir and pibrentasvir. In the 2018 trial, 10 transplant candidates received two other DAAs, grazoprevir and elbasvir (with or without a third drug, sofosbuvir), to protect them against HCV from the donated kidneys. However, a 12-week course of treatment was needed.

Throughout most of the four-week therapy period in the new study -- and more importantly, at the 12-week mark after the prophylactic treatment ended -- all 10 patients were HCV free. They have all remained this way since receiving their deceased HCV-positive donor kidneys, along with showing no related adverse events such as organ rejection or liver problems.

"Although our study population was small, the trial does provide a strong proof-of-concept that transplantation of HCV-positive deceased donor kidneys to HCV-negative recipients can be successful with just four weeks of DAA prophylaxis," says study lead author Christine Durand, M.D., associate professor of medicine at the Johns Hopkins University School of Medicine. "These results support adding kidneys with HCV into the overall donor pool to shorten the transplant waiting time for seriously ill patients who would not previously have had them as an option."

MYSTERY MOLECULE UNMASKED: NEWLY IDENTIFIED PROTEIN SAVES THE AIRWAY FROM DISTRESS

Media Contact: Ayanna Tucker, atucke25@jhmi.edu

In the respiratory system, when the airway becomes irritated by small particles, infections or other invaders, the body's immune system is signaled to attack. Once the immune system is primed, it sends white blood cells called eosinophils and mast cells to the affected area to immediately begin to fight.

The weapons of choice for these immune cells are chemical compounds, such as histamine and cytokines, that cause the airway to become inflamed. Past research showed that eosinophils and mast cells possess a special type of protein on their surface, known as Siglec-8, which uniquely fits and locks onto a second, and previously unknown, glycoprotein. When this mysterious molecule and Siglec-8 are bonded, the inflammatory response dies down.

Johns Hopkins Medicine researchers, led by Ronald Schnaar, Ph.D., professor of pharmacology at the Johns Hopkins University School of Medicine, found the mystery glycoprotein using mass spectroscopy. Called DMBT1S8, the molecule is an isoform, or nearly identical twin, of a known protein, DMBT1, that is believed to play a role in defending the respiratory tract's mucosal cells. DMBT1S8 remained hidden until now because of its close likeness to DMBT1.

The discovery was announced in a study published on August 10, 2020, in the Journal of Allergy and Clinical Immunology.

Newly formed DMBT1 is processed in a pancake-like structure called a Golgi body within a human cell. The Golgi body, which acts as the cell's postal system, delivers proteins and lipids to various locations. Like Clark Kent turning into Superman as it goes through the Golgi body, DMBT1 transforms into DMBT1S8 before leaving the submucosal gland to meet eosinophils and mast cells in the respiratory tract. There, the binding of DMBT1S8 to Siglec-8 halts inflammation and saves the airway from distress.

Based on their study results, the researchers suggest that targeting the newly identified molecule may one day lead to treatments for inflammatory respiratory conditions such as asthma.

STUDY TIES CHRONIC ITCH TO SLEEP LOSS AND POSSIBLE SIGNAL OF INCREASED HEART DISEASE RISK

Media Contact: Sheree-Monet Wisdom, swisdom1@jhmi.edu

Many people suffer from a skin disorder known as chronic pruritic dermatosis, commonly referred to as "chronic itch." Recent research has suggested that chronic itch may be at the root of other related medical conditions, including sleep disturbances. Now, Johns Hopkins Medicine researchers have gone a step further, providing more evidence that the connection between chronic itch and sleep problems exists, and that these patients may be at greater risk for cardiac disease as indicated by elevated levels of a circulating protein sometimes used to predict heart problems.

The researchers presented their findings in the Aug. 19, 2020, issue of the Journal of American Academy of Dermatology.

Chronic itch has been associated in previous studies with multiple sleep disturbances, including repeated nighttime and early morning awakenings. The resulting loss of quality slumber may lead to fatigue, anxiety and even depression, all of which have lasting and negative overall health impacts.

In its investigation, the Johns Hopkins Medicine team studied 5,560 U.S. adults. Background data on the participants were obtained from the 2005-2006 edition of the National Health and Nutrition Examination Survey (NHANES), a federal database tracking the health and nutritional status of adults and children in the United States over long periods of time. NHANES findings are used to determine the prevalence of major diseases and risk factors for those illnesses.

Participants were surveyed for their current social and demographic status, as well as updates on their medical history and health behaviors. Physical exams were conducted and laboratory specimens, including blood and urine, were taken.

The study results confirmed the link between sleep disturbances and chronic itch, showing that pruritic dermatosis was associated with trouble falling asleep one to five times per month, waking during the night or too early in the morning, leg jerks and cramps while sleeping, and the impacts of fatigue (such as feeling overly sleepy during the day and having difficulty with memory).

The Johns Hopkins Medicine experts also found that these disturbances were more likely in those with elevated blood levels of C-reactive protein (CRP). Produced by the liver, CRP is sent into the bloodstream in response to inflammation. It has been used as a blood test to predict cardiovascular disease when other biomarkers, primarily low-density lipoprotein (LDL) cholesterol, are at normal levels.

"CRP levels were 52.8% higher among chronic pruritic dermatosis patients reporting trouble sleeping compared to those who did not," says Shawn Kwatra, M.D., assistant professor of dermatology at the Johns Hopkins University School of Medicine. "This suggests that along with the reduction in quality of life brought on by chronic itch, these patients also may have heightened cardiometabolic risk."

Kwatra adds "that while chronic pain is well recognized in the medical community, many patients with chronic itch quietly suffer because there are no approved therapies for the condition."

Credit: 
Johns Hopkins Medicine

Did our early ancestors boil their food in hot springs?

Some of the oldest remains of early human ancestors have been unearthed in Olduvai Gorge, a rift valley setting in northern Tanzania where anthropologists have discovered fossils of hominids that existed 1.8 million years ago. The region has preserved many fossils and stone tools, indicating that early humans settled and hunted there.

Now a team led by researchers at MIT and the University of Alcalá in Spain has discovered evidence that hot springs may have existed in Olduvai Gorge around that time, near early human archaeological sites. The proximity of these hydrothermal features raises the possibility that early humans could have used hot springs as a cooking resource, for instance to boil fresh kills, long before humans are thought to have used fire as a controlled source for cooking.

"As far as we can tell, this is the first time researchers have put forth concrete evidence for the possibility that people were using hydrothermal environments as a resource, where animals would've been gathering, and where the potential to cook was available," says Roger Summons, the Schlumberger Professor of Geobiology in MIT's Department of Earth, Atmospheric, and Planetary Sciences (EAPS).

Summons and his colleagues have published their findings today in the Proceedings of the National Academy of Sciences. The study's lead author is Ainara Sistiaga, a Marie Sklodowska-Curie fellow based at MIT and the University of Copenhagen. The team includes Fatima Husain, a graduate student in EAPS, along with archaeologists, geologists, and geochemists from the University of Alcalá and the University of Valladolid, in Spain; the University of Dar es Salaam, in Tanzania; and Pennsylvania State University.

An unexpected reconstruction

In 2016, Sistiaga joined an archaeological expedition to Olduvai Gorge, where researchers with the Olduvai Paleoanthropology and Paleoecology Project were collecting sediments from a 3-kilometer-long layer of exposed rock that was deposited around 1.7 million years ago. This geologic layer was striking because its sandy composition was markedly different from the dark clay layer just below, which was deposited 1.8 million years ago.

"Something was changing in the environment, so we wanted to understand what happened and how that impacted humans," says Sistiaga, who had originally planned to analyze the sediments to see how the landscape changed in response to climate and how these changes may have affected the way early humans lived in the region.

It's thought that around 1.7 million years ago, East Africa underwent a gradual aridification, moving from a wetter, tree-populated climate to dryer, grassier terrain. Sistiaga brought back sandy rocks collected from the Olduvai Gorge layer and began to analyze them in Summons' lab for signs of certain lipids that can contain residue of leaf waxes, offering clues to the kind of vegetation present at the time.

"You can reconstruct something about the plants that were there by the carbon numbers and the isotopes, and that's what our lab specializes in, and why Ainara was doing it in our lab," Summons says. "But then she discovered other classes of compounds that were totally unexpected."

An unambiguous sign

Within the sediments she brought back, Sistiaga came across lipids that looked completely different from the plant-derived lipids she knew. She took the data to Summons, who realized that they were a close match with lipids produced not by plants, but by specific groups of bacteria that he and his colleagues had reported on, in a completely different context, nearly 20 years ago.

The lipids that Sistiaga extracted from sediments deposited 1.7 million years ago in Tanzania were the same lipids that are produced by a modern bacteria that Summons and his colleagues previously studied in the United States, in the hot springs of Yellowstone National Park.

One specific bacterium, Thermocrinis ruber, is a hyperthermophilic organism that will only thrive in very hot waters, such as those found in the outflow channels of boiling hot springs.

"They won't even grow unless the temperature is above 80 degrees Celsius [176 degrees Fahrenheit]," Summons says. "Some of the samples Ainara brought back from this sandy layer in Olduvai Gorge had these same assemblages of bacterial lipids that we think are unambiguously indicative of high-temperature water."

That is, it appears that heat-loving bacteria similar to those Summons had worked on more than 20 years ago in Yellowstone may also have lived in Olduvai Gorge 1.7 million years ago. By extension, the team proposes, high-temperature features such as hot springs and hydrothermal waters could also have been present.

"It's not a crazy idea that, with all this tectonic activity in the middle of the rift system, there could have been extrusion of hydrothermal fluids," notes Sistiaga, who says that Olduvai Gorge is a geologically active tectonic region that has upheaved volcanoes over millions of years -- activity that could also have boiled up groundwater to form hot springs at the surface.

The region where the team collected the sediments is adjacent to sites of early human habitation featuring stone tools, along with animal bones. It is possible, then, that nearby hot springs may have enabled hominins to cook food such as meat and certain tough tubers and roots.

"Why wouldn't you eat it?"

Exactly how early humans may have cooked with hot springs is still an open question. They could have butchered animals and dipped the meat in hot springs to make them more palatable. In a similar way, they could have boiled roots and tubers, much like cooking raw potatoes, to make them more easily digestible. Animals could have also met their demise while falling into the hydrothermal waters, where early humans could have fished them out as a precooked meal.

"If there was a wildebeest that fell into the water and was cooked, why wouldn't you eat it?" Sistiaga poses.

While there is currently no sure-fire way to establish whether early humans indeed used hot springs to cook, the team plans to look for similar lipids, and signs of hydrothermal reservoirs, in other layers and locations throughout Olduvai Gorge, as well as near other sites in the world where human settlements have been found.

"We can prove in other sites that maybe hot springs were present, but we would still lack evidence of how humans interacted with them. That's a question of behavior, and understanding the behavior of extinct species almost 2 million years ago is very difficult, Sistiaga says. "I hope we can find other evidence that supports at least the presence of this resource in other important sites for human evolution."

Credit: 
Massachusetts Institute of Technology