Tech

Mouthwashes could reduce the risk of coronavirus transmission

Sars-Cov-2 viruses can be inactivated using certain commercially available mouthwashes. This was demonstrated in cell culture experiments by virologists from Ruhr-Universität Bochum together with colleagues from Jena, Ulm, Duisburg-Essen, Nuremberg and Bremen. High viral loads can be detected in the oral cavity and throat of some Covid-19 patients. The use of mouthwashes that are effective against Sars-Cov-2 could thus help to reduce the viral load and possibly the risk of coronavirus transmission over the short term. This could be useful, for example, prior to dental treatments. However, mouth rinses are not suitable for treating Covid-19 infections or protecting yourself against catching the virus.

The results of the study are described by the team headed by Toni Meister, Professor Stephanie Pfänder and Professor Eike Steinmann from the Bochum-based Molecular and Medical Virology research group in the Journal of Infectious Diseases, published online on 29 July 2020. A review of laboratory results in clinical trials is pending.

Eight mouthwashes in a cell culture test

The researchers tested eight mouthwashes with different ingredients that are available in pharmacies or drugstores in Germany. They mixed each mouthwash with virus particles and an interfering substance, which was intended to recreate the effect of saliva in the mouth. The mixture was then shaken for 30 seconds to simulate the effect of gargling. They then used Vero E6 cells, which are particularly receptive to Sars-Cov-2, to determine the virus titer. In order to assess the efficacy of the mouthwashes, the researchers also treated the virus suspensions with cell culture medium instead of the mouthwash before adding them to the cell culture.

All of the tested preparations reduced the initial virus titer. Three mouthwashes reduced it to such an extent that no virus could be detected after an exposure time of 30 seconds. Whether this effect is confirmed in clinical practice and how long it lasts must be investigated in further studies.

The authors point out that mouthwashes are not suitable for treating Covid-19. "Gargling with a mouthwash cannot inhibit the production of viruses in the cells," explains Toni Meister, "but could reduce the viral load in the short term where the greatest potential for infection comes from, namely in the oral cavity and throat - and this could be useful in certain situations, such as at the dentist or during the medical care of Covid-19 patients."

Clinical studies in progress

The Bochum group is examining the possibilities of a clinical study on the efficacy of mouthwashes on Sars-Cov-2 viruses, during which the scientists want to test whether the effect can also be detected in patients and how long it lasts. Similar studies are already underway in San Francisco; the Bochum team is in contact with the American researchers.

Credit: 
Ruhr-University Bochum

From nanocellulose to gold

image: Nanocellulose decorated with metal nanoparticles.

Image: 
Magnus Johansson

When nanocellulose is combined with various types of metal nanoparticles, materials are formed with many new and exciting properties. They may be antibacterial, change colour under pressure, or convert light to heat.

"To put it simply, we make gold from nanocellulose", says Daniel Aili, associate professor in the Division of Biophysics and Bioengineering at the Department of Physics, Chemistry and Biology at Linköping University.

The research group, led by Daniel Aili, has used a biosynthetic nanocellulose produced by bacteria and originally developed for wound care. The scientists have subsequently decorated the cellulose with metal nanoparticles, principally silver and gold. The particles, no larger than a few billionths of a metre, are first tailored to give them the properties desired, and then combined with the nanocellulose.

"Nanocellulose consists of thin threads of cellulose, with a diameter approximately one thousandth of the diameter of a human hair. The threads act as a three-dimensional scaffold for the metal particles. When the particles attach themselves to the cellulose, a material that consists of a network of particles and cellulose forms", Daniel Aili explains.

The researchers can determine with high precision how many particles will attach, and their identities. They can also mix particles of different metals and with different shapes - spherical, elliptical and triangular.

In the first part of a scientific article published in Advanced Functional Materials, the group describes the process and explains why it works as it does. The second part focusses on several areas of application.

One exciting phenomenon is the way in which the properties of the material change when pressure is applied. Optical phenomena arise when the particles approach each other and interact, and the material changes colour. As the pressure increases, the material eventually appears to be gold.

"We saw that the material changed colour when we picked it up in tweezers, and at first we couldn't understand why", says Daniel Aili.

The scientists have named the phenomenon "the mechanoplasmonic effect", and it has turned out to be very useful. A closely related application is in sensors, since it is possible to read the sensor with the naked eye. An example: If a protein sticks to the material, it no longer changes colour when placed under pressure. If the protein is a marker for a particular disease, the failure to change colour can be used in diagnosis. If the material changes colour, the marker protein is not present.

Another interesting phenomenon is displayed by a variant of the material that absorbs light from a much broader spectrum visible light and generates heat. This property can be used for both energy-based applications and in medicine.

"Our method makes it possible to manufacture composites of nanocellulose and metal nanoparticles that are soft and biocompatible materials for optical, catalytic, electrical and biomedical applications. Since the material is self-assembling, we can produce complex materials with completely new well-defined properties," Daniel Aili concludes.

Credit: 
Linköping University

Grasshopper jumping on Bloch sphere finds new quantum insights

New research at the University of Warwick has (pardon the pun) put a new spin on a mathematical analogy involving a jumping grasshopper and its ideal lawn shape. This work could help us understand the spin states of quantum-entangled particles.

The grasshopper problem was devised by physicists Olga Goulko (then at UMass Amherst), Adrian Kent and Damián Pitalúa-García (Cambridge). They asked for the ideal lawn shape that would maximize the chance that a grasshopper, starting from a random position on the lawn and jumping a fixed distance in a random direction, lands back on the lawn. Intuitively one might expect the answer to be a circular lawn, at least for small jumps. But Goulko and Kent actually proved otherwise: various shapes from a cogwheel pattern to some disconnected patches of lawn performed better for different jump sizes (link to the technical paper).

Beyond surprises about lawn shapes and grasshoppers, the research provided useful insight into Bell-type inequalities relating probabilities of the spin states of two separated quantum-entangled particles. The Bell inequality, proved by physicist John Stewart Bell in 1964 and later generalised in many ways, demonstrated that no combination of classical theories with Einstein's special relativity is able to explain the predictions (and later actual experimental observations) of quantum theory.

The next step was to test the grasshopper problem on a sphere. The Bloch sphere is a geometrical representation of the state space of a single quantum bit. A great circle on the Bloch sphere defines linear polarization measurements, which are easily implemented and commonly used in Bell and other cryptographic tests. Because of the antipodal symmetry for the Bloch sphere, a lawn covers half the total surface area, and the natural hypothesis would be that the ideal lawn is hemispherical. Researchers in the Department of Computer Science at the University of Warwick, in collaboration with Goulko and Kent, investigated this problem and found that it too requires non-intuitive lawn patterns. The main result is that the hemisphere is never optimal, except in the special case when the grasshopper needs exactly an even number of jumps to go around the equator. This research shows that there are previously unknown types of Bell inequalities.

One of the paper's authors - Dmitry Chistikov from the Centre for Discrete Mathematics and its Applications (DIMAP) and the Department of Computer Science, at the University of Warwick, commented:

"Geometry on the sphere is fascinating. The sine rule, for instance, looks nicer for the sphere than the plane, but this didn't make our job easy."

The other author from Warwick, Professor Mike Paterson FRS, said:

"Spherical geometry makes the analysis of the grasshopper problem more complicated. Dmitry, being from the younger generation, used a 1948 textbook and pen-and-paper calculations, whereas I resorted to my good old Mathematica methods."

The paper, entitled 'Globe-hopping', is published in the Proceedings of the Royal Society A. It is interdisciplinary work involving mathematics and theoretical physics, with applications to quantum information theory.

The research team: Dmitry Chistikov and Mike Paterson (both from the University of Warwick), Olga Goulko (Boise State University, USA), and Adrian Kent (Cambridge), say that the next steps to give even more insight into quantum spin state probabilities are looking for the most grasshopper-friendly lawns on the sphere or even letting the grasshopper boldly go jumping in three or more dimensions.

Credit: 
University of Warwick

Past evidence supports complete loss of Arctic sea-ice by 2035

A new study, published this week in the journal Nature Climate Change, supports predictions that the Arctic could be free of sea ice by 2035.

High temperatures in the Arctic during the last interglacial - the warm period around 127,000 years ago - have puzzled scientists for decades. Now the UK Met Office's Hadley Centre climate model has enabled an international team of researchers to compare Arctic sea ice conditions during the last interglacial with present day. Their findings are important for improving predictions of future sea ice change.

During spring and early summer, shallow pools of water form on the surface of Arctic sea-ice. These 'melt ponds' are important for how much sunlight is absorbed by the ice and how much is reflected back into space. The new Hadley Centre model is the UK's most advanced physical representation of the Earth's climate and a critical tool for climate research and incorporates sea-ice and melt ponds.

Using the model to look at Arctic sea ice during the last interglacial, the team concludes that the impact of intense springtime sunshine created many melt ponds, which played a crucial role in sea-ice melt. A simulation of the future using the same model indicates that the Arctic may become sea ice-free by 2035.

Joint lead author Dr Maria Vittoria Guarino, Earth System Modeller at British Antarctic Survey (BAS), says:

"High temperatures in the Arctic have puzzled scientists for decades. Unravelling this mystery was technically and scientifically challenging. For the first time, we can begin to see how the Arctic became sea ice-free during the last interglacial. The advances made in climate modelling means that we can create a more accurate simulation of the Earth's past climate, which, in turn gives us greater confidence in model predictions for the future."

Dr Louise Sime, the group head of the Palaeoclimate group and joint lead author at BAS, says:

"We know the Arctic is undergoing significant changes as our planet warms. By understanding what happened during Earth's last warm period we are in a better position to understand what will happen in the future. The prospect of loss of sea-ice by 2035 should really be focussing all our minds on achieving a low-carbon world as soon as humanly feasible."

Dr David Schroeder and Prof Danny Feltham from the University of Reading, who developed and co-led the implementation of the melt pond scheme in the climate model, say:

"This shows just how important sea-ice processes like melt ponds are in the Arctic, and why it is crucial that they are incorporated into climate models."

Credit: 
British Antarctic Survey

The brains of nonpartisans are different from those who register to vote with a party

The brains of people with no political allegiance are different from those who strongly support one party, major new research shows.

The largest functional neuroimaging study of its kind to date shows nonpartisan voters process risk-related information differently than partisans.

The findings show nonpartisan voters are a distinct group, not just people reluctant to divulge their political preferences.

Experts found functional brain processing differences between partisans and nonpartisans in parts of the brain which help people to socialize and engage with others- the right medial temporal pole, orbitofrontal/medial prefrontal cortex, and right ventrolateral prefrontal cortex. As people completed a simple risk-related decision-making task there were differences in the blood flow to these regions of the brain between the two groups.

Dr Darren Schreiber, from the University of Exeter, who led the study, said: "There is skepticism about the existence of nonpartisan voters, that they are just people who don't want to state their preferences. But we have shown their brain activity is different, even aside from politics. We think this has important implications for political campaigning - nonpartisans need to be considered a third voter group.

"In the USA 40 percent of people are thought to be nonpartisan voters. Previous research shows negative campaigning deters them from voting. This exploratory study suggests US politicians need to treat swing voters differently, and positive campaigning may be important in winning their support. While heated rhetoric may appeal to a party's base, it can drive nonpartisans away from politics all together."

The study, published in the Journal of Elections, Public Opinion, and Parties, was conducted by Dr Schreiber, Gregory A. Fonzo from the University of Texas, Alan N. Simmons and Taru Flagan from the University of California San Diego, Christopher T. Dawes from New York University, and Martin P. Paulus from the Laureate Institute for Brain Research. The team of political scientists, neuroscientists, and psychiatrists scanned the brains of 110 participants in the USA with magnetic resonance imaging (MRI) while they completed the task. Some were registered with one of the two main parties and others were not. The differences in brain activity came when people had to choose whether to make a safe or risky decision, suggesting nonpartisan voters engage differently with nonpolitical tasks.

The experts now hope to carry out more research to discover what the differences in brain activity shows about the personalities and social traits of nonpartisan voters.

During the brain scanning the participants, who lived in San Diego County, had to decide between options which would have provided a guaranteed payoff or those that provided a chance for either losses or gains.

After the experiment the researchers matched participants with publicly available voting records to see if they were registered as Republicans or Democrats, or with no party preference. In total 73 were partisan - 56 Democrats and 17 Republicans - and 37 were nonpartisan.

The right medial temporal pole, orbitofrontal/medial prefrontal cortex, and right ventrolateral prefrontal cortex have been shown to be important for human social connections in hundreds of brain imaging studies. They help people to connect to their social groups, understand the thoughts of others, and regulate the reactions we have to others.

Credit: 
University of Exeter

Discovery of massless electrons in phase-change materials provides next step for future electronics

image: (Left) Crystal structure for the intermixed crystalline phase of the phase-change compound GeSb2Te4. (Middle) Angle-resolved photoemission spectrum of crystalline GeSb2Te4 by showing the linearly dispersive band crossing the Fermi level. (Right) Schematic band structure of the crystalline GeSb2Te4 based on this study

Image: 
Akio Kimura, Hiroshima University

Researchers have found electrons that behave as if they have no mass, called Dirac electrons, in a compound used in rewritable discs, such as CDs and DVDs. The discovery of "massless" electrons in this phase-change material could lead to faster electronic devices.

The international team published their results on July 6 in ACS Nano, a journal of the American Chemical Society.

The compound, GeSb2Te4, is a phase-change material, meaning its atomic structure shifts from amorphous to crystalline under heat. Each structure has individual properties and is reversible, making the compound an ideal material to use in electronic devices where information can be written and rewritten several times.

"Phase-change materials have attracted a great deal of attention owing to the sharp contrast in optical and electrical properties between their two phases," said paper author Akio Kimura, professor in the Department of Physical Sciences in the Graduate School of Science and the Graduate School of Advanced Science and Engineering at Hiroshima University. "The electronic structure in the amorphous phase has already been addressed, but the experimental study of the electronic structure in the crystalline phase had not yet been investigated."

The researchers found that the crystalline phase of GeSb2Te4 has Dirac electrons, meaning it behaves similarly to graphene, a conducting material that consists of a single layer of carbon atoms. They also found that the surface of the crystalline structure shares characteristics with a topological insulator, where the internal structure remains static while the surface conducts electrical activity.

"The amorphous phase shows a semiconducting behavior with a large electrical resistivity while the crystalline phase behaves like a metallic with a much lower electrical resistivity," said Munisa Nurmamat, paper author and assistant professor in the Department of Physical Sciences in the Graduate School of Science and the Graduate School of Advanced Science and Engineering at Hiroshima University. "The crystalline phase of GeSb2Te4 can be viewed as a 3D analogue of graphene."

Graphene is already considered by researchers to be a high-speed conducting material, according to Nurmamat and Kimura, but its inherently low on- and off-current ratio limits how it is applied in electronic devices. As a 3D version of graphene, GeSb2Te4 combines speed with flexibility to engineer the next generation of electrical switching devices.

Credit: 
Hiroshima University

Agriculture replaces fossil fuels as largest human source of sulfur in the environment

image: An agricultural field is sprayed with fertilizer or pesticides.

Image: 
John Lambeth / Pexels

A new paper out today in Nature Geoscience identifies fertilizer and pesticide applications to croplands as the largest source of sulfur in the environment--up to 10 times higher than the peak sulfur load seen in the second half of the 20th century, during the days of acid rain.

As a result, University of Colorado Boulder researchers recommend greatly expanded monitoring of sulfur and examining possible negative impacts of this increase, including increasing levels of mercury in wetlands, soil degradation and a higher risk for asthma for populations in agricultural areas.

"Sulfur in agriculture is used in many different forms, and we haven't studied broadly how those different forms react in the soil," said Eve-Lyn Hinckley, lead author of the study, assistant professor of environmental studies and fellow at the Institute of Arctic and Alpine Research (INSTAAR) at the University of Colorado Boulder. "No one has looked comprehensively at the environmental and human health consequences of these [agricultural] additions."

Sulfur is a naturally occurring element and an important plant nutrient, helping with the uptake of nitrogen. It's mined from underground through fossil fuel extraction and for the creation of fertilizers and pesticides. But sulfur is also highly reactive, meaning it will quickly undergo chemical transformations once its stable form surfaces--affecting the health of ecosystems and reacting to form heavy metals that pose a danger to wildlife and people.

Historically, coal-fired power plants were the largest source of reactive sulfur to the biosphere--leading to acid rain in the 1960s and 1970s, and the degradation of forest and aquatic ecosystems across the northeastern U.S. and Europe. Research on this issue prompted the Clean Air Act and its amendments, which regulated air pollution and drove sulfur levels from atmospheric sources down to pre-industrial levels.

"This is a very different problem than the acid rain days," said Hinckley. "We've gone from widespread atmospheric deposition over remote forests to targeted additions of reactive sulfur to regional croplands. These amounts are much higher than what we saw at the peak of acid rain."

Unknown risks

A majority of the research that examines excess nutrient use in agriculture has been in respect to nitrogen and phosphorus. Scientists have known for a long time that these two chemicals can cause detrimental effects on the environment, including increased greenhouse gas emissions and algae blooms in downstream waters.

Sulfur has long been applied to agricultural lands to improve the production and health of crops, serving as both a fertilizer and pesticide.

"We're moving it through our environment and ecosystems at a much faster rate than it would otherwise," said Hinckley.

Some agricultural industries around the world have even been putting more sulfur directly on their fields. So far, only isolated studies have given scientists a glimpse into the effects of excess sulfur on soil health and surrounding waters.

In the Florida Everglades, long-term research by the U.S. Geological Survey linked large applications of sulfur to sugarcane to the production of methyl mercury in the Everglades--a potent neurotoxin that accumulates as it moves up the food chain, affecting each predator more than the prey it consumes. This threatens a variety of local wildlife that eat fish, as well as humans.

So the researchers examined trends in sulfur applications across multiple important crops in the U.S.: corn in the Midwest; sugarcane in Florida; and wine grapes in California. Their models of sulfur in surface waters showed that in areas that are recovering from the impacts of acid rain, the amount of sulfur is again increasing.

The researchers predict that increasing levels of sulfur will continue in many croplands around the world, including places like China and India that are still working to regulate fossil fuel emissions.

Hinckley emphasized that simply documenting the impacts of increased sulfur on the environment and human health isn't enough--increased monitoring and research should include farmers, regulatory agencies and land managers to increase collaboration and collective action on the issue.

"We have an imperative to understand the impact that we're having on the environment," said Hinckley. "And then we need to work together towards solutions to mitigate those effects."

Credit: 
University of Colorado at Boulder

Nepal lockdown halved health facility births and increased stillbirths and newborn deaths

COVID-19 response has resulted in major reductions in health facility births in Nepal and widened inequalities, with significantly increased institutional stillbirth and neonatal mortality, according to a new study in The Lancet Global Health.

The research was led Dr Ashish KC and Nepal colleagues with Uppsala University, Sweden, and the London School of Hygiene & Tropical Medicine. It is the first published study with primary data on the impact of a COVID-19 lockdown on births in hospital, and measuring stillbirths and newborn deaths.

Compared to before lockdown, the number of births in the country's health institutions reduced by approximately 49.9% with increased inequality by ethnicity. Stillbirth rate in the hospitals increased by 50% from 14 per 1,000 total births before lockdown to 21 per 1,000 total births.

Professor Joy Lawn, co-senior author from the London School of Hygiene & Tropical Medicine, said: "The COVID-19 outbreak has brought unprecedented disruptions to health services, with the risk being highest in resource-limited countries, and to the most vulnerable. Babies can die in minutes if there are delays for safe care. This study provided the first published primary data on the extent of this risk during the COVID-19 lockdowns. So far we have only had snapshots from surveys and modelled estimates."

Although prioritised as an essential core health service, some surveys indicate that maternal and newborn health services are being affected due to COVID-19 restrictions in low-income and middle-income countries. Both access and quality of care might be deteriorating, risking deaths and reversals of hard-won gains over the past two decades.

Nepal is one of a small number of low-income countries believed to be on track for Sustainable Development Goal targets for maternal and newborn and child health by 2030. Over the last three decades Nepal has reduced maternal mortality by 76%, and newborn mortality by 62%. Future progress is now threatened, and each day lives are at risk.

The first case of COVID-19 was detected in Nepal on January 23, 2020. A countrywide lockdown was announced on March 21, 2020, with directives to frontline health-care providers to prepare for cases, and disruptions in the health system and more widely, for example to transport systems.

This study involved around 22,000 births in Nepal in nine hospitals across all seven provinces, including 11% of all births nationally, and covered 12.5 weeks before the national lockdown and 9.5 weeks during the lockdown. Very detailed data, including observations, were being collected as part of a national study on improving quality of care at birth.

As well as a halving of the numbers of institutional births, the research teams found the risk of neonatal death increased more than 3-fold, from 13 per 1,000 livebirths to 40 per 1,000 livebirths during lockdown.

Joy Lawn said: "The findings suggest that the national lockdown in Nepal has had a major impact on women and babies through travel restrictions, fear of going to hospitals due to COVID-19, with more complex cases in facilities, delays and reduced quality of care.

"Preterm birth and caesarean section rates rose, and quality of care also fell, notably intrapartum fetal heart rate monitoring and breastfeeding within one hour of birth. One positive finding from our study was that we did see improvements in hand hygiene practices of health workers during childbirth."

"Undoubtedly countries face very tough choices on how to combat COVID-19. However, our findings raise questions on policies regarding strict lockdowns in low-income and middle-income countries during outbreaks. Collateral effects seem to be much more severe than the actual direct effects of SARS-CoV2 infection, especially so for the most vulnerable in our society, pregnant women and babies. More data are needed, but even more importantly, more action now to protect these services."

The authors acknowledge limitations of our study, including that they did not explore the prevalence or the direct impact of COVID-19 on health outcomes. None of the women admitted to the hospital were tested for COVID-19, but the prevalence of COVID-19 among the study population then was likely to be very low.

Credit: 
London School of Hygiene & Tropical Medicine

NASA infrared data confirms depression became Tropical Storm Elida

image: On Aug. 9 at 4:55 a.m. EDT (0855 UTC) NASA's Aqua satellite gathered temperature information about Elida's cloud tops. Aqua found the most powerful thunderstorms (yellow) were around the center of circulation, where temperatures were as cold as or colder than minus 80 degrees Fahrenheit (minus 62.2 Celsius). That area was surrounded by thunderstorms (red) slightly less high in the atmosphere, but still powerful rainmakers with cloud top temperatures as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius).

Image: 
NASA/NRL

After Tropical Depression 09E formed near the coast of southwestern Mexico, infrared data from NASA's Aqua satellite helped confirm its transition to a tropical storm.

On Aug. 9 at 11 p.m. EDT (Aug. 10 at 0300 UTC), Tropical Depression 09E formed near latitude 14.7 degrees north and longitude 102.6 degrees west, about 315 miles (510 km) south-southeast of Manzanillo, Mexico. By 5 a.m. EDT (0900 UTC), infrared imagery helped confirm that 09E strengthened into a tropical storm and was renamed Elida.

The infrared imagery gathered on Aug. 9 at 4:55 a.m. EDT (0855 UTC) that helped make that confirmation included data from the Moderate Resolution Imaging Spectroradiometer or MODIS instrument. MODIS flies aboard NASA’s Aqua satellite. MODIS gathered temperature information about Elida’s cloud tops. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

MODIS found the most powerful thunderstorms were around the center of circulation, where temperatures were as cold as or colder than minus 80 degrees Fahrenheit (minus 62.2 Celsius). That area was surrounded by thunderstorms slightly less high in the atmosphere, but still powerful rainmakers with cloud top temperatures as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

At 5 a.m. EDT (0900 UTC) on Aug. 10, the National Hurricane Center (NHC) said the center of Tropical Storm Elida was located near latitude 18.3 degrees north and longitude 108.8 degrees west. That is about 145 miles (235 km) east-southeast of Socorro Island, Mexico. Elida is moving toward the west-northwest near 15 mph (24 kph), and this general motion is expected to continue through Wednesday. The estimated minimum central pressure is 995 millibars.

Maximum sustained winds are near 65 mph (100 kph) with higher gusts. Tropical-storm-force winds extend outward up to 60 miles (95 km) from the center. Strengthening is forecast during the next day or two, and Elida is expected to become a hurricane later today.

NHC cautioned, “Swells generated by Elida are expected to affect portions of the coast of west-central Mexico and the southern Baja California peninsula during the next couple of days. These swells are likely to cause life-threatening surf and rip current conditions.”

Tropical cyclones/hurricanes are the most powerful weather events on Earth. NASA’s expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For updated forecasts, visit: www.nhc.noaa.gov

Credit: 
NASA/Goddard Space Flight Center

NIST's SAMURAI measures 5G communications channels precisely

image: Rodney Leonhardt, Alec Weiss and Jeanne Quimby with NIST's SAMURAI, a portable measurement system to support design and repeatable laboratory testing of 5G wireless communications devices with unprecedented accuracy.

Image: 
Hammer/NIST

Engineers at the National Institute of Standards and Technology (NIST) have developed a flexible, portable measurement system to support design and repeatable laboratory testing of fifth-generation (5G) wireless communications devices with unprecedented accuracy across a wide range of signal frequencies and scenarios.

The system is called SAMURAI, short for Synthetic Aperture Measurements of Uncertainty in Angle of Incidence. The system is the first to offer 5G wireless measurements with accuracy that can be traced to fundamental physical standards -- a key feature because even tiny errors can produce misleading results. SAMURAI is also small enough to be transported to field tests.

Mobile devices such as cellphones, consumer Wi-Fi devices and public-safety radios now mostly operate at electromagnetic frequencies below 3 gigahertz (GHz) with antennas that radiate equally in all directions. Experts predict 5G technologies could boost data rates a thousandfold by using higher, "millimeter-wave" frequencies above 24 GHz and highly directional, actively changing antenna patterns. Such active antenna arrays help to overcome losses of these higher-frequency signals during transmission. 5G systems also send signals over multiple paths simultaneously -- so-called spatial channels -- to increase speed and overcome interference.

Many instruments can measure some aspects of directional 5G device and channel performance. But most focus on collecting quick snapshots over a limited frequency range to provide a general overview of a channel, whereas SAMURAI provides a detailed portrait. In addition, many instruments are so physically large that they can distort millimeter-wave signal transmissions and reception.

Described at a conference on Aug. 7, SAMURAI is expected to help resolve many unanswered questions surrounding 5G's use of active antennas, such as what happens when high data rates are transmitted across multiple channels at once. The system will help improve theory, hardware and analysis techniques to provide accurate channel models and efficient networks.

"SAMURAI provides a cost-effective way to study many millimeter-wave measurement issues, so the technique will be accessible to academic labs as well as instrumentation metrology labs," NIST electronics engineer Kate Remley said. "Because of its traceability to standards, users can have confidence in the measurements. The technique will allow better antenna design and performance verification, and support network design."

SAMURAI measures signals across a wide frequency range, currently up to 50 GHz, extending to 75 GHz in the coming year. The system got its name because it measures received signals at many points over a grid or virtual "synthetic aperture." This allows reconstruction of incoming energy in three dimensions -- including the angles of the arriving signals -- which is affected by many factors, such as how the signal's electric field reflects off of objects in the transmission path.

SAMURAI can be applied to a variety of tasks from verifying the performance of wireless devices with active antennas to measuring reflective channels in environments where metallic objects scatter signals. NIST researchers are currently using SAMURAI to develop methods for testing industrial Internet of Things devices at millimeter-wave frequencies.

The basic components are two antennas to transmit and receive signals, instrumentation with precise timing synchronization to generate radio transmissions and analyze reception, and a six-axis robotic arm that positions the receive antenna to the grid points that form the synthetic aperture. The robot ensures accurate and repeatable antenna positions and traces out a variety of reception patterns in 3D space, such as cylindrical and hemispherical shapes. A variety of small metallic objects such as flat plates and cylinders can be placed in the test setup to represent buildings and other real-world impediments to signal transmission. To improve positional accuracy, a system of 10 cameras is also used to track the antennas and measure the locations of objects in the channel that scatter signals.

The system is typically attached to an optical table measuring 5 feet by 14 feet (1.5 meters by 4.3 meters). But the equipment is portable enough to be used in mobile field tests and moved to other laboratory settings. Wireless communications research requires a mix of lab tests -- which are well controlled to help isolate specific effects and verify system performance -- and field tests, which capture the range of realistic conditions.

Measurements can require hours to complete, so all aspects of the (stationary) channel are recorded for later analysis. These values include environmental factors such as temperature and humidity, location of scattering objects, and drift in accuracy of the measurement system.

The NIST team developed SAMURAI with collaborators from the Colorado School of Mines in Golden, Colorado. Researchers have verified the basic operation and are now incorporating uncertainty due to unwanted reflections from the robotic arm, position error and antenna patterns into the measurements.

Credit: 
National Institute of Standards and Technology (NIST)

NASA finds strong storms in developing Tropical Storm Mekkhala

image: On Aug. 9 at 10:20 a.m. EDT (1420 UTC), NASA's Terra satellite found the most powerful thunderstorms (yellow) were around Mekkhala's center of circulation and in several other areas, where temperatures were as cold as or colder than minus 80 degrees Fahrenheit (minus 62.2 Celsius). Those areas were surrounded by thunderstorms (red) slightly less high in the atmosphere, but still powerful rainmakers with cloud top temperatures as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius).

Image: 
NASA/NRL

After Tropical Depression 07W formed close to the western Philippines, it moved away and strengthened into a tropical storm in the South China Sea. NASA's Terra satellite provided a look at the strength of the storms that make up the tropical cyclone.

On Aug. 9 at 11 p.m. EDT (Aug. 10 at 0300 UTC), Tropical Depression 07W formed near latitude 16.8 degrees north and longitude 118.3 degrees east, about 204 nautical miles northwest of Manila, Philippines.

When it formed it was close enough to the Philippines to generate warnings. Tropical cyclone wind signal #1 was posted for western portions of Ilocos Norte and Sur, La Union, western parts of Pangasinan, and northern part of Zambales. Those warnings were dropped by Aug. 10 as the system moved west and away from the Philippines.

Infrared imagery gathered on Aug. 9 at 10:20 a.m. EDT (1420 UTC) was from the Moderate Resolution Imaging Spectroradiometer or MODIS instrument. MODIS flies aboard NASA's Terra satellite. MODIS gathered temperature information about 07W's cloud tops. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

MODIS found the most powerful thunderstorms were around the center of circulation, and several other areas around the center where temperatures were as cold as or colder than minus 80 degrees Fahrenheit (minus 62.2 Celsius). Those areas were surrounded by thunderstorms slightly less high in the atmosphere, but still powerful rainmakers with cloud top temperatures as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

Tropical Depression 07W became a tropical storm at 5 a.m. EDT (0900 UTC) on Aug, 10 and was renamed Tropical Storm Mekkhala. Mekkhala is known locally in the Philippines as Ferdie. The storm's maximum sustained winds had increased to 35 knots (40 mph). It was centered near latitude 20.2 degrees north and longitude 118.7 degrees east, about 332 nautical miles south-southwest of Taipei, Taiwan. Mekkhala was moving to the north.

The Joint Typhoon Warning Center expects that Mekkhala will continue to move north before moving ashore in southern China.

Tropical cyclones/hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

By Rob Gutro
NASA's Goddard Space Flight Center

Credit: 
NASA/Goddard Space Flight Center

Stronger rains in warmer climate could lessen heat damage to crops, says study

image: A new study finds that when rain comes down mainly as drizzle, yields of major crops are depressed; when downpours are heavier, yields rise, up a point. Yields go down severely with the most extreme rainfalls, but these are quite rare. Length of bars represents impact on crops per hour. Heavy rainfall is projected to increase more than extremes in the future, giving a boost to crops.

Image: 
Graphic by Corey Lesk.

Intensified rainstorms predicted for many parts of the United States as a result of warming climate may have a modest silver lining: they could more efficiently water some major crops, and this would at least partially offset the far larger projected yield declines caused by the rising heat itself. The conclusion, which goes against some accepted wisdom, is contained in a new study published this week in the journal Nature Climate Change.

Numerous studies have projected that rising growing-season temperatures will drastically decrease yields of some major U.S.crops, absent adaptive measures. The damage will come from both steadily heightened evaporation of soil moisture due to higher background temperatures, and sudden desiccation of crops during heat waves. Some studies say that corn, which currently yields about 13 billion bushels a year and plays a major role in the U.S. economy, could nosedive 10 to 30 percent by the mid- to late century. Soy-the United States is the world's leading producer-could decline as much as 15 percent.

Since warmer air can hold more moisture, it is also projected that rainfall will in the future come more often in big bursts, rather than gentle downpours-a phenomenon that is already being observed in many areas. Many scientists have assumed that more extreme rains might further batter crops, but the new study found that this will probably not be the case. The reason: most of the projected heavier downpours will fall within a range that benefits crops, rather than passing the threshold at which they hurt them.

"People have been talking about how more extreme rain will damage crops," said lead author Corey Lesk, a PhD. student at Columbia University's Lamont-Doherty Earth Obsevatory. "The striking thing we found was, the overall effect of heavier rains is not negative. It turns out to be good for crops."

That said, the effects will probably be modest, according to the study. It estimates that corn yields could be driven back up 1 or 2 percent by the heavier rains, and soy by 1.3 to 2.5 percent. These increases are dwarfed by the potential losses due to heat, but even a few percent adds up when dealing with such huge quantities of crops. And, the researchers say, "Our findings may help identify new opportunities for climate-adaptive crop management and improved modeling."

The team reached their conclusions by studying hour-by-hour rainfall patterns recorded by hundreds of weather stations in the agricultural regions of the U.S. West, South and Northeast each year from 2002 to 2017. They then compared the rainfall patterns to crop yields. They found that years with rains of up to about 20 millimeters an hour-roughly the heaviest downpour of the year on average-resulted in higher yields. It was only when rains reached an extreme 50 millimeters an hour or more that crops suffered damage. (20 millimeters an hour is about three-quarters of an inch; 50 is about 2 inches.) Moreover, years in which rain came mainly as mere drizzle actually damaged yields.

The researchers outlined several possible reasons for the differences. For one, drizzle may be too inefficient to do much good. In hot weather, it can mostly evaporate back into the air before reaching subsurface root zones where it is needed; in cooler weather, it might remain on leaves long enough to encourage the growth of damaging fungi. "There are only a fixed number of hours of rain you can get in a season," said Lesk. "If too much of them are taken up by useless drizzle, it's wasted."

Heavier storms on the other hand, are better-at least up to a point. These allow water to soak thoroughly into the soil, carrying in both moisture and artificial fertilizer spread on the surface. It is only the most extreme events that hurt crops, say the researchers: these can batter plants directly, wash fertilizer off fields, and saturate soils so thoroughly that roots cannot get enough oxygen.

To study the effects of future potential rainfall patterns, the researchers used basic physical models to estimate how much heavier rains might become under different levels of warming. They found that in most cases, more rain would, as expected, come in bigger downpours-but these heavier rains would fall within the fairly wide range where they are beneficial. The most extreme, damaging rains would also increase-but would still be rare enough that the greater number of beneficial rainfalls would outweigh their effects.

Because the study averaged out statistics over vast areas, and many other factors can affect crop yields, it would be hard to say exactly what the effects of future rainfall will be in any one area, said Lesk. "No single farmer would use a study like this to make decisions on what to plant or how," he said. But, as the paper concludes, the results "suggest that beyond extreme events, the crop yield response to more common rainfall intensities merits further attention."

Credit: 
Columbia Climate School

Electronic components join forces to take up 10 times less space on computer chips

image: Mark Kraman, right, professor Xiuling Li, center, and Mike Yang led a study that integrates the multiple elements needed for the electronic filters found inside wireless devices into a single, self-assembling and space-saving computer chip component.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- Electronic filters are essential to the inner workings of our phones and other wireless devices. They eliminate or enhance specific input signals to achieve the desired output signals. They are essential, but take up space on the chips that researchers are on a constant quest to make smaller. A new study demonstrates the successful integration of the individual elements that make up electronic filters onto a single component, significantly reducing the amount of space taken up by the device.

Researchers at the University of Illinois, Urbana-Champaign have ditched the conventional 2D on-chip lumped or distributed filter network design - composed of separate inductors and capacitors - for a single, space-saving 3D rolled membrane that contains both independently designed elements.

The results of the study, led by electrical and computer engineering professor Xiuling Li, are published in the journal Advanced Functional Materials.

"With the success that our team has had on rolled inductors and capacitors, it makes sense to take advantage of the 2D to 3D self-assembly nature of this fabrication process to integrate these different components onto a single self-rolling and space-saving device," Li said.

In the lab, the team uses a specialized etching and lithography process to pattern 2D circuitry onto very thin membranes. In the circuit, they join the capacitors and inductors together and with ground or signal lines, all in a single plane. The multilayer membrane can then be rolled into a thin tube and placed onto a chip, the researchers said.

"The patterns, or masks, we use to form the circuitry on the 2D membrane layers can be tuned to achieve whatever kind of electrical interactions we need for a particular device," said graduate student and co-author Mark Kraman. "Experimenting with different filter designs is relatively simple using this technique because we only need to modify that mask structure when we want to make changes."

The team tested the performance of the rolled components and found that under the current design, the filters were suitable for applications in the 1-10 gigahertz frequency range, the researchers said. While the designs are targeted for use in radio frequency communications systems, the team posits that other frequencies, including in the megahertz range, are also possible based on their ability to achieve high power inductors in past research.

"We worked with several simple filter designs, but theoretically we can make any filter network combination using the same process steps," said graduate student and lead author Mike Yang. "We took what was already out there to provide a new, easier platform to lump these components together closer than ever."

"Our way of integrating inductors and capacitors monolithically could bring passive electronic circuit integration to a whole new level," Li said. "There is practically no limit to the complexity or configuration of circuits that can be made in this manner, all with one mask set."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

New machine learning tool predicts devastating intestinal disease in premature infants

image: A 34-week premature baby in an isolette incubator with oxygen.

Image: 
Sharon McCutcheon/Unsplash

New York, NY--August 7, 2020--Necrotizing enterocolitis (NEC) is a life-threatening intestinal disease of prematurity. Characterized by sudden and progressive intestinal inflammation and tissue death, it affects up to 11,000 premature infants in the United States annually, and 15-30% of affected babies die from NEC. Survivors often face long-term intestinal and neurodevelopmental complications.

Researchers from Columbia Engineering and the University of Pittsburgh have developed a sensitive and specific early warning system for predicting NEC in premature infants before the disease occurs. The prototype predicts NEC accurately and early, using stool microbiome features combined with clinical and demographic information. The pilot study was presented virtually on July 23 at ACM CHIL 2020.

"It's amazing how we may be able to use machine learning to stop this from happening to babies," said the study's co-author, Ansaf Salleb-Aouissi, a senior lecturer in discipline from the computer science department at Columbia Engineering and a specialist in artificial intelligence and its applications to medical informatics. "We looked at the data and developed a tool that can truly be useful, even life-saving."

"If doctors could accurately predict NEC before the baby actually becomes sick, there are some very simple steps they could take--treatment could include stopping feeds, giving IV fluids, and starting antibiotics to prevent the worst outcomes such as long-term disability or death," said the study's lead author, Thomas A. Hooven, who began his collaboration with Salleb-Aouissi when he was an assistant professor of pediatrics in the Division of Neonatology-Perinatology at Columbia University Medical Center. He is now assistant professor of pediatrics in the Division of Newborn Medicine at the University of Pittsburgh School of Medicine.

Currently, there is no tool to predict which preterm babies will get the disease, and often NEC is not recognized until it is too late to effectively intervene. NEC is the most common intestinal emergency among preterm infants. It is characterized by rapidly progressive intestinal necrosis, bacteremia, acidosis, and high rates of morbidity and mortality.

Causes of NEC are not well-understood, but several studies have focused on shifts in the intestinal microbiome, the bacteria in the intestine whose composition can be determined from DNA sequencing from small stool samples. The researchers hypothesized that a machine learning approach to modeling clinical, demographic, and microbiome data from preterm patients might allow discrimination of patients at high risk for NEC long before clinical disease onset, which would permit early intervention and mitigation of serious complications.

Hooven, Salleb-Aouissi, and Lin used data from a 2016 NIH clinical study of premature infants whose stool was collected in several American neonatal ICUs between 2009 and 2013. The team examined 2,895 stool samples from 161 preterm infants, 45 of whom developed NEC. Given the complexity of the microbiome data, the researchers performed several data preprocessing steps to reduce its dimensionality, and to address the compositionally and hierarchical nature of this data to harness it to machine learning.

"NEC represents an excellent application from a machine learning perspective," said Salleb-Aouissi. "The lessons we've learned from our new technique could well translate to other genetic or proteomic datasets and inspire new machine learning algorithms for healthcare datasets."

The team evaluated several machine learning methods to determine the best strategy for predicting NEC from microbiome data. They found optimal performance from a gated attention-based multiple instance learning (MIL) approach.

Since human microbiomes are subject to change, the MIL methods address the sequential aspect of the problem. For example, in the first 20 days after an infant is born, the infant's microbiome goes through a drastic change. Many studies have shown that infants with a higher diversity of microbiome typically are healthier.

"This led us to think that changes in microbiome diversity can help to explain why some infants are more likely to be sick from NEC," said Adam (Yun Chao) Lin, a computer science MS student and co-author of the study whose work on this project prompted him to now pursue a PhD.

Instead of viewing microbiome samples from an infant as independent, the team represented each patient as a collection of samples and applied attention mechanisms to learning the complex relationships among the samples. The machine learning algorithm "looks" at each bag and tries to guess from its contents whether or not the baby is affected.

In repeated trials, the ability of the model to distinguish affected from non-affected infants had a good balance of sensitivity and specificity. "The Area Under the ROC Curve (AUC) is about 0.9, which demonstrates how good our models are at distinguishing between affected and unaffected patients," Salleb-Aouissi noted. "Ours is the first effective system for a clinically applicable machine learning model that combines microbiome, demographic, and clinical data that can be collected and monitored in real-time in a neonatal ICU. We are excited about extending its applicability to a new area of predictive monitoring in medicine."

The researchers are now developing a noninvasive standalone testing platform for accurate identification of infants at high risk for NEC before clinical onset, to prevent the worst outcomes. Once the platform is ready, they will conduct a randomized clinical trial to validate their technique's predictions in a real-time neonatal ICU cohort.

"For the first time I can envision a future where parents of preterm infants, and their medical teams, no longer live in constant fear of NEC," said Hooven.

Credit: 
Columbia University School of Engineering and Applied Science

Does physician burnout, depression, career satisfaction differ by race/ethnicity?

What The Study Did: This survey study of U.S. physicians examined whether there were differences by race/ethnicity in burnout, symptoms of depression, career satisfaction and work-life balance.

Authors: Magali Fassiotto, Ph.D., of the Stanford University School of Medicine in Stanford, California, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.12762)

Editor's Note: The article includes conflicts of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network