Tech

Molecule-plasmon coupling strength tunes surface-enhanced infrared absorption spectral lineshapes

image: (A) The coupling strength dependent spectral evolution. In the zero-detuning condition (the plasmon resonant energy is equal to molecular vibrational energy), as the coupling strength increases, the SEIRA spectra lineshapes evolve from symmetric Lorentzian shape (i) to asymmetric Fano shape (ii), anti-absorption Fano dip (iii) and that with broadened spectra linewidth (iv). Further increasing the coupling strength could lead to emergent of new absorption band P' (v), which origins from the plasmon mediated coherent intermolecular coupling as show in the schematics (B). (B) The molecules located inside the hotspot and outside the hotspot couple with plasmons with different coupling strengths (Vi and Vo). The molecules at the two locations are indirectly coupled through the plasmonic field and the phenomenal interaction strength is Vint.

Image: 
©Science China Press

Plasmon-enhanced molecular spectroscopies have attracted tremendous attention as powerful detection tools with ultrahigh sensitivity down to the single-molecule level. The optical response of molecules in the vicinity of nanostructures with plasmon resonance would be dramatically enhanced through interactions with plasmons. However, beyond the signal amplification, the molecule-plasmon interaction also inevitably induce strong modifications in the spectral lineshapes and distort the implied chemical information of molecules. A typical example is surface-enhanced infrared absorption (SEIRA) spectra. Due to the dominated molecule-plasmon coupling, the lineshapes of molecular absorption spectra exhibit complicated asymmetric Fano lineshapes, instead of the symmetric Lorentzian lineshapes of probe molecules in the gas phase or in solution phase.

Many pioneering studies focused on the energy detuning-dependent (the energy difference between plasmon resonant energy and molecular vibrational energy) and damping-dependent (the radiation loss vs. the intrinsic ohm loss) lineshape effect. The issue of how the molecule-plasmon near-field interactions directly control the evolutions of SEIRA spectral lineshapes has been rarely explored. Furthermore, beyond the two-body interaction picture, how the molecule-plasmon interactions for molecules with distinctive coupling strengths collectively control the evaluation of spectral lineshapes is also not clear. Recently, Jun Yi, En-Ming You, Song-Yuan Ding, and Zhong-Qun Tian from Xiamen University made exciting progress and theoretically revealed how the molecule-plasmon coupling strength controls the spectral evolutions in SEIRA spectra. The results show even if the same molecules couple with the same plasmonic structures, spectral lineshapes depend on coupling distance, molecular density, and intrinsic loss of the plasmon in the zero-detuning condition, i.e., the plasmon resonant energy is equal to molecular vibrational energy.

The authors first showed that the spectral lineshape evolves from anti-absorption dip to asymmetric Fano profile as the coupling strength between molecules and plasmons gradually diminished by extending the distance between molecules and the plasmonic structure. The results were also reproduced by an analytical model with the molecule-plasmon coupling strength as the input parameter, which further revealed a dominated dipole-dipole interaction between molecules and plasmons.

The authors further found that molecular density also plays a crucial role in determining the spectral lineshapes, since the coupling strength depends on the square root of the molecular density. Interestingly, a new spectral mode was predicted when the density exceeds the threshold and would red-shift to lower energy as the density increases. The authors clarified the new mode origins from plasmon-mediated coherent intermolecular interactions, specifically, between molecules located inside and outside the plasmonic hotspots. Detailed studies showed the energy-shift of the new mode highly depends on the intermolecular coupling strength, thus can be applied to investigate the coherent intermolecular interaction in nanoscale. The studies unveil how molecule-plasmon coupling strength impacts on the spectral profiles, and shed light on further studies on plasmon-dressed molecular electronic or vibrational states in various coupling-strength regimes.

Credit: 
Science China Press

New technique separates industrial noise from natural seismic signals

image: Map of detected industrial noise across the contiguous United States.

Image: 
Los Alamos National Laboratory

LOS ALAMOS, N.M., May 19, 2020--For the first time, seismologists can characterize signals as a result of some industrial human activity on a continent-wide scale using cloud computing. In two recently published papers in Seismological Research Letters, scientists from Los Alamos National Laboratory demonstrate how previously characterized "noise" can now be viewed as a specific signal in a large geographical area thanks to an innovative approach to seismic data analyses.

"In the past, human-caused seismic signals as a result of industrial activities were viewed as 'noise' that polluted a dataset, resulting in otherwise useful data being dismissed," said Omar Marcillo, a seismologist at Los Alamos National Laboratory and lead author of the study. "For the first time, we were able to identify this noise from some large machines as a distinct signal and pull it from the dataset, allowing us to separate natural signals from anthropogenic ones."

The study used a year's worth of data from more than 1,700 seismic stations in the contiguous United States. Marcillo detected approximately 1.5 million industrial noise sequences, which corresponds on average to around 2.4 detections per day at each station.

"This shows us just how ubiquitous industrial noise is," said Marcillo. "It's important that we're able to characterize it and separate it from the other seismic signals so we can understand exactly what we're looking at when we analyze seismic activity."

This data was accessed and processed using cloud computing--a novel approach that allows for greater scalability and flexibility in seismological research. The approach is detailed in a companion paper, which demonstrated how cloud computing services can be used to do large-scale seismic analysis ten times faster than traditional computing, which requires data to be downloaded, stored, and processed. Using Amazon Web Services' cloud computing, researchers were able to acquire and process 5.6 terabytes of compressed seismic data in just 80 hours. To do this using traditional computing methods would have taken several weeks.

Marcillo said that his work to characterize industrial noise across the country would not have been possible without this new cloud-computing approach. "My colleagues and I had figured out how to separate the industrial noise signal from the rest of the seismic signal, but we couldn't scale it," he said. So Marcillo collaborated with Jonathan MacCarthy to find a way to expand it to cover a large geographical area; cloud computing was the answer. It is also flexible enough to adapt to the evolving needs of many research applications, including processing speed, memory requirements, and different processing architectures.

"Seismology is a data-rich field," said MacCarthy, lead author of the paper on the cloud-based approach. "Previously, seismic data would have to be downloaded and processed by each individual researcher. Cloud computing allows all of that data to be stored in one place, and for researchers to easily access and work with it in a community-based way. It's a huge development and has the potential to totally transform the way seismological research on large datasets is done."

Credit: 
DOE/Los Alamos National Laboratory

Thousands of lives could be lost to delays in cancer surgery during COVID-19 pandemic

Delays to cancer surgery and other treatment caused by the Covid-19 crisis could result in thousands of additional deaths linked to the pandemic in England, a major new study reports.

New modelling has revealed the extent of the impact that disruption to the cancer care and diagnosis pathway could have on the survival of cancer patients.

Many cancer patients may end up experiencing delays of several months to their cancer treatment in the context of the pandemic - including in operations to remove tumours. Those patients whose cancer will have progressed during the delay and who might otherwise have been effectively cured by surgery could now be at risk of their cancer coming back and shortening their lives.

Scientists at The Institute of Cancer Research, London, analysed existing Public Health England data on delays to cancer surgery on patients' five-year survival rates to estimate the effect of three-month or six-month delays, respectively.

Their modelling, which factored in the risk of hospital-acquired Covid-19-infection, showed dramatic differences in the impact of delay on cancer survival depending on patients' age, their cancer type and whether it was earlier- or later-stage cancer.

The team found that a delay of three months across all 94,912 patients who would have had surgery to remove their cancer over the course of a year would lead to an additional 4,755 deaths. Taking into account the length of time that patients are expected to live after their surgery, the delay would amount to 92,214 years of life lost.

They estimated that surgery for cancer affords on average 18.1 life years per patient, of which on average 1.0 years are lost for a three-month delay or 2.2 years are lost with a six-month delay. Considering healthcare resource more broadly, they compared this with hospital treatment for Covid-19, from which on average 5.1 life years were currently gained per patient.

The new study was published in Annals of Oncology today (Wednesday), and was funded by The Institute of Cancer Research (ICR) itself, with support from Cancer Research UK.

Study leader Professor Clare Turnbull, Professor of Cancer Genomics at The Institute of Cancer Research, London, said:

"The Covid-19 crisis has put enormous pressure on the NHS at every stage of the cancer pathway, from diagnosis right across to surgery and other forms of treatment. Our study shows the impact that delay to cancer treatment will have on patients, with England, and the UK more widely, potentially set for many thousands of attributable cancer deaths as a result of the pandemic.

"Our findings should help policymakers and clinicians make evidence-based decisions as we continue deal with the effects of the pandemic on other areas of medicine. We have to ensure that both patients with Covid-19 and also those with cancer get the best possible care. That means finding ways for the NHS to get back to normal service on cancer diagnostics and surgery as soon as possible, prioritising certain cancer types in particular."

Professor Paul Workman, Chief Executive of The Institute of Cancer Research, London, said:

"The Covid-19 pandemic has already devastated the lives of many people directly. Now, these new findings show the potential for the pandemic to also have a terrible indirect impact on the lives of cancer patients.

"It's positive that the NHS is now beginning to adapt to the new normal, and to think about how cancer services such as surgery can be restored as soon as possible. I also strongly welcome moves to treat patients with targeted cancer drugs, or shorter, more intense courses of radiotherapy, as ways of preserving survival rates while minimising the time they have to spend in hospital."

Credit: 
Institute of Cancer Research

Electrons break rotational symmetry in exotic low-temp superconductor

image: Scientists patterned thin films of strontium ruthenate -- a metallic superconductor containing strontium, ruthenium, and oxygen -- into the 'sunbeam' configuration seen above. They arranged a total of 36 lines radially in 10-degree increments to cover the entire range from 0 to 360 degrees. On each bar, electrical current flows from I+ to I-. They measured the voltages vertically along the lines (between gold contacts 1-3, 2-4, 3-5, and 4-6) and horizontally across them (1-2, 3-4, 5-6). Their measurements revealed that electrons in strontium ruthenate flow in a preferred direction unexpected from the crystal lattice structure.

Image: 
Brookhaven National Laboratory

UPTON, NY--Scientists have discovered that the transport of electronic charge in a metallic superconductor containing strontium, ruthenium, and oxygen breaks the rotational symmetry of the underlying crystal lattice. The strontium ruthenate crystal has fourfold rotational symmetry like a square, meaning that it looks identical when turned by 90 degrees (four times to equal a complete 360-degree rotation). However, the electrical resistivity has twofold (180-degree) rotational symmetry like a rectangle.

This "electronic nematicity"--the discovery of which is reported in a paper published on May 4 in the Proceedings of the National Academy of Sciences--may promote the material's "unconventional" superconductivity. For unconventional superconductors, standard theories of metallic conduction are inadequate to explain how upon cooling they can conduct electricity without resistance (i.e., losing energy to heat). If scientists can come up with an appropriate theory, they may be able to design superconductors that don't require expensive cooling to achieve their near-perfect energy efficiency.

"We imagine a metal as a solid framework of atoms, through which electrons flow like a gas or liquid," said corresponding author Ivan Bozovic, a senior scientist and the leader of the Oxide Molecular Beam Epitaxy Group in the Condensed Matter Physics and Materials Science (CMPMS) Division at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory and an adjunct professor in the Department of Chemistry at Yale. "Gases and liquids are isotropic, meaning their properties are uniform in all directions. The same is true for electron gases or liquids in ordinary metals like copper or aluminum. But in the last decade, we have learned that this isotropy doesn't seem to hold in some more exotic metals."

Scientists have previously observed symmetry-breaking electronic nematicity in other unconventional superconductors. In 2017, Bozovic and his team detected the phenomenon in a metallic compound containing lanthanum, strontium, copper, and oxygen (LSCO), which becomes superconducting at relatively higher (but still ultracold) temperatures compared to low-temperature counterparts like strontium ruthenate. The LSCO crystal lattice also has square symmetry, with two equal periodicities, or arrangements of atoms, in the vertical and horizontal directions. But the electrons do not obey this symmetry; the electrical resistivity is higher in one direction unaligned with the crystal axes.

"We see this kind of behavior in liquid crystals, which polarize light in TVs and other displays," said Bozovic. "Liquid crystals flow like liquids but orient in a preferred direction like solids because the molecules have an elongated rod-like shape. This shape constrains rotation by the molecules when packed close together. Liquids are typically symmetric with respect to any rotation, but liquid crystals break such rotational symmetry, with their properties different in the parallel and perpendicular directions. This is what we saw in LSCO--the electrons behave like an electronic liquid crystal."

With this surprising discovery, the scientists wondered whether electronic nematicity existed in other unconventional superconductors. To begin addressing this question, they decided to focus on strontium ruthenate, which has the same crystal structure as LSCO and strongly interacting electrons.

At the Kavli Institute at Cornell for Nanoscale Science, Darrell Schlom, Kyle Shen, and their collaborators grew single-crystal thin films of strontium ruthenate one atomic layer at a time on square substrates and rectangular ones, which elongated the films in one direction. These films have to be extremely uniform in thickness and composition--having on the order of one impurity per trillion atoms--to become superconducting.

To verify that the crystal periodicity of the films was the same as that of the underlying substrates, the Brookhaven Lab scientists performed high-resolution x-ray diffraction experiments.

"X-ray diffraction allows us to precisely measure the lattice periodicity of both the films and the substrates in different directions," said coauthor and CMPMS Division X-ray Scattering Group Leader Ian Robinson, who made the measurements. "In order to determine whether the lattice distortion plays a role in nematicity, we first needed to know if there is any distortion and how much."

Bozovic's group then patterned the millimeter-sized films into a "sunbeam" configuration with 36 lines arranged radially in 10-degree increments. They passed electrical current through these lines--each of which contained three pairs of voltage contacts--and measured the voltages vertically along the lines (longitudinal direction) and horizontally across them (transverse direction). These measurements were collected over a range of temperatures, generating thousands of data files per thin film.

Compared to the longitudinal voltage, the transverse voltage is 100 times more sensitive to nematicity. If the current flows with no preferred direction, the transverse voltage should be zero at every angle. That wasn't the case, indicating that strontium ruthenate is electronically nematic--10 times more so than LSCO. Even more surprising was that the films grown on both square and rectangular substrates had the same magnitude of nematicity--the relative difference in resistivity between two directions--despite the lattice distortion caused by the rectangular substrate. Stretching the lattice only affected the nematicity orientation, with the direction of highest conductivity running along the shorter side of the rectangle. Nematicity is already present in both films at room temperature and significantly increases as the films are cooled down to the superconducting state.

"Our observations point to a purely electronic origin of nematicity," said Bozovic. "Here, interactions between electrons bumping into each other appear to have a much stronger contribution to electrical resistivity than electrons interacting with the crystal lattice, as they do in conventional metals."

Going forward, the team will continue to test their hypothesis that electronic nematicity exists in all nonconventional superconductors.

"The synergy between the two CMPMS Division groups at Brookhaven was critical to this research," said Bozovic. "We will apply our complementary expertise, techniques, and equipment in future studies looking for signatures of electronic nematicity in other materials with strongly interacting electrons."

Credit: 
DOE/Brookhaven National Laboratory

RNA molecules in maternal blood may predict pregnancies at risk for preeclampsia

Small non-coding RNA molecules, called microRNAs (miRNAs), found and measured in the blood plasma of asymptomatic pregnant women may predict development of preeclampsia, a condition characterized by high blood pressure and abnormal kidney function that affects roughly 5 to 8 percent of all pregnancies. Preeclampsia is responsible for a significant proportion of maternal and neonatal deaths, low birth weight and is a primary cause of premature birth.

The findings are reported in the May 19, 2020 issue of Cell Reports Medicine by researchers at University of California San Diego School of Medicine and Sera Prognostics, Inc., a Salt Lake City-based company that makes diagnostics tests for predicting the risk of premature birth.

"The ability to identify pregnancies at high risk for developing preeclampsia would be of great value to patients and their doctors to better personalize prenatal care," said senior author Louise Laurent, MD, PhD, professor in the Department of Obstetrics, Gynecology and Reproductive Sciences at UC San Diego School of Medicine. "This would enable prompt detection and optimal management of pregnancies that develop preeclampsia. And the information could be used to better identify participants for research studies testing preventive therapies."

Preeclampsia is a common and serious complication of pregnancy. It is estimated to be the cause of 15 percent of preterm births and 14 percent of maternal deaths worldwide. Symptoms of preeclampsia -- most notably hypertension, but also sudden weight gain, swelling, severe headaches, abdominal pain and nausea -- appear during the second half of pregnancy, though Laurent said studies suggest the disorder is caused by problems with placental development early in pregnancy. Delayed diagnosis and suboptimal management of preeclampsia typically results in poorer outcomes for mother and child.

The new study involved 141 subjects (49 cases, 92 controls) in the discovery cohort and 71 subjects (24 cases, 47 controls) in a separate verification cohort. Researchers found that two single-miRNA biomarkers (univariate) and 29 two-miRNA (bivariate) biomarkers measured in the serum of asymptomatic pregnant women between 17 and 28 weeks of pregnancy were able to predict later onset of preeclampsia.

Laurent said the next step will be to validate these miRNA biomarkers in a large independent pregnancy cohort, with the ultimate goal of developing a clinical test for screening women early in pregnancy for increased risk of preeclampsia.

"We look forward to the clinical validation of these novel miRNA biomarkers of preeclampsia through our continued collaboration with Dr. Laurent and UC San Diego," said Jay Boniface, PhD, chief scientific officer at Sera and a study co-author. "Innovative bioinformatics approaches have enabled their discovery and the prospect of creating predictors for individualized risk of pregnancy complications."

Credit: 
University of California - San Diego

What if we could design powerful drugs without unwanted side effects?

Psychedelics such as LSD and magic mushrooms have proven highly effective in treating depression and post-traumatic stress disorders, but medical use of these drugs is limited by the hallucinations they cause.

"What if we could redesign drugs to keep their benefits while eliminating their unwanted side effects?" asks Ron Dror, an associate professor of computer science at Stanford. Dror's lab is developing computer simulations to help researchers do just that.

In an article published in Science, Dror's team describes discoveries that could be used to minimize or eliminate side effects in a broad class of drugs that target G protein-coupled receptors, or GPCRs. GPCRs are proteins found in all human cells. LSD and other psychedelics are molecules that attach to GPCRs, as are about a third of all prescription drugs, including antihistamines, beta blockers and opioids. So important is this molecular mechanism that Stanford professor Brian Kobilka shared the 2012 Nobel Prize in Chemistry for his role in discovering how GPCRs work.

When a drug molecule attaches to a GPCR, it causes multiple simultaneous changes in the cell, some beneficial and some dangerous.

By comparing simulations of a GPCR with different molecules attached, Dror's team was able to pinpoint how a drug molecule can alter the GPCR's shape to deliver beneficial effects while avoiding side effects, something that had remained mysterious until now. Based on these results, the researchers designed new molecules that indeed caused beneficial changes in cells, without unwanted changes. Although these designed molecules are not yet suitable for use as drugs in humans, they represent a crucial first step toward developing side-effect-free drugs.

Today, researchers typically test millions of drug candidates - first in test tubes, then in animals and finally in humans - hoping to find the magic molecule that is both effective and safe, meaning that any side effects are tolerable. This massive undertaking typically takes many years and costs billions of dollars, and the resulting drug still often has some frustrating side effects.

The discoveries by Dror's team promise to allow researchers to bypass a lot of that trial and error work, so that they can bring promising drug candidates into animal and human trials faster, and with a greater likelihood that these potential drugs will prove very safe and effective.

Postdoctoral scholar Carl-Mikael Suomivuori and former graduate student Naomi Latorraca led an 11-member team that included Robert Lefkowitz of Duke University, with whom Kobilka shared the Nobel Prize, and Andrew Kruse of Harvard University, Kobilka's former student.

"In addition to revealing how a drug molecule could cause a GPCR to trigger only beneficial effects," Dror said, "we've used these findings to design molecules with desired physiological properties, which is something that many labs have been trying to do for a long time."

"Armed with our results, researchers can begin to imagine new and better ways to design drugs that retain their effectiveness while posing fewer dangers," Dror said. He hopes that such research will eventually eliminate dangerous side effects of drugs used to treat a wide variety of diseases, including heart conditions, psychiatric disorders and chronic pain.

Credit: 
Stanford University School of Engineering

NASA examines tropical storm Arthur's rainfall as it transitions

image: The GPM core satellite passed over Tropical Storm Arthur in the western North Atlantic Ocean on May 19 at 2:51 a.m. EDT (0651 UTC) and found the heaviest rainfall (red) on the northeastern side of the storm falling at a rate of over 25mm (over 1 inch) per hour.

Image: 
NASA/JAXA/NRL

When the Global Precipitation Measurement mission or GPM core satellite passed over the western North Atlantic Ocean, it captured rainfall data on Tropical Storm Arthur as the storm was transitioning into an extra-tropical storm.

The GPM's core satellite passed over Arthur on May 19 at 2:51 a.m. EDT (0651 UTC) and found the heaviest rainfall on the northeastern side of the storm falling at a rate of over 25mm (over 1 inch) per hour. Lighter rainfall rates were measured throughout the rest of the storm. Forecasters at NOAA's National Hurricane Center or NHC incorporate the rainfall data into their forecasts.

NHC forecasters noted at 5 a.m. EDT (0900 UTC) on May 19, "Arthur's cloud pattern has continued to take on a generally post-tropical appearance, though a recent convective burst near its center suggests that it isn't quite post-tropical yet. Satellite imagery and earlier scatterometer data also indicate the presence of a developing warm front near the cyclone's center, and this could be contributing the development of the aforementioned convective burst."

NHC said the center of Tropical Storm Arthur was located near latitude 37.0 north, longitude 70.6 west, about 300 miles (485 km) east-northeast of Cape Hatteras, North Carolina. Arthur was moving toward the east-northeast near 15 mph (24 kph). Maximum sustained winds are near 60 mph (95 kph) with higher gusts. The estimated minimum central pressure is 991 millibars.

The NHC cautions that swells generated by Arthur are expected to affect portions of the mid-Atlantic and southeast U.S. coasts during the next day or two. These swells could cause life-threatening surf and rip current conditions.

Arthur is forecast to become post-tropical later today and is forecast to slow down and turn toward the south in another day or so. Arthur is expected to dissipate by the end of the week.

A Post-Tropical Storm is a generic term for a former tropical cyclone that no longer possesses sufficient tropical characteristics to be considered a tropical cyclone. Former tropical cyclones that have become extratropical, subtropical, or remnant lows are all three classes of post-tropical cyclones. In any case, they no longer possesses sufficient tropical characteristics to be considered a tropical cyclone. However, post-tropical cyclones can continue carrying heavy rains and produce high winds.

Credit: 
NASA/Goddard Space Flight Center

NASA-NOAA satellite sees Amphan's eye obscured

image: NASA-NOAA's Suomi NPP satellite passed over Tropical Cyclone Amphan on May 18 at 4:28 p.m. EDT (2028 UTC) and the VIIRS instrument aboard captured cloud top temperatures using infrared light. Cloud top temperatures were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius), indicating powerful storms.

Image: 
NASA/NOAA/UWM-CIMISS, William Straka III

Early on May 18, 2020, Tropical Cyclone Amphan was a Category 5 storm in the Northern Indian Ocean. On May 19, satellite data from NASA-NOAA's Suomi NPP satellite revealed that the storm has weakened and the eye was covered by high clouds.

When NASA-NOAA's Suomi NPP satellite passed over Tropical Cyclone Amphan on May 18 at 4:28 p.m. EDT (2028 UTC), infrared imagery revealed very cold cloud top temperatures and an obscured eye. The higher the cloud top, the colder it is, and the stronger the storm. The VIIRS instrument found several areas within where cloud top temperatures were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius), indicating powerful storms. Storms with cloud tops that cold have been found to generate heavy rainfall.

The Joint Typhoon Warning Center noted, "Satellite imagery also revealed that the eyewall is open on the eastern side of the eye, indicative of the easterly vertical shear and mid-level dry air moving into the tropical cyclone."

On May 19 at 5 a.m. EDT (0900 UTC), Tropical Cyclone Amphan was located near latitude 16.5 degrees north and longitude 86.8 degrees east, that is about 377 nautical miles south-southwest of Kolkata, India. Amphan was moving to the north-northeast and had maximum sustained winds near 110 knots.

Amphan is weakening as it moves north-northeast. The storm is forecast to make landfall near Kolkata on May 20 soon after 2 a.m. EDT (0600 UTC), according to the Joint Typhoon Warning Center.

Credit: 
NASA/Goddard Space Flight Center

But it's a dry heat: Climate change and the aridification of North America

Discussions of drought often center on the lack of precipitation. But among climate scientists, the focus is shifting to include the growing role that warming temperatures are playing as potent drivers of greater aridity and drought intensification.

Increasing aridity is already a clear trend across the western United States, where anthropogenic climate warming is contributing to declining river flows, drier soils, widespread tree death, stressed agricultural crops, catastrophic wildfires and protracted droughts, according to the authors of a Commentary article published online May 19 in Proceedings of the National Academy of Sciences.

At the same time, human-caused warming is also driving increased aridity eastward across North America, with no end in sight, according to climate scientists Jonathan Overpeck of the University of Michigan and Bradley Udall of Colorado State University.

"The impact of warming on the West's river flows, soils, and forests is now unequivocal," write Overpeck, dean of the U-M School for Environment and Sustainability, and Udall, senior water and climate scientist at Colorado State. "There is a clear longer-term trend toward greater aridification, a trend that only climate action can stop."

The Commentary article responds to a PNAS paper, published May 11 by Justin Martin of the U.S. Geological Survey and his colleagues, that showed how warming is causing streamflow declines in the northern Rocky Mountains, including the nation's largest river basin, the Missouri.

The Martin et al. study used tree-ring records to analyze the 2000-2010 Upper Missouri River Basin drought and concluded that "recent warming aligns with increasing drought severities that rival or exceed any estimated over the last 12 centuries."

The study details the mechanisms of temperature-driven streamflow declines, and it "places more focus on how anthropogenic climate warming is progressively increasing the risk of hot drought and more arid conditions across an expanding swath of the United States," according to Overpeck and Udall.

The Martin et al. study also highlights the way temperature-driven aridity in the West is typically framed in terms of episodic drought. Many water and land managers, as well as the general public, implicitly assume that when returning rains and snowfall break a long drought, arid conditions will also fade away.

But that's a faulty assumption, one that ignores mounting evidence all around us, according to Overpeck and Udall.

"Anthropogenic climate change calls this assumption into question because we now know with high confidence that continued emissions of greenhouse gases into the atmosphere guarantees continued warming, and that this continued warming makes more widespread, prolonged and severe dry spells and droughts almost a sure bet," they write. "Greater aridity is redefining the West in many ways, and the costs to human and natural systems will only increase as we let the warming continue."

Anticipated impacts in the Upper Missouri River Basin mirror changes already occurring in the Southwest, where the trend toward warming-driven aridification is clearest.

Rivers in the Southwest provide the only large, sustainable water supply to more than 40 million people, yet flows have declined significantly since the late 20th century. Declining flows in the region's two most important rivers, the Colorado and the Rio Grande, have been attributed in part to increasing temperatures caused by human activities, most notably the burning of fossil fuels.

Multiple processes tied to warming are likely implicated in the observed aridification of the West, according to Overpeck and Udall. For starters, warmer air can hold more water vapor, and this thirsty air draws moisture from water bodies and land surfaces through evaporation and evapotranspiration--further drying soils, stressing plants and reducing streamflow.

But the atmosphere's increased capacity to hold water vapor also boosts the potential for precipitation; rain and snow amounts are, in fact, rising in many regions of the United States outside the Southwest. However, the frequency and intensity of dry spells and droughts are expected to increase across much of the continent in coming decades, even if average annual precipitation levels rise, according to Overpeck and Udall.

"Perhaps most troubling is the growing co-occurrence of hot and dry summer conditions, and the likely expansion, absent climate change action, of these hot-dry extremes all the way to the East Coast of North America, north deep into Canada, and south into Mexico," they write.

"Other parts of North America likely won't see the widespread aridification and decadal to multi-decadal droughts of the West, but will nonetheless continue to see more frequent and severe arid events--extreme dry spells, flash droughts and interannual droughts will become part of the new normal," according to Overpeck and Udall.

"Unfortunately, climate change and this aridification are likely irreversible on human time scales, so the sooner emissions of greenhouse gases to the atmosphere are halted, the sooner the aridification of North America will stop getting worse."

Credit: 
University of Michigan

Scientists use light to accelerate supercurrents, access forbidden light, quantum world

image: This illustration shows light wave acceleration of supercurrents, which gives researchers access to a new class of quantum phenomena. That access could chart a path forward for practical quantum computing, sensing and communicating applications.

Image: 
Image courtesy of Jigang Wang/Iowa State University

AMES, Iowa - Scientists are using light waves to accelerate supercurrents and access the unique properties of the quantum world, including forbidden light emissions that one day could be applied to high-speed, quantum computers, communications and other technologies.

The scientists have seen unexpected things in supercurrents - electricity that moves through materials without resistance, usually at super cold temperatures - that break symmetry and are supposed to be forbidden by the conventional laws of physics, said Jigang Wang, a professor of physics and astronomy at Iowa State University, a senior scientist at the U.S. Department of Energy's Ames Laboratory and the leader of the project.

Wang's lab has pioneered use of light pulses at terahertz frequencies- trillions of pulses per second - to accelerate electron pairs, known as Cooper pairs, within supercurrents. In this case, the researchers tracked light emitted by the accelerated electrons pairs. What they found were "second harmonic light emissions," or light at twice the frequency of the incoming light used to accelerate electrons.

That, Wang said, is analogous to color shifting from the red spectrum to the deep blue.

"These second harmonic terahertz emissions are supposed to be forbidden in superconductors," he said. "This is against the conventional wisdom."

Wang and his collaborators - including Ilias Perakis, professor and chair of physics at the University of Alabama at Birmingham and Chang-beom Eom, the Raymond R. Holton Chair for Engineering and Theodore H. Geballe Professor at the University of Wisconsin-Madison - report their discovery in a research paper just published online by the scientific journal Physical Review Letters. (See sidebar for a list of the other co-authors.)

"The forbidden light gives us access to an exotic class of quantum phenomena - that's the energy and particles at the small scale of atoms - called forbidden Anderson pseudo-spin precessions," Perakis said.

(The phenomena are named after the late Philip W. Anderson, co-winner of the 1977 Nobel Prize in Physics who conducted theoretical studies of electron movements within disordered materials such as glass that lack a regular structure.)

Wang's recent studies have been made possible by a tool called quantum terahertz spectroscopy that can visualize and steer electrons. It uses terahertz laser flashes as a control knob to accelerate supercurrents and access new and potentially useful quantum states of matter. The National Science Foundation has supported development of the instrument as well as the current study of forbidden light.

The scientists say access to this and other quantum phenomena could help drive major innovations:

"Just like today's gigahertz transistors and 5G wireless routers replaced megahertz vacuum tubes or thermionic valves over half a century ago, scientists are searching for a leap forward in design principles and novel devices in order to achieve quantum computing and communication capabilities," said Perakis, with Alabama at Birmingham. "Finding ways to control, access and manipulate the special characteristics of the quantum world and connect them to real-world problems is a major scientific push these days. The National Science Foundation has included quantum studies in its '10 Big Ideas' for future research and development critical to our nation."

Wang said, "The determination and understanding of symmetry breaking in superconducting states is a new frontier in both fundamental quantum matter discovery and practical quantum information science. Second harmonic generation is a fundamental symmetry probe. This will be useful in the development of future quantum computing strategies and electronics with high speeds and low energy consumption."

Before they can get there, though, researchers need to do more exploring of the quantum world. And this forbidden second harmonic light emission in superconductors, Wang said, represents "a fundamental discovery of quantum matter."

Credit: 
Iowa State University

Landmark recommendations on development of artificial intelligence and the future of global health

May 19, 2020 - A landmark review of the role of artificial intelligence (AI) in the future of global health published in The Lancet calls on the global health community to establish guidelines for development and deployment of new technologies and to develop a human-centered research agenda to facilitate equitable and ethical use of AI. The review and recommendations were developed by Nina Schwalbe, MPH, adjunct professor in the Heilbrunn Department of Population and Family Health at the Columbia University Mailman School of Public Health, and Principal Visiting Fellow at United Nations University - International Institute for Global Health, and Brian Wahl, PhD, assistant acientist in the Department of International Health at the Johns Hopkins Bloomberg School of Public Health.

Advances in information technology infrastructure and mobile computing power in many low and middle-income countries (LMICs) have raised hopes that AI could help to address challenges which are unique to the field of global health and accelerate the achievement of the health-related Sustainable Development Goals (SDGs) and Universal Health Coverage (UHC). However, the deployment of AI-enabled interventions must be exercised with care and caution for individuals and societies to benefit equally, especially in the current context of the digital tools and systems being rapidly deployed in response to the novel coronavirus disease 2019 (COVID-19).

"Especially during the COVID-19 emergency, we cannot ignore what we know about the importance of human-centered design and gender bias of algorithms," said Schwalbe. "Thinking through how AI interventions will be adapted within the context of the health systems in which they are deployed must be part of every study."

"This review marks an important point in our rapidly developing digital age at which to reflect on the impressive opportunities that AI may hold, but also consider what we are urgently missing to protect those most at risk - exciting developments but many are being rolled out without adequate evidence or appropriate safeguards" said Dr. Naomi Lee, Senior Executive Editor at The Lancet.

According to Wahl and Schwalbe, artificial intelligence is already being used in high-resource settings to address COVID-19 response activities, including patient risk assessment and managing patient flowThey point out, however, that while artificial intelligence could support the COVID-19 response in resource-limited settings, there are currently few mechanisms to ensure its appropriate use in such settings.

As the field of AI is rapidly evolving in global health, and in light of the COVID-19 response, the review highlights the following recommendations:

Incorporate aspects of human-centered design into the development process, including starting from a needs-based rather than a tool-based approach;

Ensure rapid and equitable access to representative datasets;

Establish global systems for assessing and reporting efficacy and effectiveness of AI-driven interventions in global health;

Develop a research agenda that includes implementation and system-related questions on the deployment of new AI-driven interventions;

Develop and implement global regulatory, economic, and ethical standards and guidelines that safeguard the interests of LMICs.

Schwalbe and Wahl developed these recommendations through an extensive review of the peer-reviewed literature to help ensure that AI helps to improve health in LMICs and contribute to the achievement of the SDGs and UHC, to the COVID-19 response.

"In the eye of the COVID-19 storm, now more than ever we must be vigilant to apply regulatory, ethical, and data protection standards. We hold ourselves to ethical standards around proving interventions work before we roll them out at scale. Without this, we risk undermining the vulnerable populations we are best trying to support" said Schwalbe.

The review was supported by Fondation Botnar, a Swiss-based foundation that champions the use of AI and digital technology to improve the health and wellbeing of children and young people in growing urban environments.

"We are proud to have supported this critical and timely review", said Stefan Germann, CEO of Fondation Botnar. "In anticipation of the adoption of the new WHO Global Strategy on Digital Health later this year, and the rapid deployment of technologies in response to COVID-19, we need to raise the discussions on the human rights issues and necessary governance structures around data use and sharing, and the role of institutions such as the WHO in providing leadership."

Credit: 
Columbia University's Mailman School of Public Health

Fishing rod 'selfie stick' and scientific sleuthing turn up clues to extinct sea reptile

image: An artistic life reconstruction of Nannopterygius.

Image: 
Andrey Atuchin

A Russian paleontologist visiting the Natural History Museum in London desperately wanted a good look at the skeleton of an extinct aquatic reptile, but its glass case was too far up the wall. So he attached his digital camera to a fishing rod and -- with several clicks -- snagged a big one, scientifically speaking.

Images from the "selfie stick" revealed that the creature, whose bones were unearthed more than a century ago on a coast in southern England, seemed very similar to a genus of ichthyosaurs he recognized from Russian collections.

He emailed the photos of the dolphin-like ichthyosaur to fellow paleontologist Megan L. Jacobs, a Baylor University doctoral candidate in geosciences. She quickly realized that the animal's skeletal structure matched not only some ichthyosaurs she was studying in a fossil museum on the English Channel coast, but also some elsewhere in the United Kingdom.

Jacobs and paleontologist Nikolay G. Zverkov of the Russian Academy of Sciences -- who "fished" for the ichthyosaur -- merged their research, studying their collective photos and other materials and ultimately determining that the Russian and English ichthyosaurs were of the same genus and far more common and widespread than scientists believed.

Their study is published in the Zoological Journal of the Linnean Society.

"Ichthyosaurs swam the seas of our planet for about 76 million years," Jacobs said. "But this 5-foot ichthyosaur from some 150 million years ago was the least known and believed to be among the rarest ichthyosaurs. The skeleton in the case, thought to be the only example of the genus, has been on display in the Natural History Museum in London since 1922.

"Nikolay's excellent detailed photos significantly expand knowledge of Nannopterygius enthekiodon," she said. "Now, after finding examples from museum collections across the United Kingdom, Russia and the Arctic -- as well as several other Nannopterygius species -- we can say Nannopterygius is one of the most widespread genera of ichthyosaurs in the Northern Hemisphere."

Additionally, the study described a new species, Nannopterygius borealis, dating from about 145 million years ago in a Russian archipelago in the Arctic. The new species is the northernmost and youngest representative of its kind, Jacobs said.

Previously, for the Middle and Late Jurassic epochs, the only abundant and most commonly found ichthyosaur was Ophthalmosaurus, which had huge eyes and was about 20 feet long. It was known from hundreds of specimens, including well-preserved skeletons from the Middle Jurassic Oxford Clay Formation of England, Jacobs said.

"For decades, the scientific community thought that Nannopterygius was the rarest and most poorly known ichthyosaur of England," Zverkov said. "Finally, we can say that we know nearly every skeletal detail of these small ichthyosaurs and that these animals were widespread. The answer was very close; what was needed was just a fishing rod."

Credit: 
Baylor University

Researchers find potential drug treatment targets for alcohol-related liver disease

Alcohol-related liver disease (ALD) is a deadly condition affecting more than 150 million people worldwide with no treatment available besides transplant.

But now, a team led by researchers from Massachusetts General Hospital (MGH) has uncovered key molecular step stones in ALD that may provide targets for drug therapy development. Their work was recently published in Proceedings of the National Academy of Sciences (PNAS).

This groundbreaking research was achieved by combining RNA sequence analysis of patient liver samples with transgenic mouse studies.

The team identified two promising potential molecular targets for ALD drug development - cGAS and Cx32. The lead author on the study is Jay Luther, MD, gastroenterologist and Director of the Mass General Alcohol Liver Center. The senior author is Suraj J. Patel, MD, PhD, a research fellow in the department of Medicine.

"Now that we know the key players in this pathway, we finally have drug targets for treatment development," says Luther. "Until now, we haven't had any successful leads. Meanwhile, the number of patients has grown."

Researchers already knew that liver cell (hepatocyte) death in ALD is driven by IFN regulatory factor (IRF3). That process also fuels a strong secondary inflammatory response that affects nearby cells and can eventually lead to liver failure.

But scientists have been puzzling over how alcohol activates IRF3 and which pathways amplify the inflammatory signals that make the disease spread throughout the liver. As a result of that process, ALD attacks the liver at a certain point regardless of whether the patient is still drinking alcohol.

"Until now, now we had only few clues about why alcohol-related liver disease progresses the way it does, but this research fills in key pieces of the puzzle," says Patel.

In their studies, the team analyzed liver cells from patients, with a wide range in degree of disease severity, using RNA-sequencing. Their analysis showed that the level of cGAS expression was related to the degree of disease.

In a study of alcohol-fed mice, they found those animals also had higher expression of the cGAS-IrF3 pathway in liver cells. Meanwhile, mice that were genetically designed to have lower levels of cGAS and IRF3 were less susceptible to ALD.

The Mass General-led team determined that cytoplasmic sensor cyclic guanosine monophosphate-adenosine monophosphate (AMP) synthase (cGAS) drives IRF3 activation in both liver cells directly injured by alcohol as well as neighboring cells. They also pinpointed connexin 32 (Cx32) as a possible new drug target, as it is a key regulator of cGAS-driven IRF3 activation.

"It is very encouraging to see that we now have evidence-based targets for drug development for ALD," says Luther.

Credit: 
Massachusetts General Hospital

New biomarker could flag tumors that are sensitive to common diabetes drug

image: This is Russell Jones, Ph.D.

Image: 
Courtesy of Van Andel Institute

GRAND RAPIDS, Mich. (May 19, 2020) -- A newly identified biomarker could help scientists pinpoint which cancers are vulnerable to treatment with biguanides, a common class of medications used to control blood sugar in Type 2 diabetes.

Biguanides, particularly a medication called metformin, have long been of interest to cancer researchers because of their ability to target cellular metabolism, which fuels the growth and spread of malignant cells. To date, the success of biguanides as potential cancer therapeutics has been mixed, largely due to the difficulty in getting enough of the agent into cancer cells to be effective and the lack of a way to determine which cancers will respond to treatment.

"Cancers vary widely in how they react to different therapies -- what works for one cancer type may not work for another -- but regardless, they are all reliant on metabolism for energy production," said Van Andel Institute Professor Russell Jones, Ph.D., the study's senior author and leader of VAI's Metabolic and Nutritional Programming group. "Our results establish two important things: First, they give us a way to objectively determine which types of cancer are sensitive to biguanide treatment and, second, they illuminate how and why some patients may respond better to biguanides than other patients."

Published today in Cell Reports Medicine, the findings identify a microRNA regulated by the gene MYC as a biomarker for cancers that are sensitized to biguanide treatment. MYC is a well-known cancer-related gene whose activity is increased in as many as 70% of lymphomas. MYC works in part by turning down the activity of other genes that suppress tumor growth while heightening metabolic activity in cancer cells, a combination that allows the cells to flourish.

But there's a trade-off. While MYC helps fuel cancer cells' voracious appetites, it also turns off cells' ability to respond to a stressful biological environment, limiting flexibility in their metabolism. Treatment with biguanides cut off this energy supply, causing stress that the cells cannot cope with and leading them to die. In Type 2 diabetes, biguanides are used to lower blood sugar, but in certain cancer cells, such as lymphomas with high MYC expression, the increased stress kills the cancer cells.

"Biguanides have great potential as cancer treatments, particularly for blood cancers," Jones said. "Biomarkers such as what we have found here are vital tools for determining which cancers will respond to biguanides and which will not, which is important for patient care as well as designing more effective clinical trials."

As part of the study, Jones and his colleagues also characterized an experimental biguanide called IM156, which is more potent than existing biguanides. IM156 was developed by ImmunoMet Therapeutics, a clinical-stage biotechnology company that develops anti-tumor and anti-fibrotic therapies. ImmunoMet recently completed a Phase 1 safety study and IM156 was well tolerated. Jones serves as a member of ImmunoMet's Board of Scientific Advisors.

Credit: 
Van Andel Research Institute

Algorithmic autos

image: UD doctoral student A M Ishtiaque Mahbub is the first author of two new technical papers published by SAE -- formerly known as the Society of Automotive Engineers -- describing how UD engineers optimized vehicle dynamics and powertrain operation using connectivity and automation as well as how they developed and tested a control framework that reduced travel time and energy use in a connected and automated vehicle.

Image: 
Photo courtesy of Ishti Mahbub

Imagine merging into busy traffic without ever looking over your shoulder nor accelerating or braking too hard, irritating the driver in the next lane over. Connected and automated vehicles that communicate to coordinate optimal traffic patterns could enable this pleasant driving scenario sooner than you think.

At the University of Delaware, a research group of students is developing algorithms for connected and automated vehicles that reduce energy consumption and travel delays. The Information and Decision Science Lab is led by Andreas Malikopoulos, Terri Connor Kelly and John Kelly Career Development Associate Professor.

Connected and automated vehicles use technology such as sensors, cameras and advanced control algorithms to adjust their operation to changing conditions with little or no input from drivers.

For doctoral student A M Ishtiaque Mahbub, the project has offered unprecedented opportunities. He is the first author of two new technical papers published by SAE -- formerly known as the Society of Automotive Engineers -- describing how UD engineers optimized vehicle dynamics and powertrain operation using connectivity and automation as well as how they developed and tested a control framework that reduced travel time and energy use in a connected and automated vehicle.

The team is optimizing an Audi A3 e-tron, a plug-in hybrid electric vehicle. First, the team members developed control architectures to reduce stop-and-go driving and travel time while ensuring that energy efficiency. Next, the team tested the algorithms using driving simulators in UD's Spencer Laboratory.

Then, in October 2019, they put their work to the test in the University of Michigan's MCity, a testing ground for cutting-edge vehicles. The software developed at UD went into the Audi A3 e-tron.

On test day, Mahbub stepped into the test car with two other engineers from Bosch. Each was equipped with a laptop to take data as they drove along a track that included a roundabout, merging zone, intersection and other challenges. The connected and automated vehicle is designed to take over and navigate these situations for you.

"This alleviates stress, and by eliminating stop-and-go driving behavior where you're constantly braking and accelerating, braking and accelerating, or even yielding, it also has a smooth margin in those cases, which also as a byproduct increases the fuel efficiency," said Mahbub.

Virtual reality was used to simulate challenges for the car to navigate around, such as other cars and pedestrians.

With months of preparation behind him, Mahbub was excited for the test, but nervous, too. "There is a certain level of uncertainty that plays on your mind, that, OK: The theory and control algorithms worked in simulation, but how about in the real world?" he said. "How might the real-world uncertainties and unknown variables affect the system?"

The test was a success, with a 30 percent increase in energy efficiency, more than the simulation even predicted.

The real-world scenario helped Mahbub put his analysis in context, gain an even greater understanding of the vehicle's control architecture, and collect data that could be used to realize and quantify even greater gains in energy efficiency.

"At one point in the field test I was feeling a bit nauseous because the centrifugal force was a little too much," he said. "I'm thinking right now going forward if we plan to visit MCity, I will definitely put that in my algorithm so that the passengers will have a more comfortable drive."

Credit: 
University of Delaware