Tech

In your face: a compact RGB scanning projector for wearable displays and smart glasses

image: Diagram of the proposed device and the actual implementation, which is compact and thus suitable for wearable devices

Image: 
University of Fukui

Thanks to the remarkable progress that has been made in miniaturization over the last decade, wearable electronic devices that were considered far-fetched in the 2000's are within reach. Prime examples of such technology are smart glasses and similar head-mounted displays, which leverage recent advances in virtual and augmented reality to create an immersive experience or provide hands-free utilities.

However, existing prototypes of smart glasses and wearable displays suffer from certain limitations that hinder their commercialization, such as bulkiness, battery life issues, and sub-par image quality. However, at the University of Fukui, Japan, scientists have been making steady progress towards creating an incredibly small yet powerful image projector for eyewear displays. In fact, in collaboration with SEIREN KST Corp., a Japanese silicon manufacturer, they're getting ready to commercialize their latest innovations within a year.

In their latest work, which was presented at the 27th International Display Workshop (IDW '20), they succeeded in creating an optical engine by integrating a compact RGB laser module measuring only 8×4×3 mm with a microelectromechanical (MEMS) mirror. The direction in which the MEMS mirror reflects light from the laser module can be controlled electronically, making it possible to project high-quality 2D images through laser scanning over the projected area.

One challenging aspect of making the laser module was combining the light beams from three independent laser sources to obtain an RGB output. To achieve this, the scientists used a waveguide-type combiner, where each of the three waveguides receives light from each of the primary colors. Although only the center green waveguide is connected to the actual optical output, the blue and red light travelling on adjacent dead-end waveguides are passed to the center waveguide through strategically placed couplers. "The measured efficiency of the combiner was as high as 97%, which represents a loss of only 0.13 dB," remarks Asst. Prof. Akira Nakao from the University of Fukui, lead author of the study. "The outputs from the individual RGB lasers end up perfectly aligned thanks to the nature of the waveguide-type combiner," he adds. Moreover, by using an achromatic lens, an excellent circular focused beam is achieved, while also providing the option to use other lenses to produce collimated beams with larger diameters.

The laser beam scanning module can project 1280×720 color video by tuning the MEMS driving frequency. These characteristics, together with its small size and its potentially low battery consumption, make the proposed laser scanning projector a promising device for wearable displays. Further tuning will be required to make it possible to safely project images directly onto the retina of the human eyes, but this is something the scientists have been working on and the foundations of this new technology have been laid.

Excited about the results, Asst. Prof. Nakao comments: "At the University of Fukui, we are trying to stir things up in the wearable display industry by developing smart glasses with optical engine, driver circuit, projector, and battery all integrated in one single device." The applications of wearable displays also go beyond those of virtual and augmented reality for entertainment and could allow for better conferencing, surveillance, and even remote-assisted surgery. Expanding further upon these possible applications, Asst. Prof. Nakao says: "For now, our unit can be used in laser microscopes, sensors, projectors, and HUD displays, particularly those for novel automobile systems with intelligent driving technology, which are all set to reshape our future."

Let's hope further progress in this field takes us soon to this projected future!

Credit: 
University of Fukui

Low oxygen levels in lakes and reservoirs may accelerate global change

image: Beaverdam Reservoir in Vinton, Virginia. Photo courtesy of Alexandria Hounshell.

Image: 
Alexandria Hounshell

Because of land use and climate change, lakes and reservoirs globally are seeing large decreases in oxygen concentrations in their bottom waters. It is well-documented that low oxygen levels have detrimental effects on fish and water quality, but little is known about how these conditions will affect the concentration of carbon dioxide and methane in freshwaters.

Carbon dioxide and methane are the primary forms of carbon that can be found in the Earth's atmosphere. Both of these gases are partially responsible for the greenhouse effect, a process that increases global air temperatures. Methane is 34 times more potent of a greenhouse gas than carbon dioxide, so knowing how low oxygen levels within lakes and reservoirs affect both carbon dioxide and methane could have important implications for global warming.

Until now, researchers did not have any empirical data from the whole-ecosystem scale to definitively say how changing oxygen can affect these two greenhouse gases.

"We found that low oxygen levels increased methane concentrations by 15 to 800 times at the whole-ecosystem scale," said Alexandria Hounshell, a postdoctoral researcher in the Department of Biological Sciences in the College of Science. "Our work shows that low oxygen levels in the bottom waters of lakes and reservoirs will likely increase the global warming potential of these ecosystems by about an order of magnitude."

Virginia Tech researchers just published these findings in a high-impact paper in Limnology and Oceanography Letters.

To determine a correlation between oxygen and methane concentrations, researchers honed in on two reservoirs outside of Roanoke. In collaboration with the Western Virginia Water Authority, the research team operated an oxygenation system in Falling Creek Reservoir, which pumps oxygen into the bottom waters and allows researchers to study oxygen concentrations on a whole-ecosystem scale. By also monitoring Beaverdam Reservoir, an upstream reservoir without an oxygenation system, they were able to compare greenhouse gas concentrations in the bottom waters of both reservoirs. They ran the experiment over three years to see how consistent their findings were over time.

"Methane levels were much higher when there was no oxygen in the bottom waters of these reservoirs; whereas the carbon dioxide levels were the same, regardless of oxygen levels," said Cayelan Carey, associate professor of biological sciences and affiliated faculty member of the Global Change Center. "With low oxygen levels, our work shows that you'll get higher production of methane, which leads to more global warming in the future."

This study was one of the first to experimentally test at the whole ecosystem-scale how different oxygen levels affect greenhouse gases. Logistically, it is extremely challenging to manipulate entire ecosystems due to their complexity and many moving parts. Even though scientists can use computer modelling and lab experiments, nothing is as definitive as the real thing.

"We were able to do a substitution of space for time because we have these two reservoirs that we can manipulate and contrast with one another to see what the future may look like, as lower bottom water oxygen levels become more common. We can say with high certainty that we are going to see these lakes become bigger methane emitters as oxygen levels decrease," said Carey.

According to Hounshell, the strength of their results lie in the study's expanse over multiple years. Despite having a range of meteorological conditions over the three years, the study affirmed that much higher methane concentrations in low oxygen conditions happen consistently every year, no matter the air temperature.

Ultimately, this study is crucial for how researchers, and the general public, think about how freshwater ecosystems produce greenhouse gases in the future. With low oxygen concentrations increasing in lakes and reservoirs across the world, these ecosystems will produce higher concentrations of methane in the future, leading to more global warming.

Of course, these ecological changes are not just happening in the Roanoke region. Around the globe, a number of studies have pointed to changing carbon cycling in terrestrial and marine ecosystems. However, this study is one of the few to address this phenomenon in lakes and reservoirs, which are often neglected in carbon budgets. This study will fill in these knowledge gaps and shine a spotlight on what we can do as citizens to solve this problem.

This study suggests that keeping lakes from experiencing low oxygen concentrations in the first place could further prevent them from hitting the tipping point, when they start to become large methane producers. Small decisions can add up. For example, decreasing runoff into lakes and reservoirs can prevent the depletion of oxygen in their bottom waters. "Don't put a ton of fertilizer on your lawn, and be really strategic about how much fertilizer you use and how you use it," said Hounshell.

And greenhouse gases are just a small part of the bigger picture of how reservoirs function in the global carbon cycle. Currently, the research team is conducting follow-up oxygen manipulation studies to elucidate other components that contribute to ecosystem change. They will continue to monitor oxygen manipulations in the two Roanoke reservoirs to see how the reservoir can affect the ecosystem for the long haul.

Credit: 
Virginia Tech

Essential oral healthcare during the COVID-19 pandemic

Alexandria, Va., USA -- The COVID-19 pandemic has revealed the need for consensus on the definition of essential oral healthcare. The article "Pandemic considerations on essential oral healthcare" provides a layered model of essential oral healthcare, integrating urgent and basic oral healthcare, as well as advanced and specialist oral healthcare.

Essential oral healthcare covers the most prevalent oral health problems but, by default, does not include the full spectrum of possible interventions that contemporary dentistry can provide. A layered approach to the definition of essential oral healthcare allows for categorization and prioritization with available resources and needs in mind. This model also includes a definition of basic oral healthcare and calls for oral healthcare to be recognized as an integral component of a healthcare system's essential services.

"There is a significant need for evidence-based criteria to define which dental interventions are to be included in each category of essential oral healthcare. A lack of clearly defined essential oral healthcare services leaves people at risk for physical, mental and social harm," said JDR Editor-in-Chief Nicholas Jakubovics. "All stakeholders, including the research, academic and clinical communities, need to work together to respond to this call for a consensus."

Credit: 
International Association for Dental, Oral, and Craniofacial Research

Researchers estimate nearly one-third of oaks are threatened with extinction

image: Acorns of Quercus bambusifolia, an endangered oak found in China and Vietnam.

Image: 
The Morton Arboretum

Lisle, Ill. (Dec. 10) -- An estimated 31% of the world's oak species are threatened with extinction according to data compiled in a new report by The Morton Arboretum and the International Union for Conservation of Nature (IUCN) Global Tree Specialist Group, The Red List of Oaks 2020. The report details for the first time the distributions, population trends and threats facing the world's estimated 430 oak species, and will serve as a roadmap for conservation action.

According to Arboretum researchers, an estimated 41% of the world's 430 oak species are of conservation concern. Nearly one-third (31%) are considered threatened with extinction. This proportion of threatened species is higher than threat levels for mammals (26%) and birds (14%). The report indicates that the countries with the highest number of threatened oak species are Mexico (32 species), China (36), Vietnam (20), and the United States (16).

"As we were evaluating the extinction risk of these hundreds of species over the past several years, it became clear how dire the situation is for oaks," said Murphy Westwood, Ph.D., Director of Global Tree Conservation at the Arboretum. "We finally have a complete picture of the state of the world's oaks, so conservationists worldwide can take informed action to save oaks from extinction," Westwood stressed.

The analysis of the global patterns of threat to oaks reveals that invasive pests, diseases and climate change are the key threats to oaks in the United States, whereas deforestation for agriculture and urbanization are the biggest drivers of change in Southeast Asia. The report calls for concerted conservation efforts and capacity building in the global centers of diversity for oaks in Mexico and Southeast Asia.

Toward that end, the Arboretum and Botanic Gardens Conservation International (BGCI) also established the Global Conservation Consortium for Oak (GCCO), partnering with botanical gardens, arboreta, universities and government agencies around the world to create a network of experts and institutions to protect threatened oaks globally. The Consortium will use the results to prioritize and guide conservation efforts in the wild and in living botanical collections, such as the conservation groves of threatened oaks being established at the Arboretum.

"More than 2,300 species of bird, mosses, fungi, insects, lichens and mammals are recorded as using native oaks for food and shelter in the U.K., and the same will be true for the 113 species of oak now threatened with extinction," said BGCI Secretary General Paul Smith. "The loss of just one of these tree species has catastrophic consequences for hundreds of other species," Smith warned.

The Red List of Oaks 2020 is the culmination of five years of research and consultation with more than 100 experts from around the globe to assess the extinction risk of the world's oak species. This initiative represents The Morton Arboretum and its collaborators' contribution to achieving one of the 16 targets of the Global Strategy for Plant Conservation: to have an assessment of the conservation status of all known plant species by the end of 2020.

"The comprehensive assessment of all oak species is a major achievement," said Craig Hilton-Taylor, Head of the IUCN Red List Unit. "Having this information about such an ecologically and economically important group of trees is vital for informing conservation efforts."

Credit: 
The Morton Arboretum

Police investigators of online child abuse at risk of mental harm

Police who investigate online crimes against children, and protect wider society from seeing images of violence against young people, are themselves at risk of moral injury and other psychological harms.

Researchers at the University of Portsmouth and Solent University explored moral injury amongst child exploitation investigators and interviewed police officers from two Constabularies during a year-long study. The CREST (Centre for Research and Evidence on Security Threats) funded project asked questions relating to motivations for beginning the role, any personality changes, prior trauma, difficulties relating to their current role, coping mechanisms, moral decision making and use of professional support.

Professor Peter Lee, Director of Security and Risk Research and Innovation at the University of Portsmouth, said: "We found that law enforcement professionals who investigate child exploitation can be continually exposed to traumatising visual images in their jobs for years on end. This makes them particularly vulnerable to moral injury, PTSD, anxiety, depression and secondary trauma."

Speaking ahead of the launch of the report Professor Lee added: "Investigating online child sex crime is an extreme example of regularly and repeatedly witnessing acts that transgress the moral frameworks of those involved. This study is important to find out more about the causes and consequences of moral injury amongst these police investigators to enhance support in the future."

Researchers found that some individuals were distressed by their current role. This was a consequence of repeatedly experiencing intrusive thoughts and images, with overwhelming workloads and timeframes. The study suggests ways to improve management of the psychological distress of viewing indecent images by enhancing current professional support. Police officers rely heavily on family and peer support to help process their distress, rather than use the current help available to them.

Dr Mark Doyle, Solent University said: "We are enormously grateful to CREST for funding this vitally important research and to those police officers that have taken part. We aim to build on this current research and support it with a further in-depth exploration of the issues raised, by collaborating with relevant government organisations, police and industry authorities where traumatising imagery is regularly encountered.

"We also aim to improve the recognition, training and professional support for those who conduct this difficult role. We have identified a number of areas for follow-on research and engagement with police online investigators to enhance existing support schemes and develop new specialised training to promote the psychological wellbeing of those on the front line."

The evidence and insights from the research will help to enhance good mental health practice, not just for these police investigators but for other professionals whose work can be mentally traumatic. The research team hopes to go on to establish greater understanding of the prevalence and severity of mental health difficulties, personality change, mental health stigma, behavioural and other factors that specifically influence wellbeing.

Credit: 
University of Portsmouth

The greening of the earth is approaching its limit

When plants absorb this gas to grow, they remove it from the atmosphere and it is sequestered in their branches, trunk or roots. An article published today in Science shows that this fertilizing effect of CO2 is decreasing worldwide, according to the text co-directed by Professor Josep Peñuelas of the CSIC at CREAF and Professor Yongguan Zhang of the University of Nanjin, with the participation of CREAF researchers Jordi Sardans and Marcos Fernández. The study, carried out by an international team, concludes that the reduction has reached 50% progressively since 1982 due basically to two key factors: the availability of water and nutrients. "There is no mystery about the formula, plants need CO2, water and nutrients in order to grow. However much the CO2 increases, if the nutrients and water do not increase in parallel, the plants will not be able to take advantage of the increase in this gas", explains Professor Josep Peñuelas. In fact, three years ago Prof. Peñuelas already warned in an article in Nature Ecology and Evolution that the fertilising effect of CO2 would not last forever, that plants cannot grow indefinitely, because there are other factors that limit them.

If the fertilizing capacity of CO2 decreases, there will be strong consequences on the carbon cycle and therefore on the climate. Forests have received a veritable CO2 bonus for decades, which has allowed them to sequester tons of carbon dioxide that enabled them to do more photosynthesis and grow more. In fact, this increased sequestration has managed to reduce the CO2 accumulated in the air, but now it is over. "These unprecedented results indicate that the absorption of carbon by vegetation is beginning to become saturated. This has very important climate implications that must be taken into account in possible climate change mitigation strategies and policies at the global level. Nature's capacity to sequester carbon is decreasing and with it society's dependence on future strategies to curb greenhouse gas emissions is increasing", warns Josep Peñuelas.

The study published in Science has been carried out using satellite, atmospheric, ecosystem and modelling information. It highlights the use of sensors that use near-infrared and fluorescence and are thus capable of measuring vegetation growth activity.

Less water and nutrients

According to the results, the lack of water and nutrients are the two factors that reduce the capacity of CO2 to improve plant growth. To reach this conclusion, the team based itself on data obtained from hundreds of forests studied over the last 40 years. "These data show that concentrations of essential nutrients in the leaves, such as nitrogen and phosphorus, have also progressively decreased since 1990," explains researcher Songhan Wang, the first author of the article.

The team has also found that water availability and temporal changes in water supply play a significant role in this phenomenon. "We have found that plants slow down their growth, not only in times of drought, but also when there are changes in the seasonality of rainfall, which is increasingly happening with climate change," explains researcher Yongguan Zhang.

Credit: 
Spanish National Research Council (CSIC)

Bacteria release climate-damaging carbon from thawing permafrost

image: Permafrost areas thaw out and become marshland.

Image: 
Monique Patzner

Around a quarter of the ground in the northern hemisphere is permanently frozen. These areas are estimated to contain about twice as much carbon as the world's current atmosphere. New research says that these permafrost soils are not only increasingly thawing out as the Earth becomes warmer, but also releasing that carbon, which accelerates the thawing.

An international research team that includes Thomas Borch, Colorado State University professor in the Department of Soil and Crop Sciences, and Monique Patzner, a Ph.D. student at the University of Tübingen's Center for Applied Geoscience in Germany, has investigated the way this development affects the microorganisms in the soil. Their results have been published in Nature Communications. Borch, who co-advises Patzner, also holds appointments in the CSU Department of Civil and Environmental Engineering and Department of Chemistry. Patzner was lead author on the paper.

The work was led by Andreas Kappler of the University of Tübingen and Casey Bryce at the University of Bristol in the UK.

The team worked on the assumption that thawing increases the availability of organic carbon for microorganisms to process, in turn releasing vast amounts of carbon dioxide and methane. These gases accelerate the greenhouse effect, leading to further permafrost thawing in a vicious cycle.

Rising temperatures lead to collapse of intact permafrost soils, resulting in landslides and the widespread formation of wetlands. In this latest study, the team investigated what happens to the carbon trapped in the soil when the permafrost thaws out.

"The organic material naturally present in the samples accumulated as peat over thousands of years. With permafrost thaw, microbes become active and are able to decompose the peat," Kappler said. "We also know that iron minerals preserve organic carbon from biodegradation in various environments - and thus they could be a carbon sink even after the permafrost has thawed."

The reactive iron is present as a kind of rust and might be expected to trap the organic material in what the scientists call a "rusty carbon sink."

The team investigated the storage potential of the rusty carbon sink at a permafrost peatland at Stordalen mire, Abisko, Sweden. There, samples of the soil porewater and drill cores of the active layer were taken along a permafrost thaw gradient. The research team examined how much organic material was bound to reactive iron minerals, how stable these iron-carbon associations are with permafrost thaw, and whether the microorganisms present could use the material as a source of food and energy. The team also carried out experiments in the laboratory in Tübingen.

Borch's team was responsible for characterization of the iron mineralogy along the permafrost thaw gradient using synchrotron-based radiation at the Stanford Synchrotron Radiation Lightsource. They observed a loss of poorly crystalline iron and a decrease in organic matter-chelated iron, but an increase in iron-bearing clays and iron sulfurs along the thaw gradient.

"This clearly indicated that important iron phases were dissolving due to permafrost thaw-induced anaerobic conditions," Borch said.

The team found that microorganisms are apparently able to use the iron as a food source, thereby releasing the bound organic carbon into the water in the soil.

"That means the rusty carbon sink cannot prevent the organic carbon from escaping from the thawing permafrost," Kappler said. "Based on data available from elsewhere in the northern hemisphere, we expect that our findings are applicable for permafrost environments worldwide," added Bryce

Patzner explained that the rusty carbon sink is only found in intact permafrost soils and is lost during permafrost thaw. Now the researchers are seeking to find out how this facilitates greenhouse gas emissions and thus global warming.

"It appears that the previously iron-bound carbon is highly bioavailable and, therefore, bacteria could immediately metabolize it into greenhouse gas emissions," Patzner said. "This is a process which is currently absent from climate-change prediction models and must be factored in."

The Borch lab is now using Fourier transform ion cyclotron resonance mass spectrometry at the National High Magnetic Field Laboratory to elucidate the chemical nature and fate of the organic matter released. These further inquiries should help improve understanding of carbon cycling in these sensitive ecosystems, Borch said.

Credit: 
Colorado State University

Atom-thin transistor uses half the voltage of common semiconductors, boosts current density

BUFFALO, N.Y. -- University at Buffalo researchers are reporting a new, two-dimensional transistor made of graphene and the compound molybdenum disulfide that could help usher in a new era of computing.

As described in a paper accepted at the 2020 IEEE International Electron Devices Meeting, which is taking place virtually next week, the transistor requires half the voltage of current semiconductors. It also has a current density greater than similar transistors under development.

This ability to operate with less voltage and handle more current is key to meet the demand for new power-hungry nanoelectronic devices, including quantum computers.

"New technologies are needed to extend the performance of electronic systems in terms of power, speed, and density. This next-generation transistor can rapidly switch while consuming low amounts of energy," says the paper's lead author, Huamin Li, Ph.D., assistant professor of electrical engineering in the UB School of Engineering and Applied Sciences (SEAS).

The transistor is composed of a single layer of graphene and a single layer of molybdenum disulfide, or MoS2, which is a part of a group of compounds known as transition metals chalcogenides. The graphene and MoS2 are stacked together, and the overall thickness of the device is roughly 1 nanometer -- for comparison, a sheet of paper is about 100,000 nanometers.

While most transistors require 60 millivolts for a decade of change in current, this new device operates at 29 millivolts.

It's able to do this because the unique physical properties of graphene keep electrons "cold" as they are injected from the graphene into the MoS2 channel. This process is called Dirac-source injection. The electrons are considered "cold" because they require much less voltage input and, thus, reduced power consumption to operate the transistor.

An even more important characteristic of the transistor, Li says, is its ability to handle a greater current density compared to conventional transistor technologies based on 2D or 3D channel materials. As described in the study, the transistor can handle 4 microamps per micrometer.

"The transistor illustrates the enormous potential 2D semiconductors and their ability to usher in energy-efficient nanoelectronic devices. This could ultimately lead to advancements in quantum research and development, and help extend Moore's Law," says co-lead author Fei Yao, PhD, assistant professor in the Department of Materials Design and Innovation, a joint program of SEAS and UB's College of Arts of Sciences.

Credit: 
University at Buffalo

NYUAD researchers shed new light on mysteries behind the light emission of fireflies

image: A female beetle glowing green-yellow light

Image: 
NYU Abu Dhabi

Fast facts:

Bioluminescence is an energy-conserving process of natural production of cold light that many lower organisms use for communication, capturing prey, or mating.

This wondrous phenomenon has long fascinated scientists and the public, but many details of the chemical reactions used to produce light remain unclear. For example, it remains uncertain why various beetle species can emit different colors of light, despite using very similar light-producing enzymes.

Understanding the chemical reactions responsible for bioluminescence could lead to the development of new bioanalytical tools, such as those for early discovery of cancer and diagnostics of other diseases.

Abu Dhabi, UAE, December 10, 2020: A team of researchers from the NYU Abu Dhabi's (NYUAD) Smart Materials Lab (SML) led by Professor of Chemistry Panče Naumov has conducted a thorough review of the scientific literature surrounding the natural production of light, called bioluminescence, and developed conclusions that will help others in the field direct their research to uncover the mysteries behind this fascinating natural phenomenon.

In the new study The Elusive Relationship Between Structure and Color Emission in Beetle Luciferases, which is featured on the cover of the journal Nature Reviews Chemistry, Naumov and colleagues provide the most comprehensive critical overview of the field of the bioluminescence of beetles, including fireflies, to date.

The NYUAD researchers, including the Naumov group's post-doctoral associates César Carrasco-López and Stefan Schramm, and undergraduate student Nathan M. Lui, identify the intricate structural factors that govern what color light is emitted by wild-type and mutant luciferases, the enzymes that generate light. They also demonstrate that it is possible to build a library of bioluminescent enzymes in the future, which will enable researchers to control the color and intensity of light emission by engineering luciferases at will.

"Learning from nature will provide us with tools to engineer luciferases that can emit colors within a large range of energies," said Naumov. "This will eventually help us expand the range of application of these and similar enzymes for some exciting applications in biology and medicine, including early diagnosis and prevention of diseases."

Throughout human's history, bioluminescence has been an inspiration to scientists, artists, and laypersons. Glowing fungi or ostracods have been used by tribes and soldiers as lanterns to guide their way through jungles without the need of electricity, and fireflies were used by miners as safety lights.

The Nobel Prize in Chemistry in 2008 was awarded for the discovery of the green fluorescent protein, a bioluminescent protein found in the jellyfish Aequorea victoria. Today, bioluminescence is the basis for a great number of bioanalytical methods, such as cell imaging, cancer research, and control of food contamination, and a way to efficiently convert the energy stored in chemical bonds into light that can be easily detected. For example, bioluminescence of some bioluminescent bacterial strains is used to monitor water toxicity and contamination. The fluorescent proteins are genetically inserted into cells and animals to analyze important aspects of dynamics of some diseases.

The latest research from the NYUAD's Naumov team is poised to solve some of the mysteries surrounding the chemistry of bioluminescence and to bring this research closer to applications.

Credit: 
New York University

Princeton Chem reports role of quantum vibrations in electron transfer

image: Vibrational wavepackets mapped to the electron transfer trajectory.

Image: 
Image courtesy of Bo Fu, Princeton Chemistry.

Princeton Chemistry's Scholes Group is reporting evidence that quantum vibrations participate in electron transfer, establishing with ultrafast laser spectroscopy that the vibrations provide channels through which the reaction takes place.

Seeking to establish an experimental proof for a highly contested topic - the role of vibrations in processes fundamental to solar energy conversion - Princeton researchers set out to map the progress of a photoinduced electron transfer (ET) reaction.

The short laser pulses in ultrafast spectroscopy helped to lock all the light-absorbing entities in-step. Researchers were then able to watch the electron transfer dynamics and the vibrational dynamics simultaneously through beats created by the vibrational coherences. They found that the photoinduced ET reaction occurs in ~30-femtoseconds, which contrasts with conventional Marcus theory, and concluded that the unexpectedly rapid pace of the reaction revealed some unknown mechanisms at play.

"What we found is a unique cascade of quantum mechanical events occurring succinctly with the electron transfer reaction," said Shahnawaz Rafiq, a former postdoc in the Scholes Group and lead author of the paper. "These events appear sequentially in the form of loss of phase coherence along high-frequency vibrations, followed by impulsive appearance of a phase coherence along a low-frequency vibration.

"These two events of quantum nature occur because of the role these vibrations play in enabling this ET reaction," said Rafiq. "That's a major part of what we're reporting: how we're able to pinpoint certain places in spectral data that tell us, oh, this is the point of importance. It's a needle in a haystack."

In addition, researchers found an extra vibrational wavepacket in the product state, which was not there in the reactant state.

"It is as if the ET reaction itself created that wavepacket," said Rafiq. "The ultimate revelation is that there is an order to the structural changes associated with a reaction that is decided by the frequencies of the vibrational modes."

The paper, "Interplay of vibrational wavepackets during an ultrafast electron transfer reaction," was published this week online in Nature Chemistry. It marks the culmination of two years of work.

The challenge researchers set themselves in this investigation involved parsing out vibrational coherences relevant to the ET reaction from the vast number of coherences generated by the laser excitation, most of which are spectators.

In their data, researchers discovered the abrupt loss of phase coherence along some high-frequency vibrational coordinates. This rapid loss of phase coherence originates from the random phase interference of ET reaction pathways provided by the vibrational ladder. The observation steps beyond the conventional Marcus theory and directly reports on the vibrationally driven reaction trajectory from the reactant state to the transition state.

"We create wavepackets on the reactant state by using laser pulses, and these wavepackets start dephasing irreversibly from then on," said Rafiq. "So, we do not anticipate seeing any extra wavepacket in the product state. We can see some of them dephase abruptly because they participate in the reaction, but then, seeing a new wavepacket appearing on the product state was tantalizing."

Bo Fu, a postdoc in the Scholes Group and co-author of the paper, added, "Researchers always think that the wavepacket can only be generated by a photon pulse. But here we observe a wavepacket that did not seem to be generated by the photon pulse. Seeing it on the product state indicates a different mechanism of its generation. Quantum dynamics simulations helped us establish that this wavepacket was actually generated by the ET reaction."

Researchers likened the wavepacket generation by ET to stretching a vibrating spring to a more stable position, with an added property that the spring vibrates with a significantly larger amplitude about its new mean position. This spring-like response of the synchronized beating of the molecular structure to the ET provides a sink that inhibits coherent recurrence of the ET, which might otherwise be expected for a process that occurs vectorially than stochastically.

"What I like about this work is that it shows how the structure of a molecular complex distorts during a reaction," said Gregory Scholes, the William S. Tod Professor of Chemistry and a co-author on the paper. "And this distortion happens as a logical sequence of events--just like the molecules were made of springs. The stiff springs respond first, the soft springs last."

The Scholes Group is interested in ultrafast processes in chemistry, seeking to answer questions about energy transfer, excited state processes, and what happens after light is absorbed by molecules. These questions are addressed both theoretically and experimentally.

Credit: 
Princeton University

Bosses need appreciation, too

image: Ambrose holds a doctorate from the University of Illinois at Urbana-Champaign and has served on the faculties at the University of Iowa and University of Colorado at Boulder, where she was director of research for the College of Business. She joined UCF's College of Business in 1999. Her research interests include organizational fairness, ethics and workplace deviance.

Image: 
University of Central Florida

'Tis the season to be grateful, even for your boss, according to a recent University of Central Florida study that suggests when supervisors feel appreciated, it gives them a boost of energy and optimism. In the end, that's good for employees and the organization's bottom line.

"Based on theory, we knew feeling appreciated by another person sends a strong signal that you are positively regarded, and feelings of positive regard evoke a sense of vigor--or high energy," said Maureen Ambrose, the Gordon J. Barnett Professor of Business Ethics and a UCF Pegasus Professor. "This is important because research indicates when people possess higher levels of resources, in this case, energy, they are better able to maintain a positive outlook and engage in positive behaviors at work. We know when supervisors have feelings of depletion--or low energy--negative things happen. For example, when bosses have low energy, they engage in more abusive supervision, creating worse workplaces for their employees,"

Ambrose teamed up with Clemson professor and UCF alumna Susan Sheridan to examine feelings of appreciation and emotional expressions in the workplace. Typically, research in this area has focused solely on the downward influence of supervisors on their employees.

"Our study also found that feeling appreciated by employees was positively related, via energy, to supervisors' psychological well-being. Psychological well-being can buffer individuals from the negative effects of job stress," Ambrose said.

Lessening job stress on employees can have a significant impact on a business's bottom line. The American Institute of Stress estimates that job stress cos U.S. industry more than $300 billion a year in absenteeism, turnover, diminished productivity, and medical, legal and insurance costs.

The study asked supervisors to respond to surveys twice a day for 10 consecutive workdays. Each day participants recorded how much they felt appreciated by their subordinates, how energetic they felt and how it affected them personally (sense of optimism and life satisfaction) and professionally (job satisfaction).

"On days supervisors felt more appreciated, they had more energy, and this translated into higher levels of optimism, life satisfaction, job satisfaction and helping," says Sheridan, who earned her doctorate at UCF and is now an assistant professor of leadership at Clemson. "This was interesting because our field hasn't connected feeling appreciated to higher energy, and we typically look at how supervisors can boost the resources of subordinates--not the other way around."

The study found that the external validation from feeling appreciated is especially powerful for those supervisors who lack a strong sense of validation from within.

Ambrose and Sheridan say they hope this research sparks a deeper examination into the role of gratitude and appreciation in the workplace and how employees influence supervisors.

"Anyone who has managed people knows how influential the relationships with subordinates can be," Ambrose said. "Taking this upwards perspective may help us better understand supervisors' lived experiences at work and why they do the things they do."

Credit: 
University of Central Florida

Diet modifications - including more wine and cheese - may help reduce cognitive decline

AMES, Iowa - The foods we eat may have a direct impact on our cognitive acuity in our later years. This is the key finding of an Iowa State University research study spotlighted in an article published in the November 2020 issue of the Journal of Alzheimer's Disease.

The study was spearheaded by principal investigator, Auriel Willette, an assistant professor in Food Science and Human Nutrition, and Brandon Klinedinst, a Neuroscience PhD candidate working in the Food Science and Human Nutrition department at Iowa State. The study is a first-of-its-kind large scale analysis that connects specific foods to later-in-life cognitive acuity.

Willette, Klinedinst and their team analyzed data collected from 1,787 aging adults (from 46 to 77 years of age, at the completion of the study) in the United Kingdom through the UK Biobank, a large-scale biomedical database and research resource containing in-depth genetic and health information from half-a-million UK participants. The database is globally accessible to approved researchers undertaking vital research into the world's most common and life-threatening diseases.

Participants completed a Fluid Intelligence Test (FIT) as part of touchscreen questionnaire at baseline (compiled between 2006 and 2010) and then in two follow-up assessments (conducted from 2012 through 2013 and again between 2015 and 2016). The FIT analysis provides an in-time snapshot of an individual's ability to "think on the fly."

Participants also answered questions about their food and alcohol consumption at baseline and through two follow-up assessments. The Food Frequency Questionnaire asked participants about their intake of fresh fruit, dried fruit, raw vegetables and salad, cooked vegetables, oily fish, lean fish, processed meat, poultry, beef, lamb, pork, cheese, bread, cereal, tea and coffee, beer and cider, red wine, white wine and champaign and liquor.

Here are four of the most significant findings from the study:

Cheese, by far, was shown to be the most protective food against age-related cognitive problems, even late into life;

The daily consumption of alchohol, particularly red wine, was related to improvements in cognitive function;

Weekly consumption of lamb, but not other red meats, was shown to improve long-term cognitive prowess; and

Excessive consumption of salt is bad, but only individuals already at risk for Alzheimer's Disease may need to watch their intake to avoid cognitive problems over time.

"I was pleasantly surprised that our results suggest that responsibly eating cheese and drinking red wine daily are not just good for helping us cope with our current COVID-19 pandemic, but perhaps also dealing with an increasingly complex world that never seems to slow down," Willette said. "While we took into account whether this was just due to what well-off people eat and drink, randomized clinical trials are needed to determine if making easy changes in our diet could help our brains in significant ways."

Klinedinst added, "Depending on the genetic factors you carry, some individuals seem to be more protected from the effects of Alzheimers, while other seem to be at greater risk. That said, I believe the right food choices can prevent the disease and cognitive decline altogether. Perhaps the silver bullet we're looking for is upgrading how we eat. Knowing what that entails contributes to a better understanding of Alzheimer's and putting this disease in a reverse trajectory."

Credit: 
Iowa State University

Self-collected saliva samples prove effective for diagnosing COVID-19

image: Instructions given to individuals when collecting their saliva sample.

Image: 
Memorial Sloan Kettering Cancer Center

Philadelphia, December 10, 2020 - Researchers at Memorial Sloan Kettering Cancer Center (MSK) have found that SARS-CoV-2 genetic material can be reliably detected in self-collected saliva samples at a rate similar to that of nasopharyngeal and oropharyngeal swabs. The rate of detection using saliva samples was similar across different testing platforms, and saliva samples remained stable for up to 24 hours when stored with ice packs or at room temperature, according to a new study in The Journal of Molecular Diagnostics, published by Elsevier. Oral rinses, which have been suggested as another alternative to nasal swab collection, did not reliably diagnose COVID-19.

"The current pandemic has placed a significant strain on the supply chain, from swabs to the personal protective equipment (PPE) healthcare workers need to safely collect samples," explained lead investigator Esther Babady, PhD, D(ABMM), FIDSA, director of the Clinical Microbiology Service at Memorial Sloan Kettering Cancer Center, New York, NY, USA. "The use of self-collected saliva has the potential to minimize healthcare worker exposure and decrease the need for specialized collection devices, such as swabs and viral transport media."

The study was conducted at MSK in New York City at the peak of the regional outbreak between April 4 and May 11, 2020. Study participants were 285 MSK employees who needed to be tested for COVID-19 because they had symptoms of the virus or had been exposed to someone who had the virus. Each participant provided paired samples: a nasopharyngeal swab and oral rinse; a nasopharyngeal swab and a saliva sample; or an oropharyngeal swab and a saliva sample. All samples to be tested were stored at room temperature and transported to the laboratory within two hours.

The agreement between the saliva test and the oropharyngeal swab was 93 percent, with a sensitivity of 96.7 percent. In comparison with the nasopharyngeal swab, the agreement of the saliva test was 97.7 percent, with a sensitivity of 94.1 percent. Oral rinses were only 63 percent effective in detecting the virus, with an overall agreement with nasopharyngeal swab of only 85.7 percent. To test for stability, saliva samples and nasopharyngeal samples with a range of viral loads were stored in a transport cooler at 4° C or at room temperature. No significant difference in virus concentration was detected in any samples at the time of collection, eight hours later, and 24 hours later. These results were validated on two commercial SARS-CoV-2 PCR platforms, and overall agreement between the different testing platforms was over 90 percent.

Dr. Babady noted that validation of sample self-collection methods holds great promise for broad testing strategies that would mitigate infection risk and PPE resource utilization. "The current 'test, track, and trace' public health approach to surveillance relies heavily on testing for both diagnosis and surveillance," she stated. "The use of self-collected saliva provides a cheaper and less invasive option for viable sample collection. It's certainly easier to spit in a cup twice a week than undergoing frequent nasopharyngeal swabs. This can improve patient compliance and satisfaction particularly for surveillance testing, which requires frequent sample collection. Since we also showed that the virus was stable at room temperature for at least 24 hours, saliva collection has potential for use at home."

Credit: 
Elsevier

First presentation after Hayabusa2 mission return set for SPIE conference 14 December

image: MISSION ACCOMPLISHED: Hayabusa2 completed its out-of-this-world expedition near Woomera in South Australia. Credit: JAXA.

Image: 
JAXA.

BELLINGHAM, Washington, USA — As part of the opening plenary session at the SPIE Astronomical Telescopes + Instrumentation Digital Forum, Hitoshi Kuninaka, of the Japan Aerospace Exploration Agency, will be discussing and responding to audience questions about the successful return of the Hayabusa2 capsule from its asteroid-sample mission, a second-time-in-history-success that marks exciting and innovating advances for the astronomical-instrumentation community. In addition, Kuninaka may have initial information regarding what potential treasures the capsule's sample container holds.

"I'm so excited to hear more about this mission," says SPIE Astronomical Telescopes + Instrumentation Chair Alison Peck. "The level of precision needed to pull this off — for the navigation as well as the engineering that went into the spacecraft itself — is just mind-blowing. This really ushers in a new era of 'in-person' study of the composition of these solar system bodies."

SPIE Astronomical Telescopes + Instrumentation Digital Forum, showcasing and sharing the latest in space- and ground-based telescopes and related engineering, runs 14-18 December. The interactive plenary session — as well as the rest of the digital forum — will be accessible online, and registration to the event is free and open to the public. The plenary session is scheduled for 12 to 2 PM Pacific Time on 14 December.

This is the second successful Hayabusa attempt to bring asteroid material to Earth: in 2010, the original Hayabusa capsule brought back samples from the asteroid Itokawa, a pioneering success which allowed scientists to learn Itokawa's age and geologic history. This week's sample materials stem from an 18-month rendezvous that Hayabusa2 — launched in 2014 — had with Ryugu, a carbonaceous, or C-type, asteroid, a class of asteroid believed to contain organic materials and hydrates. During its year-and-a-half meet-up, Hayabusa2 observed Ryugu remotely as well as collecting material and data from the astral body's surface and subsurface. The Hayabusa missions are historical firsts, the only successful sample-return missions so far.

"We think that asteroids, comets, and cosmic dust played important roles in terms of transferring material inside the solar system, and Hayabusa and Hayabusa2 have inspired us to explore the above hypothesis," says Kuninaka. "If the asteroid Ryugu is indeed confirmed to be shown to be carbon rich, there is a strong possibility of resource utilization for the international human space exploration."

"They have done a fantastic job in completing all actions of this mission without major failures," notes SPIE Astronomical Telescopes + Instrumentation Chair Satoru Iguchi. "A lot of technically challenging goals were successfully achieved, including touchdown on the asteroid Ryugu, the collection of samples, and a successful return to Earth. Just as exciting, an investigation of the samples should offer us a deeper understanding of the origin of life."

Credit: 
SPIE--International Society for Optics and Photonics

How commercial vessels could become tsunami early-warning systems

Scientists may have discovered a new ally in efforts to keep coastal communities in the Pacific Northwest safe from future tsunamis, according to a new study: Fleets of commercial shipping vessels.

The research taps into an urgent need for communities like Newport, Oregon, a seaside town that is home to more than 10,000 people. If a tsunami formed along a fault line in the Pacific called the Cascadia Subduction Zone, residents there might have just minutes to get to safety, said study coauthor Anne Sheehan.

"A tsunami can take 20 or 30 minutes to reach the coastline, so the time is very short," said Sheehan, a fellow at the Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado Boulder.

In a study now under review in the journal Earth and Space Science, she and her colleagues think that they may have stumbled upon a possible solution. Drawing on computer simulations, the group reports that networks of cargo ships carrying special GPS sensors could, theoretically, work together to automatically monitor a shoreline for possible tsunami waves--spotting these dangerous surges in less than 10 minutes in some cases.

The team will present its findings Thursday at the 2020 virtual fall meeting of the American Geophysical Union (AGU).

Lead author M. Jakir Hossen added that this early warning system would be much less expensive to put in place than current tsunami forecasting tools.

"There are so many ships that are already traveling in the Cascadia Subduction Zone area," said Hossen, a visiting fellow at CIRES. "We're thinking about how we can use those existing facilities for tsunami forecasts."

When not if

In a geologic sense, the Cascadia Subduction Zone is a bomb waiting to go off. Over the last few decades, scientists have discovered that intense energy seems to be building in the tectonic plates that lie miles below the Pacific Ocean from Northern California to British Columbia.

"They could release this energy anytime and trigger a huge earthquake. The tsunami size could be as big as the 2004 Indian Ocean tsunami," Hossen said.

But as big as such a wave could be, if you were swimming in the open ocean, you might not even know that you were in the middle of a tsunami. That can make these disasters hard to predict ahead of time, said Sheehan, also a professor in the Department of Geological Sciences.

"Even a really big tsunami wave would be only a meter tall in the open ocean, and it would take 15 minutes to pass you," she said.

Scientists currently use seafloor sensors to record when a possible tsunami might be passing overhead. But these gauges are costly to install and maintain. Sheehan and her colleagues had a different idea: Why not take advantage of all of the ships that are already out there in the ocean, delivering goods like cars and produce to towns up and down the Pacific Coast?

Global warning system

To see if that might be feasible, the team ran a mock scenario: They built computer simulations that drew on the locations of real ships near the Cascadia Subduction Zone. The researchers also imagined that each of the digitized ships was carrying a GPS sensor that could precisely measure its elevation, or how it bobbed up and down in the waves. Sheehan explained that such vessels already use satellite systems to transmit their exact locations in the ocean--so the new sensors might only be a modest upgrade.

The team then ran a synthetic experiment to see if those ships might be able to forecast a tsunami.

The test was a success. The findings show that similar networks of ships could be used to identify tsunami waves long before they ever reached shore, and all without needing to diverge from their normal routes.

"A single ship couldn't do this," Sheehan said. "The power comes from having 100 ships in the same area that are all going up and down at the same time."

Scientists at the U.S. National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information based in Boulder collaborated with the team to provide information on current tsunami warning systems and Cascadia tsunami hazards.

The researchers calculated that, at a minimum, you'd need a web of ships traveling about 12 miles, or 20 kilometers, apart to make accurate tsunami forecasts using their method. Just how much of an advanced warning such a fleet could provide to people onshore isn't clear. It may depend on where the ships happen to be at that moment.

But Hossen noted that team's method could, theoretically, be applied in any ocean in the world as long as it had enough shipping traffic--even in regions like the Indian Ocean where the tsunami risk is high but disaster preparedness resources are often scant.

"If we can use these ships, then it would probably be much more affordable for any country, not just developed ones," he said.

Credit: 
University of Colorado at Boulder