Tech

Stresses and flows in ultra-cold superfluids

Superfluids, which form only at temperatures close to absolute zero, have unique and in some ways bizarre mechanical properties, Yvan Buggy of the Institute of Photonics and Quantum Sciences at Heriot-Watt University in Edinburgh, Scotland, and his co-workers have developed a new quantum mechanical model of some of these properties, which illustrates how these fluids will deform as they flow around impurities. This work is published in the journal EPJ D.

Imagine that you start stirring a cup of tea, come back to it five minutes later and find that the tea is still circulating. In itself, this is clearly impossible, but if you could stir a cup of an ultra-cold liquid this is exactly what would happen. Below about -270oC - that is, just a few degrees above the coldest possible temperature, absolute zero - the liquid becomes a superfluid: a weird substance that has no viscosity and that therefore will flow without losing kinetic energy, creep along surfaces and along vessel walls, and continue to spin indefinitely around vertices.

Superfluids acquire these properties because so many of their atoms fall into the lowest energy state that quantum mechanical properties dominate over classical ones. They therefore provide a unique opportunity for studying quantum phenomena on a macroscopic level, if in extreme conditions. In this study, Buggy and his colleagues use the essential equations of quantum mechanics to calculate the stresses and flows in such an ultracold superfluid under changes in potential energy. They show that the fluid flow will be steady and homogeneous in the absence of impurities. If an impurity is present, however, the fluid will become deformed in the vicinity of that impurity.

Credit: 
Springer

'Data clouds fusion' helps robots work as a team in hazardous situations

A group of researchers and engineers has created a new way for robots to pool data gathered in real time, allowing them to 'think' collectively and navigate their way through difficult, previously unmapped obstacles as a team.

Researchers published their findings in IEEE/CAA Journal of Automatica Sinica, which is jointly published by the Institute of Electrical and Electronics Engineers (IEEE) and the Chinese Association of Automation (CAA).

Not long ago, the concept of artificially intelligent robots mimicking animal herd-like intuition and cooperation to accomplish a mutual task was limited to the realm of science fiction and our imaginations. But a recent joint effort among international researchers demonstrated that robots working together can navigate unknown terrain faster than they would as individuals.

At the heart of the researchers' new group robotic navigation system is a centralized data cloud. Each cloud-linked robot draws on data gathered in real time from all of the other robots, applying logic algorithms to help them steer clear of paths more likely to contain obstacles.

The group took a three-pronged approach to build out the decision-making capacity of their group of robots. "We presented a solution of combined dead reckoning, data transferring, and machine vision, based on our research group's original laser-based real-time technical vision system," said Mykhailo Ivanov, an engineer and one of the lead authors of the study at Universidad Autónoma de Baja in Mexico. Previous efforts, he added, focused on each problem separately.

The study deployed simple four-wheeled robots onto obstacle courses designed to have unique blind spots for each unit. The robots' 'eyes' were based on a simple laser and pair of sensors that evaluated the reflected light for position and distance. The team chose laser vision rather than digital video cameras because the laser system can operate in total darkness and is less expensive.

The ability to move through difficult terrain may be helpful in a variety of settings where there is a need to collect data, yet the environment is too small or too dangerous for humans. "Here we have earthquakes very frequently," Ivanov said of California. "So such robotic groups, equipped with our technical vision system, could monitor buildings' structural integrity as well as speed rescue efforts following a catastrophic event.

The next step in this research, Mr. Ivanov said, will be to improve upon the robotic vision, which would make their cloud-based navigation teamwork system potentially useful across a wider array of industries and applications.

Credit: 
Chinese Association of Automation

Future information technologies: 3D quantum spin liquid revealed

image: One of the four magnetic interactions leads to a three-dimensional network of corner-sharing triangles also known as the hyperkagome lattice. Combined the magnetic interactions form a hyper-hyper-Kagome lattice which allows the 3D quantum spin liquid behavior.

Image: 
HZB

They found spin liquid behaviour in 3D, due to a so called hyper hyperkagome lattice. The experimental data fit extremely well to theoretical simulations also done at HZB.

IT devices today are based on electronic processes in semiconductors. The next real breakthrough could be to exploit other quantum phenomena, for example interactions between tiny magnetic moments in the material, the so-called spins. So-called quantum-spin liquid materials could be candidates for such new technologies. They differ significantly from conventional magnetic materials because quantum fluctuations dominate the magnetic interactions: Due to geometric constraints in the crystal lattice, spins cannot all "freeze" together in a ground state - they are forced to fluctuate, even at temperatures close to absolute zero.

Quantum spin liquids: a rare phenomenon

Quantum spin liquids are rare and have so far been found mainly in two-dimensional magnetic systems. Three-dimensional isotropic spin liquids are mostly sought in materials where the magnetic ions form pyrochlore or hyperkagome lattices. An international team led by HZB physicist Prof. Bella Lake has now investigated samples of PbCuTe2O6, which has a three-dimensional lattice called hyper-hyperkagome lattice.

Magnetic interactions simulated

HZB physicist Prof. Johannes Reuther calculated the behaviour of such a three-dimensional hyper-hyperkagome lattice with four magnetic interactions and showed that the system exhibits quantum-spin liquid behaviour with a specific magnetic energy spectrum.

Experiments at neutron sources find 3D quantum spin liquid

With neutron experiments at ISIS, UK, ILL, France and NIST, USA the team was able to prove the very subtle signals of this predicted behaviour. "We were surprised how well our data fit into the calculations. This gives us hope that we can really understand what happens in these systems," explains first author Dr. Shravani Chillal, HZB.

Credit: 
Helmholtz-Zentrum Berlin für Materialien und Energie

Mathematics to keep farmers on track

image: Solid and dashed lines respectively indicate stable and unstable steering; Trajectory A under V = 1.5 m s-1 and μs = 0.8; Trajectory B under V = 3.0 m s-1 and μs = 0.8; Trajectory C under V = 1.5 m s-1 and μs = 0.4; Trajectory D under V = 3.0 m s-1 and μs = 0.4 (V: Travel velocity of tractor, μs: Friction coefficient between tire and ground).

Image: 
Kenshi Sakai / TUAT

Tokyo, Japan - Scientists at the Tokyo University of Agriculture and Technology (TUAT) used nonlinear mathematical modeling to understand the bouncing and sliding instabilities that can led to tractor accidents. This research may help protect farmers from injury, as well as better control automated agricultural systems.

The research was published in Biosystems Engineering on Feb 12th, 2020 as "Numerical analysis of steering instability in an agricultural tractor induced by bouncing and sliding".

Accidents in which tractors overturn is a leading cause of death for farmers. This is especially concerning in regions that have uneven terrain. Repeated bumps of a specific frequency can lead to catastrophic bouncing or sliding instabilities. This resonant frequency reflects the natural oscillation period of the tractor, but when applied from an outside source, can cause vibrations with dangerously large amplitudes. Similar to an earthquake that can topple one building while leaving neighboring structures undamaged, one tractor may be safe to ride a certain route while another will be in danger of overturning. Moreover, even a slight increase in speed may start violent vibrations that cause the front wheels to lift off the ground, preventing proper steering.

The TUAT researchers used a mathematical bicycle model that represented the front and back wheels of the tractor as damped oscillators, and calculated the vertical and pitching motion of a trailing implement, such as a rotary tiller. The equations allowed the computer to calculate the motions of each component as part and simulate the overall stability based on the parameters of the system, like the stiffness of the wheel springs. Using frequency response analysis, which involved intensive numerical experiments for various parameter combinations, they showed that the stability of the tractor was strongly dependent on specific conditions, especially the travel speed of the tractor and the friction coefficient of the road. "As in many non-linear systems, the onset of instability is very sensitive to the control parameters," says first author Masahisa Watanabe. This means that small changes in velocity and friction coefficient can led to a discontinuous increase in bouncing and sliding.

"We included simulations based on actual accident case studies in Japan, and found that a large increase in nonlinear response can happen when the tractor attempts to climb a 19-degree slope," says senior author Kenshi Sakai. The right figure shows the tractor trajectories in the numerical experiments of the tractor operations on the passage slope of 19 degree. In the numerical experiments, the travel velocity V was respectively set at 1.5 and 3.0 m s-1 for the low and high velocity conditions and the static friction coefficient μs was respectively set at 0.8 and 0.4 for the preferable and adverse road conditions. When the travel velocity V was 1.5 m s-1, Trajectories A and C remained on the road throughout the simulation while there was steering instability for Trajectory C. When the travel velocity was V = 3.0 m s-1, the tractor trajectory dramatically deviated outward for Trajectories B and D. In particular, for Trajectory D, the steering instability continued until the tractor reached the edge of the road. This project aims not only to protect the safety of farmers, but also to advance the understanding of autonomous control theory.

Credit: 
Tokyo University of Agriculture and Technology

Artificial synapses on design

image: Dr. Ilia Valov (front left) in the Oxide Cluster at Forschungszentrum Jülich, where experiments were carried out for the current work. In the background: Michael Lübben (center) and Prof. Rainer Waser (right)

Image: 
Copyright: RWTH Aachen / Peter Winandy

Scientists around the world are intensively working on memristive devices, which are capable in extremely low power operation and behave similarly to neurons in the brain. Researchers from the Jülich Aachen Research Alliance (JARA) and the German technology group Heraeus have now discovered how to systematically control the functional behaviour of these elements. The smallest differences in material composition are found crucial: differences so small that until now experts had failed to notice them. The researchers' design directions could help to increase variety, efficiency, selectivity and reliability for memristive technology-based applications, for example for energy-efficient, non-volatile storage devices or neuro-inspired computers.

Memristors are considered a highly promising alternative to conventional nanoelectronic elements in computer Chips. Because of the advantageous functionalities, their development is being eagerly pursued by many companies and research institutions around the world. The Japanese corporation NEC installed already the first prototypes in space satellites back in 2017. Many other leading companies such as Hewlett Packard, Intel, IBM, and Samsung are working to bring innovative types of computer and storage devices based on memristive elements to market.

Fundamentally, memristors are simply "resistors with memory", in which high resistance can be switched to low resistance and back again. This means in principle that the devices are adaptive, similar to a synapse in a biological nervous system. "Memristive elements are considered ideal candidates for neuro-inspired computers modelled on the brain, which are attracting a great deal of interest in connection with deep learning and artificial intelligence," says Dr. Ilia Valov of the Peter Grünberg Institute (PGI-7) at Forschungszentrum Jülich.

In the latest issue of the open access journal Science Advances, he and his team describe how the switching and neuromorphic behaviour of memristive elements can be selectively controlled. According to their findings, the crucial factor is the purity of the switching oxide layer. "Depending on whether you use a material that is 99.999999 % pure, and whether you introduce one foreign atom into ten million atoms of pure material or into one hundred atoms, the properties of the memristive elements vary substantially" says Valov.

This effect had so far been overlooked by experts. It can be used very specifically for designing memristive systems, in a similar way to doping semiconductors in information technology. "The introduction of foreign atoms allows us to control the solubility and transport properties of the thin oxide layers," explains Dr. Christian Neumann of the technology group Heraeus. He has been contributing his materials expertise to the project ever since the initial idea was conceived in 2015.

"In recent years there has been remarkable progress in the development and use of memristive devices, however that progress has often been achieved on a purely empirical basis," according to Valov. Using the insights that his team has gained, manufacturers could now methodically develop memristive elements selecting the functions they need. The higher the doping concentration, the slower the resistance of the elements changes as the number of incoming voltage pulses increases and decreases, and the more stable the resistance remains. "This means that we have found a way for designing types of artificial synapses with differing excitability," explains Valov.

Design specification for artificial synapses

The brain's ability to learn and retain information can largely be attributed to the fact that the connections between neurons are strengthened when they are frequently used. Memristive devices, of which there are different types such as electrochemical metallization cells (ECMs) or valence change memory cells (VCMs), behave similarly. When these components are used, the conductivity increases as the number of incoming voltage pulses increases. The changes can also be reversed by applying voltage pulses of the opposite polarity.

The JARA researchers conducted their systematic experiments on ECMs, which consist of a copper electrode, a platinum electrode, and a layer of silicon dioxide between them. Thanks to the cooperation with Heraeus researchers, the JARA scientists had access to different types of silicon dioxide: one with a purity of 99.999999 % - also called 8N silicon dioxide - and others containing 100 to 10,000 ppm (parts per million) of foreign atoms. The precisely doped glass used in their experiments was specially developed and manufactured by quartz glass specialist Heraeus Conamic, which also holds the patent for the procedure. Copper and protons acted as mobile doping agents, while aluminium and gallium were used as non-volatile doping.

Record switching time confirms theory

Based on their series of experiments, the researchers were able to show that the ECMs' switching times change as the amount of doping atoms changes. If the switching layer is made of 8N silicon dioxide, the memristive component switches in only 1.4 nanoseconds. To date, the fastest value ever measured for ECMs had been around 10 nanoseconds. By doping the oxide layer of the components with up to 10,000 ppm of foreign atoms, the switching time was prolonged into the range of milliseconds. "We can also theoretically explain our results. This is helping us to understand the physico-chemical processes on the nanoscale and apply this knowledge in the practice" says Valov. Based on generally applicable theoretical considerations, supported by experimental results, some also documented in the literature, he is convinced that the doping/impurity effect occurs and can be employed in all types memristive elements.

Credit: 
Forschungszentrum Juelich

Material manufacturing from particles takes a giant step forward

image: Nanocellulose can also form structures known from pulp technology with the particles.

Image: 
Bruno Mattos / Aalto University

Cohesion, the ability to keep things together, from the scale of nanoparticles to building sites is inherent to these nanofibrils, which can act as mortar to a nearly infinite type of particles as described in the study. The ability of nanocelluloses to bring together particles into cohesive materials is at the root of the study that links decades of research into nanoscience towards manufacturing.

The research reveals the universality of cohesion led by nanocelluloses

In a paper just published in Science Advances, the authors demonstrate how nanocellulose can organize itself in a multitude of different ways by assembling around particles to form highly robust materials. As pointed out by the main author, Dr Bruno Mattos, 'This means that nanocelluloses induce high cohesion in particulate materials in a constant and controlled manner for all particles types. Because of such strong binding properties, such materials can now be built with predictable properties and therefore easily engineered'.

The moment anytime a material is created from particles, one has to first come up with a way to generate cohesion, which has been very particle dependent, 'Using nanocellulose, we can overcome any particle dependency', Mattos adds.

The universal potential of using nanocellulose as a binding component rises from their ability to form networks at the nanoscale, that adapt according to the given particles. Nanocelluloses bind micrometric particles, forming sheet-like structures, much like the paper-mâché as done in schools. Nanocellulose can also form tiny fishnets to entrap smaller particles, such as nanoparticles. Using nanocellulose, materials built from particles can be formed into any shape using an extremely easy and spontaneous process that only needs water. Importantly, the study describes how these nanofibers form network following precise scaling laws that facilitates their implementation.

This development is especially timely in the era of the nanotechnologies, where combining nanoparticles in larger structures is essential. As Dr Blaise Tardy points out, 'New property limits and new functionalities are regularly showcased at the nanoscale, but implementation in the real world is rare. Unraveling the physics associated with the scaling of the cohesion of nanofibers is therefore a very exciting first step towards connecting laboratory findings with current manufacturing practices'. For any success, strong binding among the particles is needed, an opportunity herein offered by nanocellulose.

Nanofibers extracted from plants are used as universal binders for particles to form a variety of functional or structural materials

The team has shown a pathway to achieve scalability in the production of materials, from particles as small as 20 nm in diameter to those that are 20,000 larger. Furthermore, inert particles such as metallic nanoparticles to living entities such as baker's yeast can be compounded. They can be of different shape, from 1D to 3D, hydrophilic or hydrophobic. They can comprise living microorganisms, functional metallic particles, or pollen, achieving new combinations and functionalities.

According to the team leader, Prof. Orlando Rojas, 'This is a powerful and generic method, a new alternative that bridges colloidal science, material development and manufacturing'.

Credit: 
Aalto University

Single-cell RNA seq developed to accurately quantify cell-specific drug effects in pancreatic islets

image: Workflow of identification and removal of contaminating reads in single cell sequencing and the islet cell type-specific drug responses of the decontaminated profiles.

Image: 
© Brenda Marquina-Sánchez and Nikolaus Fortelny / CeMM

The pancreas is an abdominal organ that produces digestive enzymes as well as hormones that regulate blood sugar levels. This hormone-producing function is localized to the islets of Langerhans, which constitute clusters of different endocrine cell types. Among those are beta cells, which produce the hormone insulin needed to lower glucose (a type of sugar) levels in our blood, as well as alpha cells, which generate the hormone glucagon in charge of raising glucose levels in the blood.

Type 1 diabetes is a chronic disease in which the body's immune system mistakenly attacks and destroys the pancreas' insulin-producing beta cells. Regenerative medicine aims to replenish beta cell mass, and thus support and ultimately substitute the current insulin replacement therapies. Alterations to islet composition, including insufficient beta cell function and beta cell dedifferention, also contribute to type II diabetes. Therefore, a deeper understanding of the identity and crosstalk of the different islet cell types leads to a better characterization of both forms of diabetes and may contribute to the development of novel therapeutic concepts.

Single-cell transcriptomics is a powerful technique to characterize cellular identity. Previously, CeMM researchers from Christoph Bock's and Stefan Kubicek's groups at CeMM published the first single cell transcriptomes from primary human pancreatic islet cells (EMBO Rep. 2016 Feb 17;(2):178-87. DOI: 10.15252/embr.201540946). Advances in the technology have since enabled its application to the generation of global human and mouse single cell transcriptome atlases. Despite these advances, single cell approaches remain technologically challenging given that the miniscule RNA amount present is entirely used up in the experiment. Therefore, it is essential to ensure the quality and purity of the resulting single cell transcriptomes.

CeMM researchers in the two contributing laboratories identified unexpectedly high hormone expression in non-endocrine cell types, both in their own dataset as well as other published single cell studies. They set out to elucidate whether this would be the result of contamination by RNA molecules, for example from dying cells, and how it could be removed to obtain a more reliable dataset. Such contamination seems present in single cell RNA-seq data from most tissues but was most visible in pancreatic islets. Islet endocrine cells are exclusively devoted to the production of single hormones, and insulin in beta cells and glucagon in alpha cells are expressed to higher levels than typical "housekeeping" genes. Thus, redistribution of these transcripts to other cell types was highly pronounced. Based on this observation, their goal was to develop, validate and apply a method to experimentally determine and computationally remove such contamination.

In their investigation, CeMM researchers used spiked-in cells from different cell types, both mouse and human samples, that they added to their pancreatic islet samples. Importantly, the transcriptomes of these spike-in cell were fully characterized. This allowed them to control internally and accurately the level of RNA contamination in single cell RNA-seq, giving that the human transcripts detected in the mouse spike-in cells constitute contaminating RNA. In this way, they found that the samples had a contamination level of up to 20%, and were able to define the contamination in each samples. They then developed a novel bioinformatics approach to computationally remove contaminating reads from single cell transcriptomes.

Having now obtained a "decontaminated" transcriptome, from which the spurious signal has been removed, they proceeded to characterize how the cellular identity in the different cell types responded to the treatment with three different drugs. They found that a small molecule inhibitor of the transcription factor FOXO1 induces dedifferentiation of both alpha and beta cells. Furthermore, they studied artemether, which had been found to diminish the function of alpha cells and could induce insulin production in both in vivo and in vitro studies (Cell. 2017 Jan 12;168(1-2):86-100.e15. DOI: 10.1016/j.cell.2016.11.010). The effects of the drug artemether were species-specific and cell-type-specific. In alpha cells, a fraction of cells increases insulin expression and gain aspects of beta cell identity, both in mouse and human samples. Importantly, researchers found that in human beta cells, there is no significant change in insulin expression, whereas in mouse islets, beta cells reduce their insulin expression and overall beta cell identity.

This study is the result of a cross-disciplinary collaboration of the laboratories of Stefan Kubicek and Christoph Bock at CeMM with Patrick Collombat at the Institute of Biology Valrose (France). This is the first study to apply single cell sequencing to analyze dynamic drug response in intact isolated tissue, which benefitted from the high quantitative accuracy of the decontamination method. It provides thus not only a novel method for single-cell decontamination and highly quantitative single-cell analysis of drug responses in intact tissues, but also addresses an important current question in islet cell biology and diabetes research. These findings could open up potential therapeutic avenues to treat Type 1 diabetes in the future.

Credit: 
CeMM Research Center for Molecular Medicine of the Austrian Academy of Sciences

COVID-19 lockdowns significantly impacting global air quality

WASHINGTON--Levels of two major air pollutants have been drastically reduced since lockdowns began in response to the COVID-19 pandemic, but a secondary pollutant - ground-level ozone - has increased in China, according to new research.

Two new studies in AGU's journal Geophysical Research Letters find nitrogen dioxide pollution over northern China, Western Europe and the U.S. decreased by as much as 60 percent in early 2020 as compared to the same time last year. Nitrogen dioxide is a highly reactive gas produced during combustion that has many harmful effects on the lungs. The gas typically enters the atmosphere through emissions from vehicles, power plants and industrial activities.

In addition to nitrogen dioxide, one of the new studies finds particulate matter pollution (particles smaller than 2.5 microns) has decreased by 35 percent in northern China. Particulate matter is composed of solid particles and liquid droplets that are small enough to penetrate deep into the lungs and cause damage.

The two new papers are part of an ongoing special collection of research in AGU journals related to the current pandemic.

Such a significant drop in emissions is unprecedented since air quality monitoring from satellites began in the 1990s, said Jenny Stavrakou, an atmospheric scientist at the Royal Belgian Institute for Space Aeronomy in Brussels and co-author of one of the papers. The only other comparable events are short-term reductions in China's emissions due to strict regulations during events like the 2008 Beijing Olympics.

The improvements in air quality will likely be temporary, but the findings give scientists a glimpse into what air quality could be like in the future as emissions regulations become more stringent, according to the researchers.

"Maybe this unintended experiment could be used to understand better the emission regulations," Stavrakou said. "It is some positive news among a very tragic situation."

However, the drop in nitrogen dioxide pollution has caused an increase in surface ozone levels in China, according to one of the new studies. Ozone is a secondary pollutant formed when sunlight and high temperature catalyze chemical reactions in the lower atmosphere. Ozone is harmful to humans at ground-level, causing pulmonary and heart disease.

In highly polluted areas, particularly in winter, surface ozone can be destroyed by nitrogen oxides, so ozone levels can increase when nitrogen dioxide pollution goes down. As a result, although air quality has largely improved in many regions, surface ozone can still be a problem, according to Guy Brasseur, an atmospheric scientist at the Max Planck Institute for Meteorology in Hamburg, Germany, and lead author of one of the new studies.

"It means that by just reducing the [nitrogen dioxide] and the particles, you won't solve the ozone problem," Brasseur said.

Worldwide emissions

Stavrakou and her colleagues used satellite measurements of air quality to estimate the changes in nitrogen dioxide pollution over the major epicenters of the outbreak: China, South Korea, Italy, Spain, France, Germany, Iran and the United States.

They found that nitrogen dioxide pollution decreased by an average of 40 percent over Chinese cities and by 20 to 38 percent over Western Europe and the United States during the 2020 lockdown, as compared to the same time in 2019.

However, the study found nitrogen dioxide pollution did not decrease over Iran, one of the earliest and hardest-hit countries. The authors suspect this is because complete lockdowns weren't in place until late March and before that, stay-at-home orders were largely ignored. The authors did see a dip in emissions during the Iranian New Year holiday after March 20, but this dip is observed during the celebration every year.

Air quality in China

The second study looked at air quality changes in northern China where the virus was first reported and where lockdowns have been most strict.

Brasseur analyzed levels of nitrogen dioxide and several other types of air pollution measured by 800 ground-level air quality monitoring stations in northern China.

Brasseur and his colleague found particulate matter pollution decreased by an average of 35 percent and nitrogen dioxide decreased by an average of 60 percent after the lockdowns began on January 23.

However, they found the average surface ozone concentration increased by a factor of 1.5-2 over the same time period. At ground level, ozone forms from complex reactions involving nitrogen dioxide and volatile organic compounds (VOCs), gases emitted by a variety of household and industrial products, but ozone levels can also be affected by weather conditions and other factors.

Credit: 
American Geophysical Union

COVID-19, digital technologies, and the future of disease surveillance

Several data-driven epidemiological approaches that have been proposed or trialed for COVID-19 are justified if implemented through transparent processes that involve oversight, write Michelle M. Mello and C. Jason Wang in this Policy Forum. Though, say the authors, "we question the necessity to conduct contact tracing using cellphone and other private data without users' consent." Around the globe in select countries, a range of innovative uses of personal data from outside the health sector have been undertaken to meet the challenge posed by the novel coronavirus. In this Policy Forum, Mello and Wang ask: "Should these approaches be in wider use?" The ethical issues raised by digital epidemiology center on a core tension: these novel uses of people's data can involve both personal and social harms, but so does failing to harness the enormous power of data to arrest epidemics. Considerations in this debate, say the authors, raise important question for governments, like what is the exit strategy for any approach applied? Also, when it comes to respecting autonomy - asking people for permission to access their personal information (or not) - whether such infringements likely to be effective is an important question, among others. A final question is how to ensure that those involved in epidemiologic analysis of novel data sources are accountable for what they do. "Ordinary presumptions about what kind of data uses are ethically acceptable for governments and companies to pursue may need to flex" in these times, say the authors. "But the key principles guiding decision making remain the same." The authors believe some novel epidemiological approaches described for COVID-19 are justified if implemented with the right parameters. "When the epidemic abates, it will be important to reconsider these approaches in light of what has been learned about their benefits and the public's attitudes toward them, so that we are prepared to deploy cutting-edge methods responsibly during the next epidemic."

Credit: 
American Association for the Advancement of Science (AAAS)

El Niño-linked decreases in soil moisture could trigger massive tropical-plant die offs

image: Figure 6: In the Los Alamos National Laboratory team's analysis of soil moisture changes during "Super El Niño" events in 1982-83, 1997-98 and 2015-16, cell colors indicate the magnitude of change shown in the corresponding histograms. More consistent strong decreases relative to normal for these seasons are typical for red cells and more consistent strong increases relative to normal for these seasons are typical for green cells. The top map shows October to December and the bottom map shows January to March.

Image: 
Los Alamos National Laboratory

LOS ALAMOS, N.M., May 11, 2020--New research has found that El Niño events are often associated with droughts in some of the world's more vulnerable tropical regions. Associated with warmer than average ocean temperatures in the eastern Pacific, El Niños can in turn influence global weather patterns and tropical precipitation, and these changes can lead to massive plant die-offs if other extreme factors are also at play.

"We know a lot about El Niño in terms of its impact on weather and surface water resources," said Kurt Solander, a research hydrologist in the Computational Earth Science group at Los Alamos National Laboratory and lead author of the paper. "This new study drills down to reveal how El Niño can affect the moisture content of soil, which controls the growth of plants, the food we eat, and how much water from land gets fed back into the atmosphere through evaporation."

In the paper, Solander and Los Alamos colleagues Brent Newman and Chonggang Xu analyzed changes in soil moisture content in the humid tropics after three "Super El Niño" events from the past--1982-83, 1997-98, and 2015-16. They found that during these years the most severe and consistent decreases in soil moisture occurred in regions like the Amazon basin and maritime Southeast Asia, with some of the changes potentially being significant enough to become a factor responsible for large-scale plant die off. In contrast, some other tropical areas, such as tropical East Africa, will likely see an increase in soil moisture during major El Niño events.

The team used a global dataset based on computer models and historic satellite observations of near-surface soil moisture. By extracting data from the rooting zone from the humid tropics, predicted soil moisture changes during the super El Niños could be examined at local scales. The team combined these data with on-site measurements, collected across the tropics, to verify the accuracy of the satellite and computer models. They were then able to identify ways to improve the estimates of soil moisture changes during El Niño events, and showed that El Niño induced responses varied from significant increases or decreases to minimal change relative to what occurs during non-El Niño years and spatial location.

Super El Niño events typically happen every 15 to 20 years, with mild to moderate events coming every three to five years. The most immediate impact of this new information is that it can help governments or farmers in these areas prepare for the consequences of decreased soil moisture, or understand that crops will need more water during these events.

"Scientists can predict these events with a moderate degree of confidence three to six months in advance," Solander said. "With this new information, water managers in these areas can, for example, regulate how much water they retain in a reservoir to compensate for the expected decreases in available moisture for local agricultural crops."

The work is part of ongoing research at Los Alamos studying spatial patterns of precipitation recycling, which effectively determines how much moisture plants return to the atmosphere. In plant-dense regions like the Amazon basin, researchers at Los Alamos hope to provide insight on atmospheric moisture feedbacks from vegetation as plants adjust to climatic warming, which in turn helps researchers understand how precipitation will change on a global scale.

Credit: 
DOE/Los Alamos National Laboratory

Healthy eating behaviors in childhood may reduce the risk of adult obesity and heart disease

DALLAS, May 11, 2020 -- How children are fed may be just as important as what they are fed, according to a new scientific statement from the American Heart Association, "Caregiver Influences on Eating Behaviors in Young Children," published today in the Journal of the American Heart Association.

The statement is the first from the Association focused on providing evidence-based strategies for parents and caregivers to create a healthy food environment for young children that supports the development of positive eating behaviors and the maintenance of a healthy weight in childhood, thereby reducing the risks of overweight, obesity and cardiovascular disease later in life.

Although many children are born with an innate ability to stop eating when they are full, they are also influenced by the overall emotional atmosphere, including caregiver wishes and demands during mealtimes. If children feel under pressure to eat in response to caregiver wants, it may be harder for them to listen to their individual internal cues that tell them when they are full.

Allowing children to choose what and especially how much to eat within an environment composed of healthy options encourages children to develop and eventually take ownership of their decisions about food and may help them develop eating patterns linked to a healthy weight for a lifetime, according to the statement authors.

"Parents and caregivers should consider building a positive food environment centered on healthy eating habits, rather than focusing on rigid rules about what and how a child should eat," said Alexis C. Wood, Ph.D., the writing group chair for the scientific statement and assistant professor at the U.S. Department of Agriculture/Agriculture Research Services Children's Nutrition Research Center and the department of pediatrics (nutrition section) at Baylor College of Medicine in Houston.

The statement suggests that parents and caregivers should be positive role models by creating an environment that demonstrates and supports healthy food choices, rather than an environment focused on controlling children's choices or highlighting body weight. Parents and caregivers should encourage children to eat healthy foods by:

providing consistent timing for meals;

allowing children to select what foods they want to eat from a selection of healthy choices;

serving healthy or new foods alongside foods children already enjoy;

regularly eating new, healthy foods while eating with the child and demonstrating enjoyment of the food;

paying attention to a child's verbal or non-verbal hunger and fullness cues; and

avoiding pressuring children to eat more than they wish to eat.

Wood noted that some parents and caregivers may find it challenging to allow children to make their own food decisions, especially if the children become reluctant to try new foods and/or become picky eaters. These behaviors are common and considered normal in early childhood, ages 1 to 5 years, as children are learning about the tastes and textures of solid foods. Imposing rigid, authoritarian rules around eating and using tactics such as rewards or punishments may feel like successful tactics in the short term. However, research does not support this approach; rather, it may have long-term, negative consequences. An authoritarian eating environment does not allow a child to develop positive decision-making skills and can reduce their sense of control, which are important developmental processes for children.

In addition, the authoritarian approach has been linked to children being more likely to eat when they are not hungry and eating less healthy foods that are likely higher in calories, which increase the risk of overweight and obesity and/or conditions of disordered eating.

On the other hand, an indulgent approach, where a child is allowed to eat whatever they want whenever they want, does not provide enough boundaries for children to develop healthy eating habits. Research has also linked this "laissez-faire" approach to a greater risk of children becoming overweight or having obesity.

Research does suggest that some strategies can increase children's dietary variety during the early years if they are "picky" or "fussy" about foods. Repeatedly offering children a wide variety of healthy foods increases the likelihood they will accept them, particularly when served with foods they prefer. In addition, caregivers or parents who enthusiastically eat a food may also help a child accept this food. Modeling eating healthy foods - by caregivers, siblings and peers - is a good strategy for helping children to be open to a wider variety of food options.

"Children's eating behaviors are influenced by a lot of people in their lives, so ideally, we want the whole family to demonstrate healthy eating habits," said Wood.

It is important to note that not all strategies work for all children, and parents and caregivers should not feel undue stress or blame for children's eating behaviors. "It is very clear that each child is an individual and differs in their tendency to make healthy decisions about food as they grow. This is why it is important to focus on creating an environment that encourages decision-making skills and provides exposure to a variety of healthy, nutritious foods throughout childhood, and not place undue attention on the child's individual decisions," concluded Wood.

Caregivers can be a powerful force in helping children develop healthy eating habits, and yet their role is limited by other factors. The statement authors encourage policies that address barriers to implementing the statement's recommendations within the wider socioeconomic context, including social determinants of health such as socio-economic status, food insecurity and others. While efforts that encourage caregivers to provide a responsive, structured feeding environment could be an important component of reducing obesity and cardiometabolic risk across the lifespan, they note that they will be most effective as part of a multi-level, multi-component prevention strategy.

Credit: 
American Heart Association

NIST scientists create new recipe for single-atom transistors

video: To realize the full potential of tiny transistors, researchers must find a way to make many copies of these notoriously difficult-to-fabricate components. This animation shows the step-by-step recipe designed by NIST scientists and their colleagues to produce these atomic-scale devices.

Image: 
S. Kelley/NIST

Once unimaginable, transistors consisting only of several-atom clusters or even single atoms promise to become the building blocks of a new generation of computers with unparalleled memory and processing power. But to realize the full potential of these tiny transistors -- miniature electrical on-off switches -- researchers must find a way to make many copies of these notoriously difficult-to-fabricate components.

Now, researchers at the National Institute of Standards and Technology (NIST) and their colleagues at the University of Maryland have developed a step-by-step recipe to produce the atomic-scale devices. Using these instructions, the NIST-led team has become only the second in the world to construct a single-atom transistor and the first to fabricate a series of single electron transistors with atom-scale control over the devices' geometry.

The scientists demonstrated that they could precisely adjust the rate at which individual electrons flow through a physical gap or electrical barrier in their transistor -- even though classical physics would forbid the electrons from doing so because they lack enough energy. That strictly quantum phenomenon, known as quantum tunneling, only becomes important when gaps are extremely tiny, such as in the miniature transistors. Precise control over quantum tunneling is key because it enables the transistors to become "entangled" or interlinked in a way only possible through quantum mechanics and opens new possibilities for creating quantum bits (qubits) that could be used in quantum computing.

To fabricate single-atom and few-atom transistors, the team relied on a known technique in which a silicon chip is covered with a layer of hydrogen atoms, which readily bind to silicon. The fine tip of a scanning tunneling microscope then removed hydrogen atoms at selected sites. The remaining hydrogen acted as a barrier so that when the team directed phosphine gas (PH3) at the silicon surface, individual PH3 molecules attached only to the locations where the hydrogen had been removed (see animation). The researchers then heated the silicon surface. The heat ejected hydrogen atoms from the PH3 and caused the phosphorus atom that was left behind to embed itself in the surface. With additional processing, bound phosphorous atoms created the foundation of a series of highly stable single- or few-atom devices that have the potential to serve as qubits.

Two of the steps in the method devised by the NIST teams -- sealing the phosphorus atoms with protective layers of silicon and then making electrical contact with the embedded atoms -- appear to have been essential to reliably fabricate many copies of atomically precise devices, NIST researcher Richard Silver said.

In the past, researchers have typically applied heat as all the silicon layers are grown, in order to remove defects and ensure that the silicon has the pure crystalline structure required to integrate the single-atom devices with conventional silicon-chip electrical components. But the NIST scientists found that such heating could dislodge the bound phosphorus atoms and potentially disrupt the structure of the atomic-scale devices. Instead, the team deposited the first several silicon layers at room temperature, allowing the phosphorus atoms to stay put. Only when subsequent layers were deposited did the team apply heat.

"We believe our method of applying the layers provides more stable and precise atomic-scale devices," said Silver. Having even a single atom out of place can alter the conductivity and other properties of electrical components that feature single or small clusters of atoms.

The team also developed a novel technique for the crucial step of making electrical contact with the buried atoms so that they can operate as part of a circuit. The NIST scientists gently heated a layer of palladium metal applied to specific regions on the silicon surface that resided directly above selected components of the silicon-embedded device. The heated palladium reacted with the silicon to form an electrically conducting alloy called palladium silicide, which naturally penetrated through the silicon and made contact with the phosphorus atoms.

In a recent edition of Advanced Functional Materials, Silver and his colleagues, who include Xiqiao Wang, Jonathan Wyrick, Michael Stewart Jr. and Curt Richter, emphasized that their contact method has a nearly 100% success rate. That's a key achievement, noted Wyrick. "You can have the best single-atom-transistor device in the world, but if you can't make contact with it, it's useless," he said.

Fabricating single-atom transistors "is a difficult and complicated process that maybe everyone has to cut their teeth on, but we've laid out the steps so that other teams don't have to proceed by trial and error," said Richter.

In related work published today in Communications Physics, Silver and his colleagues demonstrated that they could precisely control the rate at which individual electrons tunnel through atomically precise tunnel barriers in single-electron transistors. The NIST researchers and their colleagues fabricated a series of single-electron transistors identical in every way except for differences in the size of the tunneling gap. Measurements of current flow indicated that by increasing or decreasing the gap between transistor components by less than a nanometer (billionth of a meter), the team could precisely control the flow of a single electron through the transistor in a predictable manner.

"Because quantum tunneling is so fundamental to any quantum device, including the construction of qubits, the ability to control the flow of one electron at a time is a significant achievement," Wyrick said. In addition, as engineers pack more and more circuitry on a tiny computer chip and the gap between components continues to shrink, understanding and controlling the effects of quantum tunneling will become even more critical, Richter said.

Credit: 
National Institute of Standards and Technology (NIST)

Study finds rising rate of mental health visits among youth to emergency departments

(COLUMBUS, Ohio) - While the number of pediatric emergency department (ED) visits across the nation has remained stable over the last 10 years, visits for mental health disorders have risen 60% and the rate of visits for deliberate self-harm have increased 329%.

In a study published today in Pediatrics, Nationwide Children's Hospital researchers looked at the number and reason for mental health-related ED visits. They also examined the geographic location of EDs and the overall number of children coming to each ED. Previous studies have shown that low pediatric volume EDs and EDs in rural settings are less prepared for all pediatric emergencies, and only one third of rural facilities have pediatric mental health policies or mental health transfer agreements.

Rachel Stanley, MD, division chief of Emergency Medicine at Nationwide Children's and the study's senior author said, "We would like children to go to their primary care provider or a psychiatrist, but EDs are the safety net for children with mental health disorders, and we need to be able to take care of them. Knowing why children are going to the ED is essential to making sure the EDs are prepared to treat them appropriately."

Over the 10-year study period, most visits occurred at non-children's EDs in both metropolitan and non-urban settings. The study looked at children 5 to 17 years old, and data is representative of all U.S. emergency departments.

Findings showed the highest jump in ED visits was among 15- to 17-year-olds (68% increase), and while the rate grew among both males and females, it was more pronounced in girls (74% increase). Additionally, visits for substance use disorders rose by 75%, with alcohol-related disorders decreasing by nearly 40% and substance use disorders significantly increasing (over 150%). The rate of visits for deliberate self-harm increased 329%.

"Examining the characteristics of EDs that children present to was important because outcomes have been shown to be directly linked to the volume and geographic location of the EDs," said Charmaine Lo, PhD, MPH, the study's lead author and senior research scientist in Emergency Medicine at Nationwide Children's.

Further research is needed to identify solutions that will better equip all EDs with the tools, personnel and resources to better manage pediatric cases, particularly those related to mental health.

Universal screenings for suicidal ideation, a recent requirement of the Joint Commission, is one step to improving the quality of care for those being treated for behavioral health conditions.

Study authors say telehealth services can also provide an avenue for increasing access to behavioral health specialists who can provide screening, assist with acute interventions, and support connections to continued care within the community, thereby avoiding long distance transfers, transportation costs and delays in care.

"The overall goal of our work is to improve preparedness of EDs for children," said Dr. Stanley. "Large children's hospitals with psychiatric providers can offer outreach services to these smaller EDs in the form of telehealth. Another solution is more training for emergency physicians and nurses so they know how to treat and triage children."

Recently, a team of experts from areas including Emergency Medicine, Psychiatry and Behavioral Health at Nationwide Children's collaborated to develop and open a Psychiatric Crisis Department dedicated solely for youth and adolescents in a behavioral or mental health crisis. Designed with safety for its patients as a top priority, the Psychiatric Crisis Department opened in March 2020 at the Big Lots Behavioral Health Pavilion at Nationwide Children's, the largest center of its kind on a pediatric medical campus.

The need for the Pavilion and others like it around the country are essential as one in five children is living with a significantly impairing mental illness that interferes with everyday life. Half of all lifetime mental health concerns start by age 14. Children don't wear their mental health on their sleeves, so there is an unprecedented need for evidence-informed resources and support for children. On Our Sleeves™ is the movement to transform children's mental health through education, advocacy and research. Its mission is to provide every community in America with free resources necessary for breaking child mental health stigmas and educating families and advocates.

Credit: 
Nationwide Children's Hospital

Chameleon materials: The origin of color variation in low-dimensional perovskites

image: This is Maria Loi, Professor of Photophysics and Optoelectronics at the University of Groningen

Image: 
Sylvia Germes

Some light-emitting diodes (LEDs) created from perovskite, a class of optoelectronic materials, emit light over a broad wavelength range. Scientists from the University of Groningen have now shown that in some cases, the explanation of why this happens is incorrect. Their new explanation should help scientists to design perovskite LEDs capable of broad-range light emission. The study was published in the journal Nature Communications on 11 May.

Low-dimensional (2D or 1D) perovskites emit light in a narrow spectral range and are therefore used to make light-emitting diodes of superior colour purity. However, in some cases, a broad emission spectrum at energy levels below the narrow spectrum has been noted. This process has attracted great interest as it could be used to produce white light LEDs more easily compared to the processes that are currently being used. To design perovskites for specific purposes, however, it is necessary to understand why some perovskites produce broad-spectrum emissions while others emit a narrow spectrum.

Quantum confinement

Perovskites are a versatile group of materials with a very distinctive crystal structure, known as the perovskite structure. In an idealized cubic unit cell, anions form an octahedron around a central cation while the corners of the cube are occupied by other, larger cations. Different ions can be used to create different perovskites.

In hybrid perovskites, the cations are organic molecules of different sizes. When the size exceeds a certain dimension, the structure becomes two-dimensional or layered. The resulting quantum confinement has large consequences for the materials' physical properties and, in particular, for the optical properties.

Emissions

'There are many reports in the literature where, in addition to the narrow emission of these low dimensional systems, there is a broad low energy spectrum. And this is thought to be an intrinsic property of the material,' says Maria Loi, Professor of Photophysics and Optoelectronics at the University of Groningen. It has been proposed that the vibrations of the octahedron's atoms can 'trap' an excited stated in a self-trapped exciton, or self-trapped excited state, causing the broad-spectrum photoluminescence, especially in these two-dimensional systems and in systems where the octahedrons are isolated from each other (zero-dimensional).

However, observations made in Loi's laboratory appear to contradict this theory, says Simon Kahmann, a postdoctoral researcher in her team. 'One of our students studied single crystals of a lead-iodide-based 2D perovskite and noticed that some crystals emitted green light and others emitted red light. This is not what you would expect if the broad red emission were an intrinsic property of this material.'

Colour

The research team proposed that defects in these perovskites could change the colour of emitted light,. Therefore, they decided to test the mainstream interpretation with an ad hoc experiment. Loi: 'In the accepted theoretical explanation, the excitations should be larger than the bandgap to produce broad emission.' The bandgap is the energy difference between the top of the valence band and the bottom of the conduction band.

Using laser light of different colours, and therefore of different energies, they studied the emission of the crystals. 'We noted that when we used photons below the bandgap energy, the broad emission still occurred,' says Loi. 'This should not have happened according to the mainstream interpretation.'

Consequences

Their explanation is that a defect state with an energy level inside the bandgap is governing the broad emission and the large colour variation of the crystals. 'We think that it is a chemical defect in the crystal, probably related to iodide, which causes states inside the band gap,' says Kahmann. Thus, the broad emissions are not an intrinsic property of the material but are caused by an extrinsic effect. Kahmann: 'At this point, we cannot totally rule out that this is a quirk of lead iodide perovskites but it is likely to be a general property of low-dimensional perovskites.' This finding has profound consequences, explains Loi: 'If we want to predict new and better compounds that broadly emit light, we need to understand the origin of this emission. We should not be tricked by this chameleon.'

Simple Science Summary

Just over 10 years ago, a class of materials moved into the spotlight of scientific research. These materials can convert light into electricity or electricity into light: hybrid perovskites. These can be used in solar cells, detectors of light or X-rays, but they can also be used as light-emitting diodes. Some perovskites emit light over a narrow wavelength band while others produce broadband emissions that could be used to produce white light. Scientists from the University of Groningen have now shown that the broad emission in 2D lead iodide perovskites is not an intrinsic property of the material. This means it is not very efficient. This means that optical investigations of this class of materials should be interpreted with care.

Credit: 
University of Groningen

Cancer research breakthrough as DNA behavior is uncovered in 3D models

image: Scientists have used 3D models to break down the DNA behavior of cancer cells, in a breakthrough new study which could revolutionize treatment for the disease.

Image: 
Dr Manel Esteller

Scientists have used 3D models to break down the DNA behavior of cancer cells, in a breakthrough new study which could revolutionize treatment for the disease.

In what is a first for science, a research team led by Dr Manel Esteller, Director of the Josep Carreras Leukaemia Research Institute (IJC), demonstrated how 3D models (known as organoids) can now be used to develop a characterization of the DNA make-up - or the epigenetic fingerprint - of human cancer.

Pubished in Epigenetics, the research validates the use of these 3D samples for cancer research that could deliver new oncology treatments.

Dr Esteller, who is also Chaiman of Genetics at the University of Barcelona, explains: "Frequently, promising cancer therapies fail when applied to patients in the real clinical setting. This occurs despite many of these new treatments demonstrating promising results at the preclinical stage in the lab. One explanation is that many of the tumor models used in early research phases are established cell lines that have been growing for many decades and in two dimension (2D) culture flasks. These cancer cells might not completely resemble the features of real tumors from patients that expand into three dimensions (3D). Very recently, it has been possible to grow cancers in the laboratories but respecting the 3D structure: these models are called 'organoids'. We know very little about these cells and if they actually mimic the conformation of the tumor within the body, particularly the chemical behaviors (known as modifications) of DNA that are called epigenetics ("beyond the genetics"), such as DNA methylation.

"What our article solves is this unmet biomedical need in the cancer research field: the characterization of the epigenetic fingerprint of human cancer organoids. The developed study shows that these tumor models can be very useful for the biomedical research community and the pharmaceutical companies developing anti-cancer drugs."

Specifically looking at 25 human cancer organoids, made available from the American Type Culture Collection (ATTC), Dr Esteller, an ICREA Research Professor, states that during their research the team made some interesting findings around the properties of the cancer cells.

"First, we found that every cancer organoid retains the properties of the tissue of origin, so this shows that if the samples were obtained from the surgery of a colon or pancreatic cancer, the organoid closely resembles the original primary tumor.

"Second, we discovered that there is no contamination of normal cells, thus, the malignant pure transformed cells can be analyzed without interferences. And finally, the 3D organoid cancers are closer to the patient tumors than the commonly used 2D cell lines."

The study will now be used to help form Big Data, as the team's samples will be shared in easily accessible public databases between researchers to promote more collaborative studies.

"This will enable further data mining to produce new cancer discoveries using different biometric approaches or focusing on particular genes," explains Dr Esteller.

"And most importantly, the characterized cancer organoids can be readily obtainable from a reliable provider (the ATCC) researchers around the world can use the epigenetic information of these sharable samples to develop their own investigations."

Credit: 
Taylor & Francis Group