Tech

Printed perovskite LEDs

Microelectronics utilise various functional materials whose properties make them suitable for specific applications. For example, transistors and data storage devices are made of silicon, and most photovoltaic cells used for generating electricity from sunlight are also currently made of this semiconductor material. In contrast, compound semiconductors such as gallium nitride are used to generate light in optoelectronic elements such as light-emitting diodes (LEDs). The manufacturing processes also different for the various classes of materials.

Transcending the materials and methods maze

Hybrid perovskite materials promise simplification - by arranging the organic and inorganic components of semiconducting crystal in a specific structure. "They can be used to manufacture all kinds of microelectronic components by modifying their composition", says Prof. Emil List-Kratochvil, head of a Joint Research Group at HZB and Humboldt-Universität.

What's more, processing perovskite crystals is comparatively simple. "They can be produced from a liquid solution, so you can build the desired component one layer at a time directly on the substrate", the physicist explains.

First solar cells from an inkjet printer, now light-emitting diodes too

Scientists at HZB have already shown in recent years that solar cells can be printed from a solution of semiconductor compounds - and are worldwide leaders in this technology today. Now for the first time, the joint team of HZB and HU Berlin has succeeded in producing functional light-emitting diodes in this manner. The research group used a metal halide perovskite for this purpose. This is a material that promises particularly high efficiency in generating light - but on the other hand is difficult to process.

"Until now, it has not been possible to produce these kinds of semiconductor layers with sufficient quality from a liquid solution", says List-Kratochvil. For example, LEDs could be printed just from organic semiconductors, but these provide only modest luminosity. "The challenge was how to cause the salt-like precursor that we printed onto the substrate to crystallise quickly and evenly by using some sort of an attractant or catalyst", explains the scientist. The team chose a seed crystal for this purpose: a salt crystal that attaches itself to the substrate and triggers formation of a gridwork for the subsequent perovskite layers.

Significantly better optical and electronic characteristics

In this way, the researchers created printed LEDs that possess far higher luminosity and considerably better electrical properties than could be previously achieved using additive manufacturing processes. But for List-Kratochvil, this success is only an intermediate step on the road to future micro- and optoelectronics that he believes will be based exclusively on hybrid perovskite semiconductors. "The advantages offered by a single universally applicable class of materials and a single cost-effective and simple process for manufacturing any kind of component are striking", says the scientist. He is therefore planning to eventually manufacture all important electronic components this way in the laboratories of HZB and HU Berlin.

List-Kratochvil is Professor of Hybrid Devices at the Humboldt-Universität zu Berlin and head of a Joint Lab founded in 2018 that is operated by HU together with HZB. In addition, a team jointly headed by List-Kratochvil and HZB scientist Dr. Eva Unger is working in the Helmholtz Innovation Lab HySPRINT on the development of coating and printing processes - also known in technical jargon as "additive manufacturing" - for hybrid perovskites. These are crystals possessing a perovskite structure that contain both inorganic and organic components.

Credit: 
Helmholtz-Zentrum Berlin für Materialien und Energie

Plant cell gatekeepers' diversity could be key to better crops

image: Tagging aquaporins with fluorescent protein shows their location in the cell membrane. These gatekeepers help move substances in and out of the cell.

Image: 
Annamaria De Rosa, CoETP

Scientists have shed new light on how the network of gatekeepers that controls the traffic in and out of plant cells works, which researchers believe is key to develop food crops with bigger yields and greater ability to cope with extreme environments.

Everything that a plant needs to grow first needs to pass through its cells' membranes, which are guarded by a sieve of microscopic pores called aquaporins.

"Aquaporins (AQPs) are ancient channel proteins that are found in most organisms, from bacteria to humans. In plants, they are vital for numerous plant processes including, water transport, growth and development, stress responses, root nutrient uptake, and photosynthesis," says former PhD student Annamaria De Rosa from the ARC Centre of Excellence for Translational Photosynthesis (CoETP) at The Australian National University (ANU).

"We know that if we are able to manipulate aquaporins, it will open numerous useful applications for agriculture, including improving crop productivity, but first we need to know more about their diversity, evolutionary history and the many functional roles they have inside the plant," Ms De Rosa says.

Their research, published this week in the Journal BMC Plant Biology, did just that. They identified all the different types of aquaporins found in tobacco (Nicotiana tabacum), a model plant species closely related to major economic crops such as tomato, potato, eggplant and capsicum.

"We described 76 types of these microscopic hour-glass shape channels based on their gene structures, protein composition, location in the plant cell and in the different organs of the plant and their evolutionary origin. These results are extremely important as they will help us to transfer basic research to applied agriculture," says Ms De Rosa, whose PhD project focused on aquaporins.

"The Centre (CoETP) is really interested in understanding aquaporins because we believe they are a key player in energy conversion through photosynthesis and also control how a plant uses water. That is why we think we can use aquaporins to enhance plant performance and crop resilience to environmental changes," says lead researcher Dr Michael Groszmann from the Research School of Biology and the CoETP at ANU.

Aquaporins are found everywhere in the plant, from the roots to flowers, transporting very different molecules in each location, at an astonishing 100 million molecules per second. The configuration of an aquaporin channel determines the substrate it transports and therefore its function, from the transport of water and nutrients from roots to shoots, to stress signalling or seed development.

"We focused on tobacco because it is a fast-growing model species that allows us to scale from the lab to the field, allowing us to evaluate performance in real-world scenarios. Tobacco is closely related to several important commercial crops, which means we can easily transfer the knowledge we obtain in tobacco to species like tomato and potato. Tobacco itself has own commercial applications and there is a renewed interest in the biofuel and plant-based pharmaceutical sectors," he says.

"This research is extremely exciting because the diversity of aquaporins in terms of their function and the substrates they transport, mean they have many potential applications for crop improvement ranging from improved salt tolerance, more efficient fertiliser use, improved drought tolerance, and even more effective response to disease infection. They are currently being used in water filtration systems and our results could help to expand these applications. The future of aquaporins is full of possibilities," says Dr Groszmann.

This research has been funded by the Australian Research Council (ARC) Centre of Excellence for Translational Photosynthesis (CoETP), led by the Australian National University, which aims to improve the process of photosynthesis to increase the production of major food crops such as sorghum, wheat and rice.

Credit: 
ARC Centre of Excellence for Translational Photosynthesis

Animal territorial behavior could play larger role in disease spread than formerly thought

Territorial behaviors in animals, such as a puma using its scent to mark its domain, may help to decrease the severity of a potential disease outbreak--but not without the cost of increased persistence of the disease within the population.

This finding comes from a new study, led by Lauren White, of the University of Maryland's National Socio-Environmental Synthesis Center, published in PLOS Computational Biology.

While disease research often addresses direct social contact without considering individual animals' movement, White and her colleagues implemented a unique approach in this study by using a mathematical model to link animal movement and the spread of disease. The model aimed to help the researchers understand more clearly the relationship between animals' indirect contact and disease spread.

While animals' territorial behavior has the potential to stop disease spread through direct transmission, pathogens remaining in the environment could still be infectious. Therefore, the researchers created a model in which infected animals could indirectly transmit the disease to others from pathogens left behind from the animals' deposited scent marks. By simulating the animals' movement across their territories, the researchers were able to see how such movement impacted the spread of disease.

As part of the study, the researchers simulated a scenario with conditions conducive to a disease outbreak: a high density of animals and slower disease-recovery rate (animals are sick for longer). They found that territorial movement resulted in a lower number of animals who became infected but with the consequence of the disease persisting for a longer period of time within the population. The researchers wrote that such results suggest that indirect contact among animals, through behavior such as scent marking, could have a more important role in the transmission of disease than previously thought.

"It was exciting to be able to incorporate a movement-ecology perspective into a disease-modeling framework," White says. "Our findings support the possibility that pathogens could evolve to co-opt indirect communication systems to overcome social barriers in territorial species."

These findings could have implications for how disease spreads more broadly by revealing the importance of factoring movement behavior into studies of disease transmission. The researchers said they would continue to expand the model by including other factors, such as varying habitat quality and prey kill sites.

Credit: 
University of Maryland

The disease pyramid: Environment, pathogen, individual and microbiome

image: The microbiome is also an important part of the immune response in amphibians.

Image: 
Dirk Schmeller

Researchers from the Leibniz-Institute of Freshwater Ecology and Inland Fisheries (IGB), the Université de Toulouse and the Helmholtz Centre for Environmental Research (UFZ) show how the microbial colonisation of the organism influences the interactions between living organisms, the environment and pathogens, using amphibians like frogs as examples. This is basic research for health prophylaxis.

Biotic and abiotic environmental factors have a strong influence on the dynamics of diseases in humans and animals. In their study, the researchers focus on an important component: the microbiome. The individual microbiome of a living being is a vital component of immunity. Especially on the skin and in the bowel, i.e. directly at the interface between the individual and pathogens, endogenous bacteria and viruses are highly active.

The international team presents the concept of a disease pyramid with the four cornerstones of environment, pathogen, host and microbiome. For the first time, the different functions of the microbiome are taken into account. The researchers illustrate this by using the amphibian disease chytridiomycosis caused by the fungal pathogen Batrachochochytrium dendrobatidis.

"The microbiome of living organisms is highly variable. It is only in recent years that researchers have succeeded in using genetic methods to determine the totality of microorganisms. We are now only gradually beginning to understand their role in health prophylaxis and how they interact, for example, with the environmental microbiome, pathogens and the host," explains IGB researcher Dr. Adeline Loyau, who led the study.

Diversity of microbiome and habitats strengthens resistance

The authors emphasise that more diverse microbiomes can make the host more resistant to pathogens because they are better able to keep potential pathogens at bay. The study also shows that individuals who inhabit complex and therefore species-rich habitats have a lower mortality rate. The team shows that the microbiome can act very specifically against pathogens: The symbiotic skin bacterium Janthinobacterium spp. forms an anti-fungal agent as a metabolic product and thus prevents infection with Batrachochytrium dendrobatidis.

Climate change modifies the microbiome of amphibians

It is assumed that the adaptability of the microbiome can in turn increase the organism's adaptability to environmental influences. There are several examples of this in the animal kingdom. However, environmental changes such as climate change can also throw the microbiome out of balance: "A microbiome in equilibrium can protect against infection in changing environmental conditions," explains the study's first author, Adriana P. Bernardo-Cravo from the Université de Toulouse and the UFZ.

"However, it is also shown that environmental changes - especially temperature - have a significant impact on the composition of the microbiome, and thus on the resistance of amphibians to Batrachochytrium dendrobatidis. Climate change will significantly change the distribution of this fungal disease in amphibians," the ecologist predicts.

Axa-Professor Dirk Schmeller from the Université de Toulouse, further explains: "We have to be aware that climate change and biodiversity loss are stress factors for ecosystems, for humans, for animals and for the microbiome. Our research shows that if the different axes of the disease pyramid are destabilised, new infectious diseases can be expected, including for humans." The presented concept of the disease pyramid is therefore trend-setting for research on human-animal-plant-environment interactions and the resulting risks for biodiversity and humans.

Supplementary information:

The decline of amphibians, the most endangered vertebrates, causes cascade effects in the food webs and can change the environmental balance in the long term, for example water quality or the occurrence of pests and pathogens. In some ecosystems, such as the North American arboreal forests, amphibians are the most common terrestrial vertebrates. There they help regulate the carbon balance. The fungus Batrachochochytrium dendrobatidis is responsible for the decline of over 500 frog species worldwide. It damages the amphibians' skin and disrupts its basic functions, which ultimately leads to cardiac arrest.

Credit: 
Forschungsverbund Berlin

Sound waves transport droplets for rewritable lab-on-a-chip devices

video: Droplets are manipulated and moved on a lab-on-a-chip device through tunnels in oil carefully created by vibrating transducers.

Image: 
Ken Kingery, Duke University

DURHAM, N.C. - Engineers at Duke University have demonstrated a versatile microfluidic lab-on-a-chip that uses sound waves to create tunnels in oil to touchlessly manipulate and transport droplets. The technology could form the basis of a small-scale, programmable, rewritable biomedical chip that is completely reusable to enable on-site diagnostics or laboratory research.

The results appear online on June 10 in the journal Science Advances.

"Our new system achieves rewritable routing, sorting and gating of droplets with minimal external control, which are essential functions for the digital logic control of droplets," said Tony Jun Huang, the William Bevan Distinguished Professor of Mechanical Engineering and Materials Science at Duke. "And we achieve it with less energy and a simpler setup that can control more droplets simultaneously than previous systems."

Automated fluid handling has driven the development of many scientific fields such as clinical diagnostics and large-scale compound screening. While ubiquitous in the modern biomedical research and pharmaceutical industries, these systems are bulky, expensive and do not handle small volumes of liquids well.

Lab-on-a-chip systems have been able to fill this space to some extent, but most are hindered by one major drawback--surface absorption. Because these devices rely on solid surfaces, the samples being transported inevitably leave traces of themselves behind that can lead to contamination.

The new lab-on-a-chip platform uses a thin layer of inert, immiscible oil to stop droplets from leaving behind any trace of themselves. Just below the oil, a grid of piezoelectric transducers vibrate when electricity is passed through them. Just like the surface of a subwoofer, these vibrations create sound waves in the thin layer of oil above them.

These sound waves form complex patterns when they bounce off the top and bottom of the chip as well as when they run into one another. By meticulously planning the design of the transducers and controlling the frequency and strength of the vibrations causing the waves, the researchers are able to create vortices that, when combined, form tunnels that can push and pull droplets in any direction along the surface of the device.

"The new system uses dual-mode transducers, which can transport droplets along x or y axis based on two different streaming patterns," said Huang. "This is a big step up from our previous system, which simply created a series of dimples in the oil to pass droplets along on a single axis."

Aiding Huang in the creation of this upgraded system was Krishnendu Chakrabarty, the John Cocke Distinguished Professor of Electrical and Computer Engineering at Duke, and his PhD student Zhanwei Zhong. The pair helped design the electronics at the heart of the new lab-on-a-chip demonstration, and greatly upgraded and miniaturized the wire connections, controllers and other hardware used in the system.

By using dual-mode transducers, the researchers were able to move droplets along two axes while simultaneously reducing the complexity of the electronics four-fold. They were also able to reduce the operating voltage of the transducers three-to-seven times lower than previous system, which allowed it to simultaneously control eight droplets. And by introducing a microcontroller to the setup, the researchers were able to program and automate much of the droplet movement.

The researchers show off the capabilities of their new device in a series of videos. In one, a droplet is quickly whisked around the exterior of a square. Others show droplets coming to a "T" intersection and turning right or left, and the creation of a "logic gate" that can either interrupt a droplet's movement along a corridor or allow it to pass through.

The ability to control droplets in a manner similar to the logic systems found on a computer chip is essential to a wide variety of clinical and research procedures.

"Our next step is to combine the miniaturized radio-frequency power-supply and control board designed by Professor Chakrabarty's team for large-scale integration and dynamic planning," said Huang. "We're also planning to integrate the ability to split droplets into two without having to touch them."

Credit: 
Duke University

Rice lab turns fluorescent tags into cancer killers

image: The design of thio-based photosensitizers, at left, by Rice University chemists shows promise for photodynamic cancer therapy, among other applications. One thiocarbonyl substitution -- trading an oxygen atom for a sulfur atom -- of a variety of fluorophores can dramatically enhance their ability to generate reactive oxygen species that kill cancer cells. At right, images of multicellular tumor spheroids treated with photosensitizers and light (in the bottom row) show how the compounds, when excited by light, damage the cells.

Image: 
Xiao Lab/Rice University

HOUSTON - (June 11, 2020) - A Rice University lab's project to make better fluorescent tags has turned into a method to kill tumors.
Switching one atom in the tag does the trick.

Rice chemist Han Xiao and his colleagues found that replacing a single oxygen atom with a sulfur atom in a common fluorophore turns it into a photosensitizing molecule. When exposed to light, the molecule generated reactive oxygen species (ROS) that destroyed breast cancer cells in the lab.

The study led by co-lead authors Juan Tang and Lushun Wang, both Rice postdoctoral researchers, appears in the Royal Society of Chemistry flagship journal Chemical Science.

This method of photodynamic therapy is already in use, as light-triggered molecules are known to generate cytotoxic ROS. Most current photosensitizers require the incorporation of heavy atoms, but they are difficult and costly to synthesize and remain toxic in the dark, potentially damaging healthy cells, Xiao said.

The Rice lab's one-step compounds have no heavy atoms, yield a high ratio of ROS when triggered and shut off when the light is turned off. The lab's various thio-based fluorophores absorb light in visible to near-infrared wavelengths that penetrate up to 5 millimeters into tissues.

"This work comes through our previous study to make better fluorogenic dyes," Xiao said. "That was a totally new discovery, but once we went deeper into the mechanism, we found that our thio-based fluorophores can lead to a dramatic generation of singlet oxygen when excited with light. This is the real mediator."

For testing, the researchers combined their photosensitizers with trastuzumab, an antibody used to target and treat early and advanced breast cancer. The combination showed "robust cytotoxicity" against HER2-positive (cancerous) cell lines but almost no activity against HER2-negative cells.

Xiao said the experiments showed their photosensitizers targeted both monolayer cancer cells and multicellular tumor spheroids. "We think a big application for this photosensitizer will be skin cancers," he said. "It should be easy for light to penetrate basal cell carcinomas on the surface."

The researchers noted solar cells, photocatalytic applications and organic chemistry may benefit from their photosensitizers.

Credit: 
Rice University

Rice engineers offer smart, timely ideas for AI bottlenecks

image: Yingyan Lin led a team that demonstrated methods for both designing data-centric computing hardware and co-designing hardware with machine-learning algorithms that together can improve energy efficiency in artificial intelligence hardware by as much as two orders of magnitude.

Image: 
Rice University

HOUSTON -- (June 11, 2020) -- Rice University researchers have demonstrated methods for both designing innovative data-centric computing hardware and co-designing hardware with machine-learning algorithms that together can improve energy efficiency by as much as two orders of magnitude.

Advances in machine learning, the form of artificial intelligence behind self-driving cars and many other high-tech applications, have ushered in a new era of computing -- the data-centric era -- and are forcing engineers to rethink aspects of computing architecture that have gone mostly unchallenged for 75 years.

"The problem is that for large-scale deep neural networks, which are state-of-the-art for machine learning today, more than 90% of the electricity needed to run the entire system is consumed in moving data between the memory and processor," said Yingyan Lin, an assistant professor of electrical and computer engineering.

Lin and collaborators proposed two complementary methods for optimizing data-centric processing, both of which were presented June 3 at the International Symposium on Computer Architecture (ISCA), one of the premier conferences for new ideas and research in computer architecture.

The drive for data-centric architecture is related to a problem called the von Neumann bottleneck, an inefficiency that stems from the separation of memory and processing in the computing architecture that has reigned supreme since mathematician John von Neumann invented it in 1945. By separating memory from programs and data, von Neumann architecture allows a single computer to be incredibly versatile; depending upon which stored program is loaded from its memory, a computer can be used to make a video call, prepare a spreadsheet or simulate the weather on Mars.

But separating memory from processing also means that even simple operations, like adding 2 plus 2, require the computer's processor to access the memory multiple times. This memory bottleneck is made worse by massive operations in deep neural networks, systems that learn to make humanlike decisions by "studying" large numbers of previous examples. The larger the network, the more difficult the task it can master, and the more examples the network is shown, the better it performs. Deep neural network training can require banks of specialized processors that run around the clock for more than a week. Performing tasks based on the learned networks -- a process known as inference -- on a smartphone can drain its battery in less than an hour.

"It has been commonly recognized that for the data-centric algorithms of the machine-learning era, we need innovative data-centric hardware architecture," said Lin, the director of Rice's Efficient and Intelligent Computing (EIC) Lab. "But what is the optimal hardware architecture for machine learning?

"There are no one-for-all answers, as different applications require machine-learning algorithms that might differ a lot in terms of algorithm structure and complexity, while having different task accuracy and resource consumption -- like energy cost, latency and throughput -- tradeoff requirements," she said. "Many researchers are working on this, and big companies like Intel, IBM and Google all have their own designs."

One of the presentations from Lin's group at ISCA 2020 offered results on TIMELY, an innovative architecture she and her students developed for "processing in-memory" (PIM), a non-von Neumann approach that brings processing into memory arrays. A promising PIM platform is "resistive random access memory" (ReRAM), a nonvolatile memory similar to flash. While other ReRAM PIM accelerator architectures have been proposed, Lin said experiments run on more than 10 deep neural network models found TIMELY was 18 times more energy efficient and delivered more than 30 times the computational density of the most competitive state-of-the-art ReRAM PIM accelerator.

TIMELY, which stands for "Time-domain, In-Memory Execution, LocalitY," achieves its performance by eliminating major contributors to inefficiency that arise from both frequent access to the main memory for handling intermediate input and output and the interface between local and main memories.

In the main memory, data is stored digitally, but it must be converted to analog when it is brought into the local memory for processing in-memory. In prior ReRAM PIM accelerators, the resulting values are converted from analog to digital and sent back to the main memory. If they are called from the main memory to local ReRAM for subsequent operations, they are converted to analog yet again, and so on.

TIMELY avoids paying overhead for both unnecessary accesses to the main memory and interfacing data conversions by using analog-format buffers within the local memory. In this way, TIMELY mostly keeps the required data within local memory arrays, greatly enhancing efficiency.

The group's second proposal at ISCA 2020 was for SmartExchange, a design that marries algorithmic and accelerator hardware innovations to save energy.

"It can cost about 200 times more energy to access the main memory -- the DRAM -- than to perform a computation, so the key idea for SmartExchange is enforcing structures within the algorithm that allow us to trade higher-cost memory for much-lower-cost computation," Lin said.

"For example, let's say our algorithm has 1,000 parameters," she added. "In a conventional approach, we will store all the 1,000 in DRAM and access as needed for computation. With SmartExchange, we search to find some structure within this 1,000. We then need to only store 10, because if we know the relationship between these 10 and the remaining 990, we can compute any of the 990 rather than calling them up from DRAM.

"We call these 10 the 'basis' subset, and the idea is to store these locally, close to the processor to avoid or aggressively reduce having to pay costs for accessing DRAM," she said.

The researchers used the SmartExchange algorithm and their custom hardware accelerator to experiment on seven benchmark deep neural network models and three benchmark datasets. They found the combination reduced latency by as much as 19 times compared to state-of-the-art deep neural network accelerators.

Credit: 
Rice University

Shift to online consultations helps patients with chronic pain receive support in lockdown

The covid-19 pandemic has exacerbated conditions for people living with chronic pain around the world and its long-term consequences are likely to be substantial, according to a new paper from researchers at the University of Bath's Centre for Pain Research.

Their Topical Review, published recently in the journal PAIN, suggests that with many doctors specialising in pain being redeployed to focus on the immediate crisis, access to traditional services for patients suffering from acute conditions, such as nerve damage or arthritis, has been severely disrupted. Whilst this creates an immediate capacity challenge for healthcare professionals, it has also provided them an opportunity to move towards the greater use of 'telemedicine' with online consultation, say the researchers.

Chronic or persistent pain is characterised as pain that carries on for longer than 12 weeks despite medication or treatment. Whereas most people get back to normal following an injury or operation, sometimes pain carries on for longer, or comes on without any history of an injury or operation. Common examples include lower back pain, arthritis, fibromyalgia and persistent and frequent headaches. Globally the burden of chronic pain is as high as 1 in 4 of adults. Data from young people are similar.

For those suffering, to date, access to healthcare professionals who can advise on physical therapy, psychological support, or prescriptions for painkillers, has relied heavily on face-to-face consultations. With the recent shift towards online web conferencing platforms in conducting many of our daily interactions, the researchers see a possibility to enable vital access to services at a time of crisis.

The team from the University have been working with healthcare providers locally, nationally and internationally on how best to manage that process and to support patients.

Professor Christopher Eccleston, Professor of Medical Psychology and Director of the Centre for Pain Research at the University of Bath explains: "There is clearly an opportunity to reform how consultations for patients with chronic pain are delivered through new online platforms and technologies. This has come to the fore as a result of covid-19, the immediate public health challenge we are facing and the abrupt shifts we have seen in people adopting new ways of working and interacting. Applying telemedicine to practice, which our team at Bath has assisted with, has enabled doctors to keep their doors open, in a virtual way, to patients who are desperately in need of help and support. It is having important impacts."

Yet, Professor Eccleston and team argue that the broader application of telemedicine is complex and now requires further research in particular about how it can be best coordinated, financially supported and integrated with traditional practice.

He adds: "Changing practice in such an unplanned way will have positive and negative consequences, many unforeseen. Systems can establish protocols that can enable them to oversee, monitor, and capture important patient and provider outcomes and perspectives. When we come to redesign services after the pandemic, we will need to share that experience and use it to learn what works, to modify what does not work, and to build new models of care for people living with chronic pain."

Credit: 
University of Bath

Utah's arches continue to whisper their secrets

video: This animation shows exaggerated modes of vibration of Utah's Moonshine Arch.

Image: 
Utah Geohazards Research Group

Two new studies from University of Utah researchers show what can be learned from a short seismic checkup of natural rock arches and how erosion sculpts some arches--like the iconic Delicate Arch--into shapes that lend added strength.

A study published in Geophysical Research Letters begins with thorough measurements of vibrations at an arch in Utah, and applies those measurements to glean insights from 17 other arches with minimal scientific equipment required.

The second study, published in Geomorphology, compares the strength of arch shapes, specifically beam-like shapes versus inverted catenary shapes (like Delicate Arch or Rainbow Bridge).

A seismological stethoscope

The Geohazards Research Group at the University of Utah measures small vibrations in rock structures, which come from earthquakes, wind and other sources both natural and man-made, to construct 3-D models of how the structures resonate.

Find the group's 3-D models here.

Watch (and listen - turn up your speakers!) how Moonshine Arch near Vernal, Utah, moves here.

Part of the reason for these measurements is to assess the structural health of the rock feature. In studying 17 natural arches, doctoral candidates Paul Geimer, Riley Finnegan and their colleagues set seismometers on the arches for a few hours to a few days. The data from those measurements, coupled with the 3-D models, gave important information about the modes, or major movement directions, of the arches as well as the frequencies for those modes of vibration.

"This is all possible using noninvasive methods," Geimer says, "that form the first step in improving our ability to detecting and identifying damage within arches and similar features." The noninvasive nature of the tests--with the seismometers sitting on the arch's surface without damaging the rock--is important, as many of Utah's rock arches are culturally significant.

The studies of the 17 arches used just one or two seismometers each, so with permission from the National Park Service, the researchers went to Musselman Arch in Canyonlands National Park to verify their earlier measurements. The arch is flat across the top and easily accessible, so they dotted it with 30 seismometers and listened.

"This added wealth of information helped us to confirm our assumptions that arch resonant modes closely follow simple predictive models, and surrounding bedrock acts as rigid support," Geimer says. "To my knowledge, it was the first measurement of its kind for a natural span, after decades of similar efforts at man-made bridges."

All of the arches studied exhibited the property of low damping, Geimer says, which means that they continued to vibrate long after a gust of wind, for example, or a seismic wave from a far-off earthquake. The results also help researchers infer the mechanical properties of rocks without having to drill into the rock to take a sample. For example, the stiffness of the Navajo Sandstone, widespread in Southern Utah, seems to be related to the amount of iron in the rock.

Find the full study here.

Sculpted for stability

Natural arches come in a range of shapes, including beam-like spans that stretch between two rock masses and classic freestanding or partly freestanding inverted catenary arches. A catenary is the arc formed by a hanging chain or rope--so flip it upside down and you've got an inverted catenary.

"In its ideal form, the inverted catenary eliminates all tensile stresses," Geimer says, creating a stable curved span supported solely by compression, which the host sandstone can resist most strongly. The idea that inverted catenary arches are sculpted by erosion into strong shapes is not new. But the U team's approach to analyzing them is. Returning back to their 3-D models of arches and analysis of their vibration modes, the researchers simulated the gravitational stresses in detail on each arch and calculated a number, called the mean principle stress ratio, or MSR, that classifies whether the arch is more like a beam or more like an inverted catenary.

The structure of the rock in which the arch is carved can also influence its shape. Inverted catenary arches are more likely to form in thick massive rock formations. "This allows gravitational stresses to be the dominant sculpting agent," Geimer says, "leaving behind a smooth arc of rock held in compression." Beam-like arches typically form in rock formations with multiple layers with varying strengths. "Weaker layers are removed by erosion more quickly," he adds, "leaving behind a layer of stronger material too thin to form a catenary curve."

While the inverted catenary shape can lend an arch stability in its current form, Geimer and associate professor Jeff Moore are quick to point out that the arch is still vulnerable to other means of eventual collapse. "At Delicate Arch," Moore says, "the arch rests on a very thin easily eroded clayey layer, which provides weak connection to the ground, while Rainbow Bridge is restrained from falling over by being slightly connected to an adjoining rock knoll."

Still, the MSR metric can help researchers and public lands managers evaluate an arch's stability due to its shape. The Geohazards Research Group is continuing to study other factors that can influence rock features' stability, including how cracks grow in rock and how arches have collapsed in the past.

Credit: 
University of Utah

Scientists carry out first space-based measurement of neutron lifetime

image: Artist impression of NASA MErcury Surface, Space ENvironment, GEochemistry, and Ranging MESSENGER spacecraft in orbit at Mercury.

Image: 
NASA/Johns Hopkins University Applied Physics Laboratory/Carnegie Institution of Washington

Scientists have found a way of measuring neutron lifetime from space for the first time - a discovery that could teach us more about the early universe.

Knowing the lifetime of neutrons is key to understanding the formation of elements after the Big Bang that formed the universe 13.8 billion years ago.

Scientists at Durham University, UK, and Johns Hopkins Applied Physics Laboratory, USA,
used data from NASA's MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) spacecraft to make their discovery.

As MESSENGER flew over Venus and Mercury it measured the rates at which neutron particles were leaking out from the two planets.

The number of neutrons detected depended on the time it took them to fly up to the spacecraft relative to the neutron lifetime, giving the scientists a way of calculating how long the subatomic particles could survive.

The findings, published in the journal Physical Review Research, could provide a route to end a decades-long stalemate that has seen researchers disagree - by a matter of seconds - over how long neutrons are capable of surviving.

Dr Vincent Eke, in the Institute for Computational Cosmology, at Durham University, said: "The lifetime of free neutrons provides a key test of the Standard Model of particle physics, and it also affects the relative abundances of hydrogen and helium formed in the early universe just minutes after the Big Bang, so it has wide-ranging implications.

"Space-based methods offer the possibility of breaking the impasse between the two competing Earth-based measurement techniques."

Neutrons are normally found in the nucleus of an atom but quickly disintegrate into electrons and protons when outside the atom.

Scientists have previously used two lab-based methods - the so-called "bottle method" and "beam" technique - to try and determine the lifetime of neutrons.

The bottle method - which traps neutrons in a bottle and measures how long it takes for their radioactivity to decay - suggests they can survive on average for 14 minutes 39 seconds.

Using the alternative beam technique - which fires a beam of neutrons and counts the number of protons created by radioactive decay - gives about 14 minutes and 48 seconds, nine seconds longer than the bottle method.

While this might seem a small difference, scientists say the gap could be enormous. As the Standard Model of particle physics requires the neutron lifetime to be about 14 minutes 39 seconds, any deviation from this would provoke a fundamental change in our understanding of this model.

MESSENGER carried a neutron spectrometer to detect neutrons set loose into space by cosmic rays colliding with atoms on Mercury's surface as part of research to determine the existence of water on the planet.

On its way the spacecraft first flew by Venus, where it collected neutron measurements for the first time ever.

Dr Jacob Kegerreis, in the Institute for Computational Cosmology, at Durham University, said: "Even though MESSENGER was designed for other purposes, we were still able to use the data to estimate the neutron lifetime. The spacecraft made observations over a large range of heights above the surfaces of Venus and Mercury, which allowed us to measure how the neutron flux changes with distance from the planets."

Using models, the team estimated the number of neutrons MESSENGER should count at its altitude above Venus for neutron lifetimes would be between 10 and 17 minutes. For the shorter lifetimes, fewer neutrons survive long enough to reach MESSENGER's neutron detector.

They found the neutron lifetime to be 13 minutes, with an uncertainty of about 130 seconds from statistical and other uncertainties, like whether the number of neutrons changes during the day and uncertainty about the chemical make-up of Mercury's surface.

Their estimated neutron lifetime falls right near the range of the "bottle" and "beam" method estimates.

Lead author Dr Jack Wilson, of the Johns Hopkins Applied Physics Laboratory, said: "It's like a large bottle experiment, but instead of using walls and magnetic fields, we use Venus's gravity to confine neutrons for times comparable to their lifetime."

As systematic errors in space-based measurements are unrelated to those in the bottle and beam methods, the researchers said their new method could provide a way to break the deadlock between the existing, competing measurements.

The researchers added that more precise measurements would require a dedicated space mission, possibly to Venus, as its thick atmosphere and large mass trap neutrons around the planet.

They hope to design and build an instrument that can make a high-precision measurement of the neutron lifetime using their new technique.

Credit: 
Durham University

Elite gamers share mental toughness with top athletes, study finds

video: A new QUT psychology study has shown the overlap between top esports gamers and traditional athletes, like Olympians, in terms of mental toughness.

Image: 
QUT Media

High performing esports professionals may require the same mental stamina it takes to be a top Olympian, according to latest QUT research.

A new study, published in Frontiers in Psychology, indicated an overlap between the mental toughness and stress-coping processes in traditional sports and competitive esports athletes.

Competitive esports athletes appear to cope with stressors similarly to high-performing sports athletes

esports players with higher ranks tended to have higher levels of mental toughness

sports psychology interventions for high-performing sports athletes may also be beneficial to competitive esports athletes.

QUT esports researcher Dylan Poulus said 316 esports players aged 18 and over were studied from among the top 40 per cent of players.

"A disposition considered to be influential in sporting success is mental toughness and it appears to be important for success in esports," Mr Poulus said.

"To be a millionaire esports gamer you deal with stress similar as if you are getting ready to go to the Olympics.

"It is one of the fastest growing sports in the world, and with the coronavirus pandemic there has been huge interest."

The study used athletes who played Overwatch, Counter-Strike: Global Offensive, Rainbow Six: Siege, Defence of the Ancients 2 and League of Legends competitively.

Events can draw more than 60 million online views.

Mr Poulus said the study identified some of the mental skills required for optimal performance among gamers, including high levels of mental toughness, emotional control, and life control.

"Similar to traditional sports athletes, esports athletes with higher mental toughness employed more problem-focused coping strategies which aided in their success," he said.

However, the findings also showed how esports athletes with high mental toughness employed emotion-focused coping strategies like acceptance coping.

"By accepting the elements of their game that are beyond their control could lead to better performance," Mr Poulus said.

"Everything we see in sports psychology interventions that work with traditional sports is likely going to work with esports athletes."

Mr Poulus is completing his PhD at QUT's Faculty of Health school of Exercise and Nutrition Sciences.

The research, titled Stress and Coping in Esports and the Influence of Mental Toughness, was co-authored by Michael Trotter, Dr Tristan Coulter and Professor Remco Polman.

This study was one of the first studies to investigate mental toughness and stress and coping in performing esports athletes.

Further research is investigating what specifically causes stress to high performing esports athletes.

QUT was home to Australia's first official university-endorsed esports program, with five $10,000 scholarships and the first dedicated on-campus gaming arena.

Credit: 
Queensland University of Technology

New insight into the Great Dying

A new study shows for the first time that the collapse of terrestrial ecosystems during Earth's most deadly mass extinction event was directly responsible for disrupting ocean chemistry.

The international study, led by the University of Leeds, highlights the importance of understanding the inter-connectedness of ecosystems as our modern environment struggles with the devastating effects of a rapidly warming planet.

The Permian-Triassic extinction, also known as the Great Dying, took place roughly 252 million years ago. It saw the loss of an estimated 90% of marine species, 70% of land species, widespread loss of plant diversity and extreme soil erosion.

While the exact cause of the terrestrial mass extinction is still debated, it is becoming apparent that the terrestrial ecosystems were wiped out prior to the marine ecosystems. However, until now it was unclear if or how the terrestrial extinction consequently impacted the chemistry of Earth's ancient oceans.

The team built a computer model that mapped chemical changes in Earth's oceans during the period of the Permian-Triassic extinction. The model tracks the cycling of the poisonous element mercury, which is emitted from volcanoes but also gets incorporated into living organisms. By tracing both the mercury and carbon cycles, and comparing to measurements in ancient rocks, the team were able to separate out biological and volcanic events.

This revealed that a massive collapse of terrestrial ecosystems cascaded organic matter, nutrients, and other biologically-important elements into the marine system.

While further research is needed to understand the exact effect this had on marine life, the fact that many marine species rely on chemical stability in their environment means that it is unlikely it was without consequence.

Study co-author Dr Jacopo Dal Corso, who conceived the study during a research placement at Leeds said: "In this study we show that during the Permian-Triassic transition, roughly. 252 million years ago, the widespread collapse of the terrestrial ecosystems caused sudden changes in marine chemistry.

"This likely played a central role in triggering the most severe known marine extinction in Earth's history. This deep-time example shows how important the terrestrial reservoir is in regulating global biogeochemical cycles and calls for the greater conservation of these ecosystems."

Study co-author Dr Benjamin Mills, from the School of Earth and Environment at Leeds said: "252 million years ago the effects of mass plant death and soil oxidation appear to have seriously altered the chemistry of the oceans. This is an uncomfortable parallel with our own human-driven land use change, and we too are transferring large quantities of nutrients and other chemicals to the oceans.

"As we look to re-start the world's economies in the wake of the current pandemic, protecting our life-sustaining ecosystems should be a priority."

Credit: 
University of Leeds

Freshly printed magnets

image: Precisely Magnetized: Iron filings stick to this mini chessboard with four millimeter edge length. The partially magnetic structure was produces from a single type of steel power at different temperatures.

Image: 
Empa

It looks quite inconspicuous to the casual beholder, hardly like groundbreaking innovation: a small metallic chessboard, four millimeters long on either side. At first glance, it shines like polished steel; at second glance, minute differences in color are visible: The tiny chessboard has 16 squares, eight appear slightly darker, the other eight a bit lighter.

The unassuming material sample goes to show that 3D printing with the help of laser beams and metal powder is not only suitable for creating new geometric shapes, but also for producing new materials with completely new functionalities. The small chessboard is a particularly obvious example: Eight squares are magnetic, eight non-magnetic - the entire piece has been 3D-printed from a single grade of metal powder. Only the power and duration of the laser beam varied.

As a starting point, an Empa team led by Aryan Arabi-Hashemi and Christian Leinenbach used a special type of stainless steel, which was developed some 20 years ago by the company Hempel Special Metals in Dübendorf, among others. The so-called P2000 steel does not contain nickel, but around one percent of nitrogen. P2000-steel does not cause allergies and is well suited for medical applications. It is particularly hard, which makes conventional milling more difficult. Unfortunately, at first glance it also seems unsuitable as a base material for 3D laser printing: In the melting zone of the laser beam, the temperature quickly peaks. This is why a large part of the nitrogen within the metal normally evaporates, and the P2000 steel changes its properties.

Turning a problem into an advantage

Arabi-Hashemi and Leinenbach managed to turn this drawback into an advantage. They modified the scanning speed of the laser and the intensity of the laser beam, which melts the particles in the metal powder bed, and thus varied the size and lifetime of the liquid melt pool in a specified manner. In the smallest case, the pool was 200 microns in diameter and 50 microns deep, in the largest case 350 microns wide and 200 microns deep. The larger melt pool allows much more nitrogen to evaporate from the alloy; the solidifying steel crystallizes with a high proportion of magnetizable ferrite. In the case of the smallest melt pool, the melted steel solidifies much faster. The nitrogen remains in the alloy; the steel crystallizes mainly in the form of non-magnetic austenite.

During the experiment, the researchers had to determine the nitrogen content in tiny, millimeter-sized metal samples very precisely and measure the local magnetization to within a few micrometers, as well as the volume ratio of austenitic and ferritic steel. A number of highly developed analytical methods available at Empa were used for this purpose.

Shape Memory Alloys become smart

The experiment, which seems like a mere gimmick, could soon add a crucial tool to the methodology of metal production and processing. "In 3D laser printing, we can easily reach temperatures of more than 2500 degrees Celsius locally," says Leinenbach. "This allows us to vaporize various components of an alloy in a targeted manner - e.g. manganese, alumnium, zinc, carbon and many more - and thus locally change the chemical composition of the alloy." The method is not limited to stainless steels, but can also be useful for many other alloys.

Leinenbach thinks about, for instance, certain nickel-titanium alloys known as shape memory alloys. At what temperature the alloy "remembers" its programmed shape depends on just 0.1 percent more or less nickel in the mixture. Using a 3D laser printer, structural components could be manufactured that react locally and in a staggered manner to different temperatures.

Fine structures for the electric motor of the future

The ability to produce different alloy compositions with micrometer precision in a single component could also be helpful in the design of more efficient electric motors. For the first time, it is now possible to build the stator and the rotor of the electric motor from magnetically finely structured materials and thus make better use of the geometry of the magnetic fields.

The crucial factor in the discovery of the relationship between laser power, the size of the melt pool and the material's properties was the expertise in the field of Additive Manufacturing, which has been built up at Empa over the last nine years. Ever since then, Christian Leinenbach and his team, as one of the world's leading research groups in the field, have devoted themselves to materials science issues related to 3D laser printing processes. At the same time, Empa researchers have gained experience in process monitoring, especially in measuring the melt pools, whose size and lifetime are crucial for the targeted modification of alloys.

Credit: 
Swiss Federal Laboratories for Materials Science and Technology (EMPA)

Nickel-based catalysts tested at Boca de Jaruco oilfield in Cuba

image: This is a schematic representation of a nickel-based catalyst.

Image: 
Kazan Federal University

Kazan University continues its extensive research into catalysts for non-traditional hydrocarbons - viscous and heavy oils.

In this publication, the authors studied transformations of asphaltenes, the compounds determining the viscosity of petroleum. The provided catalyst proved to be effective for in-situ conversion of asphaltenes.

The nickel-based catalyst precursor was introduced in order to intensify the conversion processes of heavy oil components. The active form of such catalysts--nickel sulfides--are achieved after steam treatment of crude oil at reservoir conditions. The experiments were carried out on a rock sample extracted from the depth of 1900 m. Changes in composition and structure of heavy oil after the conversion were identified using SARA-analysis, Gas Chromatography-Mass Spectroscopy of saturated fractions, FTIR spectroscopy of saturated fractions, and MALDI of resins. It is revealed that catalyst particles provide a reduction in the content of resins and asphaltenes due to the destruction of carbon-heteroatom bonds. Moreover, the destruction of C=Carom. bonds and interactions with aromatic rings are heightened. In contrast, the results of experiments in the absence of catalysts exposed polymerization and condensation of aromatic rings. The most remarkable result to emerge from the thermo-catalytic influence is the irreversible viscosity reduction of produced crude oil enhancing the oil recovery factor. Moreover, the introduction of catalysts increases the gas factor due to additional gas generation as a result of aquathermolysis reactions. The yield of methane gas is significantly high in the experimental runs with oil-saturated rocks rather than crude oil experiments.

Co-author, Senior Research Associate of KFU's In-Situ Combustion Lab Aleksey Vakhin explains, "This is a continuation of a three-part work by Kazan Federal University and Zarubezhneft which appeared in Petroleum Science and Technology. Those publications covered transformations of petroleum under the influence of various catalysts."

The results will also be used in a doctoral thesis prepared by another lab employee, Firdaws Aliev.

In addition, 14 metric tons of the experimental aquathermolysis catalyst (trademarked as UniCat) have been produced for field testing.

Credit: 
Kazan Federal University

Bacteria in Chinese pickles can prevent cavities -- Ben-Gurion University study

BEER-SHEVA, Israel...June 11, 2020 - Can a probiotic derived from Chinese pickles prevent cavities? That seems to be the case, according to a study by researchers at Ben-Gurion University of the Negev and Chengdu University in China.

Pickles are an integral part of the diet in the southwest of China. When fruits and vegetables are fermented, healthy bacteria break down the natural sugars. These bacteria, also known as probiotics, not only preserve foods but offer numerous benefits, including immune system regulation, stabilization of the intestinal microbiota, reducing cholesterol levels, and now inhibiting tooth decay.

According to the study published in Frontiers in Microbiology, a strain of Lactobacilli (L. plantarum K41) found in Sichuan pickles reduced S. mutans by 98.4%. Dental caries (cavities) are caused by Streptococcus mutans, (S. mutans) commonly found in the human oral cavity as plaque and is a significant contributor to tooth decay.

Prof. Ariel Kushmaro of the BGU Avram and Stella Goldstein-Goren Department of Biotechnology Engineering and the Chinese research team evaluated 14 different types of Sichuan pickles from southwest China. They extracted 54 different strains of Lactobacilli and found that one, L. plantarum K41, significantly reduced the incidence and severity of cavities. K41 was also highly tolerant of acids and salts, an additional benefit as a probiotic for harsh oral conditions. It also could have potential commercial value when added to dairy products.

According to Doug Seserman, chief executive officer of American Associates, Ben-Gurion University of the Negev based in New York City, "the researchers currently have no plans to evaluate Jewish deli pickles."

Credit: 
American Associates, Ben-Gurion University of the Negev