Tech

Researchers use "swarmalation" to design active materials for self-regulating soft robots

video: Two fully coated sheets are initially placed in symmetric locations about the patch.

Image: 
Raj Kumar Manna

PITTSBURGH (March 16, 2021) ... During the swarming of birds or fish, each entity coordinates its location relative to the others, so that the swarm moves as one larger, coherent unit. Fireflies on the other hand coordinate their temporal behavior: within a group, they eventually all flash on and off at the same time and thus act as synchronized oscillators.

Few entities, however, coordinate both their spatial movements and inherent time clocks; the limited examples are termed "swarmalators"1, which simultaneously swarm in space and oscillate in time. Japanese tree frogs are exemplar swarmalators: each frog changes both its location and rate of croaking relative to all the other frogs in a group.

Moreover, the frogs change shape when they croak: the air sac below their mouth inflates and deflates to make the sound. This coordinated behavior plays an important role during mating and hence, is vital to the frogs' survival. In the synthetic realm there are hardly any materials systems where individual units simultaneously synchronize their spatial assembly, temporal oscillations and morphological changes. Such highly self-organizing materials are important for creating self-propelled soft robots that come together and cooperatively alter their form to accomplish a regular, repeated function.

Chemical engineers at the University of Pittsburgh Swanson School of Engineering have now designed a system of self-oscillating flexible materials that display a distinctive mode of dynamic self-organization. In addition to exhibiting the swarmalator behavior, the component materials mutually adapt their overall shapes as they interact in a fluid-filled chamber. These systems can pave the way for fabricating collaborative, self-regulating soft robotic systems.

The group's research was published this week in the journal Proceedings of the National Academy of Sciences (DOI: 10.1073/pnas.2022987118). Principal investigator is Anna C. Balazs, Distinguished Professor of Chemical and Petroleum Engineering and the John A. Swanson Chair of Engineering. Lead author is Raj Kumar Manna and co-author is Oleg E. Shklyaev, both post-doctoral associates.

"Self-oscillating materials convert a non-periodic signal into the material's periodic motion," Balazs explained. "Using our computer models, we first designed micron and millimeter sized flexible sheets in solution that respond to a non-periodic input of chemical reactants by spontaneously undergoing oscillatory changes in location, motion and shape. For example, an initially flat, single sheet morphs into a three-dimensional shape resembling an undulating fish tail, which simultaneously oscillates back and forth across the microchamber."

The self-oscillations of the flexible sheets are powered by catalytic reactions in a fluidic chamber. The reactions on the surfaces of the sheet and chamber initiate a complex feedback loop: chemical energy from the reaction is converted into fluid flow, which transports and deforms the flexible sheets. The structurally evolving sheets in turn affect the motion of the fluid, which continues to deform the sheets.

"What is really intriguing is that when we introduce a second sheet, we uncover novel forms of self-organization between vibrating structures," Manna adds. In particular, the two sheets form coupled oscillators that communicate through the fluid to coordinate not only their location and temporal pulsations, but also synchronize their mutual shape changes. This behavior is analogous to that of the tree frog swarmalators that coordinate their relative spatial location, and time of croaking, which also involves a periodic change in the frog's shape (with an inflated or deflated throat).

"Complex dynamic behavior is a critical feature of biological systems," Shklyaev says. Stuff does not just come together and stop moving. Analogously, these sheets assemble in the proper time and space to form a larger, composite dynamic system. Moreover, this structure is self-regulating and can perform functions that a single sheet alone cannot carry out."

"For two or more sheets, the collective temporal oscillations and spatial behavior can be controlled by varying the size of the different sheets or the pattern of catalyst coating on the sheet," says Balazs. These variations permit control over the relative phase of the oscillations, e.g., the oscillators can move in-phase or anti-phase.

"These are very exciting results because the 2D sheets self-morph into 3D objects, which spontaneously translate a non-oscillating signal into "instructions" for forming a larger aggregate whose shape and periodic motion is regulated by each of its moving parts," she notes. "Our research could eventually lead to forms of bio-inspired computation - just as coupled oscillators are used to transmit information in electronics - but with self-sustained, self-regulating behavior."

Credit: 
University of Pittsburgh

FSU researchers enhance quantum machine learning algorithms

image: William Oates, the Cummins Inc. Professor in Mechanical Engineering and chair of the Department of Mechanical Engineering at the FAMU-FSU College of Engineering.

Image: 
(FAMU-FSU College of Engineering/Mark Wallheisier

A Florida State University professor's research could help quantum computing fulfill its promise as a powerful computational tool.

William Oates, the Cummins Inc. Professor in Mechanical Engineering and chair of the Department of Mechanical Engineering at the FAMU-FSU College of Engineering, and postdoctoral researcher Guanglei Xu found a way to automatically infer parameters used in an important quantum Boltzmann machine algorithm for machine learning applications.

Their findings were published in Scientific Reports.

The work could help build artificial neural networks that could be used for training computers to solve complicated, interconnected problems like image recognition, drug discovery and the creation of new materials.

"There's a belief that quantum computing, as it comes online and grows in computational power, can provide you with some new tools, but figuring out how to program it and how to apply it in certain applications is a big question," Oates said.

Quantum bits, unlike binary bits in a standard computer, can exist in more than one state at a time, a concept known as superposition. Measuring the state of a quantum bit -- or qubit -- causes it to lose that special state, so quantum computers work by calculating the probability of a qubit's state before it is observed.

Specialized quantum computers known as quantum annealers are one tool for doing this type of computing. They work by representing each state of a qubit as an energy level. The lowest energy state among its qubits gives the solution to a problem. The result is a machine that could handle complicated, interconnected systems that would take a regular computer a very long time to calculate -- like building a neural network.

One way to build neural networks is by using a restricted Boltzmann machine, an algorithm that uses probability to learn based on inputs given to the network. Oates and Xu found a way to automatically calculate an important parameter associated with effective temperature that is used in that algorithm. Restricted Boltzmann machines typically guess at that parameter instead, which requires testing to confirm and can change whenever the computer is asked to investigate a new problem.

"That parameter in the model replicates what the quantum annealer is doing," Oates said. "If you can accurately estimate it, you can train your neural network more effectively and use it for predicting things."

Credit: 
Florida State University

Semiconductor nanogrooves enhanced broad spectral band mmW and THz detection

image: a Structure of the epitaxially grown InSb on GaAs substrate. b Schematic of the spiral antenna assisted device. R and r are respectively the out and inner radius of the antenna. β1 and β2 represent curves of the arm. c The central ohmic metal-semiconductor-metal (OMSM structure). s (50 μm) and w (30 μm) are the length and width of the mesa between ohmic contacts. E denotes the TM polarization orientation (to arouse SPPs in nanogroove array) of incident electromagnetic waves. d Nanogroove array. The period p, width d, and depth t of the nanogroove are 700 nm, 350 nm, and 250 nm, respectively. e Distribution of simulated optical field for nanogroove InSb device at 0.171 THz. f Distribution of the field at xy plane (z=250 nm, bottom plane of the nanogroove array). g Distribution of the field at xz (y=0) plane. Optical field distribution along cut-line I (h) and cut-line II (i) as denoted in (f) and (g), respectively. j Scanning electron microscopy image of the plasmonic nanogroove InSb device. The bottom panel is the zoom-in view of the nanogroove array area.

Image: 
by Jinchao Tong, Fei Suo, Tianning Zhang, Zhiming Huang, Junhao Chu and Dao Hua Zhang

Millimetre and terahertz wave detectors have a wide range of applications in areas such as communications, security, biological diagnosis, spectroscopy, and remote sensing. They are the components that can transform light information loaded by long-wavelength millimetre and terahertz waves into electrical signals. High-performance room-temperature detectors with high sensitivity, fast response, broad spectral bandwidth, and possibility to be extended to large format arrays are always pursued. They are the building blocks for a wide range of millimetre and terahertz wave related systems, including communication network, deep space exploration equipment, security screening system, spectroscopy system, and material composition inspection. However, conventional efficient photoexcitation in optoelectronic semiconductors seems not applicable due to small quantum energy of millimetre and terahertz waves and strong background thermal disturbances. Although Golay cells, pyroelectrics, bolometers, and Schottky barrier diodes (SBDs) are in widespread use, they suffer from poor noise equivalent power (NEP) (only 10-9-10-10 W Hz-1/2 level for Golay cells and pyroelectrics), slow response (ms level for Golay cells, pyroelectrics), or narrow spectral bandwidth (multiple modules for SBDs to achieve broad spectral bandwidth).

In a new paper published in Light Science & Application, Professor Dao Hua Zhang and Presidential Postdoctoral Fellow Jinchao Tong from the School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore and co-workers reported millimetre and terahertz wave detectors based on epitaxially grown InSb/AlInSb/GaSb/GaAs by molecular-beam epitaxy (MBE) with nanogroove array for enhancement. The InSb films in such a novel structure possess high electron mobility and negative permittivity in a broad millimetre and terahertz wave band, and further, it is suitable for fabrication of large format arrays. A broad spectral bandwidth planar equiangular spiral antenna is designed to efficiently couple millimetre and terahertz waves. A nanogroove array is fabricated in the InSb layer, which can arouse strong excitation of millimetre and terahertz wave surface plasmon polaritons (SPPs), especially at the InSb-air interfaces, leading to a general improvement of 50-100% for detection performance. A NEP of 2.2×10-14 W Hz-1/2 or a detectivity (D*) of 2.7×1012 cm Hz1/2 W-1 is achieved at 1.75 mm (0.171 THz) at room temperature. The device also shows a broad spectral band detection from 0.9 mm (0.330 THz) to 9.4 mm (0.032 THz) and a fast response speed of 3.5 μs. By moderately decreasing the temperature to the thermoelectric cooling of 200 K, the corresponding NEP, D* and response speed can be further improved to 3.8×10-15 W Hz-1/2, 1.6×1013 cm Hz1/2 W-1 and 780 ns, respectively.

The detection of the detector is based millimetre and terahertz wave SPPs induced nonequilibrium electrons. Under external bias, unidirectional drift of these carriers will form photocurrent. The newly developed detector has a few advantages compared to current technologies. High sensitivity: the achieved NEP is 2-3 order superior to state-of-the-art. Uncooled operation: no cooling technology is required for its normal operation. Broad spectral band detection: A single detector can performance detection in 0.9-9.4 mm. Easy to be extended: this detector is based on wafer-scale InSb. Fast response speed: the detector has a response speed of μs level at room temperature. Simple configuration: the detector is based on very simple two-terminal structure.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

From a window to a mirror: new material paves the way to faster computing

Research led by the Cavendish Laboratory at the University of Cambridge has identified a material that could help tackle speed and energy, the two biggest challenges for computers of the future.

Research in the field of light-based computing - using light instead of electricity for computation to go beyond the limits of today's computers - is moving fast, but barriers remain in developing optical switching, the process by which light would be easily turned 'on' and 'off', reflecting or transmitting light on-demand.

The study, published in Nature Communications, shows that a material known as Ta2NiSe5 could switch between a window and a mirror in a quadrillionth of a second when struck by a short laser pulse, paving the way for the development of ultra-fast switching in computers of the future.

The material looks like a chunk of pencil lead and acts an insulator at room temperature, which means that when infrared light strikes the material in this insulating state, it passes straight through like a window. However, when heated, the material becomes a metal which acts like a mirror and reflects light.

"We knew that Ta2NiSe5 could switch between a window and a mirror when it was heated up, but heating an object is a very slow process," said Dr Akshay Rao, Harding University Lecturer at the Cavendish Laboratory, who led the research. "What our experiments have shown is that a short laser pulse can also trigger this 'flip' in only 10-15 seconds. This is a million times faster than switches in our current computers."

The researchers were looking into the material's behaviour to show the existence of a new phase of matter called an 'excitonic insulator', which has been experimentally challenging to find since it was first theorised in the 1960s.

"This excitonic insulating phase looks in many ways like a very normal insulator, but one way to distinguish between an unusual and ordinary insulator is to see exactly how long it takes for it to become a metal," said Rao. "For normal matter, going from an insulator to a metal is like melting an ice cube. The atoms themselves move positions and rearrange, making it a slow process. But in an excitonic insulator, this could happen very fast because the atoms themselves do not need to move to switch phases. If we could find a way to measure how fast this transition occurs, we could potentially unmask the excitonic insulator."

To do these experiments, the researchers used a sequence of very short laser pulses to first perturb the material and then measure how its reflection changed. At room temperature, they found that when Ta2NiSe5 was struck by a strong laser pulse, it exhibited signatures of the metallic state immediately, becoming a mirror on a timescale faster than they could resolve. This provided strong evidence for the excitonic insulating nature of Ta2NiSe5.

"Not only does this work remove the material's camouflage, opening up further studies into its unusual quantum mechanical behaviour, it also highlights this material's unique capability of acting as an ultrafast switch," said first author Hope Bretscher, also from the Cavendish Laboratory. "In fact, for the optical switch to be effective, not only must it transition quickly from the insulating to the metallic phase, but the reverse process must also be fast.

"We found that Ta2NiSe5 returned to an insulating state rapidly, much faster than other candidate switch materials. This ability to go from mirror, to window, to mirror again, make it extremely enticing for computing applications."

"Science is a complicated and evolving process--and we think we've been able to take this discussion a step forward. Not only we can now better understand the properties of this material, but we also uncovered an interesting potential application for it," said co-author Professor Ajay Sood, from the Indian Institute of Science in Bangalore.

"While practically producing quantum switches with Ta2NiSe5 may still be a long way off, having identified a new approach to the growing challenge of computer's speed and energy use is an exciting development," said Rao.

Credit: 
University of Cambridge

New perovskite LED emits a spin-polarized glow

The inclusion of a special new perovskite layer has enabled scientists to create a "spin-polarized LED" without needing a magnetic field or extremely low temperatures, potentially clearing the path to a raft of novel technologies.

Details of the research conducted at the National Renewable Energy Laboratory (NREL) and the University of Utah appear in the journal Science.

Researchers at NREL and around the world have been investigating the use of perovskite semiconductors for solar cells that have proven to be highly efficient at converting sunlight to electricity. Since a solar cell is one of the most demanding applications of any semiconductor, scientists are discovering other uses exist as well.

"We are exploring the fundamental properties of metal-halide perovskites, which has allowed us to discover new applications, beyond photovoltaics," said Joseph Luther, a co-author of the new paper, "Chiral-induced spin selectivity enabling a room-temperature spin light-emitting diode." "Because metal-halide perovskites, and other related systems, are some of the most fascinating semiconductors, they exhibit a host of novel phenomena that can be utilized in transforming energy."

The other co-authors from NREL are Matthew Beard, a senior research fellow and director of the Center for Hybrid Organic Inorganic Semiconductors for Energy (CHOISE), Young-Hoon Kim, Yaxin Zhai, Haipeng Lu, Chuanxiao Xiao, E. Ashley Gaulding, Steven Harvey, and Joseph Berry. Valy Vardeny and Xin Pan are co-authors from Utah. All are part of CHOISE, an Energy Frontier Research Center (EFRC) funded by the Office of Science within DOE.

The goals of the CHOISE EFRC are to control the interconversion of charge, spin, and light using carefully designed chemical systems. Most opto-electronic devices in use today only control charge and light and not the spin of the electron. An electron can have either "up" or "down" spins. Using two different perovskite layers, the researchers were able to control the spin by creating a filter that blocks electrons "spinning" in the wrong direction.

One way to produce spin-polarized currents is through a "chiral-induced spin selectivity" layer, where the transport of electrons with "up" or "down" spin states depends upon the chirality of the transporting materials. Chirality refers to the materials structure where it is not identical to its mirror image. For example, a "left-handed" oriented chiral system may allow transport of electrons with "up" spins but block electrons with "down" spins and vice versa.

The filter enabled the researchers to inject spin-polarized charges into a light-emitting diode (LED) at room temperature--instead of at hundreds of degrees below zero Fahrenheit--and without the use of magnetic fields or ferromagnetic contacts that are typically needed to control the spin degree of freedom. The LED, in response, emits light with special chiral properties, accordingly. The concept proves that using these chiral-hybrid systems gains control over spin without magnets and has "broad implications for applications such as quantum-based optical computing, bioencoding, and tomography," according to Beard.

Credit: 
DOE/National Renewable Energy Laboratory

Researchers find a better way to measure consciousness

MADISON, Wis. -- Millions of people are administered general anesthesia each year in the United States alone, but it's not always easy to tell whether they are actually unconscious.

A small proportion of those patients regain some awareness during medical procedures, but a new study of the brain activity that represents consciousness could prevent that potential trauma. It may also help both people in comas and scientists struggling to define which parts of the brain can claim to be key to the conscious mind.

"What has been shown for 100 years in an unconscious state like sleep are these slow waves of electrical activity in the brain," says Yuri Saalmann, a University of Wisconsin-Madison psychology and neuroscience professor. "But those may not be the right signals to tap into. Under a number of conditions -- with different anesthetic drugs, in people that are suffering from a coma or with brain damage or other clinical situations -- there can be high-frequency activity as well."

UW-Madison researchers recorded electrical activity in about 1,000 neurons surrounding each of 100 sites throughout the brains of a pair of monkeys at the Wisconsin National Primate Research Center during several states of consciousness: under drug-induced anesthesia, light sleep, resting wakefulness, and roused from anesthesia into a waking state through electrical stimulation of a spot deep in the brain (a procedure the researchers described in 2020).

"With data across multiple brain regions and different states of consciousness, we could put together all these signs traditionally associated with consciousness -- including how fast or slow the rhythms of the brain are in different brain areas -- with more computational metrics that describe how complex the signals are and how the signals in different areas interact," says Michelle Redinbaugh, a graduate student in Saalman's lab and co-lead author of the study, published today in the journal Cell Systems.

To sift out the characteristics that best indicate whether the monkeys were conscious or unconscious, the researchers used machine learning. They handed their large pool of data over to a computer, told the computer which state of consciousness had produced each pattern of brain activity, and asked the computer which areas of the brain and patterns of electrical activity corresponded most strongly with consciousness.

The results pointed away from the frontal cortex, the part of the brain typically monitored to safely maintain general anesthesia in human patients and the part most likely to exhibit the slow waves of activity long considered typical of unconsciousness.

"In the clinic now, they may put electrodes on the patient's forehead," says Mohsen Afrasiabi, the other lead author of the study and an assistant scientist in Saalmann's lab. "We propose that the back of the head is a more important place for those electrodes, because we've learned the back of the brain and the deep brain areas are more predictive of state of consciousness than the front."

And while both low- and high-frequency activity can be present in unconscious states, it's complexity that best indicates a waking mind.

"In an anesthetized or unconscious state, those probes in 100 different sites record a relatively small number of activity patterns," says Saalmann, whose work is supported by the National Institutes of Health.

A larger -- or more complex -- range of patterns was associated with the monkey's awake state.

"You need more complexity to convey more information, which is why it's related to consciousness," Redinbaugh says. "If you have less complexity across these important brain areas, they can't convey very much information. You're looking at an unconscious brain."

More accurate measurements of patients undergoing anesthesia is one possible outcome of the new findings, and the researchers are part of a collaboration supported by the National Science Foundation working on applying the knowledge of key brain areas.

"Beyond just detecting the state of consciousness, these ideas could improve therapeutic outcomes from people with consciousness disorders," Saalmann says. "We could use what we've learned to optimize electrical patterns through precise brain stimulation and help people who are, say, in a coma maintain a continuous level of consciousness."

Credit: 
University of Wisconsin-Madison

Image release: Cosmic lens reveals faint radio galaxy

image: Composite image of galaxy cluster MACSJ0717.5+3745, with VLA radio image superimposed on visible-light image from Hubble Space Telescope. Pullout is detail of distant galaxy VLAHFF-J071736.66+374506.4 -- likely the faintest radio-emitting object yet found -- revealed by the magnifying effect of the gravitational lens.

Image: 
Heywood et al.; Sophia Dagnello, NRAO/AUI/NSF; STScI.

Radio telescopes are the world's most sensitive radio receivers, capable of finding extremely faint wisps of radio emission coming from objects at the farthest reaches of the universe. Recently, a team of astronomers used the National Science Foundation's Karl G. Jansky Very Large Array (VLA) to take advantage of a helping hand from nature to detect a distant galaxy that likely is the faintest radio-emitting object yet found.

The discovery was part of the VLA Frontier Fields Legacy Survey, led by NRAO Astronomer Eric Murphy, which used distant clusters of galaxies as natural lenses to study objects even farther away. The clusters served as gravitational lenses, using the gravitational pull of the galaxies in the clusters to bend and magnify light and radio waves coming from the more-distant objects.

In this composite, a VLA radio image is superimposed on a visible-light image from the Hubble Space Telescope. The prominent red-orange objects are radio relics -- large structures possibly caused by shock waves -- inside the foreground galaxy cluster, called MACSJ0717.5+3745, which is more than 5 billion light-years from Earth.

Detailed VLA observations showed that many of the galaxies in this image are emitting radio waves in addition to visible light. The VLA data revealed that one of these galaxies, shown in the pullout, is more than 8 billion light-years distant. Its light and radio waves have been bent by the intervening cluster's gravitational-lensing effect.

The radio image of this distant galaxy, called VLAHFF-J071736.66+374506.4, has been magnified more than 6 times by the gravitational lens, the astronomers said. That magnification is what allowed the VLA to detect it.

"This probably is the faintest radio-emitting object ever detected," said Ian Heywood, of Oxford University in the UK. "This is exactly why we want to use these galaxy clusters as powerful cosmic lenses to learn more about the objects behind them."

"The magnification provided by the gravitational lens, combined with extremely sensitive VLA imaging, gave us an unprecedented look at the structure of a galaxy 300 times less massive than our Milky Way at a time when the universe was less than half its current age. This is giving us valuable insights on star formation in such low-mass galaxies at that time and how they eventually assembled into more massive galaxies," said Eric Jimenez-Andrade, of NRAO.

The scientists are reporting their work in a pair of papers that are accepted into publication in the Astrophysical Journal.

Credit: 
National Radio Astronomy Observatory

Tired at the office? Take a quick break; your work will benefit

Recent research shows that people are more likely to take "microbreaks" at work on days when they're tired - but that's not a bad thing. The researchers found microbreaks seem to help tired employees bounce back from their morning fatigue and engage with their work better over the course of the day.

At issue are microbreaks, which are short, voluntary and impromptu respites in the workday. Microbreaks include discretionary activities such as having a snack, chatting with a colleague, stretching or working on a crossword puzzle.

"A microbreak is, by definition, short," says Sophia Cho, co-author of a paper on the work and an assistant professor of psychology at North Carolina State University. "But a five-minute break can be golden if you take it at the right time. Our study shows that it is in a company's best interest to give employees autonomy in terms of taking microbreaks when they are needed - it helps employees effectively manage their energy and engage in their work throughout the day."

The new paper is based on two studies that explored issues related to microbreaks in the workday. Specifically, the studies were aimed at improving our understanding of how people boost or maintain their energy levels throughout the day in order to engage with work even when they start the day already exhausted. The studies also examined which factors might play a role in determining whether people took microbreaks, or what they did during those microbreaks.

The first study surveyed 98 workers in the United States. Study participants were asked to fill out two surveys per day for 10 consecutive workdays. The surveys were completed in the morning and at the end of workday. The second study included 222 workers in South Korea. This study had participants complete three surveys per day for five workdays. Study participants completed the surveys in the morning, after lunch and at the end of the workday.

Survey questions in both studies were aimed at collecting data about each study participant's sleep quality, levels of fatigue, as well as their engagement with their work and their experiences at the workplace that day. In the studies, the researchers analyzed the survey data with statistical tools to examine day-to-day fluctuations in sleep quality, fatigue, work behavior and engagement in varying types of microbreaks.

The results were straightforward: on days that people were already fatigued when they arrived at work, they tended to take microbreaks more frequently. And taking microbreaks helped them maintain their energy level. This, in turn, helped them meet work demands and engage with work better.

"Basically, microbreaks help you manage your energy resources over the course of the day - and that's particularly beneficial on days when you're tired," Cho says.

In addition, the researchers found that people were more likely to take microbreaks if they felt their employer cared about the health and well-being of its workers.

"When people think their employer cares about their health, they feel more empowered to freely make decisions about when to take microbreaks and what type of microbreaks to take," Cho says. "And that is ultimately good for both the employer and the employee."

Credit: 
North Carolina State University

Army, Air Force fund research to pursue quantum computing

RESEARCH TRIANGLE PARK, N.C. -- Joint Army- and Air Force-funded researchers have taken a step toward building a fault-tolerant quantum computer, which could provide enhanced data processing capabilities.

Quantum computing has the potential to deliver new computing capabilities for how the Army plans to fight and win in what it calls multi-domain operations. It may also advance materials discovery, artificial intelligence, biochemical engineering and many other disciplines needed for the future military; however, because qubits, the fundamental building blocks of quantum computers, are intrinsically fragile, a longstanding barrier to quantum computing has been effective implementation of quantum error correction.

Researchers at University of Massachusetts Amherst, with funding from the Army Research Office and the Air Force Office of Scientific Research, identified a way to protect quantum information from a common error source in superconducting systems, one of the leading platforms for the realization of large-scale quantum computers. The research, published in Nature, realized a novel way for quantum errors to be spontaneously corrected.

ARO is an element of the U.S. Army Combat Capabilities Development Command, known as DEVCOM, Army Research Laboratory. AFOSR supports basic research for the Air Force and Space Force as part of the Air Force Research Laboratory.

"This is a very exciting accomplishment not only because of the fundamental error correction concept the team was able to demonstrate, but also because the results suggest this overall approach may amenable to implementations with high resource efficiency, said Dr. Sara Gamble, quantum information science program manager, ARO. "Efficiency is increasingly important as quantum computation systems grow in size to the scales we'll need for Army relevant applications."

Because qubits, the fundamental building blocks of quantum computers, are intrinsically fragile, a longstanding barrier to quantum computing has been effective implementation of quantum error correction.

Today's computers are built with transistors representing classical bits, either a 1 or 0. Quantum computing is a new paradigm of computation using quantum bits or qubits, where quantum superposition and entanglement can be exploited for exponential gains in processing power.

Existing demonstrations of quantum error correction are active, meaning that they require periodically checking for errors and immediately fixing them. This demands hardware resources and thus hinders the scaling of quantum computers.

In contrast, the researchers' experiment achieves passive quantum error correction by tailoring the friction or dissipation experienced by the qubit. Because friction is commonly considered the nemesis of quantum coherence, this result may appear surprising. The trick is that the dissipation has to be designed specifically in a quantum manner.

This general strategy has been known in theory for about two decades, but a practical way to obtain such dissipation and put it in use for quantum error correction has been a challenge.

"Demonstrating such non-traditional approaches will hopefully spur more clever ideas for overcoming some of the most challenging issues for quantum science," said Dr. Grace Metcalfe, program officer for Quantum Information Science at AFOSR.

Looking forward, researchers said the implication is that there may be more avenues to protect qubits from errors and do so less expensively.

"Although our experiment is still a rather rudimentary demonstration, we have finally fulfilled this counterintuitive theoretical possibility of dissipative QEC," said Dr. Chen Wang, University of Massachusetts Amherst physicist. "This experiment raises the outlook of potentially building a useful fault-tolerant quantum computer in the mid to long run."

Credit: 
U.S. Army Research Laboratory

Could birth control pills ease concussion symptoms in female athletes?

Higher progesterone level is protective in mild traumatic brain injury

Blood flow in brain is linked to progesterone and stress symptom levels

Most concussion research has been focused on male athletes

CHICAGO --- Could birth control pills help young female athletes recover faster from concussions and reduce their symptoms?

A new Northwestern Medicine pilot study has shown when a female athlete has a concussion injury during the phase of her menstrual cycle when progesterone is highest, she feels less stress. Feeling stressed is one symptom of a concussion. Feeling less stressed is a marker of recovery.

The study also revealed for the first time the physiological reason for the neural protection is increased blood flow to the brain as a result of higher levels of progesterone.

"Our findings suggest being in the luteal phase (right after ovulation) of the menstrual cycle when progesterone is highest -- or being on contraceptives, which artificially increase progesterone -- may mean athletes won't have as severe symptoms when they have a concussion injury," said co-author Amy Herrold, research assistant professor of psychiatry and behavioral sciences at Northwestern University Feinberg School of Medicine.

"Resolving those symptoms is especially problematic for our athletes who are trying to return to school, their sports and everyday life after a concussion," said lead author Jennie Chen, research assistant professor of radiology at Feinberg.

The study was published in the Journal of Neurotrauma.

The athletes in the study were in soccer, ultimate frisbee, crew, triathlon, lacrosse, women's rugby and tennis clubs. The focus on club athletes is important because more college students take part in club athletics than varsity athletics, Herrold said. In addition, club athletics are not as tightly monitored, possibly leading to increased exposure and under-reporting of concussion.

Northwestern investigators found increased blood flow in the brain when a female athlete had a higher level of progesterone due to her menstrual cycle phase. The region, the middle temporal gyrus, is important for information processing and integrating visual and auditory stimuli. It also has been implicated in social anxiety disorder.
Recovering from concussion is stressful for athletes
Following a concussion or mild traumatic brain injury (mTBI), athletes are pulled from classes for a period of time and struggle to keep up with classes.

"When they are recovering from a concussion, they get very stressed trying to keep up with coursework and making up for lost time," Herrold said. "Their ratings on perceived stress are really important for their overall recovery from the injury and getting back to normal."

Big gap in research on female concussion

The bulk of sports-related concussion research has been focused on male athletes. The study fills a big gap in literature by studying female club athletes, Herrold said. "The trajectory of recovery from mild traumatic brain injury is different in female athletes than male athletes. Male athletes have shorter length of recovery than females, despite similar symptom severity."

For the study, investigators enrolled 30 female collegiate athletes and assessed them three to 10 days after a concussion or mTBI. Assessments included an MRI scan to examine brain blood flow, a blood draw to examine progesterone levels and self-reported mTBI symptom questionnaires including the perceived stress questionnaire. Once an injured athlete was studied, the investigators enrolled a healthy control athlete that was matched based on age, ethnicity contraceptive use and type, and menstrual cycle phase.

Clinicians may consider menstrual cycle phase when caring for injured athletes

It may be helpful for clinicians caring for injured athletes to consider the phase of the athlete's menstrual cycle and what, if any, hormonal contraceptives they are on, Herrold said. Both will affect progesterone levels and could affect brain blood flow and perceived stress.

"Clinicians also may want to evaluate wider use of hormonal contraceptives that raise progesterone levels for athletes who are at risk for incurring a concussion or mild TBI as there could be potential for neuroprotection," Herrold said.

In future research, Chen and Herrold plan to study if these results can be replicated in a larger more heterogenous sample of female athletes. They also want to compare what they found in males and females competing in sports with concussion risk such as soccer.

Credit: 
Northwestern University

Leaders take note: Feeling powerful can have a hidden toll

New research from the University of Florida Warrington College of Business finds that feeling psychologically powerful makes leaders' jobs seem more demanding. And perceptions of heightened job demands both help and hurt powerful leaders.

Trevor Foulk of the University of Maryland Robert H. Smith School of Business and Klodiana Lanaj, Martin L. Schaffel Professor at UF, note that while power-induced job demands are key to helping leaders more effectively pursue their goals and feel that their jobs are meaningful each day at work, these demands can also cause pain and discomfort, felt in the evening at home.

"Power is generally considered a desirable thing, as leaders often seek power, and it's very rare for leaders to turn powerful roles down," Foulk said. "However, this view is qualified by the fact that many leaders feel exhausted and overburdened by their work. Our work helps shed light on this paradox, as it helps us understand why leaders might want powerful positions (they achieve more goal progress and feel that their work is more meaningful), but also face substantial consequences (their jobs feel more demanding in a way that causes anxiety and physical pain)."

The study shows that leaders who are higher in neuroticism - a personality trait that captures one's propensity to worry and to experience stress - are particularly sensitive to both the costs and the benefits that come with feeling powerful at work.

"Neuroticism is generally associated with negative outcomes like stress, job dissatisfaction, and a focus on failures and frustrations," Foulk and Lanaj write. "However, our results demonstrate that neuroticism can strengthen the indirect effect of power on goal progress and meaningfulness, highlighting that neuroticism can also have positive implications for powerful employees at work."

With these findings in mind, Foulk and Lanaj offer options for how leaders and organizations can help powerful employees deal with the negative effects of experienced power - anxiety and physical pain. For those in positions of power dealing with anxiety, the researchers suggest giving these individuals access to increased social support and help in developing strategies for dealing with anxiety like practicing mindfulness or participating in stress management programs.

As for reducing physical discomfort and pain, Foulk and Lanaj recommend that organizations consider encouraging powerful leaders to take more breaks during work or providing them with physical resources like ergonomic chairs and office equipment.

"Such strategies may help employees and organizations realize the positive effects of power-induced job demands, while minimizing or mitigating their negative effects," Foulk and Lanaj write.

Taken together, these findings shed light on nuanced ways that power impacts leaders at work. Leaders feeling burdened by their power are likely to feel like something is awry or that they may just not be up to the task. This may be particularly likely for leaders high in neuroticism, but this work shows that feeling under pressure at work is a natural consequence of feeling powerful. Therefore, managers and organizations should recognize the discordant effects that power has on employees and realize that the experience of power is neither universally positive nor universally negative for powerholders.

This research is forthcoming in the Journal of Applied Psychology.

Credit: 
University of Florida

Structural insights into how an early SARS-Cov-2 variant gained its advantage

In an analysis that explores the structural underpinnings of a SARS-CoV-2 strain, G614, that quickly became dominant early in the pandemic, researchers discovered interactions that prevent this strain’s spike from shedding its host binding domain too early. This may explain the enhanced infectivity of the G614 virus, they say. Throughout the COVID-19 pandemic, epidemiologists have monitored evolution of the SARS-CoV-2 virus with particular focus on the spike (S) protein. Spike trimers decorate the viral surface and facilitate host cell entry. An early variant with a single-residue substitution (G614) in its spike protein rapidly became the dominant strain throughout the world, and studies have also suggested it is more infectious than the original strain. Puzzlingly, studies have shown that it does not bind more tightly to recombinant ACE2, the host cell receptor. Jun Zhang et al. investigated the structural basis for the spread of the G614 virus. Structural and biochemical studies on a full-length G614 S trimer revealed interactions not present in D614, the original strain, which was described in a paper published in Science in July 2020. In particular, a loop wedges between domains in the G614 spike, in an added interaction that appears to stabilize the spike to prevent premature dissociation of the G614 trimer. This effectively increases the number of functional spikes. “[W]e suggest that the enhanced infectivity of the G614 virus largely results from the increased stability of the S trimer,” conclude the authors.

Journal

Science

DOI

10.1126/science.abd4251

Credit: 
American Association for the Advancement of Science (AAAS)

Jupiter's "dawn storm" auroras are surprisingly Earth-like

video: A study conducted by researchers from the Laboratory for Planetary and Atmospheric Physics of the University of Liege, shows for the first time global views of a dawn storm, a spectacular auroral phenomenon that occurs on Jupiter.

Image: 
@University of Liège

The storms, which consist of brightenings and broadenings of the dawn flank of an oval of auroral activity that encircles Jupiter's poles, evolve in a pattern surprisingly reminiscent of familiar surges in the aurora that undulate across Earth's polar skies, called auroral substorms, according to the authors.

The new study is the first to track the storms from their birth on the nightside of the giant planet through their full evolution. It was published today in AGU Advances, AGU's journal for high-impact, short-format reports with immediate implications spanning all Earth and space sciences.

During a dawn storm, Jupiter's quiet and regular auroral arc transforms into a complex and intensely bright auroral feature. It emits hundreds to thousands of Gigawatts of ultraviolet light into space as it rotates from the night side to the dawn side and ultimately to the day side of the planet over the course of 5-10 hours. A Gigawatt is the power produced by a typical modern nuclear reactor. This colossal brightness implies that at least ten times more energy was transferred from the magnetosphere to the upper atmosphere of Jupiter.

Previously, dawn storms had only been observed from ground-based telescopes on Earth or the Hubble Space Telescope, which can only offer side views of the aurora and cannot see the night side of the planet. Juno revolves around Jupiter every 53 days along a highly elongated orbit that brings it right above the poles every orbit.

"This is a real game changer," said Bertrand Bonfond, a researcher from the University of Liège and lead author of the new study. "We finally got to find out what was happening on the night side, where the dawn storms are born."

Familiar auroral sequences, different engines
Polar auroras on Earth and on Jupiter are images of processes occurring in the magnetic fields that surround them. Both planets generate magnetic fields that capture charged particles.

Earth's magnetosphere is shaped by charged particles flowing out of the sun called the solar wind. Bursts of solar wind stretch Earth's magnetic field into a long tail on the nightside of the planet. When that tail snaps back, it fires charged particles into the nightside ionosphere, which appear as spectacular auroral light shows.

The new study found the timing of the dawn storms on Jupiter did not correlate with solar wind fluctuations. Jupiter's magnetosphere is mostly populated by particles escaping from its volcanic moon Io, which then get ionized and trapped around the planet by its magnetic field.

The sources of mass and energy fundamentally differ between these two magnetospheres, leading to auroras that usually look quite different. However, the dawn storms, as unraveled by Juno's ultraviolet spectrograph, looked familiar to the researchers.

"When we looked at the whole dawn storm sequence, we couldn't help but notice that the dawn storm auroras at Jupiter are very similar to a type of terrestrial auroras called substorms" said Zhonghua Yao, co-author of the study and scientific collaborator at the University of Liège.

The substorms result from the explosive reconfiguration of the tail of the magnetosphere. On Earth, they are strongly related to the variations of the solar wind and of the orientation of the interplanetary magnetic field. On Jupiter, such explosive reconfigurations are rather related to an overspill of the plasma originating from Io.

These findings demonstrate that, whatever their sources, particles and energy do not always circulate smoothly in planetary magnetospheres. They often accumulate until the magnetospheres collapse and generate substorm-like responses in the planetary aurorae.

"Even if their engine is different, showing for the first time the link between these two very different systems allows us to identify the universal phenomena from the peculiarities specific to each planet," Bonfond said.

Credit: 
University of Liège

Study uncovers safety concerns with some air purifiers

image: The environmental test chamber where researchers conducted air purifier experiments. (a) is the exterior with instruments set up outside, and (b) is inside the chamber with mock-up furnishings and materials.

Image: 
Illinois Tech

The market for air purifiers is booming, but a new study has found that some air cleaning technologies marketed for COVID-19 may be ineffective and have unintended health consequences.

The study, authored by researchers at Illinois Tech, Portland State University, and Colorado State University, found that cleaning up one harmful air pollutant can create a suite of others.

Both chamber and field tests found that an ionizing device led to a decrease in some volatile organic compounds (VOCs) including xylenes, but an increase in others, most prominently oxygenated VOCs (e.g., acetone, ethanol) and toluene, substances commonly found in paints, paint strippers, aerosol sprays and pesticides. According to the EPA, exposure to VOCs has been linked to a range of health effects from eye, nose and throat irritation, headaches, loss of coordination and nausea, to damage to liver, kidney and central nervous system, and some organics can cause cancer in animals, some are suspected or known to cause cancer in humans.

The study, published this week in Building and Environment, mimicked real-world operating conditions for these ionization devices to test the effectiveness and potential to form chemical byproducts in environments similar to where we all live, work, and learn.

One of the most popular types of air purifiers on the market right now are ion-generating systems, including 'bipolar ionization' devices that electrically charge particles so they settle out of the air faster, and are generally marketed to kill bacteria, fungi, and viruses.

Understandably, the "virus-killing" capability has drawn attention and been heavily featured in advertising over the past year and led to a flood of new and revamped products on the market.

However, the study finds that the air purifier marketplace is fraught with inadequate test standards, confusing terminology, and a lack of peer-reviewed studies of their effectiveness and safety. Unlike air filtration (where air is pushed through a filter to remove airborne pollutants), there has been very little research on the effectiveness and side effects of "additive" air cleaning methods like ionizing devices.

"Manufacturers and third-party test labs commonly demonstrate their product's effectiveness using chamber tests, but these test reports often don't use experimental conditions that could show how the device actually performs in real-world conditions," said Brent Stephens, Chair of the Department of Civil, Architectural, and Environmental Engineering at Illinois Tech. "To the extent that there are testing standards for ionization and other devices, those are largely industry-led standards that remain underdeveloped at this point, focused mostly on ensuring just one pollutant, ozone, is not generated during operation."

In everyday operating conditions, ions added to occupied environments such as a school or office building can react with other compounds present in indoor air, which can potentially lead to the formation of harmful byproducts such as formaldehyde and ozone. Ions can also rapidly bind to other gases and spur the formation of new 'ultrafine' particles, which are known air pollutants. But little independent data exists on these mechanisms.

The research team conducted a series of experiments on the operation of a commercially available in-duct bipolar ionization device. Lab tests were conducted with air sampling of particulate matter and gases in a large semi-furnished chamber and in a field test with an ionizer device installed in an air handling unit serving an occupied office building. The research also found that despite small changes in particle concentrations, there was very little net effect on the overall concentration of PM2.5 in the air.

According to the EPA, particulate matter contains microscopic solids or liquid droplets that are so small that they can be inhaled and cause serious health problems. Particles less than 2.5 micrometers in diameter, also known as fine particles or PM2.5, pose the greatest risk to health as they can get deep into your lungs and some may even get into your bloodstream. Numerous scientific studies have linked fine particle pollution exposure to a range of health impacts, including premature death in people with heart or lung disease, nonfatal heart attacks, irregular heartbeat, aggravated asthma, decreased lung function, irritation of the airways, coughing or difficulty breathing.

Health impacts of air ionizers are largely unknown, although a small number of recent studies give cause for concern. In August 2020, a study concluded that exposure to negative ions was associated with increased systemic oxidative stress levels (a marker of cardiovascular health), and despite reduced indoor particulate matter concentrations, there were no beneficial changes to respiratory health.

Another recent study of air ionizers in school classrooms reduced particulate matter concentrations led to some improvements in respiratory health among 11-14 year old children, the ionizers had an adverse effect on heart rate variability (a measure of cardiovascular health), meaning that any benefit to the lungs came at a cost to the heart.

"We should have a much better understanding of these effects before widespread use of these types of devices," said Delphine Farmer, Associate Professor in the Department of Chemistry at Colorado State University and a co-lead author of the paper.

"Without peer-reviewed research into the health impacts of these devices, we risk substituting one harmful agent for another," said Stephens. "We urge others to follow guidance from organizations like the U.S. EPA and ASHRAE, which generally recommend the use of established, evidence-based measures to clean indoor air, including high efficiency particle filtration and enhanced ventilation, in addition to face coverings and physical distancing, to help reduce airborne transmission of COVID-19."

Credit: 
Colorado State University

Meandering rivers create "counter-point bars" no matter underlying geology

video: A sequence of satellite images showing the changing path of the Mamoré River of Bolivia from 1986 -2018. The sequence on the left is colored to track the formation of sediment deposits in the form of point bars (red) and counter-point bars (blue).

Image: 
Zoltán Sylvester/ The University of Texas at Austin.

It's not uncommon for crescent-shaped swaths of sand to dot the shorelines of meandering rivers. These swaths usually appear along the inner side of a river bend, where the bank wraps around the sandy patch, forming deposits known as a "point bars."

When they appear along an outer bank, which curves the opposite way, they form "counter-point" bars, which are usually interpreted by geoscientists as an anomaly: a sign that something - such as a patch of erosion-resistant rocks - is interfering with the river's usual manner of sediment deposition.

But according to research led by The University of Texas at Austin, counter-point bars are not the oddities they're often made out to be. In fact, they're a perfectly normal part of the meandering process.

"You don't need a resistant substrate, you can get beautiful [counter-point] bars without it," said Zoltán Sylvester, a research scientist at UT's Bureau of Economic Geology who led the study.

The finding suggests that counter-point bars - and the unique geology and ecology associated with them - are more common than previously thought. Building awareness around that fact can help geoscientists be on the lookout for counter-point bars in geological formations deposited by rivers in the past, and understand how they may be influencing the flow of hydrocarbons and water passing though them.

The research was published in the Geological Society of America Bulletin on March 12.

The co-authors are David Mohrig, a professor at the UT Jackson School of Geosciences; Paul Durkin, a professor at the University of Manitoba; and Stephen Hubbard, a professor at the University of Calgary.

Rivers are constantly on the move. For meandering rivers, this means carving out new paths and reactivating old ones as they snake across a landscape over time.

The researchers observed this behavior in both an idealized computer model and in nature, using satellite photos of a stretch of Bolivia's Mamoré River, which is known for quickly changing its path. The satellite photos captured how the river changed over 32 years, from 1986 - 2018.

In both the model and the Mamoré, counter-point bars appeared. The researchers found that the appearance was linked directly to short, high curvature bends: little spikes in a river's path.

The researchers observed that these spikes frequently form when the river's course is abruptly changed, such as when a new oxbow lake forms through cutoff, or after reconnecting with an old oxbow lake.

But the sharp bends don't stay put, they start migrating in the downstream direction. And as they rapidly move downstream, they create the conditions for sediment to accumulate around the bend as a counter-point bar.

The study shows a number of instances of this happening in the Mamoré. For example, in 2010, a sharp bend (bend 2 in the image) forms when an ox-bow lake reconnects with a downstream portion of the river. By 2018, the bend has moved about 1.5 miles downstream, with counter-point deposits along the shoreline marking its path.

Geomorphologists and engineers knew for some time that long-term change along a river can be described in terms of local and upstream values of curvature (places where the river seems to wrap around a small circle have high curvatures). In the study, the researchers used a formula that uses these curvature values to determine the likelihood of a counter-point bar forming at a particular location.

Sylvester said that he was surprised at how well this formula - and the simplified models used in part to derive it - worked to explain what was thought to be a complex phenomenon.

"Natural rivers, they are actually not that far from what these really simple models predict," Sylvester said.

This is not the first time that Sylvester's research has revealed that river behavior can be governed by relatively simple rules. In 2019, he led a study published in Geology that described a direct relationship between bend sharpness and river migration.

Superficially, point bars and counter-point bars look quite similar and frequently blend into one another. But counter-point bars are distinct environments: compared to point bars, they have finer sediments and lower topography, making them more prone to flooding and hosting lakes. These characteristics create unique ecological niches along rivers. But they are also geologically important, with ancient counter-point bar deposits preserved underground influencing the flow of fluids, such as water and oil and gas.

Mathieu Lapôtre, a geoscientist and assistant professor at Stanford University, said that recognizing that counter-point bars can readily form in meandering rivers - and having a formula for predicting where they will form - is a significant advancement.

"Altogether, the results of Sylvester et al. have important implications for a range of scientific and engineering questions," he said.

Credit: 
University of Texas at Austin