Tech

When light and atoms share a common vibe

video: 1. A laser generates a very short pulse of light
2. A fraction of this pulse is sent to a nonlinear device to change its color
3. The two laser pulses overlap on the same path again, creating a "write & read" pair of pulses.
4. Each pair is split into a short and a long path,
5. yielding an "early" and a "late" time slot, overlapping once again
6. Inside the diamond, during the "early" time slot, one photon from the "write" pulse may generate a vibration, while one photon from the "read" pulse converts the vibration back into light.
7. The same sequence may also happen during the "late" slot. But in this experiment, the scientists made sure that only one vibration is excited in total (in both early and late time slots).
8. By overlapping the photons in time again it becomes impossible to discriminate the early vs. late moment of the vibration. The vibration is now in a quantum superposition of early and late time.
9. In the detection apparatus, "write" and "read" photons are separated according to their different colors, and analyzed with single-photon counters to reveal their entanglement.

Image: 
Santiago Tarrago Velez (EPFL)

An especially counter-intuitive feature of quantum mechanics is that a single event can exist in a state of superposition - happening both here and there, or both today and tomorrow.

Such superpositions are hard to create, as they are destroyed if any kind of information about the place and time of the event leaks into the surrounding - and even if nobody actually records this information. But when superpositions do occur, they lead to observations that are very different from that of classical physics, questioning down to our very understanding of space and time.

Scientists from EPFL, MIT, and CEA Saclay, publishing in Science Advances, demonstrate a state of vibration that exists simultaneously at two different times, and evidence this quantum superposition by measuring the strongest class of quantum correlations between light beams that interact with the vibration.

The researchers used a very short laser-pulse to trigger a specific pattern of vibration inside a diamond crystal. Each pair of neighboring atoms oscillated like two masses linked by a spring, and this oscillation was synchronous across the entire illuminated region. To conserve energy during this process, a light of a new color is emitted, shifted toward the red of the spectrum.

This classical picture, however, is inconsistent with the experiments. Instead, both light and vibration should be described as particles, or quanta: light energy is quantized into discrete photons while vibrational energy is quantized into discrete phonons (named after the ancient Greek "photo = light" and "phono = sound").

The process described above should therefore be seen as the fission of an incoming photon from the laser into a pair of photon and phonon - akin to nuclear fission of an atom into two smaller pieces.

But it is not the only shortcoming of classical physics. In quantum mechanics, particles can exist in a superposition state, like the famous Schrödinger cat being alive and dead at the same time.

Even more counterintuitive: two particles can become entangled, losing their individuality. The only information that can be collected about them concerns their common correlations. Because both particles are described by a common state (the wavefunction), these correlations are stronger than what is possible in classical physics. It can be demonstrated by performing appropriate measurements on the two particles. If the results violate a classical limit, one can be sure they were entangled.

In the new study, EPFL researchers managed to entangle the photon and the phonon (i.e., light and vibration) produced in the fission of an incoming laser photon inside the crystal. To do so, the scientists designed an experiment in which the photon-phonon pair could be created at two different instants. Classically, it would result in a situation where the pair is created at time t1 with 50% probability, or at a later time t2 with 50% probability.

But here comes the "trick" played by the researchers to generate an entangled state. By a precise arrangement of the experiment, they ensured that not even the faintest trace of the light-vibration pair creation time (t1 vs. t2) was left in the universe. In other words, they erased information about t1 and t2. Quantum mechanics then predicts that the phonon-photon pair becomes entangled, and exists in a superposition of time t1 and t2. This prediction was beautifully confirmed by the measurements, which yielded results incompatible with the classical probabilistic theory.

By showing entanglement between light and vibration in a crystal that one could hold in their finger during the experiment, the new study creates a bridge between our daily experience and the fascinating realm of quantum mechanics.

"Quantum technologies are heralded as the next technological revolution in computing, communication, sensing, says Christophe Galland, head of the Laboratory for Quantum and Nano-Optics at EPFL and one of the study's main authors. "They are currently being developed by top universities and large companies worldwide, but the challenge is daunting. Such technologies rely on very fragile quantum effects surviving only at extremely cold temperatures or under high vacuum. Our study demonstrates that even a common material at ambient conditions can sustain the delicate quantum properties required for quantum technologies. There is a price to pay, though: the quantum correlations sustained by atomic vibrations in the crystal are lost after only 4 picoseconds -- i.e., 0.000000000004 of a second! This short time scale is, however, also an opportunity for developing ultrafast quantum technologies. But much research lies ahead to transform our experiment into a useful device -- a job for future quantum engineers."

Credit: 
Ecole Polytechnique Fédérale de Lausanne

Developing smarter, faster machine intelligence with light

image: A massively parallel amplitude-only Fourier neural network

Image: 
Volker Sorger/GWU

SUMMARY

Researchers at the George Washington University, together with researchers at the University of California, Los Angeles, and the deep-tech venture startup Optelligence LLC, have developed an optical convolutional neural network accelerator capable of processing large amounts of information, on the order of petabytes, per second. This innovation, which harnesses the massive parallelism of light, heralds a new era of optical signal processing for machine learning with numerous applications, including in self-driving cars, 5G networks, data-centers, biomedical diagnostics, data-security and more.

THE SITUATION

Global demand for machine learning hardware is dramatically outpacing current computing power supplies. State-of-the-art electronic hardware, such as graphics processing units and tensor processing unit accelerators, help mitigate this, but are intrinsically challenged by serial data processing that requires iterative data processing and encounters delays from wiring and circuit constraints. Optical alternatives to electronic hardware could help speed up machine learning processes by simplifying the way information is processed in a non-iterative way. However, photonic-based machine learning is typically limited by the number of components that can be placed on photonic integrated circuits, limiting the interconnectivity, while free-space spatial-light-modulators are restricted to slow programming speeds.

THE SOLUTION

To achieve a breakthrough in this optical machine learning system, the researchers replaced spatial light modulators with digital mirror-based technology, thus developing a system over 100 times faster. The non-iterative timing of this processor, in combination with rapid programmability and massive parallelization, enables this optical machine learning system to outperform even the top-of-the-line graphics processing units by over one order of magnitude, with room for further optimization beyond the initial prototype.

Unlike the current paradigm in electronic machine learning hardware that processes information sequentially, this processor uses the Fourier optics, a concept of frequency filtering which allows for performing the required convolutions of the neural network as much simpler element-wise multiplications using the digital mirror technology.

FROM THE RESEARCHERS

"This massively parallel amplitude-only Fourier optical processor is heralding a new era for information processing and machine learning. We show that training this neural network can account for the lack of phase information."
-Volker Sorger, associate professor of electrical and computer engineering at the George Washington University.

"Optics allows for processing large-scale matrices in a single time-step, which allows for new scaling vectors of performing convolutions optically. This can have significant potential for machine learning applications as demonstrated here."
-Puneet Gupta, professor & vice chair of computer engineering at UCLA

"This prototype demonstration shows a commercial path for optical accelerators ready for a number of applications like network-edge processing, data-centers and high-performance compute systems."
-Hamed Dalir, Co-founder, Optelligence LLC

Credit: 
George Washington University

Nanotechnology -- nanoparticles as weapons against cancer

Many chemotherapeutic agents used to treat cancers are associated with side-effects of varying severity, because they are toxic to normal cells as well as malignant tumors. This has motivated the search for effective alternatives to the synthetic pharmaceuticals with which most cancers are currently treated. The use of calcium phosphate and citrate for this purpose has been under discussion for some years now, since they lead to cell death when delivered directly into cells, while their presence in the circulation has little or no toxic effect. The problem consists in finding ways to overcome the mechanisms that control the uptake of these compounds into cells, and ensuring that the compounds act selectively on the cells one wishes to eliminate. Researchers in the Department of Chemistry at LMU, led by Dr. Constantin von Schirnding, Dr. Hanna Engelke and Prof. Thomas Bein, now report the development of a class of novel amorphous nanoparticles made up of calcium and citrate, which are capable of breaching the barriers to uptake, and killing tumor cells in a targeted fashion.

Both calcium phosphate and citrate are involved in the regulation of many cellular signaling pathways. Hence, the levels of these substances present in the cytoplasm are tightly controlled, in order to avoid disruption of these pathways. Crucially, the nanoparticles described in the new study are able to bypass these regulatory controls. "We have prepared amorphous and porous nanoparticles consisting of calcium phosphate and citrate, which are encapsulated in a lipid layer," von Schirnding explains. The encapsulation ensures that these particles are readily taken up by cells without triggering countermeasures. Once inside the cell, the lipid layer is efficiently broken down, and large amounts of calcium and citrate are deposited in the cytoplasm.

Experiments on cultured cells revealed that the particles are selectively lethal - killing cancer cells, but leaving healthy cells (which also take up particles) essentially unscathed. "Clearly, the particles can be highly toxic to cancer cells. - Indeed, we found that the more aggressive the tumor, the greater the killing effect," says Engelke.

During cellular uptake, the nanoparticles acquire a second membrane coat. The authors of the study postulate that an unknown mechanism - which is specific to cancer cells - causes a rupture of this outer membrane, allowing the contents of the vesicles to leak into the cytoplasm. In healthy cells, on the other hand, this outermost layer retains its integrity, and the vesicles are subsequently excreted intact into the extracellular medium.

"The highly selective toxicity of the particles made it possible for us to successfully treat two different types of highly aggressive pleural tumors in mice. With only two doses, administered locally, we were able to reduce tumor sizes by 40 and 70%, respectively," says Engelke. Many pleural tumors are the metastatic products of lung tumors, and they develop in the pleural cavity between the lung and the ribcage. Because this region is not supplied with blood, it is inaccessible to chemotherapeutic agents. "In contrast, our nanoparticles can be directly introduced into the pleural cavity," says Bein. Furthermore, over the course of a 2-month treatment, no signs of serious side-effects were detected. Overall, these results suggest that the new nanoparticles have great potential for the further development of novel treatments for other types of cancer.

Credit: 
Ludwig-Maximilians-Universität München

Water limitations in the tropics offset carbon uptake from arctic greening

image: A map of the world shows the changes in global gross primary productivity (GPP), an indicator of carbon uptake, from 1982-2016. Each dot indicates a region with a statistically significant trend.

Image: 
Credits: NASA/Nima Madani

More plants and longer growing seasons in the northern latitudes have converted parts of Alaska, Canada and Siberia to deeper shades of green. Some studies translate this Arctic greening to a greater global carbon uptake. But new research shows that as Earth's climate is changing, increased carbon absorption by plants in the Arctic is being offset by a corresponding decline in the tropics.

"This is a new look at where we can expect carbon uptake to go in the future," said scientist Rolf Reichle with the Global Modeling and Assimilation Office (GMAO) at NASA's Goddard Space Flight Center in Greenbelt, Maryland.

Reichle is one of the authors of a study, published Dec. 17 in AGU Advances, which combines satellite observations over 35 years from the National Oceanic and Atmospheric Administration (NOAA's) Advanced Very High Resolution Radiometer (AVHRR) with computer models, including water limitation data from NASA's Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2).

Together, these provide a more accurate estimate of global "primary productivity" - a measure of how well plants convert carbon dioxide and sunlight to energy and oxygen via photosynthesis, for the time span between 1982 to 2016.

Arctic gains and tropical losses

Plant productivity in the frigid Arctic landscape is limited by the lengthy periods of cold. As temperatures warm, the plants in these regions have been able to grow more densely and extend their growing season, leading to an overall increase in photosynthetic activity, and subsequently greater carbon absorption in the region over the 35-year time span.

However, buildup of atmospheric carbon concentrations has had several other rippling effects. Notably, as carbon has increased, global temperatures have risen, and the atmosphere in the tropics (where plant productivity is limited by the availability of water) has become drier. Recent increases in drought and tree mortality in the Amazon rainforest are one example of this, and productivity and carbon absorption over land near the equator have gone down over the same time period as Arctic greening has occurred, canceling out any net effect on global productivity.

Adding Satellites to Productivity Models

Previous model estimates suggested that the increasing productivity of plants in the Arctic could partially compensate for human activities which release atmospheric carbon, like the burning of fossil fuels. But these estimates relied on models that calculate plant productivity based on the assumption that they photosynthesize (convert carbon and light) at a given efficiency rate.

In reality, many factors can affect plants' productivity. Including satellite records like those from AVHRR provide scientists with consistent measurments of the global photosynthetic plant cover, and can help account for variable events such as pest outbreaks and deforestation that previous models do not capture. These can impact the global vegetation cover and productivity.

"There have been other studies that focused on plant productivity at global scales," said Nima Madani from NASA's Jet Propulsion Laboratory, (JPL) Pasadena, California, and lead author of the study, which also includes scientists from the University of Montana. "But we used an improved remote sensing model to have a better insight into changes in ecosystem productivity." This model uses an enhanced light use efficiency algorithm, which combines multiple satellites' observations of photosynthetic plant cover and variables such as surface meteorology.

"The satellite observations are critical especially in regions where our field observations are limited, and that's the beauty of the satellites," Madani said. "That's why we are trying to use satellite remote sensing data as much as possible in our work."

It was only recently that the satellite records began to show these emerging trends in shifting productivity. According to Reichle, "The modelling and the observations together, what we call data assimilation, is what really is needed." The satellite observations train the models, while the models can help depict Earth system connections such as the opposing productivity trends observed in the Arctic and tropics.

Brown Is the New Green

The satellite data also revealed that water limitations and decline in productivity are not confined to the tropics. Recent observations show that the Arctic's greening trend is weakening, with some regions already experiencing browning.

"I don't expect that we have to wait another 35 years to see water limitations becoming a factor in the Arctic as well," said Reichle. We can expect that the increasing air temperatures will reduce the carbon uptake capacity in the Arctic and boreal biomes in the future. Madani says Arctic boreal zones in the high latitudes that once contained ecosystems constrained by temperature are now evolving into zones limited by water availability like the tropics.

These ongoing shifts in productivity patterns across the globe could affect numerous plants and animals, altering entire ecosystems. That can impact food sources and habitats for various species, including endangered wildlife, and human populations.

Credit: 
NASA/Goddard Space Flight Center

Humpback whale songs provide insight to population changes

image: Humpback whale off Maui.

Image: 
HIHWNMS/NMFS ESA Permit #782-1719

Approximately 8,000-12,000 of the North Pacific humpback whale stock visits the shallow waters of the Hawaiian Islands seasonally to breed. During this time, mature males produce an elaborate acoustic display known as "song," which becomes the dominant source of ambient underwater sound between December and April. Following reports of unusually low whale numbers that began in 2015-16, researchers at the University of Hawai?i at Mānoa in collaboration with the Hawaiian Islands Humpback Whale National Marine Sanctuary, Oceanwide Science Institute and Woods Hole Oceanographic Institution, examined song chorusing recorded through long-term passive acoustic monitoring at six sites off Maui, as a proxy for whale populations between September 2014 and May 2019. The findings were published in Endangered Species Research.

Using autonomous acoustic recorders called an "Ecological Acoustic Recorder," researchers calculated root-mean-square sound pressure levels (RMS SPL), a metric of the average amount of acoustic energy (how loud the soundscape is) per day.

Over the course of the season, RMS SPL levels mirror the whales' migratory patterns. Levels increase starting in November through January when whales start arriving in the waters around the archipelago, peaking in February and March, before decreasing in April through May when whales start migrating back to their high-latitude feeding grounds. Researchers compared overall differences of this pattern and monthly averages of RMS SPL levels among years.

"Between the 2014-15 and 2017-18 seasons, we saw a continuous decrease in overall chorusing levels during the peak months of February and March of between -3 and -9 dB depending on location over the course of this four-year period," said Anke Kügler, a PhD candidate in marine biology, research assistant at the Hawai?i Institute of Marine Biology and lead author of the paper. "Only in the 2018-19 season did levels increase again, reaching 2015-16 at most and even 2014-15 levels at some recording sites. Further, we saw a shift in the seasonal pattern, with peaks shifting to early- and mid-February from late February to early March. Overall, chorusing levels were not only significantly lower during the peak of the season, whales also appeared to depart the islands earlier than in the past."

Acoustic energy decreased by more than 50%

When anecdotal reports from the on-water community initially showed lower numbers of whales in 2015-16, this coincided with an El Niño event in the North Pacific. Researchers did not expect to see a decreasing trend for the subsequent two seasons, before chorusing levels seemed to bounce back in 2018-19.

Further, a decrease of acoustic energy -6 dB means a decrease by 50%. While this does not automatically translate into half the number of whales, other researchers visually assessed numbers of mother-calf pairs off Maui and overall whales off Hawai?i Island, and reported seeing declines of similar magnitude during the same time, indicating that researchers captured changes in population levels, not just changes in singing behavior.

"The Hawai?i 'distinct population segments' has been delisted from the Endangered Species Act in 2016, assuming sustainable levels after decades of population increase," said Kügler. "However, in light of global change, continued monitoring is necessary to detect potential negative changes early and implement mitigation and adjust protection measures within Hawaiian waters, if necessary."

An ambassador species

Humpback whales are considered charismatic megafauna that hold a unique place in society, particularly modern Hawaiian culture. Further, whale watching is an important economic resource in Hawai?i. As such, humpback whales are what is called a "flagship species." They have the potential to be the ambassador species for the entire region's ecosystem by helping to raise awareness of threats and global change impacts, on them and other species they share their habitat with, as well as their migration areas.

"The University of Hawai?i has been a global leader in marine mammal research since the 1970s, therefore doing this kind of work and continuing on this tradition of high-impact marine mammal research enables the university to maintain that status as one of world's prime research universities," said Kügler.

"In addition, this collaborative project highlights and strengthens UH's existing long-term connection to NOAA," added Kügler. "I was able to do this research due to this close partnership and collaboration with NOAA and the Hawaiian Islands Humpback Whale National Marine Sanctuary."

Credit: 
University of Hawaii at Manoa

Shifting gears toward chemical machines

image: Animation from simulation demonstrating spatio-temporal control of rotors via a cascade reaction. GOx-coated rotor (magenta) lies on the left side of the chamber, while CAT-coated rotor (green) lies on the right side. Background color map indicates spatial distribution of H2O2 in the solution at y = 3mm for side views and at z = 0.4 mm for top views. Introduction of D-glucose in the solution activates the GOx-coated rotor, which morphs into a 3D structure and starts rotating spontaneously. CAT-coated rotor stays flat and stationary. H2O2 is produced by the first reaction, constituting the first step of the cascade reaction. In the presence of H2O2, CAT-coated rotor becomes active and starts rotating, while the GOx-coated rotor becomes flat and stationary as glucose in the solution is depleted. With time, H2O2 in the solution is also depleted and consequently, the motion of the CAT-coated rotor stops and sheet becomes flat.

Image: 
A. Laskar

PITTSBURGH (December 18, 2020) ... The gear is one of the oldest mechanical tools in human history1 and led to machines ranging from early irrigation systems and clocks, to modern engines and robotics. For the first time, researchers at the University of Pittsburgh Swanson School of Engineering have utilized a catalytic reaction that causes a two-dimensional, chemically-coated sheet to spontaneously "morph" into a three-dimensional gear that performs sustained work.

The findings indicate the potential to develop chemically driven machines that do not rely on external power, but simply require the addition of reactants to the surrounding solution. Published today in the Cell Press journal Matter (DOI: 10.1016/j.matt.2020.11.04), the research was developed by Anna C. Balazs, Distinguished Professor of Chemical and Petroleum Engineering and the John A. Swanson Chair of Engineering. Lead author is Abhrajit Laskar and co-author is Oleg E. Shklyaev, both post-doctoral associates.

"Gears help give machines mechanical life; however, they require some sort of external power, such as steam or electricity, to perform a task. This limits the potential of future machines operating in resource-poor or remote environments," Balazs explains. "Abhrajit's computational modeling has shown that chemo-mechanical transduction (conversion of chemical energy into motion) at active sheets presents a novel way to replicate the behavior of gears in environments without access to traditional power sources."

In the simulations, catalysts are placed at various points on a two-dimensional sheet resembling a wheel with spokes, with heavier nodes on the sheet's circumference. The flexible sheet, approximately a millimeter in length, is then placed in a fluid-filled microchamber. A reactant is added to the chamber that activates the catalysts on the flat "wheel", thereby causing the fluid to spontaneously flow. The inward fluid flow drives the lighter sections of the sheet to pop up, forming an active rotor that catches the flow and rotates.

"What is really distinctive about this research is the coupling of deformation and propulsion to modify the object's shape to create movement," Laskar says. "Deformation of the object is key; we see in nature that organisms use chemical energy to change their shape and move. For our chemical sheet to move, it also has to spontaneously morph into a new shape, which allows it to catch the fluid flow and perform its function."

Additionally, Laskar and Shklyaev found that not all the gear parts needed to be chemically active for motion to occur; in fact, asymmetry is crucial to create movement. By determining the design rules for the placement, Laskar and Shklyaev could direct the rotation to be clockwise or counterclockwise. This added "program" enabled the control of independent rotors to move sequentially or in a cascade effect, with active and passive gear systems. This more complex action is controlled by the internal structure of the spokes, and the placement within the fluid domain.

"Because a gear is a central component to any machine, you need to start with the basics, and what Abhrajit has created is like an internal combustion engine at the millimeter scale," Shklyaev says. "While this won't power your car, it does present the potential to build the basic mechanisms for driving small-scale chemical machines and soft robots."

In the future, Balazs will investigate how the relative spatial organization of multiple gears can lead to greater functionality and potentially designing a system that appears to act as if it were making decisions.

"The more remote a machine is from human control, the more you need the machine itself to provide control in order to complete a given task," Balazs said. "The chemo-mechanical nature of our devices allows that to happen without any external power source."

These self-morphing gears are the latest evolution of chemo-mechanical processes developed by Balazs, Laskar, and Shklyaev. Other advances include creating crab-like sheets that mimic feed, flight, and fight responses; and sheets resembling a "flying carpet" that wrap, flap, and creep.

Credit: 
University of Pittsburgh

Media Alert: The CRISPR Journal publishes special issue on expanding the CRISPR toolbox

image: outstanding research and commentary on all aspects of CRISPR and gene editing, including CRISPR biology, technology, and genome editing, and commentary and debate of key policy, regulatory, and ethical issues affecting the field.

Image: 
Mary Ann Liebert, Inc., publishers

The CRISPR Journal announces the publication of its December 2020 issue, a Special Issue on Expanding the CRISPR Toolbox. The Journal is dedicated to validating and publishing outstanding research and commentary on all aspects of CRISPR and gene editing, including CRISPR biology, technology, and genome editing, and commentary and debate of key policy, regulatory, and ethical issues affecting the field. The Journal, led by Editor-in-Chief Rodolphe Barrangou, PhD (North Carolina State University) and Executive Editor Kevin Davies, PhD is published bimonthly in print and online. Visit The CRISPR Journal website for more information.

This press release is copyright Mary Ann Liebert, Inc. Its use is granted only for journalists and news media receiving it directly from The CRISPR Journal. For full-text copies of articles or to arrange interviews with Dr. Barrangou, Dr. Davies, authors, or members of the editorial board, contact Kathryn Ryan at the Publisher.

1 Special Issue: Expanding the CRISPR Toolbox

The December 2020 issue of The CRISPR Journal features a special collection of eight research articles under the theme: "Expanding the CRISPR Toolbox." The collection includes articles from CRISPR Therapeutics, Metagenomi, and research groups in Denmark and China. The guest editor is Dr. Stanley Qi (Stanford University).

As the Journal's chief editor Rodolphe Barrangou notes in this issue's editorial: "This series captures many such developments, notably the mining of novel Type V-A Cas12 enzymes from a group at Metagenomi. The collection also features new in silico tools that enable CRISPR-Cas system identification and practical exploitation from groups in Denmark and China." In addition to tool development, articles include advances in guide design and selection (from China), Cas fusion to effectors (from the US), off-target method assessments (CRISPR Therapeutics), and applications for DNA detection and exogenous DNA integration.

Contact: Rodolphe Barrangou (NCSU/The CRISPR Journal) or Stanley Qi (Stanford University).

2 Assessing CRISPR Off-Targets

Understanding the scope and prevalence of off-target genome editing using CRISPR-Cas9 is a critical undertaking to ensure the safety of the technology in patients. In the lead article in the "Expanding the CRISPR Toolbox" special issue, researchers from CRISPR Therapeutics present a meticulous comparison of three popular "off-target site nomination assays." After treating HEK293T cells with Cas9 and various guide RNAs, the authors compared the performance of three homology-independent off-target nomination methods: the cell-based assay GUIDE-seq, and the biochemical assays CIRCLE-seq and SITE-seq. While the three methods performed similarly, there were significant differences in the total number of sites nominated. Nevertheless, the authors conclude that all three methods provide reliable and "comprehensive assessment of off-target activity."

Contact: Andrew Kernytsky (CRISPR Therapeutics)

3 Mining for New CRISPR Tools

Cas12a enzymes are increasingly gaining popularity as go-to tools in the CRISPR toolbox. Conducting a large-scale metagenomic analysis including uncultivated organisms, Christopher Brown and coworkers at Metagenomi in California have identified novel families of Type V-A CRISPR nucleases and initiated their analysis in gene editing platforms. The novel nucleases display extensive protein variation and can be programmed by a single-guide RNA. Moreover, some exhibit unexpected protospacer adjacent motif (PAM) diversity. These systems, the authors suggest, will facilitate a variety of genome-engineering applications including gene and cell therapies.

Contact: Christopher Brown (Metagenomi)

4 Isolated and Orphan CRISPR Arrays

The study of CRISPR arrays continues apace, revealing some surprises along the way. In a new report in The CRISPR Journal, veteran bioinformatician Eugene Koonin and colleagues at the National Center for Biotechnology Information (NCBI) have surveyed more than 13,000 microbial genome sequences looking for isolated CRISPR arrays that are not adjacent to Cas genes. Using a bioinformatic pipeline, the team identified 116 unique bona fide arrays distributed across 89 clusters, for which repeats show no similarity to known CRISPR sequences. These are considered ''orphans'' until the associated Cas gene is discovered.

Contact: Eugene Koonin (NCBI)

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Study finds growing numbers of critically endangered sawfish in Miami waters

image: Figure 3 from the paper: (A) Photograph taken by W. A. Fishbaugh in the 1920s, recorded as taken in Miami (courtesy of State Library & Archives of Florida, Florida: https://www.floridamemory.com/items/show/165364). (B) Photograph taken by 2 national park rangers in Biscayne Bay National Park near Elliott Key on 23 November 2018, showing a smalltooth sawfish entangled in fishing gear (courtesy of Biscayne National Park: https://www.fisheries.noaa.gov/feature-story/saving-endangered-sawfish)

Image: 
see above

MIAMI--A new collaborative study lead by scientists at the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science and the National Oceanic and Atmospheric Administration (NOAA) found evidence of growing numbers of critically endangered smalltooth sawfish within coastal waters off Miami, Florida, an area where the regular presence of this rare species had gone largely undocumented, until now. The new findings are part of a NOAA initiative to support and enhance the recovery of smalltooth sawfish in and around Biscayne Bay, a coastal lagoon off Miami, that was designated a Habitat Focus Area by NOAA in 2015.

A shark-like ray, smalltooth sawfish (Pristis pectinata) are unique for their long flat rostra with roughly 22-29 teeth on either side that is used to detect and catch prey. The species can reach 16-feet in length. NOAA estimates that smalltooth sawfish populations in U.S. waters have declined by as much as 95 percent from a combination of overfishing, bycatch in fishing gear, and habitat loss from increasing coastal development.

The research team compiled sighting records dating as far back as 1895 and recent encounters of sawfish in the Biscayne Bay Habitat Focus Area.

"Our analysis showed sightings have increased exponentially in recent decades, with some individuals even appearing to be making returning annual visits," said Laura McDonnell, the study's lead author and a PhD student at UM Abess Center for Ecosystem Science & Policy and researcher at the UM Rosenstiel School. "These findings demonstrate that smalltooth sawfish have been using these waters with some regularity, largely unnoticed prior to the compilation of these records.

"However, the extent to which sawfish use Biscayne Bay and reason for their occurrence remains unknown," said Joan Browder, a fisheries biologist at NOAA's Southeast Fisheries Science Center and senior author of the study. "Understanding this would be a valuable next research step."

Many of the smalltooth sawfish documented in this study were found in waters very close to Miami, where they were exposed to high levels of pollution, boat traffic, and fishing.

"These results highlight a need to understand the effects of coastal urbanization on smalltooth sawfish and the conservation implications for this and other endangered species using the area," said Neil Hammerschlag, research associate professor at the UM Rosenstiel School and UM Abess Center for Ecosystem Science & Policy and co-author of the study.

"Given the documented use of smalltooth sawfish in and around Biscayne Bay, we hope the area will receive informative signage to help inform the public about their endangered status, the importance of reporting encounters, and the dangers of harming sawfish," said McDonnell.

Credit: 
University of Miami Rosenstiel School of Marine, Atmospheric, and Earth Science

New class of cobalt-free cathodes could enhance energy density of next-gen lithium-ion batteries

image: Oak Ridge National Laboratory researchers have developed a new class of cobalt-free cathodes called NFA that are being investigated for making lithium-ion batteries for electric vehicles.

Image: 
Andy Sproles/ORNL, U.S. Dept. of Energy

Oak Ridge National Laboratory researchers have developed a new family of cathodes with the potential to replace the costly cobalt-based cathodes typically found in today's lithium-ion batteries that power electric vehicles and consumer electronics.

The new class called NFA, which stands for nickel-, iron- and aluminum-based cathode, is a derivative of lithium nickelate and can be used to make the positive electrode of a lithium-ion battery. These novel cathodes are designed to be fast charging, energy dense, cost effective, and longer lasting.

With the rise in the production of portable electronics and electric vehicles throughout the world, lithium-ion batteries are in high demand. According to Ilias Belharouak, ORNL's scientist leading the NFA research and development, more than 100 million electric vehicles are anticipated to be on the road by 2030. Cobalt is a metal currently needed for the cathode which makes up the significant portion of a lithium-ion battery's cost.

Cobalt is rare and largely mined overseas, making it difficult to acquire and produce cathodes. As a result, finding an alternative material to cobalt that can be manufactured cost effectively has become a lithium-ion battery research priority.

ORNL scientists tested the performance of the NFA class of cathodes and determined they are promising substitutes for cobalt-based cathodes, as described in Advanced Materials and the Journal of Power Sources. Researchers used neutron diffraction, Mossbauer spectroscopy and other advanced characterization techniques to investigate NFA's atomic- and micro-structures as well as electrochemical properties.

"Our investigations into the charging and discharging behavior of NFA showed that these cathodes undergo similar electrochemical reactions as cobalt-based cathodes and deliver high enough specific capacities to meet the battery energy density demands," said Belharouak.

Although research on the NFA class is in the early stages, Belharouak said that his team's preliminary results to date indicate that cobalt may not be needed for next-generation lithium-ion batteries.

"We are developing a cathode that has similar or better electrochemical characteristics than cobalt-based cathodes while utilizing lower cost raw materials," he said.

Belharouak added that not only does NFA perform as well as cobalt-based cathodes, but the process to manufacture the NFA cathodes can be integrated into existing global cathode manufacturing processes.

"Lithium nickelate has long been researched as the material of choice for making cathodes, but it suffers from intrinsic structural and electrochemical instabilities," he said. "In our research, we replaced some of the nickel with iron and aluminum to enhance the cathode's stability. Iron and aluminum are cost-effective, sustainable and environmentally friendly materials."

Future research and development on the NFA class will include testing the materials in large-format cells to validate the lab-scale results and further explore the suitability of these cathodes for use in electric vehicles.

Credit: 
DOE/Oak Ridge National Laboratory

Cannabis could reduce fentanyl use, reduce overdose risk: Study

New research suggests that cannabis use by people in care for opioid addiction might improve their treatment outcomes and reduce their risk of being exposed to fentanyl in the contaminated unregulated drug supply.

In a paper published today in the peer-reviewed journal Drug and Alcohol Dependence, researchers from the BC Centre on Substance Use (BCCSU) and University of British Columbia (UBC) found that 53 per cent of the 819 study participants in Vancouver's Downtown Eastside were intentionally or inadvertently using fentanyl, despite being on opioid agonist treatments (OAT) like methadone or buprenorphine/naloxone. These evidence-based treatments aim to support people who want to eliminate their use of unregulated opioids, however, these findings suggest people may be supplementing their treatment through the unregulated drug supply, putting them at risk of overdose.

However, researchers found that those in the study who had urine tests positive for THC (the primary psychoactive component of cannabis) were approximately 10 per cent less likely to have fentanyl-positive urine, putting them at lower risk of a fentanyl overdose.

"These new findings suggest that cannabis could have a stabilizing impact for many patients on treatment, while also reducing the risk of overdose," said Dr. Eugenia Socías, a clinician scientist at BCCSU and lead author of the study. "With overdoses continuing to rise across the country, these findings highlight the urgent need for clinical research to evaluate the therapeutic potential of cannabinoids as adjunctive treatment to OAT to address the escalating opioid overdose epidemic."

Untreated opioid use disorder is a key driver of the overdose crisis in BC and across the United States and Canada, and expanding access to evidence-based addiction care like OAT has been identified as an urgent need and a key part of BC's response. Research has found that without access to and rapid scale-up of take home naloxone, overdose prevention services, and OAT, the number of overdose deaths in B.C. would be 2.5 times as high. However, while more British Columbians diagnosed with an opioid use disorder are being connected to evidence-based treatments, retention on these medications remains a challenge. People who are retained in OAT face much lower risks of dying from an overdose, acquiring HIV or suffering other harms of drug use compared to people who are out of treatment.

Cannabis may play an important role in supporting retention on OAT. Previous research from the BCCSU found that individuals initiating OAT who reported using cannabis on a daily basis were approximately 21 per cent more likely to be retained in treatment at six months than non-cannabis users. This was the first study to find a beneficial link between high-intensity cannabis use and retention in treatment among people initiating OAT.

The findings published today add to an emerging body of research suggesting cannabis could have a stabilizing impact for many patients on treatment, while also reducing the risk of overdose.

Researchers from BCCSU will soon be able to confirm these preliminary results, as the Canadian Institutes of Health Research, Canada's federal health research funder, recently approved funding for a Vancouver-based pilot study evaluating the feasibility and safety of cannabis as an adjunct therapy to OAT.

"Scientists are only just beginning to understand the role cannabis might play in supporting people's wellbeing, particularly those who use other substances," says Dr. M-J Milloy, study co-author and the Canopy Growth professor of cannabis science at UBC, who will lead the new study with Dr. Socías. "This study will help us understand if and how cannabis might have a role in addressing the overdose crisis."

Credit: 
University of British Columbia

Researchers deconstruct ancient Jewish parchment using multiple imaging techniques

image: UV fluorescence examination

Image: 
The authors

A picture may be worth a thousand words, but capturing multiple images of an artifact across the electromagnetic spectrum can tell a rich story about the original creation and degradation of historical objects over time. Researchers recently demonstrated how this was possible using several complementary imaging techniques to non-invasively probe a Jewish parchment scroll. The results were published in the journal Frontiers in Materials.

A team of scientists from Romania's National Institute for Research and Development in Optoelectronics extracted details about the manuscript's original materials and manufacturing techniques employing various spectroscopic instruments. These specialized cameras and devices capture images that the human eye normally can't see.

"The goal of the study was ... to understand what the passing of time has brought upon the object, how it was degraded, and what would be the best approach for its future conservation process," explained Dr Luminita Ghervase, a co-author on the paper and research scientist at the institute.

The manuscript the team investigated was a poorly preserved but sacred scroll containing several chapters of the Book of Esther from the Hebrew Bible. An artifact from a private collection, little was known of the object's provenance or history.

"The use of complementary investigation techniques can shed light on the unknown history of such an object," Ghervase noted. "For some years now, non-invasive, non-destructive investigation techniques are the first choice in investigating cultural heritage objects, to comply with one of the main rules of the conservation practice, which is to not harm the object."

One of the more common imaging techniques is multispectral imaging, which involves scanning an object within specific parts of the electromagnetic spectrum. Such images can show otherwise invisible details about the manuscript's wear and tear. Different ultraviolet modes, for example, revealed a dark stain on the scroll that might indicate a repair using an organic material such as a resin, because the spot strongly absorbs UV light.

A related technique, hyperspectral imaging, was used to determine the material basis of the ink on the aged parchment. The scientists detected two distinct types of ink, another indication that someone may have attempted to repair the item in the past. They also used a computer algorithm to help characterize the spectral signals of individual pixels to further discriminate the materials - a method that holds promise for reconstructing the text itself.

"The algorithm used for materials classification has the potential of being used for identifying traces of the ink to infer the possible original shape of the letters," Ghervase said.

The team also employed an imaging technique known as x-ray fluorescence (XRF), which can identify the kinds of chemicals used in both the ink and the manufacturing of the parchment. For instance, the XRF found rich concentrations of zinc, a chemical often linked to the bleaching process, but possibly another indication of past restoration efforts. Finally, the scientists employed a Fourier-transform infrared (FTIR) spectrometer to identify other chemicals present using an infrared light source to measure absorption. Specifically, the FTIR analysis provided an in-depth view regarding the deterioration rate of the collagen in the scroll, which is made from animal skin, among other insights.

Employing these various imaging techniques to dissect the parchment could help conservators restore the object closer to its original condition by identifying the materials used to create it.

"They can wisely decide if any improper materials had been used, and if such materials should be removed," Ghervase said. "Moreover, restorers can choose the most appropriate materials to restore and preserve the object, ruling out any possible incompatible materials."

Credit: 
Frontiers

Cell atlas of tropical disease parasite may hold key to new treatments

The first cell atlas of an important life stage of Schistosoma mansoni, a parasitic worm that poses a risk to hundreds of millions of people each year, has been developed by researchers at the Wellcome Sanger Institute and their collaborators.

The study, published today (18 December 2020) in Nature Communications, identified 13 distinct cell types within the worm at the start of its development into a dangerous parasite, including new cell types in the nervous and muscular systems. The atlas provides an instruction manual for better understanding the biology of S. mansoni that will enable research into new vaccines and treatments.

S. mansoni has a complex life cycle that begins when larval forms of the parasite emerge from snails into rivers and lakes. These larvae then enter humans through the skin after contact with infested water. Once inside the body, the parasite begins what is known as the intra-mammalian stage of its life cycle, undergoing a series of developmental transitions as it matures to adulthood.

Adult worms live in human blood vessels and reproduce, releasing eggs that pass from the body into water to continue the life cycle. But some eggs remain trapped in the body, leading to the disease schistosomiasis.

Schistosomiasis is a debilitating long-term illness that can lead to the inability to work, organ damage and death. It affects hundreds of millions of people each year, primarily in sub-Saharan Africa,* and is listed by the World Health Organisation (WHO) as one of the most Neglected Tropical Diseases. Currently, only one drug is available to treat the disease, but this is inappropriate for use in very young children and there are fears that overreliance on a single treatment will allow the parasites to develop resistance to the drug.

Researchers have been looking at ways to find new drug targets, but until now there has been no high-resolution understanding of the parasite's biology.

This new study sought to map all of the cells in the first intra-mammalian stage of the parasite using single-cell technology, which identifies different cell types present in an organism or tissue.

The early-stage parasites were broken apart into individual cells that were characterised by single-cell RNA sequencing by scientists at the Wellcome Sanger Institute. The data were then analysed to identify cell types according to the genes expressed by individual cells, and where in the body these cells were located.

The team identified 13 distinct cell types, including previously unknown cell types in the nervous system and parenchymal system**. Individual fluorescent probes were made for genes specifically expressed by each cell type. Scientists at the Morgridge Institute for Research in the USA then used these probes to confirm the position of the discovered cells within whole parasites under the microscope.

Dr Carmen Diaz Soria, a first author of the study from the Wellcome Sanger Institute, said: "Though significant advances in our understanding of Schistosoma mansoni have been made in recent years, we have yet to identify targets leading to a viable vaccine. Single-cell RNA sequencing provides a whole new level of biological detail, including previously unidentified cell types, that will allow us to better understand each cell population in the parasite."

To identify new drug targets, researchers most often look for differences between a pathogen and its human host. However, S. mansoni is far closer to us in evolutionary terms than most major parasites, such as those that cause malaria. It is hoped that these findings will reveal areas of the parasite's genetic code that are sufficiently different from our own to be viable treatment targets.

Dr Jayhun Lee, a first author of the study from the Morgridge Institute for Research, Wisconsin USA, said: "We found genes in the muscular system of Schistosoma mansoni that might be specific to schistosomes. Because they are found in these parasites but not in humans, they are one possible treatment target identified by the study. The muscle allows the parasite to travel through our bodies, so if we were able to hinder that ability, we may be able to halt its life cycle before reproduction takes place."

The authors also shed light on the parenchymal tissue of S. mansoni, the 'filler' tissue that connects all the tissues of the parasite together. Previous studies had found it difficult to isolate parenchymal cells for analysis. The cell atlas found that some genes that are important for the parasite to digest food are also associated with the parenchymal tissue. Disrupting how the parasite feeds by targeting these cells could be another avenue for therapies.

Dr Matt Berriman, senior author of the paper from the Wellcome Sanger Institute, said: "Schistosomiasis is one of the most serious neglected parasitic diseases and gaining a deeper understanding of the parasite's biology will help to expose vulnerabilities that could one day be targeted by new treatments. We hope that this cell atlas for the first intra-mammalian stage of Schistosoma mansoni will provide researchers with valuable clues to help accelerate the development of new treatments and eliminate this parasite from the lives of hundreds of millions of affected people each year."

Credit: 
Wellcome Trust Sanger Institute

Simple and cost-effective extraction of rare metals from industrial waste

Kanazawa, Japan - Many rare metals are in scarce supply, yet demand for use in electronics, medical instrumentation, and other purposes continues to increase. As waste, these metals pollute the environment and harm human health. Ideally, we would recycle the metals from waste for reuse. Unfortunately, current recycling methods are some combination of complex, expensive, toxic, wasteful, and ultimately inefficient.

In an upcoming study in Chemical Engineering Journal, researchers from Kanazawa University report a major improvement in recovering silver and palladium ions from aqueous acidic waste. Recovery of the metals in elemental, metallic form is straightforward--simply burn the extraction material and collect the remaining metal after further heating.

The researchers chemically modified ultrasmall particles of cellulose, an abundant and nontoxic biopolymer, to selectively adsorb silver and palladium ions at room temperature. Adsorption was nearly complete at acidic pH with acid concentrations of around 1 to 13 percent by volume. These are reasonable experimental conditions.

"The adsorbent selectively chelated the soft acid silver and palladium cations," explains lead author Foni Biswas. "Of the 11 competing base metals we tested, only copper and lead cations were also adsorbed, but we removed them with ease."

Maximum metal ion adsorption was fast--e.g., an hour for silver. Maximum adsorption commonly requires many hours with other approaches.

"Intraparticle diffusion did not hinder adsorption, which is an endothermic, spontaneous chemical process," explains senior author Hiroshi Hasegawa. "Maximum metal adsorption capacities--e.g., 11 mmol/g for silver--are substantially higher than that reported in prior research."

After adsorption, the researchers simply incinerated the cellulose particles to obtain elemental silver or palladium powder. Subsequent higher-temperature incineration converted the powder into pellets. Cyanide or other toxic extractants were not required. Spectroscopic analyses indicated that the final metal pellets were in metallic rather than oxide form.

"We removed nearly all of the silver and palladium from real industrial waste samples," says lead author Biswas. "Obtaining pure and elemental metals proceeded as smoothly as in our trial runs."

Palladium and silver are valuable metals yet natural supplies are increasingly limited. Future needs require that we recycle the metals that we already have in a practical manner. The research reported here is an important development that will avoid supply and distribution difficulties that will only increase in the coming years.

Credit: 
Kanazawa University

Artificial Intelligence that can run a simulation faithful to physical laws

image: Diagram showing how the developed technology could be utilized.

Image: 
Takashi Matsubara

A research group led by Associate Professor YAGUCHI Takaharu (Graduate School of System Informatics) and Associate Professor MATSUBARA Takashi (Graduate School of Engineering Science, Osaka University) have succeeded in developing technology to simulate phenomena for which the detailed mechanism or formula are unexplained. They did this by using artificial intelligence (AI) to create a model, which is faithful to the laws of physics, from observational data.

It is hoped that this development will make it possible to predict phenomena that have been difficult to simulate up until now because their detailed underlying mechanisms were unknown. It is also expected to increase the speed of the simulations themselves.

These research achievements were presented on December 7 at the Thirty-fourth Conference on Neural Information Processing Systems (NeurIPS 2020), a prestigious meeting on artificial intelligence technology-related topics. 9454 papers were posted to NeurIPS 2020 and of the 1900 that were selected, this research paper was in the top 1.1% and was one of only 105 selected for oral presentation at the conference.

Main Points

Being able to apply artificial intelligence to the prediction of physical phenomena could result in extremely precise, high-speed simulations.

Prediction methods up until now have been prone to generating overestimated or underestimated results because the difficulty of digitizing phenomena means that the laws of physics (such as the energy conservation law) are not preserved.

This research group developed AI-based technology that can run simulations while preserving the laws of physics. They used digital analysis to replicate physics that the computer can recognize in the digital world.

It is expected that this technology will enable phenomena for which the detailed mechanism or formula is unclear (e.g. wave motion, fracture mechanics (such as crack growth) and the growth of crystal structures) to be simulated as long as there is sufficient observational data.

Research Background

Ordinarily, it is possible to carry out predictions of physical phenomena via simulations using supercomputers, and these simulations use equations based on the laws of physics. Even though these equations are highly versatile, this does not always mean that they are capable of perfectly replicating the distinct characteristics of individual phenomena. For example, many people learn about the physics behind the motion of a pendulum in high school. However, if you were to actually make a pendulum and try swinging it, a slight manufacturing defect in the pendulum could cause it not to move in accordance with the theory and this would result in an error in the simulation's prediction. Consequently, research into applying observational data of phenomena to simulations via artificial intelligence has been advancing in recent years. If this can be fully realized, it will be possible to develop custom simulations of real phenomena, which should improve the accuracy of simulations' predictions.

However, it is difficult to introduce the laws of physics that govern real world phenomena to prediction technology using current AI because computers are digital. It has been hard to perfectly replicate physical laws such as the energy conservation law. Consequently, unnatural increases or decreases in energy may occur in long-term predictions. This can cause phenomena such as the object speed or wave height to be overestimated or underestimated, and results in uncertainty regarding the prediction's reliability.

Research Findings

This research group developed a new artificial intelligence-based technology that can be utilized to predict various phenomena by strictly preserving physical laws such as the energy conservation law.

This newly developed approach was born from the notion 'if the world were digital'. Based on this way of thinking, physical laws that must be preserved in such a digital world were introduced. Focusing on the fact that physical laws are written in calculus terms such as 'differentiation' and 'integration', the researchers rewrote them using digital calculus.

To do this technically, the researchers developed a new digital version of backpropagation (*1), which is utilized in machine learning, using automatic differentiation. It is possible to preserve physical laws such as the energy conservation law in the digital world with this new approach. Furthermore, this enables the energy conservation law to be correctly realized by AI-based technology even in simulations. Using this new methodology will make highly reliable predications possible and prevent the occurrence of unnatural increases and decreases in energy that are seen in conventional models.

In the technique developed in this study, the AI learns the energy function from observational data of the physical phenomena and then generates equations of motion in the digital world. These equations of motion can be utilized as-is by the simulation program, and it is expected that the application of such equations will result in new scientific discoveries (Figure 1). In addition, it is not necessary for these equations of motion to be rewritten for the computer simulation, so physical laws such as the energy conservation law can be replicated.

To introduce physical laws into the digital world, geometric approaches such as those of symplectic geometry (*2) and Riemannian geometry (*3) were also utilized. This makes it possible to apply this technique to the prediction of a wider range of phenomena. For example, the phenomenon of two droplets becoming one can be explained in terms of the loss of energy that occurs when they become a single droplet. This kind of phenomenon can be described well using Riemannian geometry. In fact, both energy conservation and energy dissipation phenomena can be shown in a similar equation from a geometrical aspect, which could enable the creation of a unified system that can handle both types of phenomenon. By incorporating this way of thinking, the model developed through this research was expanded to handle energy dissipation phenomena as well, making it possible to accurately estimate the reduction in energy.

Examples of such phenomena include the structural organization of materials, crystal growth and crack extension mechanics, and it is hoped that further developments in AI technology will enable these kinds of phenomena to be predicted.

Moreover, the research group also successfully increased the efficiency of the AI's learning and experiments showed that this was 10 times faster than current methods.

Further Research

The approach developed by this research suggests that it would be possible, when predicting physical phenomena, to produce custom simulations that imitate detailed aspects of these phenomena that are difficult for humans to coordinate. This would make it possible to increase the accuracy of the simulation while also making more efficient predictions possible, leading to improvements in calculation time for various physics simulations.

Furthermore, using AI to extract physical laws from observational data will make it possible to predict phenomena that were previously difficult to simulate due to their detailed mechanisms being unknown.

Predictions made by AI have often been termed 'black boxes' and they are prone to reliability issues. However, the approach developed through this research is highly reliable because it can accurately replicate phenomena while adhering to physical laws such as the energy conversion law, meaning that over predictions and under predictions are unlikely to occur.

This technique can also develop backpropagation, which is commonly utilized in AI learning. Therefore, it could improve the speed of various types of machine learning beyond the technology in this research study.

Credit: 
Kobe University

Compressive fluctuations heat ions in space plasma

image: Artist's impression of the ions and electrons in various space plasmas.

Image: 
Yohei Kawazura

New simulations carried out in part on the ATERUI II supercomputer in Japan have found that the reason ions exist at higher temperatures than electrons in space plasma is because they are better able to absorb energy from compressive turbulent fluctuations in the plasma. These finding have important implications for understanding observations of various astronomical objects such as the images of the accretion disk and shadow of the M87 supermassive black hole captured by the Event Horizon Telescope.

In addition to the normal three states of matter (solid, liquid, and gas) which we see around us every day, there is an additional state called plasma which exists only at high temperatures. Under these conditions, electrons become separated from their parent atoms leaving behind positively charged ions. In space plasma the electrons and ions rarely collide with each other, meaning that they can coexist in different conditions, such as at different temperatures. However, there is no obvious reason why they should have different temperatures unless some force affects them differently. So why ions are usually hotter than electrons in space plasma has long been a mystery.

One way to heat plasma is by turbulence. Chaotic fluctuations in turbulence smoothly mix with particles, and then their energy is converted into heat. To determine the roles of different types of fluctuations in plasma heating, an international team led by Yohei Kawazura at Tohoku University in Japan performed the world's first simulations of space plasma including two types of fluctuations, transverse oscillations of magnetic field lines and longitudinal oscillations of pressure. They used nonlinear hybrid gyrokinetic simulations which are particularly good at modeling slow fluctuations. These simulations were conducted on several supercomputers, including ATERUI II at the National Astronomical Observatory of Japan.

The results showed that the longitudinal fluctuations like to mix with ions but leave electrons. On the other hand the transverse fluctuations can mix with both ions and electrons. "Surprisingly, the longitudinal fluctuations are picky about the partner species to mix with," says Kawazura. This is a key result for understanding the ion to electron heating ratios in plasmas observed in space, like that around the supermassive black hole in Galaxy M87.

Credit: 
National Institutes of Natural Sciences