Tech

Sounding rocket CLASP2 elucidates solar magnetic field

image: Measuring the magnetic field strength at four different heights (horizontal planes) by using data from the CLASP2 and Hinode space telescopes allowed astronomers to map the spreading of magnetic field lines (shown in green) in the chromosphere.

Image: 
NAOJ

Cooperative operations between a solar observation satellite and a sounding-rocket telescope have measured the magnetic field strength in the photosphere and chromosphere above an active solar plage region. This is the first time that the magnetic field in the chromosphere has been charted all the way up to its top. This finding brings us closer to understanding how energy is transferred between layers of the Sun.

Despite being the brightest object in the sky, the Sun still holds many mysteries for astronomers. It is generally believed that magnetic fields play an important role in heating the solar corona, but the details of the process are still unclear. To solve this mystery it is important to understand the magnetic field in the chromosphere, which is sandwiched between the corona and the photosphere, the visible surface of the Sun.

An international team led by Ryohko Ishikawa, an assistant professor at the National Astronomical Observatory of Japan, and Javier Trujillo Bueno, a professor at the Instituto de Astrofísica de Canarias, analyzed data collected by the CLASP2 sounding rocket experiment over twosix-and-a-half-minutes on April 11, 2019. They determined the longitudinal component of the magnetic field above an active region plage and its surroundings by analyzing the signature that the magnetic field imprinted on ultraviolet light from the chromosphere. The unique high precision data from CLASP2 allowed the team to determine the magnetic field strengths in the lower, mid, and upper regions of the chromosphere. Simultaneously acquired data from the Japanese solar observation satellite Hinode provided information about the magnetic field in the plage itself in the photosphere. The team found that the plage magnetic field is highly structured in the photosphere but expands, rapidly merging and spreading horizontally, in the chromosphere. This new picture brings us closer to understanding how magnetic fields transfer energy to the corona from the lower layers of the Sun.

Credit: 
National Institutes of Natural Sciences

Atomic nuclei in the quantum swing

image: A team from the Max Planck Institute in Heidelberg excites nuclei of iron atoms with a flash of X-ray light and then sends a second such flash onto the sample with different delays and detuning. Then, over a period of about 200 nanoseconds, the researchers measure the intensity of the light with which the nuclei release the absorbed energy (light yellow: high intensity; violet: low intensity). They can choose the delay so that the second flash reduces the excitation and the nuclei release their energy quickly and with high intensity (a). After only 50 nanoseconds, the emission has decreased significantly. In contrast, they still emit a relatively large amount of light after more than 100 nanoseconds if the second pulse amplifies the excitation from the first (b).

Image: 
MPI for Nuclear Physics

From atomic clocks to secure communication to quantum computers: these developments are based on the increasingly better control of the quantum behaviour of electrons in atomic shells with the help of laser light. Now, for the first time, physicists at the Max Planck Institute for Nuclear Physics in Heidelberg have succeeded in precisely controlling quantum jumps in atomic nuclei using X-ray light. Compared with electron systems, nuclear quantum jumps are extreme - with energies up to millions of times higher and incredibly short zeptosecond processes. A zeptosecond is one trillionth of a billionth of a second. The rewards include profound insight into the quantum world, ultra-precise nuclear clocks, and nuclear batteries with enormous storage capacity. The experiment required a sophisticated X-ray flash facility developed by a Heidelberg group led by Jörg Evers as part of an international collaboration.

One of the great successes of modern physics is the increasingly precise control of dynamic quantum processes. It enables a deeper understanding of the quantum world with all its oddities and is also a driving force of new quantum technologies. But from the perspective of the atoms, "coherent control" has so far remained superficial: it is the quantum jump of the electrons in the outer shell of the atoms that has become increasingly controllable by lasers. But as Christoph Keitel explains, the atomic nuclei themselves are also quantum systems in which the nuclear building blocks can make quantum jumps between different quantum states.

Energy-rich quantum jumps for nuclear batteries

"In addition to this analogy to electron shells, there are huge differences", explains the Director at the Max Planck Institute for Nuclear Physics in Heidelberg: "They've got us so excited!" Quantum jumps between different quantum states are actually jumps on a kind of energy ladder. "And the energies of these quantum jumps are often six orders of magnitude greater than in the electron shell", says Keitel. A single quantum jump made by a nuclear component can thus pump up to a million times more energy into it - or get it out again. This has given rise to the idea of nuclear batteries with an unprecedented storage capacity.

Such technical applications are still visions of the future. At the moment, research entails addressing and controlling these quantum jumps in a targeted manner. This requires precisely controlled, high-energy X-ray light. The Heidelberg team has been working on such an experimental technique for over 10 years. It has now been used for the first time.

Accurate frequencies enable ultra-precise atomic clocks

The quantum states of atomic nuclei offer another important advantage over electron states. Compared with the electronic quantum jumps, they are much more sharply defined. Because this translates directly into more accurate frequencies according to the laws of physics, they can, in principle, be used for extremely precise measurements. For example, this could enable the development of ultra-precise nuclear clocks that would make today's atomic clocks look like antiquated pendulum clocks. In addition to technical applications of such clocks (e.g. in navigation), they could be used to examine the fundamentals of today's physics much more precisely. This includes the fundamental question of whether the constants of nature really are constant. However, such precision techniques require the control of quantum transitions in the nuclei.

Coordinated light flashes enhance or reduce the excitation

The principle of the Heidelberg experimental technique sounds quite simple at first. It uses pulses (i.e. flashes) of high-energy X-ray light, which are currently provided by the European Synchrotron Radiation Source ESRF in Grenoble. The experiment splits these X-ray pulses in a first sample in such a way that a second pulse follows behind the rest of the first pulse with a time delay. One after the other, both encounter a second sample, the actual object of investigation.

The first pulse is very brief and contains a broad mix of frequencies. Like a shotgun blast, it stimulates a quantum jump in the nuclei; in the first experiment, this was a special quantum state in nuclei of iron atoms. The second pulse is much longer and has an energy that is precisely tuned to the quantum jump. In this way, it can specifically manipulate the quantum dynamics triggered by Pulse 1. The time span between the two pulses can be adjusted. This allows the team to adjust whether the second pulse is more constructive or destructive for the quantum state.

The Heidelberg physicists compare this control mechanism to a swing. With the first pulse, you push it. Depending on the phase of its oscillation in which you give it a second push, it oscillates even stronger or is slowed down.

Pulse control accurate to a few zeptoseconds

But what sounds simple is a technical challenge that required years of research. A controlled change in the quantum dynamics of an atomic nucleus requires that the delay of the second pulse is stable on the unimaginably short time scale of a few zeptoseconds. Because only then do the two pulses work together in a controlling way.

A zeptosecond is one trillionth of a billionth of a second - or a decimal point followed by 20 zeroes and a 1. In one zeptosecond, light does not even manage to pass through one per cent of a medium-sized atom. How can you imagine this in relation to our world? "If you imagine that an atom were as big as the Earth, that would be about 50 km, says Jörg Evers, who initiated the project.

The sample is shifted by 45 trillionths of a metre

The second X-ray pulse is delayed by a tiny displacement of the first sample, also containing iron nuclei with the appropriate quantum transition. "The nuclei selectively store energy from the first X-ray pulse for a short period of time during which the sample is rapidly shifted by about half a wavelength of X-ray light", explains Thomas Pfeifer, Director at the Max Planck Institute for Nuclear Physics in Heidelberg. This corresponds to about 45 trillionths of a metre. After this tiny movement, the sample emits the second pulse.

The physicists compare their experiment to two tuning forks that are at different distances from a firecracker (Figure 2). The bang first strikes the closer tuning fork, making it vibrate, and then moves on to the second tuning fork. In the meantime, the first tuning fork, now excited, emits sound waves itself, which arrive with a delay at the second fork. Depending on the delay time, this sound either amplifies or dampens the vibrations of the second fork - just like the second push on the oscillating swing, as well as for the case of the excited nuclei.

With this experiment, Jörg Evers, Christoph Keitel, and Thomas Pfeifer and their team from the Max Planck Institute for Nuclear Physics in cooperation with researchers from DESY in Hamburg and the Helmholtz Institute/Friedrich Schiller University in Jena succeeded for the first time in demonstrating coherent control of nuclear excitations. In addition to synchrotron facilities such as those at the ESRF, free-electron lasers (FELs) such as the European XFEL at DESY have recently provided powerful sources of X-ray radiation - even in laser-like quality. This opens up a dynamic future for the emerging field of nuclear quantum optics.

Credit: 
Max-Planck-Gesellschaft

Animal evolution -- glimpses of ancient environments

Although amber looks like a somewhat unusual inorganic mineral, it is actually derived from an organic source - tree resins. Millions of years ago, when this aromatic and sticky substance was slowly oozing from coniferous trees, insects and other biological material could become trapped in it. That is why some samples of amber contain fossilized specimens, preserved in a virtually pristine state, which afford fascinating snapshots of the flora and fauna of long-gone forests. Now, a research team led by LMU zoologists Viktor Baranov and Joachim Haug has made exciting finds in samples of amber from the Baltic region and Myanmar, which provide new insights into the ecology of two groups of ancient insects.

In the Eocene period - between 56 and 33.9 million years ago - the Baltic amber forests covered (most likely around 38 million years ago) large areas of what is now Northern Europe, and were the source of most amber found in Europe. In one sample, the LMU team identified no less than 56 fly larvae, all of which were entombed while feasting on a single chunk of mammalian dung. "This fossil is particularly interesting, because the dung is full of plant residues, which implies the presence of at least moderately large herbivores in these forests," Baranov explains. On this basis, he and his colleagues assume that there must have been open areas of grassland nearby, corroborating earlier hypotheses. "The Baltic amber forest is often portrayed as a densely overgrown and humid jungle landscape. But it is much more likely that it was a more open, warm-to-temperate habitat," Baranov says.

In other samples, the researchers found insect larvae whose modern descendants are mainly found in association with plants that are under chronic stress. "It has long been suspected that forests which produced large amounts of amber were ecologically under stress," says Haug. "That would be perfectly compatible with the presence of these larvae." High temperature and dry conditions are the most probable source of such stress.

The unusual butterfly larva that Haug identified in amber from Myanmar is considerably older than the specimens from the Baltic. It dates to the Cretaceous, more than 100 million years ago, at a time when dinosaurs still dominated the Earth. Up until now, only four caterpillars from the Cretaceous had been discovered, and the new find is very different from all of them. "All of the previously discovered caterpillars were relatively naked", says Haug. "Our caterpillar is the first 'armored' specimen that has turned up - it bears spines dorsally on some of its segments." The new specimen thus supports the idea that butterflies underwent an early phase of diversification and also reveals some aspects of their ecology. In modern caterpillars, such spines serve as a deterrent to predators - more particularly, songbirds. "The rapid diversification of birds first sets in after the demise of the large dinosaurs, but small birds that may have fed on caterpillars were already extant during the Cretaceous," Haug points out.

Credit: 
Ludwig-Maximilians-Universität München

How the brain processes sign language

The ability to speak is one of the essential characteristics that distinguishes humans from other animals. Many people would probably intuitively equate speech and language. However, cognitive science research on sign languages since the 1960s paints a different picture: Today it is clear, sign languages are fully autonomous languages and have a complex organization on several linguistic levels such as grammar and meaning. Previous studies on the processing of sign language in the human brain had already found some similarities and also differences between sign languages and spoken languages. Until now, however, it has been difficult to derive a consistent picture of how both forms of language are processed in the brain.

Researchers at the MPI CBS now wanted to know which brain regions are actually involved in the processing of sign language across different studies - and how large the overlap is with brain regions that hearing people use for spoken language processing. In a meta-study recently published in the journal Human Brain Mapping, they pooled data from sign language processing experiments conducted around the world. "A meta-study gives us the opportunity to get an overall picture of the neural basis of sign language. So, for the first time, we were able to statistically and robustly identify the brain regions that were involved in sign language processing across all studies," explains Emiliano Zaccarella, last author of the paper and group leader in the Department of Neuropsychology at the MPI CBS.

The researchers found that especially the so-called Broca's area in the frontal brain of the left hemisphere is one of the regions that was involved in the processing of sign language in almost every study evaluated. This brain region has long been known to play a central role in spoken language, where it is used for grammar and meaning. In order to better classify their results from the current meta-study, the scientists compared their findings with a database containing several thousand studies with brain scans.

The Leipzig-based researchers were indeed able to confirm that there is an overlap between spoken and signed language in Broca's area. They also succeeded in showing the role played by the right frontal brain - the counterpart to Broca's area on the left side of the brain. This also appeared repeatedly in many of the sign language studies evaluated, because it processes non-linguistic aspects such as spatial or social information of its counterpart. This means that movements of the hands, face and body - of which signs consist - are in principle perceived similarly by deaf and hearing people. Only in the case of deaf people, however, do they additionally activate the language network in the left hemisphere of the brain, including Broca's area. They therefore perceive the gestures as gestures with linguistic content - instead of as pure movement sequences, as would be the case with hearing people.

The results demonstrate that Broca's area in the left hemisphere is a central node in the language network of the human brain. Depending on whether people use language in the form of signs, sounds or writing, it works together with other networks. Broca's area thus processes not only spoken and written language, as has been known up to now, but also abstract linguistic information in any form of language in general. "The brain is therefore specialized in language per se, not in speaking," explains Patrick C. Trettenbrein, first author of the publication and doctoral student at the MPI CBS. In a follow-up study, the research team now aims to find out whether the different parts of Broca's area are also specialized in either the meaning or the grammar of sign language in deaf people, similar to hearing people.

Credit: 
Max Planck Institute for Human Cognitive and Brain Sciences

What impact will robots and autonomous systems have on urban ecosystems?

The University of Leeds has coordinated a study with 170 experts from 35 countries, including E.T.S. Agronomic Engineering lecturer Luis Perez Urrestarazu. The study conclusions have just been published in the journal Nature Ecology & Evolution.

The researchers highlighted opportunities to improve the way green spaces are monitored and maintained and helping people to interact with and appreciate the natural world around them. Similarly, as autonomous vehicles become more widely used in cities, pollution and traffic congestion are set to fall.

But they also warn that advances in robotics and automation could be harmful to the environment. They may, for example generate new sources of waste and pollution, with potentially substantial negative implications for urban nature. Cities may have to be re-planned to provide enough space for robots and drones to operate, possibly leading to a loss of planted areas. They could also increase existing social inequalities, such as unequal access to green spaces.

In any event, robots are likely to transform many of the ways in which citizens experience and benefit from urban nature.

Credit: 
University of Seville

3D biopsies to better understand brain tumors

video: 3D biopsies to better understand brain tumors. Video

Image: 
INc-UAB

Researchers at the Institut de Neurociències of the Universitat Autònoma de Barcelona (INc-UAB) obtained a highly accurate recreation of human glioblastoma's features using a novel 3D microscopy analysis. The study, published in the journal Acta Neuropathologica Communications, provides new information to help with the diagnose, by finding therapeutical targets and designing immunotherapeutical strategies.

This new analysis of 3D images and quantitative data "will help to appreciate from within how the tumor is built in its full dimensionality, and to identify where different cell types are located", explains George Paul Cribaro, first author of the study. "It provides more complete information than the usual 2D analyses performed for neuropathological diagnosis"

With this new approach, authors show the alterations in tumor blood vessels, and that these vascular wall abnormalities do not hinder the entrance of lymphocytes T (potential defense against tumoral cells), which is relevant for the design and use of immunotherapies targeting malignant cells.

Moreover, the images allow the tumor to be differentiated into two areas, the tumor tissue properly speaking, and the stroma, which gives support to the tumor, in which there are different immunological microenvironments. "Immune cells like microglia and macrophages are seen in both areas, but they are shaped by different subpopulations. This local and populational differentiation could be an important factor that may help diagnosis and aid in the search for new therapeutic targets", indicates Carlos Barcia, coordinator of the work.

The work provides a set of resource images that will facilitate the understanding of the complexity of this tumor, showing some aspects to be considered when designing new therapeutic approaches.

Credit: 
Universitat Autonoma de Barcelona

New technology enables predictive design of engineered human cells

image: Synthetic biologists achieve a breakthrough in the design of living cells

Image: 
Justin Muir

Northwestern University synthetic biologist Joshua Leonard used to build devices when he was a child using electronic kits. Now he and his team have developed a design-driven process that uses parts from a very different kind of toolkit to build complex genetic circuits for cellular engineering.

One of the most exciting frontiers in medicine is the use of living cells as therapies. Using this approach to treat cancer, for example, many patients have been cured of previously untreatable disease. These advances employ the approaches of synthetic biology, a growing field that blends tools and concepts from biology and engineering.

The new Northwestern technology uses computational modeling to more efficiently identify useful genetic designs before building them in the lab. Faced with myriad possibilities, modeling points researchers to designs that offer real opportunity.

"To engineer a cell, we first encode a desired biological function in a piece of DNA, and that DNA program is then delivered to a human cell to guide its execution of the desired function, such as activating a gene only in response to certain signals in the cell's environment," Leonard said. He led a team of researchers from Northwestern in collaboration with Neda Bagheri from the University of Washington for this study.

Leonard is an associate professor of chemical and biological engineering in the McCormick School of Engineering and a leading faculty member within Northwestern's Center for Synthetic Biology. His lab is focused on using this kind of programming capability to build therapies such as engineered cells that activate the immune system, to treat cancer.

Bagheri is an associate professor of biology and chemical engineering and a Washington Research Foundation Investigator at the University of Washington Seattle. Her lab uses computational models to better understand -- and subsequently control -- cell decisions. Leonard and Bagheri co-advised Joseph Muldoon, a recent doctoral student and the paper's first author.

"Model-guided design has been explored in cell types such as bacteria and yeast, but this approach is relatively new in mammalian cells," Muldoon said.

The study, in which dozens of genetic circuits were designed and tested, will be published Feb. 19 in the journal Science Advances. Like other synthetic biology technologies, a key feature of this approach is that it is intended to be readily adopted by other bioengineering groups.

To date, it remains difficult and time-consuming to develop genetic programs when relying upon trial and error. It is also challenging to implement biological functions beyond relatively simple ones. The research team used a "toolkit" of genetic parts invented in Leonard's lab and paired these parts with computational tools for simulating many potential genetic programs before conducting experiments. They found that a wide variety of genetic programs, each of which carries out a desired and useful function in a human cell, can be constructed such that each program works as predicted. Not only that, but the designs worked the first time.

"In my experience, nothing works like that in science; nothing works the first time. We usually spend a lot of time debugging and refining any new genetic design before it works as desired," Leonard said. "If each design works as expected, we are no longer limited to building by trial and error. Instead, we can spend our time evaluating ideas that might be useful in order to hone in on the really great ideas."

"Robust representative models can have disruptive scientific and translational impact," Bagheri added. "This development is just the tip of the iceberg."

The genetic circuits developed and implemented in this study are also more complex than the previous state of the art. This advance creates the opportunity to engineer cells to perform more sophisticated functions and to make therapies safer and more effective.

"With this new capability, we have taken a big step in being able to truly engineer biology," Leonard said.

Credit: 
Northwestern University

Data show lower daily temperatures lead to higher transmission of COVID-19

LOUISVILLE, Ky. - The SARS-CoV-2 pandemic has caused tremendous upheaval, leading to more than 2.3 million deaths worldwide and 465,000 in the United States. Understanding the impact of seasonal temperature changes on transmission of the virus is an important factor in reducing the virus's spread in the years to come.

SARS-CoV-2 belongs to a large family of human coronaviruses, most of which are characterized by increased transmission in cooler, less humid months and decreased transmission in warmer, more humid months. With this understanding, researchers at the University of Louisville's Christina Lee Brown Envirome Institute, the Johns Hopkins University School of Medicine, the U.S. Department of Defense Joint Artificial Intelligence Center and others theorized that atmospheric temperature also would affect transmission of SARS-CoV-2.

The researchers compared daily low temperature data and logged cases of COVID-19 in 50 countries in the Northern Hemisphere between Jan. 22 and April 6, 2020. Their research, published this week in PLOS ONE, showed that as temperatures rose, the rate of new cases of COVID-19 decreased.

The data analysis showed that between 30 and 100 degrees Fahrenheit, a 1-degree Fahrenheit increase in daily low temperature was associated with a 1% decrease in the rate of increase in COVID-19 cases, and a 1-degree decrease in temperature was associated with an increase in that rate by 3.7%. By analyzing data from early in the pandemic, the results were obtained without significant influence by lockdowns, masking or other social efforts to contain the virus.

"Although COVID-19 is an infectious disease that will have non-temperature dependent transmission, our research indicates that it also may have a seasonal component," said Aruni Bhatnagar, Ph.D., co-author and director of the Brown Envirome Institute. "Of course, the effect of temperature on the rate of transmission is altered by social interventions like distancing, as well as time spent indoors and other factors. A combination of these factors ultimately determines the spread of COVID-19."

The researchers concluded that summer months are associated with slowed transmission of COVID-19, as in other seasonal respiratory viruses. This seasonal effect could be useful in local planning for social interventions and timing of resurgence of the virus.

In the United States, sharp spikes in COVID-19 were seen over the summer, but the researchers noted that based on the data they analyzed, cooler summer temperatures may have resulted in an even higher number of cases. The data also indicates that the correlation between temperature and transmission was much greater than the association between temperature and recovery or death from COVID-19.

"This understanding of of the SARS-CoV-2 temperature sensitivity has important implications for anticipating the course of the pandemic," said Adam Kaplin, M.D., Ph.D., of Johns Hopkins, first author of the study. "We do not know how long the currently available vaccines will sustain their benefits, nor what the risks are of new variants developing over time if the Northern and Southern Hemispheres continue to exchange COVID-19, back and forth across the equator, due to their opposing seasons. But it is reasonable to conclude that this research suggests that, like other seasonal viruses, SARS-CoV-2 could prove to be extremely difficult to contain over time unless there is a concerted and collaborative global effort to work to end this pandemic."

Credit: 
University of Louisville

New review compiles immunogenicity data on leading SARS-CoV-2 vaccine candidates

In a new Review, P.J. Klasse and colleagues present an extensive overview of the immunogenicity profiles of several leading SARS-CoV-2 vaccine candidates, including several developed under the auspices of the U.S. Government's "Operation Warp Speed" program, as well as leading candidates from China and Russia. Since the paper was submitted, two of these vaccines - from Pfizer/BioNTech and Moderna - have been authorized for use by the FDA. The authors review data from evaluations in non-human primates as well as human clinical trials, summarizing what is known about antibody and T cell immunogenicity for roughly a dozen leading candidates. Noting the variability in the methods used to assess each vaccine, which makes direct comparisons challenging, Klasse et al. nonetheless attempt to qualitatively compare and contrast the vaccines' performance. Notably, the authors also provide some updated information in the supplemental material for the paper.

Credit: 
American Association for the Advancement of Science (AAAS)

Targeting MAPK4 emerges as a promising therapy for prostate cancer

The battle against late-stage prostate cancer might have found a potential new strategy to combat this deadly disease. Research led by Baylor College of Medicine reveals in the Journal of Clinical Investigation that the enzyme MAPK4 concertedly activates androgen receptor (AR) and AKT, molecules at the core of two cellular signaling pathways known to promote prostate cancer growth and resistance to standard therapy. Importantly, inhibiting MAPK4 simultaneously inactivated both AR and AKT and stopped cancer growth in animal models. The findings open the possibility that targeting MAPK4 in human prostate cancer might provide a novel therapeutic strategy for this disease that is the second leading cause of cancer death in American men.

"Scientists already knew that both the AR and the AKT pathways can drive prostate cancer," said corresponding author Dr. Feng Yang, assistant professor of molecular and cellular biology and member of the Dan L Duncan Comprehensive Cancer Center at Baylor. "One complication with targeting AR (for instance, with medical castration therapy, including the most advanced agents such as enzalutamide, apalutamide and abiraterone) or AKT is that there is a reciprocal crosstalk between these pathways. When AR is inhibited, AKT gets activated, and vice-versa, therefore tackling these pathways to control cancer growth is complex."

In previous work, the Yang lab studied the little-known enzyme MAPK4.

"One interesting aspect of MAPK4 is that it is rather unique because it does not work as conventional MAPK enzymes do," Yang said. "To our knowledge, we are one of the few groups studying MAPK4 and the first to uncover its critical roles in human cancers."

In their previous study, Yang and his colleagues discovered that MAPK4 can trigger the AKT pathway, not only in prostate cancer but in other cancers as well, such as lung and colon cancers.

In the current study, the researchers found that MAPK4 also activates the AR signaling pathway by enhancing the production and stabilization of GATA2, a factor that is crucial for the synthesis and activation of AR.

Further experiments showed that MAPK4 triggered the concerted activation of both AR and AKT pathways by independent mechanisms, and this promoted prostate cancer growth and resistance to castration therapy, a standard medical treatment for advanced/metastatic prostate cancer. Importantly, genetically knocking down MAPK4 reduced the activation of both AR and AKT pathways and inhibited the growth, including castration-resistant growth, of prostate cancer in animal models. The researchers anticipate that knocking down MAPK4 also could reduce the growth of other cancer types in which MAPK4 is involved.

"Our findings suggest the possibility that regulating MAPK4 activity could result in a novel therapeutic approach for prostate cancer," Yang said. "We are interested in finding an inhibitor of MAPK4 activity that could help better treat prostate cancer and other cancer types in the future."

Credit: 
Baylor College of Medicine

Making sense of the mass data generated from firing neurons

image: Dr Miguel Aguilera, Marie Sklodowska-Curie research fellow in the School of Engineering and Informatics at the University of Sussex.

Image: 
University of Sussex.

Scientists have achieved a breakthrough in predicting the behaviour of neurons in large networks operating at the mysterious edge of chaos.

New research from the University of Sussex and Kyoto University outlines a new method capable of analysing the masses of data generated by thousands of individual neurons.

The new framework outperforms previous models in predicting and assessing network properties by more accurately estimating a system's fluctuations with greater sensitivity to parameter changes.

As new technologies allow recording of thousands of neurons from living animals, there is a pressing demand for mathematical tools to study the non-equilibrium, complex dynamics of the high-dimensional data sets they generate. In this endeavour, the researchers hope to help answer key questions about how animals process information and adapt to environmental changes.

The researchers also believe their work could be effective in reducing the massive computational cost and carbon footprint of training large AI models - making such models much more widely available to smaller research labs or companies.

Dr Miguel Aguilera, Marie Sklodowska-Curie research fellow in the School of Engineering and Informatics at the University of Sussex, said: "Only very recently have we had the technology to record thousands of individual neurons in animals while they interact with their environment, which is a tremendous stride forward from studying networks of neurons isolated in laboratory cultures or in immobilized or anaesthetized animals.

"This is a very exciting advancement but we don't have the methods yet to analyse and understand the massive amount of data created by non-equilibrium behaviour. Our contribution offers the possibility to advance the technology forward to find models that explain how neurons process information and generate behaviour."

The paper, published today in Nature Communications, develops methods to quickly approximate the complex dynamics of neural network models that capture how real neurons observed in the lab behave, how they are connected and how they process information.

In a significant step forward, the research team have created a method which works in significantly fluctuating, non-equilibrium situations that animals operate in when interacting with their environment in the real world.

Dr Aguilera said: "The most efficient manner of learning how large systems work is using statistical models and approximations, and the most common of these are mean field methods, where the effect of all interactions in a network is approximated by a simplified average effect.

"But these techniques often just work in very idealized conditions. Brains are in constant change, development and adaptation, displaying complex fluctuating patterns and interacting with rapidly changing environments. Our model aims to capture precisely the fluctuations in these non-equilibrium situations that we expect from freely behaving animals in their natural surroundings."

The statistical method captures the dynamics of large networks specifically in the region at the edge of chaos, a special region of behaviour between chaotic and ordered activity, where intense fluctuations in neuronal activity, known as neuronal avalanches, take place.

As opposed to previous mathematical models, the researchers applied an information geometric approach to better capture network correlations which allowed them to create simplified maps approximating the trajectory of neural activity which in reality travel extremely complex routes that are difficult to compute directly.

Dr. S. Amin Moosavi, research fellow in the Graduate School of Informatics at Kyoto University, said: "Information geometry provides us a clear path to systematically advance our methods and suggest novel approaches, resulting in more accurate data analysis tools."

Prof Hideaki Shimazaki, Associate Professor in the Graduate School of Informatics at Kyoto University, said: "In addition to providing advanced calculation methods for large systems, the framework unifies many existing approaches from which we can further advance neuroscience and machine learning. We are glad to offer such a unifying view that expresses a hallmark of scientific progress as a product of this intense international collaboration."

Dr Aguilera will next apply these methods to model thousands of neurons of zebrafish in the lab interacting with a virtual reality setup as part of the EU-funded DIMENSIVE project, which aims to develop generative models of large-scale behaviour and provide important insights into how behaviour arises from the dynamical interaction of an organism's nervous system, body and environment.

Credit: 
University of Sussex

Seeing stable topology using instabilities

image: The spatial intensity profile of a laser beam propagating in a nonlinear medium spontaneously becomes nonuniform due to the process of modulational instability.

Image: 
Institute for Basic Science

We are most familiar with the four conventional phases of matter: solid, liquid, gas, and plasma. Changes between two phases, known as phase transitions, are marked by abrupt changes in material properties such as density. In recent decades a wide body of physics research has been devoted to discovering new unconventional phases of matter, which typically emerge at ultra-low temperatures or in specially-structured materials. Exotic "topological" phases exhibit properties that can only change in a quantized (step-wise) manner, making them intrinsically robust against impurities and defects.

In addition to topological states of matter, topological phases of light can emerge in certain optical systems such as photonic crystals and optical waveguide arrays. Topological states of light are of interest as they can form the basis for future energy-efficient light-based communication technologies such as lasers and integrated optical circuits.

However, at high intensities light can modify the properties of the underlying material. One example of such a phenomenon is the damage that the high-power lasers can inflict on the mirrors and lenses. This in turn affects the propagation of the light, forming a nonlinear feedback loop. Nonlinear optical effects are essential for the operation of certain devices such as lasers, but they can lead to the emergence of disorder from order in a process known as modulational instability, as is shown in Figure 1. Understanding the interplay between topology and nonlinearity is a fascinating subject of ongoing research.

Daniel Leykam, Aleksandra Maluckov, and Sergej Flach at the Center for Theoretical Physics of Complex Systems (PCS) within the Institute for Basic Science (IBS, South Korea), along with their colleagues Ekaterina Smolina and Daria Smirnova from the Institute of Applied Physics, Russian Academy of Sciences and the Australian National University, have proposed a novel method to characterize topological phases of light using nonlinear instabilities exhibited by bright beams of light. This research was published in Physical Review Letters.

In this work, the researchers addressed the fundamental question of how topological phases of light in nonlinear optical media undergo the process of modulational instability. It was shown theoretically that certain features of the instability, such as its growth rate, can differ between different topological phases. The researchers performed numerical simulations of the modulational instability and demonstrated that it can be used as a tool to identify different topological phases of light. An example of this idea is shown in Figure 2: While the light beams generated by the instability have seemingly-random patterns of intensity, they exhibit hidden order in their polarization in the form of robust vortices. The number of vortices appearing as a result of the instability is quantized, and they can be used to distinguish different topological phases.

The most common way to identify topological phases of light has been to look at the edges of the material, where certain optical wavelengths become localized. However, a complete characterization requires measuring the bulk properties of the material, which is a much harder task. The light in the bulk material undergoes complicated wave interference and is highly sensitive to defects, which obscures its topological properties. Counterintuitively, the researchers have shown how nonlinear instabilities may be used to tame this unwanted interference and spontaneously encode the bulk topological properties of the material into beams of light. This approach provides a simpler way to probe and perhaps even generate topological states of light.

The next step will be to test this proposal in an experiment. For example, optical waveguide arrays inscribed in a glass will be an ideal platform for this purpose. By shining a bright pulsed laser beam into the glass, it should be possible to directly observe the modulational instability and thereby measure the topological properties of the waveguide array. The research group is currently discussing possible designs for the experimental verification of their theory with collaborators.

Credit: 
Institute for Basic Science

Scientists identify over 140,000 virus species in the human gut

Viruses are the most numerous biological entities on the planet. Now researchers at the Wellcome Sanger Institute and EMBL's European Bioinformatics Institute (EMBL-EBI) have identified over 140,000 viral species living in the human gut, more than half of which have never been seen before.

The paper, published today (18 February 2021) in Cell, contains an analysis of over 28,000 gut microbiome samples collected in different parts of the world. The number and diversity of the viruses the researchers found was surprisingly high, and the data opens up new research avenues for understanding how viruses living in the gut affect human health.

The human gut is an incredibly biodiverse environment. In addition to bacteria, hundreds of thousands of viruses called bacteriophages, which can infect bacteria, also live there.

It is known that imbalances in our gut microbiome can contribute to diseases and complex conditions such as Inflammatory Bowel Disease, allergies and obesity. But relatively little is known about the role our gut bacteria, and the bacteriophages that infect them, play in human health and disease.

Using a DNA-sequencing method called metagenomics*, researchers at the Wellcome Sanger Institute and EMBL's European Bioinformatics Institute (EMBL-EBI) explored and catalogued the biodiversity of the viral species found in 28,060 public human gut metagenomes and 2,898 bacterial isolate genomes cultured from the human gut.

The analysis identified over 140,000 viral species living in the human gut, more than half of which have never been seen before.

Dr Alexandre Almeida, Postdoctoral Fellow at EMBL-EBI and the Wellcome Sanger Institute, said: "It's important to remember that not all viruses are harmful, but represent an integral component of the gut ecosystem. For one thing, most of the viruses we found have DNA as their genetic material, which is different from the pathogens most people know, such as SARS-CoV-2 or Zika, which are RNA viruses. Secondly, these samples came mainly from healthy individuals who didn't share any specific diseases. It's fascinating to see how many unknown species live in our gut, and to try and unravel the link between them and human health."

Among the tens of thousands of viruses discovered, a new highly prevalent clade - a group of viruses believed to have a common ancestor - was identified, which the authors refer to as the Gubaphage. This was found to be the second most prevalent virus clade in the human gut, after the crAssphage, which was discovered in 2014.

Both of these viruses seem to infect similar types of human gut bacteria, but without further research it is difficult to know the exact functions of the newly discovered Gubaphage.

Dr Luis F. Camarillo-Guerrero, first author of the study from the Wellcome Sanger Institute, said: "An important aspect of our work was to ensure that the reconstructed viral genomes were of the highest quality. A stringent quality control pipeline coupled with a machine learning approach enabled us to mitigate contamination and obtain highly complete viral genomes. High-quality viral genomes pave the way to better understand what role viruses play in our gut microbiome, including the discovery of new treatments such as antimicrobials from bacteriophage origin."

The results of the study form the basis of the Gut Phage Database (GPD), a highly curated database containing 142,809 non-redundant phage genomes that will be an invaluable resource for those studying bacteriophages and the role they play on regulating the health of both our gut bacteria and ourselves.

Dr Trevor Lawley, senior author of the study from the Wellcome Sanger Institute, said: "Bacteriophage research is currently experiencing a renaissance. This high-quality, large-scale catalogue of human gut viruses comes at the right time to serve as a blueprint to guide ecological and evolutionary analysis in future virome studies."

Credit: 
Wellcome Trust Sanger Institute

Unexpected decrease in ammonia emissions due to COVID-19 lockdowns

image: Celebration of Spring Festival in China.

Image: 
Zheng Lin

Most Chinese working in the cities return to work today after a 7-day public holiday of Spring Festival. The annual Spring Festival, which also marks the start of Chinese New Year, traditionally begins with the second new moon following the winter solstice, usually in January or February. Like westerners on Thanksgiving and Christmas, people across China return to their hometown to reunite with family and friends. However, the sudden outbreak of COVID-19 last year halted the largest holiday mobilization in the world. In response to the crisis, in late 2019, local governments launched lockdowns and behavior restrictions that reduced short-term economic and social activity. Despite the negative aspects of the pandemic, reduced human activity provided a unique opportunity for atmospheric scientists to study the impact of an unprecedented intervention on air quality.

COVID-19 lockdowns have reduced carbon dioxides, nitrogen oxides and sulfur dioxides, all byproducts of fuel combustion used in transportation and manufacturing. These gases are closely linked to human activity, so not surprisingly, their atmospheric concentration responds quickly to economic change. Studies are showing significantly reduced airborne ammonia as well, despite the believed-to-be strong link to agricultural sources, especially in rural areas.

"We don't think this finding is surprising as the major source of ammonia in urban Beijing was found to be combustion sources rather than agricultural emissions," says atmospheric scientist Dr. Yuepeng Pan of the Institute of Atmospheric Physics, Chinese Academy of Sciences.

Still, some researchers are startled by the substantial decrease of ammonia amid lockdowns, given that idle agricultural activities during Spring Festival holidays typically reduce ammonia levels. In an earlier study published in Advances in Atmospheric Sciences, Dr. Pan and his group tracked and corrected isotopic signatures of ammonia sources with updated active sampling. They found that that non-agricultural emissions contributed to 66% of ammonia in urban Beijing.

While debate continues regarding ammonia sources in the urban atmosphere, the lockdowns that canceled Spring Festival celebrations provided an unprecedented opportunity to check whether fossil fuel combustion is a major source of ammonia in the air within urban regions. Ideally, relatively low ammonia concentrations should be observed if vehicular emissions are reduced due to travel restrictions during the Spring Festival.

"In addition to nitrogen isotopic evidence, the new finding in lockdowns offers additional insight for the prioritization of future clean air actions on ammonia reduction," said Dr. Pan. However, quantifying this unique scenario remains challenging as meteorological processes may mask the effective change in observed ammonia concentrations.

Pan and his collaborators introduced machine learning algorithms to models that separated these meteorological influences. They confirmed that the actual atmospheric ammonia concentration dropped to a new minimum during the 2020 Spring Festival at both urban (Beijing) and rural (Xianghe) sites. In a scenario analysis without lockdowns, ammonia concentration calculations were 39.8% and 24.6% higher than the observed values in 2020 at urban and rural sites, respectively. Their recent findings are published in Atmospheric Research.

"Future control strategies should consider the emissions of ammonia from the transportation, industrial and residential sectors, considering that agricultural emissions are minor in cold seasons." remarked Pan.

The significant difference between the two sites indicates a larger reduction of ammonia emissions in urban areas than rural areas due to lockdown measures of COVID-19 which have reduced human activity.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Blueprint for fault-tolerant qubits

image: Proposed hardware implementation of the QEC code. The circuit consists of two Josephson junctions coupled by a gyrator, highlighted in red.

Image: 
M. Rymarz et al., Phys Rev X (2021), https://doi.org/10.1103/PhysRevX.11.011032 (CC BY 4.0)

Building a universal quantum computer is a challenging task because of the fragility of quantum bits, or qubits for short. To deal with this problem, various types of error correction have been developed. Conventional methods do this by active correction techniques. In contrast, researchers led by Prof. David DiVincenzo from Forschungszentrum Jülich and RWTH Aachen University, together with partners from the University of Basel and QuTech Delft, have now proposed a design for a circuit with passive error correction. Such a circuit would already be inherently fault protected and could significantly accelerate the construction of a quantum computer with a large number of qubits.

In order to encode quantum information in a reliable way, usually, several imperfect qubits are combined to form a so-called logical qubit. Quantum error correction codes, or QEC codes for short, thus make it possible to detect errors and subsequently correct them, so that the quantum information is preserved over a longer period of time.

In principle, the techniques work in a similar way to active noise cancellation in headphones: In a first step, any fault is detected. Then, a corrective operation is performed to remove the error and restore the information to its original pure form.

However, the application of such active error correction in a quantum computer is very complex and comes with an extensive use of hardware. Typically, complex error-correcting electronics are required for each qubit, making it difficult to build circuits with many qubits, as required to build a universal quantum computer.

The proposed design for a superconducting circuit, on the other hand, has a kind of built-in error correction. The circuit is designed in such a way that it is already inherently protected against environmental noise while still controllable. The concept thus bypasses the need for active stabilization in a highly hardware-efficient manner, and would therefore be a promising candidate for a future large-scale quantum processor that has a large number of qubits.

"By implementing a gyrator - a two port device that couples current on one port to voltage on the other - in between two superconducting devices (so called Josephson junctions), we could waive the demand of active error detection and stabilization: when cooled down, the qubit is inherently protected against common types of noise," said Martin Rymarz, a PhD student in the group of David DiVincenzo and first author of the paper, published in Physical Review X.

"I hope that our work will inspire efforts in the lab; I recognize that this, like many of our proposals, may be a bit ahead of its time", said David DiVincenzo, Founding Director of the JARA-Institute for Quantum Information at RWTH Aachen University and Director of the Institute of Theoretical Nanoelectronics (PGI-2) at Forschungszentrum Jülich. "Nevertheless, given the professional expertise available, we recognize the possibility to test our proposal in the lab in the foreseeable future".

David DiVincenzo is considered a pioneer in the development of quantum computers. Among other things, his name is associated with the criteria that a quantum computer must fulfil, the so-called "DiVincenzo criteria".

Credit: 
Forschungszentrum Juelich