Earth

UCLA and Carnegie Mellon researchers develop real-time physics engine for soft robotics

image: Sequence of a simulation showing a soft robot with seven flexible limbs planning its forward motion. Technology developed by UCLA and Carnegie Mellon researchers

Image: 
Khalid Jawed/UCLA

Motion picture animation and video games are impressively lifelike nowadays, capturing a wisp of hair falling across a heroine's eyes or a canvas sail snapping crisply in the wind. Collaborators from the University of California, Los Angeles (UCLA) and Carnegie Mellon University have adapted this sophisticated computer graphics technology to simulate the movements of soft, limbed robots for the first time.

The findings were published May 6 in Nature Communications in a paper titled, "Dynamic Simulation of Articulated Soft Robots."

"We have achieved faster than real-time simulation of soft robots, and this is a major step toward such robots that are autonomous and can plan out their actions on their own," said study author Khalid Jawed, an assistant professor of mechanical and aerospace engineering at UCLA Samueli School of Engineering. "Soft robots are made of flexible material which makes them intrinsically resilient against damage and potentially much safer in interaction with humans. Prior to this study, predicting the motion of these robots has been challenging because they change shape during operation."

Movies often use an algorithm named discrete elastic rods (DER) to animate free-flowing objects. DER can predict hundreds of movements in less than a second. The researchers wanted to create a physics engine using DER that could simulate the movements of bio-inspired robots and robots in challenging environments, such as the surface of Mars or underwater.

Another algorithm-based technology, finite elemental method (FEM), can simulate the movements of solid and rigid robots but it is not well-suited to tackle the intricacies of soft, natural movements. It also requires significant time and computational power.

Until now, roboticists have used a painstaking trial-and-error process for investigating the dynamics of soft material systems, design and control of soft robots.

"Robots made out of hard and inflexible materials are relatively easy to model using existing computer simulation tools," said Carmel Majidi, an associate professor of mechanical engineering in Carnegie Mellon's College of Engineering. "Until now, there haven't been good software tools to simulate robots that are soft and squishy. Our work is one of the first to demonstrate how soft robots can be successfully simulated using the same computer graphics software that has been used to model hair and fabrics in blockbuster films and animated movies."

The researchers started working together in Majidi's Soft Machines Lab more than three years ago. Continuing their collaboration on this latest work, Jawed ran the simulations in his research lab at UCLA while Majidi performed the physical experiments performed the physical experiments that validated the simulation results.

The research was funded in part by the Army Research Office.

"Experimental advances in soft-robotics have been outpacing theory for several years," said Dr. Samuel Stanton, a program manager with the Army Research Office, an element of the U.S. Army Combat Capabilities Development Command's Army Research Laboratory. "This effort is a significant step in our ability to predict and design for dynamics and control in highly deformable robots operating in confined spaces with complex contacts and constantly changing environments."

The researchers are currently working to apply this technology to other kinds of soft robots, such as ones inspired by the movements of bacteria and starfish. Such swimming robots could be fully untethered and used in oceanography to monitor seawater conditions or inspect the status of fragile marine life.

The new simulation tool can significantly reduce the time it takes to bring a soft robot from drawing board to application. While robots are still very far from matching the efficiency and capabilities of natural systems, computer simulations can help to reduce this gap.

Credit: 
University of California - Los Angeles

Effect on quality of life of watching Disney movies during chemotherapy

What The Study Did: In this randomized clinical trial, researchers assessed the effect on measures of quality of life among women who watched Disney movies during chemotherapy for gynecologic cancer.

Authors: Johannes Ott, M.D., of the Medical University of Vienna in Austria, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.4568)

Editor's Note: The article includes conflict of interest disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

New research determines our species created earliest modern artifacts in Europe

image: Stone artifacts from the Initial Upper Paleolithic at Bacho Kiro Cave. 1-3, 5-7: Pointed blades and fragments from Layer I; 4: Sandstone bead with morphology similar to bone beads; 8: The longest complete blade.

Image: 
Tsenka Tsanova, MPI-EVA Leipzig, License: CC-BY-SA 2.0

Blade-like tools and animal tooth pendants previously discovered in Europe, and once thought to possibly be the work of Neanderthals, are in fact the creation of Homo sapiens, or modern humans, who emigrated from Africa, finds a new analysis by an international team of researchers.

Its conclusions, reported in the journal Nature, add new clarity to the arrival of Homo sapiens into Europe and to their interactions with the continent's indigenous and declining Neanderthal population.

The analysis centers on an earlier discovery of bones and other artifacts found in the Bacho Kiro cave in what is modern-day Bulgaria.

"Our findings link the expansion of what were then advanced technologies, such as blade tools and pendants made from teeth and bone, with the spread of Homo sapiens more than 45,000 years ago," explains Shara Bailey, a professor in NYU's Department of Anthropology and one of the paper's co-authors. "This confirms that Homo sapiens were mostly responsible for these 'modern' creations and that similarities between these and other sites in which Neanderthals made similar things are due to interaction between the populations."

The findings offer a new understanding of both the nature of these species and their interactions.

"If Neanderthals had created these 'modern' tools and jewelry, it would have indicated they had more advanced cognitive abilities than previously recognized," explains Bailey. "Nonetheless, there are some similarities in manufacturing techniques used by Homo sapiens at Bacho Kiro and Neanderthals elsewhere, which makes clear that there was cultural transmission going on between the two groups."

The analysis was led by researchers at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.

The team, which included scientists from Europe, the United States, and the United Kingdom, focused on the transition from the Middle to Upper Palaeolithic period, between 50,000 and 30,000 years ago. During this time, the European continent experienced the replacement and partial absorption of local Neanderthals by Homo sapiens populations from Africa. However, this process, anthropologists say, likely varied across regions, and the details of this transition remain largely unknown.

To better comprehend a piece of this transition, the team focused on one of several places--Bacho Kiro--where discoveries of the earliest modern technologies, such as pendants and blades, have been made.

To ascertain which species occupied the area of these discoveries, the scientists deployed several methodologies. Bailey, an expert in tooth analysis, and her colleagues examined teeth and bones that had been found in Bacho Kiro.

Using state-of-the-art technology called ZooMS (collagen peptide mass fingerprinting), they identified human bone fragments and concluded that they were at least 45,000 years old--a period coinciding with the arrival of multiple waves of Homo sapiens into Europe. Subsequent shape analyses of the tooth and DNA examination of the fragments determined that they belonged to Homo sapiens and not Neanderthals, whose presence was not evident among the discovered fossils.

"ZooMS allows us to identify previously unidentifiable bone fragments as some form of human," explains Bailey. "From there, we can apply more sophisticated techniques to identify the species and more accurately date human bones."

Credit: 
New York University

Water loss in northern peatlands threatens to intensify fires, global warming

image: An example of boreal peatlands and forest in Canada's Northwest Territories.

Image: 
Manuel Helbig, McMaster University

HAMILTON, ON, May 11, 2020 - A group of 59 international scientists, led by researchers at Canada's McMaster University, has uncovered new information about the distinct effects of climate change on boreal forests and peatlands, which threaten to worsen wildfires and accelerate global warming.

Manuel Helbig and Mike Waddington from McMaster's School of Geography and Earth Sciences gathered observational data from collaborators in countries across the boreal biome. Their study of how ecosystems lose water to the atmosphere appears today in the journal Nature Climate Change.

The unprecedented detail of their work has highlighted dramatic differences in the ways forests and peatlands regulate water loss to the atmosphere in a warming climate, and how those differences could in turn accelerate the pace of warming.

Most current global climate models assume the biome is all forest, an omission that could seriously compromise their projections, Helbig says.

"We need to account for the specific behavior of peatlands if we want to understand the boreal climate, precipitation, water availability and the whole carbon cycle," he says.

"Peatlands are so important for storing carbon, and they are so vulnerable."

Until now, Helbig says, it had not been possible to capture such a comprehensive view of these water-cycle dynamics, but with the support of the Global Water Futures Initiative and participation from so many research partners in Canada, Russia, the US, Germany and Scandinavia, new understanding is emerging.

As the climate warms, air gets drier and can take up more water. In response to the drying of the air, forest ecosystems - which make up most of the world's natural boreal regions - retain more water. Their trees, shrubs and grasses are vascular plants that typically take up carbon dioxide and release water and oxygen through microscopic pores in their leaves. In warmer, dryer weather, though, those pores close, slowing the exchange to conserve water.

Together with lakes, the spongy bogs and fens called peatlands make up the remainder of the boreal landscape. Peatlands store vast amounts of water and carbon in layers of living and dead moss. They serve as natural firebreaks between sections of forest, as long as they remain wet.

Peatland mosses are not vascular plants, so as warming continues, they are more prone to drying out. Unlike forests, they have no active mechanism to protect themselves from losing water to the atmosphere. Dehydration exposes their dense carbon stores to accelerated decomposition, and turns them from firebreaks into fire propagators, as shown in previous research from Waddington's ecohydrology lab.

Drier peatlands mean bigger, more intense fires that can release vast amounts of carbon into the atmosphere, accelerating global warming, Helbig says.

"It's crucial to consider the accelerated water loss of peatlands in a warming climate as we project what will happen to the boreal landscape in the next 100 to 200 years," he says.

Credit: 
McMaster University

New test identifies lobster hybrids

image: American lobsters in a holding warehouse in Maine, bound for export to Europe

Image: 
Charlie Ellis

Scientists have developed a test that can identify hybrids resulting from crossbreeding between European and American lobsters.

American lobsters have occasionally escaped or been released into European waters after being imported for the seafood market.

Experts have long feared they could threaten European lobsters by introducing disease or establishing as an invasive species.

Hybridisation - when a "pure" species is threatened at a genetic level via interbreeding with a different but related species - had been less of a concern because lab studies suggested European and American lobsters were reluctant to mate.

However, when an American lobster female was found bearing eggs in a fjord in Sweden, University of Exeter researchers tested the offspring and found they were "clearly distinct" from both European and American lobsters.

"We had just developed a genetic test for seafood traceability that could separate any American lobsters mislabelled as more expensive European equivalents once they've been cooked and shell colouration is no longer a useful indicator of the species," said Dr Charlie Ellis, of the University of Exeter.

"What we found when we tested these offspring is that they came out exactly in the middle of this separation - half American and half European - so these lobsters were hybrids."

This has potentially concerning implications for the lobster industry and conservation efforts, and Dr Ellis says further research is required to assess the extent of the threat.

"Until recently, it was thought that American and European lobsters would avoid crossbreeding, but this introduced American female has mated with a native European male, probably because she was unable to find an American male," he said.

"We now need to check whether any mature adult hybrids are fertile, because if they are then they have the ability to spread these unwanted American genes far and wide across our native lobster stocks."

Working with collaborators from the University of Gothenburg who originally found the hybrid egg clutch, the researchers say their study, published in the journal Scientific Reports, highlights the vital use of genetics to distinguish hybrid lobsters which might look almost identical to a pure strain.

"It is particularly concerning that we seem to have found American lobster genes in one of our lobster reserves," said Linda Svanberg of the Gothenburg team.

"The better news is we now have this genetic tool to test lobsters or their eggs for hybridisation", added Dr Jamie Stevens, leader of the research which was funded by an EU grant through the Agritech Cornwall scheme, "so we can use it track the spread of these 'alien' genes to assess how big a threat this presents to our native lobster species."

The team advise that, for a range of conservation reasons including potential contact with American lobsters, it is important that the general public never release a marketed lobster back into the wild, even our native species.

Dr Tom Jenkins said: "Although we appreciate that all animal-lovers have concern for the fate of individual animals, in this case the rescue of one animal might endanger the health of the entire wild population, so once a lobster has entered the seafood supply chain that's where it should stay."

Credit: 
University of Exeter

Unraveling the magnetism of a graphene triangular flake

image: A triangular piece of graphene is predicted to be magnetic

Image: 
CIC nanoGUNE

Graphene is a diamagnetic material, this is, unable of becoming magnetic. However, a triangular piece of graphene is predicted to be magnetic. This apparent contradiction is a consequence of "magic" shapes in the structure of graphene flakes, which force electrons to "spin" easier in one direction. Triangulene is a triangular graphene flake, which possesses a net magnetic moment: it is a graphene nanometer-size magnet. This magnetic state opens fascinating perspectives on the use of these pure-carbon magnets in technology.

However, the robust predictions of triangulene magnetism stumbled with the absence of clear experimental proofs, because the production of triangulene by organic synthesis methods in solution was difficult. The bi-radical character of this molecule caused it to be very reactive and difficult to fabricate, and the magnetism appears to be very elusive in those few successful cases.

In a new study, published in Physical Review Letters [1], this challenge was revisited using a scanning tunneling microscope (STM). After assembling a triangular-like piece of graphene on a clean gold surface, high-resolution scanning tunneling spectroscopy measurements revealed that this compound has a net magnetic state characterized by a spin S=1 ground state and, therefore, that this molecule is a small, pure carbon paramagnet. These results are the first experimental demonstration of a high-spin graphene flake.

The findings were further complemented with atomic manipulation steps of hydrogen-passivated triangulene side-products occasionally found in the experiment. By controlled removal of these additional hydrogen atoms in the experiments, the spin state of the flake could be modified from a closed-shell, doubly hydrogenated structure, to an intermediate S=1/2 spin state, and finally to the high-spin S=1 state of the ideal molecular structure.

The experimental proof of a spin-state in the absence of a magnetic quantization axis (detectable by spin-polarized STM) or magnetic anisotropy (detectable by spin-flip inelastic tunneling spectroscopy) is not simple. In this work, the spin signature was obtained from the underscreened Kondo effect - an exotic version of the standard Kondo effect described in the 1960s - that can arise in high-spin systems. Its observation in a graphene flake on a metal has not been reported before and brings here novel insights to understanding spins interacting with surfaces.

Credit: 
Elhuyar Fundazioa

Towards a new generation of vegetation models

Plants and vegetation play a critical role in supporting life on Earth, but there is still a lot of uncertainty in our understanding of how exactly they affect the global carbon cycle and ecosystem services. A new IIASA-led study explored the most important organizing principles that control vegetation behavior and how they can be used to improve vegetation models.

We rely on the plants that make up our planet's ecosystems to release oxygen into the atmosphere, absorb carbon dioxide (CO2), and provide habitat and food for wildlife and humans. These services are critical in the future management of climate change, especially in terms of CO2 uptake and release, but due to the many complex, interacting processes that affect the ability of vegetation to provide these services, they remain difficult to predict.

In an IIASA-led perspective published in the journal Nature Plants, an international team of researchers endeavored to address this problem by exploring approaches to master this complexity and improve our ability to predict vegetation dynamics. They explored key organizing principles that govern these processes - specifically, natural selection; self-organization (controlling collective behavior among individuals); and entropy maximization (controlling the outcome of a large number of random processes). In general, an organizing principle determines or constrains how components of a system, such as different plants in an ecosystem or different organs of a plant, behave together. Mathematically, such a principle can be seen as an additional equation added to a system of equations, allowing one or more previously unknown variables in the system to be determined and thereby reducing the uncertainty of the solution.

A lot of research has gone into understanding and predicting how plant processes combine to determine the dynamics of vegetation on larger scales. To integrate process understanding from different disciplines, dynamic vegetation models (DVMs) have been developed that combine elements from plant biogeography, biogeochemistry, plant physiology, and forest ecology. DVMs have been widely used in many fields including the assessment of impacts of environmental change on plants and ecosystems; land management; and feedbacks from vegetation changes to regional and global climates. However, previous attempts to improve vegetation models have mainly focused on improving realism by including more processes and more data. This has not led to the expected success because each additional process comes with uncertain parameters, which has in turn caused an accumulation of uncertainty and therefore unreliable model predictions.

"Despite the ever-increasing availability of data, and the fact that vegetation science, like many other scientific fields, is benefitting from increasing access to big data sets and new observation technologies, we also need to understand governing principles like evolution to make sense of the big data. Current models are not able to reliably predict long-term vegetation responses," explains lead author Oskar Franklin, a researcher in the IIASA Ecosystems Services and Management Program.

The study found that by representing the principles of evolution, self-organization, and entropy maximization in models, they could better predict complex plant behavior and resulting vegetation as an emerging result of environmental conditions. Although each of these principles had previously been used to explain a particular aspect of vegetation dynamics, their combined implications were not fully understood. This approach means that a lot of complex variation and behavior at different scales, from leaves to landscapes, can now be better predicted without additional understanding of underlying details or more measurements.

The authors expect that apart from leading to better tools for understanding and managing the biosphere, the proposed "next-generation approach" may result in different trajectories of projected climate change that both policy and the general public would have to cope with.

Credit: 
International Institute for Applied Systems Analysis

Who takes the temperature in our cells?

image: The Ded1p protein of baker's yeast changes from a diffuse state (unstressed green cells, left) to a state in which it forms dense structures (heat-stressed green cells, right). The transition is caused by the process of phase separation and is triggered by an increase in ambient temperature. The Ded1p protein was genetically labeled with a green fluorescent dye.

Image: 
(c) TUD/BIOTEC

In a world in which we are confronted with constantly rising average temperatures due to global warming, we must ask ourselves: How do organisms react to changing temperatures? What molecular mechanisms do they use?

Decades of research have shown that different organisms respond very similarly to temperature changes. When organisms are exposed to heat, their cells cease to grow, they shut down the production of housekeeping proteins that are required for growth and reproduction. Instead, they start to produce proteins that protect the cells from heat-related damage. In other words, the cell factory changes its protein production. However, it is not known how cells recognize heat stress and which mechanisms trigger the production change.

Baker's yeast as model organism

Scientists at the Biotechnology Center (BIOTEC) of the TU Dresden and the Max Planck Institute for Molecular Cell Biology and Genetics (MPI-CBG), together with partners in Heidelberg and Toronto, Canada, investigated these fundamental questions. They used a popular model organism in cell research: baker's yeast as we know it from baking bread or brewing beer. This single-celled organism provides us with insights into the basic processes of life because it has almost the same composition as human and animal cells. If we understand the molecular processes within the yeast cell, we can also better understand the development of diseases in complex organisms such as humans.

"In yeast, we were able to identify one critical protein, Ded1p, which changes its structure upon heat stress and then reprograms the cell machinery. In the laboratory, we simulated the behavior of Ded1p with purified components and observed the following: Under normal conditions, Ded1p is evenly distributed in the cytoplasm of cells, but when the temperature rises, it assembles into dense structures, using the process of phase separation," explains Christiane Iserman, the lead author of the study. "The fact that Ded1p is able to sense temperature suggests that this protein is a kind of thermometer inside the cell."

Furthermore, the scientists have investigated the consequences for the cell when Ded1p forms these dense structures. "They are telling the cell to downregulate the production of housekeeping proteins, and to ensure that the production of stress-protective proteins is upregulated," explains Chrisitine Desroches Altamirano, second author of the study.

Results may help to better understand neurodegenerative diseases

This very elegant mechanism does not seem to be limited to baker's yeast. The researchers found that the Ded1p proteins from other organisms are well adapted to the temperature of the respective habitat. "This suggests that evolution has endowed our cells with a high thermal sensitivity so that living organisms can adapt to temperature fluctuations. This gives us hope that organisms will be able to cope with global warming," explains Prof. Simon Alberti, who led the study.

Alberti: "However, our discovery may have an even more general relevance: We have discovered a mechanism within the cell that helps the organism to deal with a variety of changes in the environment, not just heat stress. Cells seem to be able to deal with a wide variety of environmental signals by using proteins that phase separate to run different gene expression programs. In further studies, we want to determine whether this mechanism can help us understand human diseases - primarily those in which our cells do not process certain stress situations properly, as it appears to be the case in age-related neurodegenerative diseases".

Credit: 
Technische Universität Dresden

Photosynthesis in a droplet

image: Plant thylacoids are encapsulated in micro-droplets of approximately 90 micrometers in diameter. Equipped with a set of enzymes, the semi-synthetic chloroplasts fixate Carbon dioxide using solar energy, following nature's example.

Image: 
Max Planck Institute for terrestrial Microbiology/Erb

For hundreds of millions of years plants have had the ability to harness carbon dioxide from the air using solar energy. The Max Planck research network MaxSynBio is on the trail of building artificial cells as sustainable green bioreactors. A Max Planck research team led by Tobias Erb from the Institute for Terrestrial Microbiology in Marburg has now succeeded in developing a platform for the automated construction of cell-sized photosynthesis modules. The artificial chloroplasts are capable of binding and converting the greenhouse gas carbon dioxide using light energy.

Over billions of years, microorganisms and plants evolved the remarkable process we know as photosynthesis. Photosynthesis converts sun energy into chemical energy, thus providing all life on Earth with food and oxygen. The cellular compartments housing the molecular machines, the chloroplasts, are probably the most important natural engines on earth. Many scientists consider artificially rebuilding and controlling the photosynthetic process the "Apollo project of our time". It would mean the ability to produce clean energy - clean fuel, clean carbon compounds such as antibiotics, and other products simply from light and carbon dioxide.

But how to build a living, photosynthetic cell from scratch? Key to mimicking the processes of a living cell is to get its components to work together at the right time and place. At the Max Planck Society, this ambitious goal is pursued in an interdisciplinary multi-lab initiative, the MaxSynBio network. Now the Marburg research team led by director Tobias Erb has succeeded successfully created a platform for the automated construction of cell-sized photosynthetically active compartments, "artificial chloroplasts", that are able to capture and convert the greenhouse gas carbon dioxide with light.

Microfluidics meets Synthetic Biology

The Max Planck researchers made use of two recent technological developments: first synthetic biology for the design and construction of novel biological systems, such as reaction networks for the capture and conversion of carbon dioxide, and second microfluidics, for the assembly of soft materials, such as cell-sized droplets. "We first needed an energy module that would allow us to power chemical reactions in a sustainable fashion. In photosynthesis, chloroplast membranes provide the energy for carbon dioxide fixation, and we planned to exploit this ability ", Tobias Erb explains.

The photosynthesis apparatus isolated from the spinach plant proved to be robust enough that it could be used to drive single reactions and more complex reaction networks with light. For the dark reaction, the researchers used their own artificial metabolic module, the CETCH cycle. It consists of 18 biocatalysts that convert carbon dioxide more efficiently than the carbon metabolism naturally occurring in plants. After several optimization rounds, the team succeeded in light-controlled fixation of the greenhouse gas carbon dioxide in vitro.

The second challenge was the assembly of the system within a defined compartment on a micro scale. With a view to future applications, it should also be easy to automate production. In cooperation with Jean-Christophe Baret's laboratory at the Centre de Recherché Paul Pascal in France, researchers developed a platform for encapsulating the semi-synthetic membranes in cell-like droplets.

More efficient that Nature's photosynthesis

The resulting microfluidic platform is capable of producing thousands of standardized droplets that can be individually equipped according to the desired metabolic capabilities. "We can produce thousands of identically equipped droplets or we can give specific properties to individual droplets," said Tarryn Miller, lead author of the study. "These can be controlled in time and space by light."

In contrast to traditional genetic engineering on living organisms, the bottom-up approach offers decisive advantages: It focuses on minimal design, and it is not necessarily bound to the limits of natural biology. "The platform allows us to realize novel solutions that nature has not explored during evolution," explains Tobias Erb. In his opinion, the results hold great potential for the future. In their study, the authors were able to show that equipping the artificial chloroplast with the novel enzymes and reactions resulted in a binding rate for carbon dioxide that is 100 times faster than previous synthetic-biological approaches. "In the long term, life like systems could be applied to practically all technological areas, including material science, biotechnology and medicine - we are only at the beginning of this exciting development." Furthermore, the results are another step towards overcoming one of the greatest challenges of the future: the ever-increasing concentrations of atmospheric carbon dioxide.

Credit: 
Max-Planck-Gesellschaft

GCS centres support research to mitigate impact of COVID-19 pandemic

image: Different scenarios based on a reduction of contacts by 0, 30%, and 60%. This plot shows that the curve becomes flatter and wider as the number of contacts is reduced.

Image: 
Barbarossa, et al. DOI: 10.1101/2020.04.08.20056630

In December 2019, the world learned of a new and deadly pathogen. News coming out of Wuhan, China confirmed public health experts' worst fears--a novel coronavirus appeared to have jumped from animals to humans. It was extremely contagious, and its penchant for hospitalising and killing vulnerable individuals has led to sweeping and indefinite changes to daily life around the globe.

Molecular biologists, chemists, and epidemiologists responded quickly in a race to combat the pandemic. As the full extent of the threat became clear in early March, the Gauss Centre for Supercomputing (GCS) joined the effort, announcing that it would fast-track applications for computing time aimed at stopping the spread of the virus or developing new treatments. Since then, GCS has supported roughly a dozen projects focused on epidemiological and drug discovery research, and remains committed to supporting scientists around the globe who are working tirelessly to combat the world's worst pandemic in at least a generation.

Coronaviruses are a broad class of virus that cause illnesses ranging from the common cold to the severe acute respiratory syndrome (SARS) illness that first appeared in humans at the turn of the century. The pandemic coursing across the world over the last 6 months is also a coronavirus, known as SARS-CoV-2, which causes the illness 'coronavirus disease 2019' (COVID-19). As of May, 2020, the world has no proven course of treatment, and promising vaccine candidates are just beginning human trials.

Coronavirus spreads when droplets of infected individuals' saliva are transmitted by coughing, sneezing, or speaking to other individuals, who absorb them through the mucous membranes of the nose and mouth. Although evidence is not conclusive, the virus might also spread through contact with infected saliva droplets that land on surfaces. While medical experts largely understand how the virus spreads, humans have no effective immunity against emerging diseases stemming from novel viral strains like SARS-CoV-2. This means that containment and social isolation are the most effective tools for buying researchers time to study treatments, develop vaccines, and create tools for tracking disease spread.

While societies have shuttered businesses and populations have largely remained at home, scientists are doing everything possible to support medical professionals at the front lines of the pandemic. Computational biologists and chemists have been using high-performance computing (HPC) to understand the virus at a molecular level, in order to identify potential treatments and accelerate the development of an effective vaccine. Epidemiologists have turned to the power of supercomputers to model and predict how the disease spreads at local and regional levels in hopes of forecasting potential new hot spots and guiding policy makers' decisions in containing the disease's spread. GCS is supporting several projects focused on these goals.

Searching for the next outbreak: epidemiological modelling to track COVID-19

While researchers begin to understand how coronavirus spreadson a person-to-person level, modelling how it spreads in communities or regions requires significant amounts of computing power and access to quality data. Even before Germany began seeing its first COVID-19 cases, leadership at the Jülich Supercomputing Centre (JSC) started collaborating with researchers at the University of Heidelberg and the Frankfurt Institute for Advanced Studies (FIAS) who had been modelling the disease's spread in China. JSC offered its computational tools and expertise to digitalise epidemiological modelling and ultimately help predict how the virus would spread at state and local levels in Germany.

"At the very beginning of this crisis, we were interested in how we could support early reaction and detection systems like computational scientists can do with tsunami or earthquake simulations," said Prof. Thomas Lippert, Director of JSC. "As this is a very dynamic situation, we began to model system changes and we try to predict developments."

With the pandemic still actively spreading around the globe, researchers knew that running quantitative, retrospective analyses of the situation was not yet appropriate. However, supercomputers could be used to combine datasets on infection growth rate, the so-called reproduction number (Rt), and the virus incubation time to create predictive models. With supercomputers, researchers begin to run suites of scenarios to predict the mortality rate on a local and national level based on the degree of social distancing measures and other actions meant to slow the virus's spread

"The qualitative validity of these models comes from the fact that one can play through the different assumptions and detailed interactions, so you can validate those methods with hard data," Lippert said. "Then you put these different measures into the model and see what it is doing. We can then ask, 'when we put these measures together, are they moving things in a positive or negative direction?'"

Lippert noted that such models became less accurate the farther into the future they tried to model, but that their early results were accurate enough to help guide policy makers.

"In a paper we published based on data up to March 20, we predicted the situation in Germany for April 20 within a few percent," he said. "Because we already knew what measures were in place around the country, our work was pretty good at these predictions. Nevertheless, the model still underestimated the number of fatalities. At the policy and public health level, that means that if our data seems to overestimate the number of fatalities, it may not actually be doing that."

Lippert, JSC researchers Dr. Jan Meinke, Dr. Jan Fuhrmann, and Dr. Stefan Krieg, and principal investigator and University of Heidelberg / FIAS research group leader Dr. Maria Vittoria Barbarossa were all contributors to a position paper published April 13 by the Helmholtz Association of German Research Centres. The paper, which was signed by the leadership of the Helmholtz Association and co-authored by 17 researchers, presented 3 scenarios for German government officials with respect to easing restrictions imposed during the COVID-19 pandemic.

The team demonstrated that if contact restrictions were raised too quickly, the Rt value would quickly rise over 1 (an Rt value of 1 representes that each infection will spawn 1 new infection), and Germany's healthcare system could become overburdened within several months. In the second scenario, researchers modelled easing restrictions gradually and adopting an aggressive "feedback-based" contact tracing model to help slow the disease spread throughout the country. While in principle this scenario seemed promising, it required that significant contact restrictions would remain in place for an extended period of time--think months rather than weeks. The third scenario had the most resonance with German policy makers--keeping strong contact restrictions in place several weeks longer to help the Rt drop well below 1, then beginning the gradual reopening process.

International collaborations converge on essential drug discovery work

While predicting the virus's spread over the initial weeks and months of the pandemic is essential, making it possible for society to return to normal will require development of effective treatments and scalable vaccinations to protect from infection.

University College London (UCL) Professor Dr. Peter Coveney has long leveraged supercomputers to understand drug interactions with pathogens and the human body. Since 2016, he has led the European Union's Horizon 2020-funded project CompBioMed, which stands for 'Computational Biomedicine,' and its successor project, CompBioMed2 (for more information, visit http://www.compbiomed.eu). Both projects focus on accelerating drug discovery by augmenting experimental validation with modelling and simulation.

In the face of the COVID-19 pandemic, Coveney and over a hundred of his peers jumped into action, in part focusing their knowledge and access to HPC resources on indentifying existing drug compounds that could turn the tide against the virus. Specifically, Coveney and his collaborators model the binding affinities of drug compounds and pathogens. A drug's binding affinity essentially means the strength of the interaction between, for instance, a protein in the lifecyle of a virus and active compounds in a medication--the stronger the binding affinity, the more effective the drug.

"We can compute binding affinities in a matter of hours on a supercomputer; the size of such machines means that we can reach the industrial scale of demand necessary to impact drug repurposing programs," Coveney said. "This can save us enormous amounts of wall-clock time and resources, including person hours, which are very precious in such a crisis situation."

Supercomputers allow researchers to run large numbers of binding affinity simulations in parallel. Here, they compare information about the structure of the virus with a database containing information about known drug compounds to identify those with a high likelihood of binding. This computational approach enables researchers to investigate large numbers of potential drugs much more quickly than would be possible if they had to mix individual drug samples with actual viruses in a lab. Coveney has been using the SuperMUC-NG supercomputer at the Leibniz Supercomputing Centre (LRZ) to run many of his binding calculations.

"SuperMUC-NG offers us an immense capability for performing a large number of binding affinity calculations using our precise, accurate and reproducible workflows--ESMACS (Enhanced Sampling of Molecular dynamics with Approximation of Continuum Solvent) and TIES (Thermodynamic Integration with Enhanced Sampling)," Coveney said. "So far, we have already performed a few hundred such calculations very quickly."

Coveney has long collaborated with LRZ, developing his workflow to scale effectively on multiple generations of the SuperMUC architectures. LRZ Director Prof. Dieter Kranzlmüller saw the recent work as a continuation of Coveney's efforts. "Our long-term collaboration has enabled us to immediately identify and reach out to Peter to offer our assistance," he said. "By strongly supporting research in drug discovery activites for years, we were in a position to ensure that research toward identifying therapeutics could be accelerated right away."

Coveney has been performing his work as part of the Consortium on Coronavirus, an international effort involving researchers and resources from 9 universities, 5 United States Department of Energy national laboratories, and some of the world's fastest supercomputers, including SuperMUC-NG (currently number 9 in the Top500 list) and Summit at Oak Ridge National Laboratory in the United States (currently the world's fastest machine for open science). "This consortium is a vast effort, involving many people, supercomputers, synchrotron sources for experimental structural biology and protein structure determination, wet labs for assays, and synthetic chemists who can make new compounds," Coveney said. "In all, it's a massive 'one-stop shop' to help fight COVID-19."

Considering the team's ability to use supercomputers to run many iterations of drug binding affinity calculations, Coveney, who leads the European side of the consortium, is grateful for as much access access to world-leading supercomputers as he can get. "Our workflows are perfectly scalable in the sense that the number of calculations we can perform is directly proportional to the number of cores available," he said. "Thus, having access to multiple HPC systems speeds things up substantially for us. Time is of the essence right now."

With access to HPC resources in Europe and the United States, Coveney and his collaborators have narrowed a list of several hundred drug compounds and identified several dozen that have the potential to inhibit SARS-CoV-2's replication in the body. In total, Coveney and his colleagues have scanned millions to billions of possible compounds via machine learning, ultimately helping them narrow down existing and novel compounds to find the most promising candidates. Once machine learning helps identify the most promising candidates, these are then subjected to computationally intensive, physics-based simulations, which provide more accurate calculations.

Molecules in motion: molecular dynamics simulations for observing drug-virus interactions

As a traditional leader in computational engineering, the High-Performance Computing Center Stuttgart (HLRS) staff has extensive experience supporting molecular dynamics (MD) simulations. In the realm of engineering, MD allows researchers to understand how combustion processes happen from the moment of ignition, but in the realm of computational biology, researchers can turn to these computationally intensive simulations to investigate how molecular structures in proteins move and interact at extremely high resolution.

A team led by Prof. Dr. José Antonio Encinar Hidalgo at the Universidad Miguel Hernandez in Elche, Spain has been using HPC resources at HLRS to run molecular dynamics simulations and molecular docking models for 9,000 different drug candidates to fight COVID-19.

Proteins on human cells and viruses come in distinctive shapes, and designing effective treatments requires that researchers understand the molecular configurations most likely to bind to one another. Molecular docking simulations serve as a basis for determining drug binding affinities--by simulating the sturctures of panels of drug compounds in various molecular positions, researchers can assess their potential to bind to and inhibit the function of viral proteins.

Encinar noted that while some molecular docking simulations could be performed on more modest computing resources, HLRS's supercomputer enabled the team to put these snapshots of molecular docking configurations in motion through the use of molecular dynamics simulations.

"Our calculations consisted of some 90 molecular dynamics simulations," Encinar said. "On Hawk, a simulation takes approximately 5 days to calculate. But Hawk also enables us to calculate about 50 simulations at a time. In two weeks, we have all the necessary data. This work is not approachable in adequate time without high-performance computing resources."

The team just published a journal article demonstrating its work scanning of 9,000 different drug compounds. It identified roughly 34 candidates that appear to have a high likelihood of inhibiting one of the key proteins of SARS-CoV-2.

Dreams of vaccines and hope for the future

In addition to the work described above, dozens of researchers focusing on other aspects of drug discovery and epidemiology related to COVID-19 have been granted access to HPC resources at the GCS centres through GCS's fast-track program as well as PRACE's calls for fast-tracked access to Europe's leading HPC resources. (For a full list of COVID-19 related projects running at the GCS centres, click here).

The ultimate goal for scientists, medical professionals, and government officials, though, lies in developing an effective vaccine and scaling up production on a global scale. Coveney indicated that supercomputers have already helped pave the way for vaccine trials, enabling researchers to comb through 30,000 DNA sequences and to design vaccine candidates that are currently entering the testing phase. There are some aspects of fighting a pandemic that supercomputing cannot accelerate, though, and as vaccine candidates enter clinical trials, societies around the globe can only hope that the foundational work done by computational scientists has helped make identifying and designing a vaccine as efficient as possible.

Coveney was encouraged by the degree of collaboration we are currently witnessing between researchers across the globe. "Drug design involves a long and tedious pipeline of tasks with a large number of steps that require a different type of expertise at each level," he said. "Working in a large consortium has obvious advantages for such projects. We are part of a well-organized project where each partner having a clear idea of his role leads to quick turnaround. Proper and clear communication is vital for the success of our project. We are using online repositories for sharing of codes as well as data or information. Weekly video conferencing allows us to make checks on progress and remain in sync, along with frequent chats between concerned subsets of people, and this has made it possible to successfully move forward in step."

For GCS leadership, this crisis has shown that making sure that computing resources are quickly and efficiently deployed for researchers in the midst of a crisis is of the utmost importance. "At LRZ, we have discussed the need for detailed plans to address the next crisis, not necessarily a pandemic," Kranzlmüller said. "We had an internal plan, and were able to send all staff into home office within a few days, but we also have a long tradition focusing on biological and virology research, computational climate science, and other research areas that could be relevant for future disasters or crises. We want to ensure when the next crisis comes, supercomputers are among the first technological resources capable of being deployed in support of front-line efforts."

Lippert, who has studied both computational science and quantum physics at the PhD level and is a vocal advocate for science, remains positive due to his trust in the international scientific community.

"Any vaccine will come from science, any measure to be validated for epidemiology will come from science, any pharmaceutical therapy will come from science, any understanding of the hygienic aspects needed to redesign or rebuild public places where people are gathering together, these are all things that are to be understood scientifically," he said. "And I believe we will be successful because of the strength of science in Germany, Europe, and around the world."

Credit: 
Gauss Centre for Supercomputing

Defective graphene has high electrocatalytic activity

image: This is defective graphene.

Image: 
Daria Sokol/MIPT Press Office

Scientists from the Moscow Institute of Physics and Technology, Skoltech, and the Russian Academy of Sciences Joint Institute for High Temperatures have conducted a theoretical study of the effects of defects in graphene on electron transfer at the graphene-solution interface. Their calculations show that defects can increase the charge transfer rate by an order of magnitude. Moreover, by varying the type of defect, it is possible to selectively catalyze the electron transfer to a certain class of reagents in solution. This can be very useful for creating efficient electrochemical sensors and electrocatalysts. The findings were published in Electrochimica Acta.

Carbon is widely used in electrochemistry. A new type of carbon-based electrodes, made of graphene, has great potential for biosensors, photovoltaics, and electrochemical cells. For example, chemically modified graphene can be used as a cheap and effective analogue of platinum or iridium catalysts in fuel cells and metal-air batteries.

The electrochemical characteristics of graphene strongly depend on its chemical structure and electronic properties, which have a significant impact on the kinetics of redox processes. The interest in studying the kinetics of heterogeneous electron transfer on the graphene surface has recently been stimulated by new experimental data showing the possibility of accelerating the transfer at structural defects, such as vacancies, graphene edges, impurity heteroatoms, and oxygen-containing functional groups.

A recent paper co-authored by three Russian scientists presents a theoretical study of the kinetics of electron transfer on the surface of graphene with various defects: single and double vacancies, the Stone-Wales defect, nitrogen impurities, epoxy and hydroxyl groups. All these changes significantly affected the transfer rate constant. The most pronounced effect was associated with a single vacancy: The transfer rate was predicted to grow by an order of magnitude relative to defect-free graphene (fig. 1). This increase should only be observed for redox processes with a standard potential of ?0.2 volts to 0.3 volts -- relative to the standard hydrogen electrode. The calculations also showed that due to the low quantum capacitance of the graphene sheet, the electron transfer kinetics can be controlled by changing the capacitance of the bilayer.

"In our calculations, we tried to establish a relation between the kinetics of heterogeneous electron transfer and the changes in the electronic properties of graphene caused by defects. It turned out that introducing defects into a pristine graphene sheet can lead to an increase in the density of electronic states near the Fermi level and catalyze electron transfer," said Associate Professor Sergey Kislenko of the Department for Physics of High-Temperature Processes, MIPT.

"Also, depending on the kind of defect, it affects the density of electronic states across various energy regions in different ways. This suggests a possibility for implementing selective electrochemical catalysis. We believe that these effects can be useful for electrochemical sensor applications, and the theoretical apparatus that we are developing can be used for targeted chemical design of new materials for electrochemical applications," the scientist added.

Credit: 
Moscow Institute of Physics and Technology

Copper ion unlocks magnesium's potential in next-generation battery

image: By adding a copper ion, new magnesium battery demonstrates dramatic improvement of performance

Image: 
CUI Guanglei

Researchers at the Qingdao Institute of Bioenergy and Bioprocess Technology (QIBEBT) of the Chinese Academy of Sciences (CAS) have come a step closer to making a viable, high-output battery based on magnesium (Mg), an element the United States Geological Survey reports is far more abundant than lithium.

The researchers published their findings in Angewandte Chemie, a peer-reviewed journal of the German Chemical Society on April 11, 2020.

Recent attempts to develop a viable Mg battery have stumbled, because the discharge products are insulators, hampering output and slowing down the charge cycle.

The QIBEBT researchers found that using a copper ion ("Cu+") origin from the cathode in the battery addresses the issue of discharge product build-up. As their Mg battery discharges, Cu+ dissolves into electrolyte, exchange with the Mg2+ chemically, and becomes metallic copper as it receives electrons and forms a coating on the electrode. Since copper is highly conductive, electricity flows freely, allowing for high energy output.

The team's findings showed excellent performance in the newly developed Mg/Cu+ battery. After initial conditioning, their experimental battery retained 80 percent of its original capacity after 200 charge/discharge cycles. A typical commercial lithium-ion battery holds at least 80 percent of its original capacity after 1000 cycles.

Prof. CUI Guanglei said his team's Mg battery is not yet commercially viable, but it is on track to compete with lithium battery. "We expect to achieve the 1,000-cycle milestone in the next two years," he said.

The day-to-day price of magnesium averages about $5,000 USD per ton - about half the cost of lithium. Beyond being cheaper, magnesium-based batteries would also be safer. Poorly made lithium batteries can overheat and explode, creating a liability for industries ranging from telecom to aerospace. "I have every confidence to say that the employment Cu+/Mg can lead to safer battery products," Prof. CUI said.

Prof. CUI said the next step toward making Cu+/Mg batteries a commercial reality will be to design it as a flexible pouch. To do so, they'll need to create a gel form of their Cu+ electrolyte solution.

"As we can see, a gel electrolyte would be suitable for the Cu+ driven cathode chemistry," Prof. CUI said. Once the battery can function in a gel pouch form, it will become easier to engineer it into the odd and often very thin shapes demanded by today's consumer devices.

"The ultimate goal of this study is to commercialize the Mg metal battery as the next generation energy storage devices beyond lithium-ion technology." Prof. CUI said.

Credit: 
Chinese Academy of Sciences Headquarters

Newly discovered cell type plays crucial role in immune response to respiratory infections

With a discovery that could rewrite the immunology textbooks, an international group of scientists, including the teams of Bart Lambrecht, Martin Guilliams, Hamida Hammad, and Charlotte Scott (all from the VIB-UGent Center for Inflammation Research) identified a new type of antigen-presenting immune cell. These cells, that are part of an expanding family of dendritic cells, play a crucial role presenting antigens to other immune cells during respiratory virus infections, and could explain how convalescent plasma helps to boost immune responses in virus-infected patients.

Inflammation and immunity

When our body faces an infection, it responds with inflammation and fever. This is a sign that the immune system does its work, and leads to the activation of many cells, like soldiers in an army. Dendritic cells (DCs) are the generals of that army. They can precisely activate and instruct the soldiers to kill infected cells by presenting antigens derived from the 'invaders' to cells of the immune system.

Mistaken identity

There are several types of DCs that perform antigen-presenting functions in the body. A first type of conventional DCs continuously scan the body for dangerous invaders, even when there is no infection. When there is inflammation triggered by infection, another subset of DCs emerges from inflammatory monocytes. Because monocyte-derived DCs are easily prepared in vitro from monocytes isolated form human blood, it was always assumed these cells were very important antigen-presenting cells. Clinical trials using monocyte-derived DCs in cancer therapy have however been
disappointing.

A study by the teams of Bart Lambrecht, Martin Guilliams, Hamida Hammad, and Charlotte Scott (all from the VIB-UGent Center for Inflammation Research) and international colleagues, shows that monocyte-derived DCs are poor antigen-presenting cells, but have wrongly been assumed to have these functions because of a case of mistaken identity.

The scientists studied mice with a viral respiratory infection (pneumonia virus of mice and influenza virus) with single-cell technologies. This single-cell resolution allowed them to finely separate the monocyte-derived cells from other DCs during their response to the infection. They found that monocyte-derived DCs do exist, but actually do not present antigens. The reason for all the confusion in the past is that a look-alike new DC emerges - called inflammatory type 2 conventional DC, or inf-cDC2 - that combines some of the best characteristics of monocytes, macrophages, and conventional DCs, to induce the best form of immunity.

Bart Lambrecht: "This was a big surprise for us. We've all been taught that monocyte-derived cells are excellent antigen presenting cells, certainly when there's inflammation. Now, we show that it's actually a new hybrid DC type that's doing all the work. This really changes what we know about the immune system and is very important knowledge for understanding respiratory viral infections and other inflammatory diseases."

Martin Guilliams: "It took a massive team effort but the strength of single-cell sequencing has finally cracked the complex DC code. Many contradicting findings from the last two decades now make much more sense. This also opens tremendous therapeutic opportunities, since vaccination strategies can now be designed to trigger formation of inf-cDC2s and thus generate a stronger antiviral immune response."

Charlotte Scott: "Through the use of single cell technologies we have been able to align all the findings from the past few years and identify the distinct cell types involved. Moving forward it will be very interesting to see under what other inflammatory conditions these inf-cDC2s are generated and how they can potentially be targeted therapeutically."

Convalescent plasma and COVID-19

The findings of the researchers also have a direct relevance for the current COVID-19 pandemic, caused by another respiratory virus. An emergency treatment that is currently being explored is the use of convalescent plasma, or the blood plasma of recovered patients.

Cedric Bosteels, lead author of the new paper: "One of the unique features of the new DCs is that they express functional Fc receptors for antibodies that are found in the plasma of patients who have recovered from COVID-19"

This study is the first to show that one of the mechanisms through which convalescent plasma and the virus-specific antibodies in it work, is via boosting of inf-cDC2. Since boosted DCs induce a much stronger immune response, this study reveals a new target for therapeutic intervention for viral infections and other inflammatory diseases.

Credit: 
VIB (the Flanders Institute for Biotechnology)

Like a molecular knob: That is how a gene controls the electrical activity of the brain

image: Primary culture of neocortical, excitatory and inhibitory neurons, labeled by fluorescent red and green markers, respectively.

Image: 
Wendy Tigani, SISSA

It works like a very fine "molecular knob" able to modulate the electrical activity of the neurons of our cerebral cortex, crucial to the functioning of our brain. Its name is Foxg1, it is a gene, and its unprecedented role is the protagonist of the discovery just published on the journal Cerebral Cortex. Foxg1 was already known for being a "master gene" able to coordinate the action of hundreds of other genes necessary for the development of our anterior central nervous system. As this new study reports, the "excitability" of neurons, namely their ability to respond to stimuli, communicating between each other and carrying out all their tasks, also depends on this gene. To discover this, the researchers developed and studied animal and cellular models in which Foxg1 has an artificially altered activity: a lack of activity, as it happens in patients affected by a rare variant of Rett Syndrome, which leads to clinical manifestations of the autistic realm; or an excessive action, as in a specific variant of the West Syndrome, with neurological symptoms such as serious epilepsy and severe cognitive impairment. As deduced by the scientists in the research, the flaw in the "knob" lies in an altered electrical activity in the brain with important consequences for the entire system, similar to what happens in the two syndromes mentioned.

Shedding light on this mechanism, say the researchers, allows to understand more deeply the functioning of our central nervous system in sickness and in health, a fundamental step to assess possible future therapeutic interventions for these pathologies. What has just been published is the latest in a series of three studies on the Foxg1 gene, recently published by the researchers of SISSA on Cerebral Cortex. It is the result of a project begun more than five years ago, which saw the team of Professor Antonello Mallamaci of SISSA in the front line with researchers of the University of Trento and the Neuroscience Institute of Pisa, with the support of the Telethon Foundation, of the Fondation Jerome Lejeune and of the FOXG1 Research Foundation.

The many abilities of the "master gene"

"We knew that this gene is important for the development of the anterior central nervous system" explains the Professor Antonello Mallamaci of SISSA, who has coordinated the research. "In previous studies we had already highlighted how it was involved in the development of particular brain cells, the astrocytes, as well as the neuronal dendrites, which are part of the nerve cells that transport the incoming electrical signal to the cell. The fact that it had mutated in patients affected by specific variants of the Rett and West Syndromes in which we see, respectively, an insufficient and excessive activity of this gene, made us explore the possibility that its role was also another. And, from what has emerged, it would appear that way".

The research findings

According to the study, the activation of the electrical activity of Foxg1 follows a positive circuit. Professor Mallamaci explains: "If the gene is very active there is increased electrical activity in the cerebral cortex. In addition, the neurons, when active, tend to make it work even harder. One process, in short, feeds the other. Obviously, in normal conditions, the system is slowed down at a certain point. "If, however, the gene functions abnormally, or it is found in a number of copies other than two, as it happens in the two syndromes above, the point of balance changes and the electrical activity is altered. All this, in addition to making us understand the mechanisms of the pathology, tells us that Foxg1 functions precisely as a key regulator of the electrical activity in the cerebral cortex".

The next step, explains the professor, will be to understand the role of the mediating genes, namely of some of the many genes whose action is regulated by the master gene Foxg1. This analysis is important to understand in more detail how this gene works under normal and pathological conditions.

How the master gene produces the pathological effects, when and how to intervene

Understanding the molecular mechanisms that Foxg1 controls is also important to study what could be the targets on which to intervene for possible therapeutic approaches. "Given that finding a therapy for these illnesses is very difficult, working so in depth you might find, for example, that most problems are caused precisely by some of the "operators" that Foxg1 regulates. And that we should therefore focus our attention on these goals, rather than on the master gene, maybe using drugs that already exist and have been seen to be useful in remedying those specific flaws". In the case of a future approach that would instead correct the anomalies of the FOXG1 gene with the gene therapy, explains Professor Mallamaci, "it is necessary to understand when to intervene, namely from what moment on the pathological effects due to the mutation of this gene become irreversible. To replace the flawed copy with the correct one, it is necessary to intervene before that moment, which might suppose you would have to make a prenatal gene diagnosis and treatment". "The next steps we will take", concludes Professor Mallamaci "will be directed precisely in the direction of a deeper understanding of all these aspects".

Credit: 
Scuola Internazionale Superiore di Studi Avanzati

Pangolins may possess evolutionary advantage against coronavirus

Similar to how a smoke detector sounds off an alarm, certain genes sense when a virus enters the body, alerting of an intruder and triggering an immune response in most mammals. But, according to a recent study published in Frontiers in Immunology, pangolins - mammals which resemble an anteater with scales, lack two of those virus-sensing genes. The finding is significant because while pangolins can be carriers of coronavirus, they appear able to tolerate it through some other unknown mechanism. Understanding their evolutionary advantage may point to possible treatment options for coronavirus in humans.

Researchers focused on pangolins because the exotic animal may have transmitted the virus to humans last year, creating the interspecies jump required for the current COVID-19 pandemic to take hold (bats have also been identified as possible agents of infection). To obtain their results, they analyzed the genome sequence of pangolins and compared it to other mammals including humans, cats, dogs, and cattle.

"Our work shows that pangolins have survived through millions of years of evolution without a type of antiviral defense that is used by all other mammals," says co-author Dr. Leopold Eckhart, of the Medical University of Vienna in Austria. "Further studies of pangolins will uncover how they manage to survive viral infections, and this might help to devise new treatment strategies for people with viral infections."

In humans, coronavirus can cause an inflammatory immune response called a cytokine storm, which then worsens outcomes. Pharmaceutical suppression of gene signaling, the authors suggest, could be a possible treatment option for severe cases of COVID-19. Eckhart cautions though that such a remedy could open the door to secondary infections. "The main challenge is to reduce the response to the pathogen while maintaining sufficient control of the virus," he says. An overactivated immune system can be moderated, Eckhart says, "by reducing the intensity or by changing the timing of the defense reaction."

While the study identified genetic differences between pangolins and other mammals, it did not investigate the impact of those differences on the antiviral response. Scientists don't yet understand how exactly pangolins survive coronavirus, only that their lack of these two signaling genes might have something to do with it. Eckhart adds that another gene, RIG-I, which also acts as a sensor against viruses, should be studied further as it could defend against coronaviruses. The study offers a starting point to better understand coronavirus's characteristics, the body's response, and the best options for treatment.

Credit: 
Frontiers