Earth

New amphibious centipede species discovered in Okinawa and Taiwan

image: The amphibious Scolopendra alcyona Tsukamoto & Shimano inhabits streamside environments, deep in the forests of the Ryukyu Archipelago.

Image: 
Tokyo Metropolitan University

Tokyo, Japan - Researchers from Tokyo Metropolitan University and Hosei University have discovered a new species of large, tropical centipede of genus Scolopendra in Okinawa and Taiwan. It is only the third amphibious centipede identified in the world, and is the largest in the region, 20 cm long and nearly 2 cm thick. It is also the first new centipede to be identified in Japan in 143 years, testament to the incredible biodiversity of the Ryukyu Archipelago.

Scolopendra is a genus of large, tropical centipede, one of the original genera named by the father of modern taxonomy himself, Carl Linnaeus. They are strong predators in any soil ecosystems they inhabit, with around 100 different species found in tropical regions around the world. Of these, only five have been identified in Japan and Taiwan.

Scientists were excited when news came in of an unknown centipede species sighted around the Ryukyu Archipelago, reportedly attacking giant freshwater prawns. A team led by Sho Tsukamoto, his supervisor Associate Professor Katsuyuki Eguchi of Tokyo Metropolitan University, and Professor Satoshi Shimano of Hosei University set out to look for and identify this mystery creature.

It turned out that they had discovered an entirely new species. Genetic analysis confirmed that it was distinct from any of the other Scolopendra inhabiting the region. Approximately 20 cm in length and 2 cm in width, it is the largest centipede species to be found in Japan and Taiwan. Sporting a beautiful jade-colored shell, it has been named Scolopendra alcyona Tsukamoto & Shimano after the Greek mythological figure Alcyone, who was transformed into a kingfisher by Zeus. Its new Japanese name, ryujin-ômukade, also has a mythological origin, in homage to the region where it was found. Local myths have it that a dragon god, or ryujin, was in agony because a centipede had entered his ear. On seeing a chicken quickly devour a centipede, it was said that the god came to fear both centipedes and chickens. In the days of the kingdom of Ryukyu, people painted chickens on their boats and hoisted a centipede flag to strike fear into the dragon gods and cross the sea safely.

Notably, the scientists found that the centipedes preferred streamside environments, and exhibited amphibious characteristics, making it only the third amphibious Scolopendra in the world. This is the first discovery of a new centipede in Japan in 143 years; the fact that such a large invertebrate could go undiscovered until now is a reflection of the unexplored biodiversity of the Ryukyu Archipelago, and a strong case for its preservation. The species is most likely endangered, and currently inhabits forest streams where people do not go. The team hopes to continue to monitor and study them from a safe distance, to preserve their habitat.

Credit: 
Tokyo Metropolitan University

The future of particle accelerators is here

image: The Electron-Ion Collider (EIC) will probe the internal structure of nuclear matter as it exists today. Electrons colliding with ions will exchange virtual photons with the nuclear particles to help scientists "see" inside the nuclear particles. The collisions will produce precision 3D snapshots of the internal arrangement of quarks and gluons within ordinary nuclear matter, like a combination CT / MRI scanner for atoms. Electrons can "pick out" individual quarks from the protons that make up nuclei. Studying how those quarks recombine to form composite particles will inform our understanding of how today's visible matter evolved from the QGP studied at the Relativistic Heavy Ion Collider (RHIC).

Image: 
Brookhaven National Laboratory

When the Electron Ion Collider received the go-ahead in January 2020, it became the only new major accelerator in the works anywhere in the world.

"All the stars aligned," said Elke-Caroline Aschenauer, Brookhaven National Laboratory Staff Scientist and a leader in developing the EIC plans. "We have the technology to build this unique particle accelerator and detector to do the measurements that, together with the underlying theory, can for the first time provide answers to longstanding fundamental questions in nuclear physics."

The EIC isn't the only Brookhaven project poised to reshape nuclear and particle physics. Forthcoming data from the Relativistic Heavy Ion Collider could finally detect the elusive chiral magnetic effect. Meanwhile, planned accelerators could run on sustainable energy, a drastic departure from today's machines.

At a press conference during the 2021 APS April Meeting, researchers will discuss how cutting-edge accelerators could collide with both energy consumption and our assumptions about the nature of matter.

A powerful new facility for nuclear physics

"The scientific advances of the EIC will help us all to understand where we come from and how the visible matter around us is composed from its elementary building blocks," said Aschenauer.

The accelerator and detector will serve as a kind of camera, taking 3D images and movies of electrons colliding with polarized protons and ions. Like a CT scanner for atoms, the EIC will let scientists see how force-carrying gluon particles hold together quarks, the internal components of protons and neutrons. It will also offer insights into the spin of fundamental particles.

Aschenauer will give status updates from the first year of the EIC project--a collaboration between BNL and Thomas Jefferson National Accelerator Facility--and an overview of its experimental equipment.

Hunting for the chiral magnetic effect

The EIC will build on the Relativistic Heavy Ion Collider, which will soon produce major results of its own.

In summer 2021, data analysis will likely conclude on an experiment searching for decisive proof of the chiral magnetic effect. This proposed effect helps explain many fundamental features of the Standard Model and could unlock why our universe contains overwhelmingly more matter than antimatter, crucial to human existence.

Jinfeng Liao, a theoretical nuclear physicist at Indiana University Bloomington, will share key predictions about what the experiment might uncover.

"The signatures, as predicted by our theoretical study, show clear promise of unambiguously establishing the existence of chiral magnetic effect in the isobar collision experiment," said Liao.

Liao and colleagues created a custom fluid-dynamics-based computational tool to simulate experimental collisions and any changes the chiral magnetic effect would cause.

They show that the new experiment has a better chance of detecting the effect than previous attempts, long plagued by weak signals and strong background contamination. The predictions were published in Physical Review Letters.

Probing profound subatomic questions requires a lot of power.

"Large particle accelerators use a shockingly large amount of energy," said Georg Hoffstaetter, a professor at Cornell University.

He will share results from the Cornell-BNL Test Accelerator, or CBETA, the world's first to accelerate a beam multiple times while powering itself by reusing beam energy. It further reduces electricity demands with superconducting and magnetic equipment.

The Energy Recovery Linacs technology that enables the test accelerator could lead to smaller particle accelerators with higher beam currents and reduced energy consumption.

"People may benefit from the industrial applications of Energy Recovery Linacs by using better computer chips, by being cured in radiation therapy centers that guide beams with permanent magnets, or by inhaling accelerator-produced medical isotopes," said Hoffstaetter.

Building on the success of the test accelerator, its principal investigator and Brookhaven Senior Physicist Dejan Trbojevic will present designs for a new green energy collider. Particles speed along racetrack beam lines, formed from high-quality permanent magnets which require no use of electrical power.

"The 'green accelerator' shows a completely new way of accelerating particles with very tight control of their motion and with an extremely high energy range. It has never been done before," said Trbojevic.

He will demonstrate how the EIC, as well as a similar accelerator under consideration at the Large Hadron Collider, could incorporate the energy-saving features.

FEATURED TALKS

The Electron-Ion Collider: A Collider to Unravel the Mysteries of Visible Matter--Its Experimental Equipment (X04.3)

11:57 a.m. - 12:33 p.m. CDT, Tuesday, April 20, 2021
Elke-Caroline Aschenauer, elke@bnl.gov
Livestream: Access here
Abstract: http://meetings.aps.org/Meeting/APR21/Session/X04.3

Signatures of Chiral Magnetic Effect in the Collisions of Isobars (SP01.5)

2:00 p.m. CDT, Monday, April 19, 2021
Jinfeng Liao, liaoji@indiana.edu
Poster: Access here
Abstract: http://meetings.aps.org/Meeting/APR21/Session/SP01.50

The Cornell-BNL ERL Test Accelerator: Demonstration of the World's First Multipass Superconducting Linear Accelerator With Energy Recovery (E15.3)

4:09 p.m. - 4:21 p.m. CDT, Saturday, April 17, 2021
Colwyn Gulliford and Georg Hoffstaetter, georg.hoffstaetter@cornell.edu
Livestream: Access here
Abstract: http://meetings.aps.org/Meeting/APR21/Session/E15.3

Green Energy Future EIC Collider (Z07.1)

3:45 p.m. - 3:57 p.m. CDT, Tuesday, April 20, 2021
Dejan Trbojevic, dejan@bnl.gov
Livestream: Access here
Abstract: http://meetings.aps.org/Meeting/APR21/Session/Z07.1

PRESS CONFERENCE

Register for the press conference to be held on Zoom at 10:00 a.m. CDT, Sunday, April 18, 2021.

Speakers:

Elke-Caroline Aschenauer (BNL)

Jinfeng Liao (Indiana University Bloomington)

Georg Hoffstaetter (Cornell University)

Dejan Trbojevic (BNL)

Credit: 
American Physical Society

CNIO researchers explain the toxicity of USP7 inhibitors, under development for cancer treatment

image: Researchers Oscar Fernandez-Capetillo (left) and Emilio Lecona (right)

Image: 
CNIO

Understanding the components that control cell division is fundamental to understanding how life works and how alterations in this delicate process can cause diseases such as cancer. It was precisely the discoveries of "key regulators of the cell cycle" and their implications for processes such as cancer, that won the British scientists R. Timothy Hunt and Paul M. Nurse and the American scientist Leland H. Hartwell the 2001 Nobel Prize in Physiology or Medicine. A study led by Óscar Fernández-Capetillo, Head of the Genomic Instability Group at the Spanish National Cancer Research Centre (CNIO) and published this week in The EMBO Journal uncovers a new cell cycle control element, the USP7 protein. It acts as a brake to prevent cells from dividing until the process of copying genetic material has been completed, and it also monitors this copying process to ensure that it happens correctly. "USP7 acts like a 'skipper' of the cells, who keeps the engines that drive cell division running at low speed," explains Fernández-Capetillo.

In addition to their importance for understanding the cell cycle, these results may have far-reaching implications for oncology as in the last three years several pharmaceutical companies around the world have been developing USP7 inhibitors for the treatment of this disease.

"Our study shows that USP7 inhibitors trigger an unbridled and premature activity of the machinery that controls the cell cycle, which, among other things, causes the genetic material to break apart as it tries to replicate," the researchers say.

"Understanding how these drugs work will help to improve the identification of patients who might benefit from their use, and also of potential combinations with other drugs that should be explored or avoided."

Cell death as a result of cell cycle over-activation

One of the most delicate and important processes that cells face in cell division is the copying of genetic material for subsequent distribution to daughter cells. If this happens in an abnormal way, cells can accumulate mutations that make them unstable and even cancerous.

In 2016, Fernández-Capetillo's team published a paper in the journal Nature Structural & Molecular Biology, in which they demonstrated that USP7 accompanies the cohort of molecules that form part of the replisome --a group of proteins involved in DNA copying-- to eliminate specific tags or signals called ubiquitins from the places in the genome where DNA is being copied, thus facilitating the replication process. Already at that time, the researchers suspected that, in addition to regulating DNA replication, USP7 inhibitors could also affect the cell cycle.

The pharmaceutical development of USP7 inhibitors as anticancer agents has been mainly based on their ability to activate the tumour suppressor protein P53, which is a potent inducer of cell suicide. However, in the 2016 paper mentioned above, Fernández-Capetillo's group already showed that the effects of these inhibitors on genome replication were not solely related to P53: "Our data indicate that USP7 is essential for genome replication in cells with or without p53," they said.

So, if not through P53, how can these inhibitors cause tumour cells to die? The paper published this week shows that the drugs have a direct effect on the cell cycle machinery that regulates cell division. Specifically, the researchers found that treatment with these inhibitors triggers premature and widespread activation of the CDK1 protein, a key driver of the cell cycle, which leads to uncontrolled cell division, DNA damage and ultimately cell death.

Potential combination therapies

The fact that USP7 inhibitors work by deregulating CDK1 opens the door to possible therapeutic combinations that could increase the efficacy of these drugs in cancer patients. According to the researchers in the article, there are several anticancer therapies in clinical trials that are acting at the same level, i.e. favouring the premature activation of CDK1, such as ATR or WEE1 inhibitors. "The effect of the combination of USP7 inhibitors together with other inhibitors that also promote CDK1 activity could be synergistic and increase the anticancer effects of these compounds in cancer patients," explains Fernández-Capetillo. "Furthermore, we also anticipate that drugs that decrease CDK1 activity will reduce the efficacy of USP7 inhibitors."

The findings of this new study also have other important implications for the use of these USP7 inhibitory compounds, as they confirm the 2016 observations that suggested these drugs can be effective regardless of whether tumours express P53 or not. The original idea that USP7 inhibitors work through stimulating P53 restricted their potential use to those patients whose tumours express this protein, which occurs in slightly less than 50% of cases. Therefore, the finding that these agents work by a P53-independent mechanism opens up their potential use to a much larger number of patients.
Currently, Fernández-Capetillo's group is focusing on uncovering new mechanisms of resistance to anticancer therapies, including USP7 inhibitors, to improve their efficacy in the clinic.

Credit: 
Centro Nacional de Investigaciones Oncológicas (CNIO)

Promising results from first-in-humans study of a novel PET radiopharmaceutical

image: The inflamed joints of a rheumatoid arthritis patient are clearly visible in the PET images with the novel 68Ga-DOTA-Siglec-9 radiopharmaceutical.

Image: 
Anne Roivainen

The preliminary trial results of a novel radiopharmaceutical for PET imaging of inflammation developed at the University of Turku, Finland, have been published. The compound, which targets the vascular adhesion protein 1 (VAP-1) that regulates inflammatory cell traffic, is the first radiopharmaceutical that has been developed completely in Finland and has advanced to clinical trials. In the study that started with healthy volunteers, the radiopharmaceutical was found to be well tolerated and safe.

The radiopharmaceutical is 68Ga-labelled Siglec-9 peptide.

"The dose of the radiopharmaceutical used in PET imaging is thousands of times lower when compared with the regular drugs. Studies with new radiopharmaceuticals are therefore safer than the usual drug research studies," explain Researchers Riikka Viitanen and Olli Moisio from the Turku PET Centre.

The study also included the imaging of a patient with early rheumatoid arthritis. The inflamed joints were clearly visible in the PET images, and the radiopharmaceutical seems to effectively target inflamed tissue.

"Our radiopharmaceutical is a product of long-term preclinical research work, and it is rewarding to see results that match our expectations. The research results are promising, but all novel radiopharmaceuticals must fulfil strict medical and statistical criteria before they can be considered for general research use. Therefore, we will continue the study with voluntary rheumatoid arthritis patients," says the leader of the research group, Professor Anne Roivainen from the University of Turku.

"This study is unique and has long, innovative history in the University of Turku. Now, it has been proven that the new radiopharmaceutical works in humans, rejoices Academician of Science," Professor Sirpa Jalkanen.

The purpose of the new radiopharmaceutical is to advance both the diagnostics of inflammatory diseases and drug development with molecular imaging. The research field is rapidly developing, and the Turku PET Centre, research institute of the University of Turku, Åbo Akademi University, and Turku University Hospital is one of the field's leading research centres in Europe.

Credit: 
University of Turku

Experiments cast doubts on the existence of quantum spin liquids

image: Arrangement of the spins in a triangular lattice: Two spins each form a pair, whereby their magnetic moments cancel each other out when viewed from the outside.

Image: 
University of Stuttgart, PI1

When temperatures drop below zero degrees Celsius, water turns to ice. But does everything actually freeze if you just cool it down enough? In the classical picture, matter inherently becomes solid at low temperatures. Quantum mechanics can, however, break this rule. Therefore, helium gas, for example, can become liquid at -270 degrees, but never solid under atmospheric pressure: There is no helium ice.

The same is true for the magnetic properties of materials: at sufficiently low temperatures, the magnetic moments known as 'spins', for example, arrange themselves in such a way that they are oriented opposite/antiparallel to their respective neighbors. One can think of this as arrows pointing alternating up and down along a chain or in a checkerboard pattern. It gets frustrating when the pattern is based on triangles: While two spins can align in opposite directions, the third is always parallel to one of them and not to the other - no matter how you turn it.

For this problem, quantum mechanics suggests the solution that the orientation and bond of two spins are not rigid, but the spins fluctuate. The state formed is called a quantum spin liquid in which the spins constitute a quantum mechanically entangled ensemble. This idea was proposed almost fifty years ago by the American Nobel laureate Phil W. Anderson (1923-2020). After decades of research, only a handful of real materials remain in the search for this exotic state of matter. As a particularly promising "candidate" a triangular lattice in a complex organic compound was considered, in which no magnetic order with a regular up-down pattern could be observed, even at extremely low temperatures. Was this the proof that quantum spin liquids really exist?

One problem is that it is extremely challenging to measure electron spins down to such extremely low temperatures, especially along different crystal directions and in variable magnetic fields. All previous experiments have been able to probe quantum spin liquids only more or less indirectly, and their interpretation is based on certain assumptions and models. Therefore, a new method of broadband electron spin resonance spectroscopy has been developed over many years at the Institute of Physics 1 at the University of Stuttgart.

Using on-chip microwave lines, one can directly observe the properties of the spins down to a few hundredths of a degree above absolute zero. In doing so, the researchers found that the magnetic moments do not arrange themselves in the up-down pattern of a typical magnet, nor do they form a dynamic state resembling a liquid. "In fact, we observed the spins in spatially separated pairs. Thus, our experiments have shattered the dream of a quantum spin liquid for now, at least for this compound," summarizes Prof. Martin Dressel, head of the Institute of Physics 1.

But even though the pairs did not fluctuate as hoped, this exotic ground state of matter has lost none of its fascination for the physicists. "We want to investigate whether quantum spin liquids might be detectable in other triangular lattice compounds or even in completely different systems such as honeycomb structures", Dressel outlines the next steps. However, it could also be that such a disordered, dynamic state simply does not exist in nature. Perhaps every kind of interaction leads in one way or another to a regular arrangement if the temperature is low enough. Spins just like to pair up.

Credit: 
Universitaet Stuttgart

Wearable sensors that detect gas leaks

image: Gas sensor

Image: 
POSTECH

Gas accidents such as toxic gas leakage in factories, carbon monoxide leakage of boilers, or toxic gas suffocation during manhole cleaning continue to claim lives and cause injuries. Developing a sensor that can quickly detect toxic gases or biochemicals is still an important issue in public health, environmental monitoring, and military sectors. Recently, a research team at POSTECH has developed an inexpensive, ultra-compact wearable hologram sensor that immediately notifies the user of volatile gas detection.

A joint research team led by Professor Junsuk Rho of departments of mechanical and chemical engineering and Dr. Inki Kim of Department of Mechanical Engineering with Professor Young-Ki Kim and Ph.D. candidate Won-Sik Kim of Department of Chemical Engineering at POSTECH has integrated metasurface with gas-reactive liquid crystal optical modulator to develop a sensor that provides an immediate visual holographic alarm when harmful gases are detected. The findings from this study were published in Science Advances on April 7, 2021.

For those working in hazardous environments such as petrochemical plants, gas sensors are life. However, conventional gas sensing devices are not widely used due to their high cost of being made with complex machines and electronic devices. In addition, commercial gas sensors have limitations in that they are difficult to use, and have poor portability and reaction speed.

To solve these issues, the research team utilized the metasurface, well known as a future optical device known to have the invisible cloak effect through making visible objects disappear by controlling the refractive index of light. Metasurface is especially used to transmit two-way holograms or 3D video images by freely controlling light.

Using the metasurface, the research team developed a gas sensor that can float a holographic image alarm in space in just a few seconds by using the polarization control of transmitted light that transforms due to the change in orientation of liquid crystal molecules in the liquid crystal layer inside the sensor device when exposed to gas. Moreover, this gas sensor developed by the research team requires no support from external mechanical or electronic devices, unlike other conventional commercial gas sensors. The researchers used isopropyl alcohol as the target hazardous gas, known as a toxic substance that can cause stomach pain, headache, dizziness, and even leukemia.

The newly developed sensor was confirmed to detect even the minute amount of gas of about 200ppm. In an actual experiment using a board marker, a volatile gas source in our daily life, a visual holographic alarm popped up instantaneously the moment the marker was brought to the sensor.

Moreover, the research team developed a one-step nanocomposite printing method to produce this flexible and wearable gas sensor. The metasurface structure, which was previously processed on a hard substrate, was designed to enable rapid production with a single-step nanocasting process on a curved or flexible substrate.

When the flexible sensor fabricated using this method attaches like a sticker on safety glasses, it can detect gas and display a hologram alarm. It is anticipated to be integrable with glass-type AR display systems under development at Apple, Samsung, Google, and Facebook.

Going a step further, the research team is developing a high-performance environmental sensor that can display the type and concentration level of gases or biochemicals in the surroundings with a holographic alarm, and is studying optical design techniques that can encode various holographic images. If these studies are successful, they can be used to reduce accidents caused by biochemical or gas leaks.

"This newly developed ultra-compact wearable gas sensor provides a more intuitive holographic visual alarm than the conventional auditory or simple light alarms," remarked Prof. Junsuk Rho. "It is anticipated to be especially effective in more extreme work environments where acoustic and visual noise are intense."

Credit: 
Pohang University of Science & Technology (POSTECH)

Scientists call for climate projections as part of more robust biodiversity conservation

image: Scientists note that climate change is expected to impact 58% of montane forest in the Peruvian Andes.

Image: 
Alliance of Bioversity and CIAT / N.Palmer

Scientists have called for the use of climate projections in conservation planning, to ensure that areas most at risk from biodiversity loss and climate impacts are protected. Protected areas are often created in areas of low population density and remote locations, rather than because of their biodiversity conservation potential. Conservation planning in tropical forests especially tends to be less rigorous and climate rarely taken into account, they said.

But climate projections and science-based information are critical to actively seek out and safeguard areas where species are most at risk and ecosystems are fast declining, said authors of the paper titled: "Assessment of Potential Climate Change Impacts on Montane Forests in the Peruvian Andes: Implications for Conservation Prioritization," published in the peer-reviewed journal Forests.

The research finds that climate change is expected to impact 58% of montane forest in the Peruvian Andes -- part of the Tropical Andes, the most biodiverse area on the planet - particularly between 800 and 1,200 meters above sea level. For this reason, conservation plans and policies must take climate change impacts into account, to ensure habitats and species are protected in areas where shocks are expected to be most severe.

Vincent Bax, author and socio-ecological system researcher at HZ University of Applied Sciences, said: "In mountainous areas, habitats change very quickly at different elevations. With climate change, if species are forced to move higher up, they may not be able to quickly adapt to cooler temperatures or higher elevations, and eventually could become extinct. Current conservation plans usually focus on protecting habitats and species as they are now. What we're saying is that we need to use climate projections, and the latest scientific data we have, to protect the most ecologically sensitive areas, now and in future."

Including current and future climatic projections in conservation efforts will help build more resilient and adaptive plans globally, to prevent massive habitat destruction, said authors. They urge policy makers, local, regional and national planners, to look at more efficient planning which delivers more effective bang for the conservation buck. This includes acting now to include rigorous scientific data and proactive solutions to protect changing habitats and species, instead of adopting reactive conservation measures, they noted.

Author and senior Environmental Scientist at the Alliance, Wendy Francesconi, said: "This research approach to conservation planning can be used in other regions and countries, as the methods are applicable to other ecosystems and mountain forests that will likely be affected by climate change. Natural systems are in constant change, and ecosystems either must adapt or go extinct. In that sense, we must approach conservation using an adaptive management framework, as a process that is not static, but constantly changing and evolving."

"Modeling approaches that predict how climate-induced changes ¬- such as temperature increases, precipitation fluctuations, floods, droughts, fires and others - affect ecosystems and species populations, help us manage these areas in a proactive manner," said Francesconi. "Such information can help design or manage protected areas in response to changes, facilitating their recovery. Protecting an area is not the end of the story. As climate change and other threats persist, our approach to conservation should be dynamic, creative and innovative."

According to the 2020 global Living Planet Index, between 1970 and 2016, there was a 68% fall in monitored populations of mammals, birds, amphibians, reptiles and fish. "That is a huge loss in biodiversity, and it's alarming," Francesconi added. "Now is the time to come together and help implement measures to reduce this detrimental loss, especially in light of potentially devastating added impacts that climate change can unfold over the next 50 - 100 years if we do not take action."

Augusto Carlos Castro-Nunez, author and senior low emissions food systems scientist at the Alliance, added: "Although this paper focuses on biodiversity conservation and adaptation action, the proposed approaches are useful for all of Peru's efforts to meet commitments under the three Rio conventions, including efforts for Reducing Emissions from Deforestation and Forest Degradation (REDD+), which mostly consider pressure on forests due to anthropogenic causes and do not incorporate climate uncertainties."

Credit: 
The Alliance of Bioversity International and the International Center for Tropical Agriculture

Scientists more confident projecting ENSO changes under global warming

image: The responses to ENSO of boundary layer humidity, rainfall, tropospheric temperature and circulation are amplified in a warmer climate

Image: 
IAP

El Niño-Southern Oscillation (ENSO) is an irregular periodic variation in winds and sea surface temperatures (SSTs) over the tropical eastern Pacific Ocean. It may lead to extreme weather events across the globe due to its ability to change global atmospheric circulation. Thus, determining how ENSO responds to greenhouse warming is crucial in climate science.

However, quantifying and understanding ENSO-related changes in a warmer climate remains challenging due to the complexity of air-sea feedbacks in the tropical Pacific Ocean and to model bias.

An international team of scientists from the Institute of Atmospheric Physics (IAP) of the Chinese Academy of Sciences, the University of Tokyo, and the University of California, San Diego reported that ENSO-related climate variability seems doomed to increase under global warming.

Their findings were published in Nature Geoscience on April 15.

Recently, the climate science community has found that ENSO's changes in fact strictly obey some basic physical mechanisms, which can reduce uncertainty in ENSO projections under greenhouse warming.

"The saturation vapor pressure increases exponentially with the increase of temperature, so the same air temperature anomaly will lead to a larger saturation vapor pressure anomaly in a warmer climate," said lead author Dr. HU Kaiming from IAP. "As a result, under global warming, even if ENSO's sea surface temperature remains unchanged, the response of tropical lower tropospheric humidity to ENSO will amplify, which in turn results in major reorganization of atmospheric temperature, circulation and rainfall."

Based on this mechanism, the team deduced an intensification in ENSO-driven anomalies in tropical humidity, tropical rainfall, upper tropospheric temperature in the tropics, and the subtropical jets under global warming. Almost all the latest CMIP5/6 climate model projections agreed well with the theoretical deduction, indicating the mechanism and projections were robust.

"As extreme weather often results from ENSO-induced anomalous atmospheric circulation and temperature, the intensification of ENSO-driven atmospheric variability suggests that the risk of extreme weather will increase in the future," said Dr. HU.

Credit: 
Chinese Academy of Sciences Headquarters

Designing better antibody drugs with artificial intelligence

image: Machine learning helps develop optimal antibody drugs.

Image: 
ETH Zurich

Antibodies are not only produced by our immune cells to fight viruses and other pathogens in the body. For a few decades now, medicine has also been using antibodies produced by biotechnology as drugs. This is because antibodies are extremely good at binding specifically to molecular structures according to the lock-and-key principle. Their use ranges from oncology to the treatment of autoimmune diseases and neurodegenerative conditions.

However, developing such antibody drugs is anything but simple. The basic requirement is for an antibody to bind to its target molecule in an optimal way. At the same time, an antibody drug must fulfil a host of additional criteria. For example, it should not trigger an immune response in the body, it should be efficient to produce using biotechnology, and it should remain stable over a long period of time.

Once scientists have found an antibody that binds to the desired molecular target structure, the development process is far from over. Rather, this marks the start of a phase in which researchers use bioengineering to try to improve the antibody's properties. Scientists led by Sai Reddy, a professor at the Department of Biosystems Science and Engineering at ETH Zurich in Basel, have now developed a machine learning method that supports this optimisation phase, helping to develop more effective antibody drugs.

Robots can't manage more than a few thousand

When researchers optimise an entire antibody molecule in its therapeutic form (i.e. not just a fragment of an antibody), it used to start with an antibody lead candidate that binds reasonably well to the desired target structure. Then researchers randomly mutate the gene that carries the blueprint for the antibody in order to produce a few thousand related antibody candidates in the lab. The next step is to search among them to find the ones that bind best to the target structure. "With automated processes, you can test a few thousand therapeutic candidates in a lab. But it is not really feasible to screen any more than that," Reddy says. Typically, the best dozen antibodies from this screening move on to the next step and are tested for how well they meet additional criteria. "Ultimately, this approach lets you identify the best antibody from a group of a few thousand," he says.

Candidate pool massively increased by machine learning

Reddy and his colleagues are now using machine learning to increase the initial set of antibodies to be tested to several million. "The more candidates there are to choose from, the greater the chance of finding one that really meets all the criteria needed for drug development," Reddy says.

The ETH researchers provided the proof of concept for their new method using Roche's antibody cancer drug Herceptin, which has been on the market for 20 years. "But we weren't looking to make suggestions for how to improve it - you can't just retroactively change an approved drug," Reddy explains. "Our reason for choosing this antibody is because it is well known in the scientific community and because its structure is published in open-access databases."

Computer predictions

Starting out from the DNA sequence of the Herceptin antibody, the ETH researchers created about 40,000 related antibodies using a CRISPR mutation method they developed a few years ago. Experiments showed that 10,000 of them bound well to the target protein in question, a specific cell surface protein. The scientists used the DNA sequences of these 40,000 antibodies to train a machine learning algorithm.

They then applied the trained algorithm to search a database of 70 million potential antibody DNA sequences. For these 70 million candidates, the algorithm predicted how well the corresponding antibodies would bind to the target protein, resulting in a list of millions of sequences expected to bind.

Using further computer models, the scientists predicted how well these millions of sequences would meet the additional criteria for drug development (tolerance, production, physical properties). This reduced the number of candidate sequences to 8,000.

Improved antibodies found

From the list of optimised candidate sequences on their computer, the scientists selected 55 sequences from which to produce antibodies in the lab and characterise their properties. Subsequent experiments showed that several of them bound even better to the target protein than Herceptin itself, as well as being easier to produce and more stable than Herceptin. "One new variant may even be better tolerated in the body than Herceptin," says Reddy. "It is known that Herceptin triggers a weak immune response, but this is typically not a problem in this case." However, it is a problem for many other antibodies and is necessary to prevent for drug development.

The ETH scientists are now applying their artificial intelligence method to optimise antibody drugs that are in clinical development. To this end, they recently founded the ETH spin-off deepCDR Biologics, which partners with both early stage and established biotech and pharmaceutical companies for antibody drug development.

Credit: 
ETH Zurich

Water purification system engineered from wood, with help from a microwave oven

image: A hydrogel before and after adsorption of methylene blue in an aqueous solution.

Image: 
Giuseppe Melilli

Researchers in Sweden have developed a more eco-friendly way to remove heavy metals, dyes and other pollutants from water. The answer lies in filtering wastewater with a gel material taken from plant cellulose and spiked with small carbon dots produced in a microwave oven.

Reporting in the journal Sustainable Marials and Technologies, researchers from KTH Royal Institute of Technology, in collaboration with Politecnico di Torino, engineered a more sustainable technique for producing hydrogel composites, a type of material that is wteidely studied for wastewater decontamination.

Minna Hakkarainen, who leads the Division of Polymer Technology at KTH Royal Institute of Technology, says that the hydrogels remove contaminants such as heavy metal ions, dyes and other common pollutants.

"The total amount of water on Earth doesn't change with time, but demand does," she says. "These all-lignocellulose hydrogels offer a promising, sustainable solution to help ensure access to clean water."

The hydrogel composites can be made from 100 percent lignocellulose, or plant matter - the most abundant bioresource on Earth, she says.

One ingredient is cellulose gum (carboxymethyl cellulose, or CMC), a thickener and emulsion derived commonly from wood pulp or cotton processing byproducts and used in various food products, including ice cream. Added to the hydrogel are graphene oxide-like carbon dots synthesized from biomass with the help of microwave heat. The hydrogel composites are then cured with UV light, a mild process that takes place in water at room temperature.

Hydrogels consist of a network of polymer chains that not only absorb water, but also collect molecules and ions by means of electrostatic interactions - a process known as adsorption. Hakkarainen says the new process also reinforces the stability of the hydrogel composites so that they can outlast ordinary hydrogels for repeated cycles of water purification.

Graphene oxide has become a favored additive to this mix, because of its high adsorption capacity, but the environmental cost of graphene oxide production is high.

"Graphene oxide is a great adsorbent, but the production process is harsh," she says. "Our route is based on common bio-based raw materials and significantly milder processes with less impact on the environment."

Graphene is derived from graphite, a crystalline form of carbon that most people would recognize as the "lead" in pencils. In oxidized form it can be used in hydrogels but the oxidation process requires harsh chemicals and conditions. Synthesizing graphene from biomass often requires temperatures of up to 1300C.

By contrast, the researchers at KTH found a way to carbonize biomass at much lower temperatures. They reduced sodium lignosulfate, a byproduct from wood pulping, into carbon flakes by heating it in water in a microwave oven. The water is brought to 240C, and it is kept at that temperature for two hours.

Ultimately after a process of oxidation they produced carbon dots of about 10 to 80 nanometers in diameter, which are then mixed with the methacrylated CMC and treated with UV-light to form the hydrogel.

"This is a simple, sustainable system," Hakkarainen says. "It works as well, if not better, than hydrogel systems currently in use."

Credit: 
KTH, Royal Institute of Technology

Investigating heavy quark physics with the LHCb experiment

A new review published in EPJ H by Clara Matteuzzi, Research Director at the National Institute for Nuclear Physics (INFN) and former tenured professor at the University of Milan, and her colleagues, examines almost three decades of the LHCb experiment - from its conception to operation at the Large Hadron Collider (LHC) - documenting its achievements and future potential.

The LCHb experiment was originally conceived to understand the symmetry between matter and antimatter and where this symmetry is broken - known as charge conjugation parity (CP) violation. Whilst this may seem like quite an obscure area of study, it addresses one of the Universe's most fundamental questions: how it came to be dominated by matter when it should have equally favoured antimatter?

"LHCb wants to study by which mechanism our universe, as we see it today, is made of matter, and how antimatter disappeared despite an initial symmetry between the two states," says Matteuzzi. "The Standard Model contains a tiny amount of violation of this symmetry, whilst the observation of the universe implies a much larger one. This is one of the most fascinating open questions in the Particle Physics field."

The LHCb experiment investigates this problem by studying the behaviour of systems and particles made from so-called heavy quarks. These are produced in abundance by highly energetic collisions - explaining why the LHC is the perfect location to study them - and were also abundant in the highly energetic early Universe.

"The field in which the LHCb is active is so-called 'heavy quarks physics' which aims to study and understand the behaviour of the particles containing the c and b heavy quarks - usually named charm and beauty quarks," says Matteuzzi. "The rich sector - spectroscopy - covered by LHCb is how quarks of different types, or flavours, aggregate together to form particles in a way that is analogous to how 'Up' and 'Down' quarks in different combinations make protons and neutrons."

"It became clear that the potentiality of the LHCb detector was in other fields beyond the study of CP violation that also hinged on aspects of heavy quark interaction. One was the spectacular success of spectroscopy and the measurement of many new states composed by heavy quarks," concludes Matteuzzi. "This incredibly rich variety of results is demonstrated in our paper - we hope!"

Credit: 
Springer

Understanding the growth of disease-causing protein fibres

image: Transmission Electron Micrograph of fibrils from the protein alpha-synuclein, which is associated with Parkinson's disease.

Image: 
University of Bath

Amyloid fibrils are deposits of proteins in the body that join together to form microscopic fibres. Their formation has been linked to many serious human diseases including Alzheimer's, Parkinson's and Type 2 diabetes.

Until today, scientists have been unable to reliably measure the speed of fibril growth, as there have been no tools that could directly measure growth rate in solution. However, researchers from the UK's University of Bath and the ISIS Neutron and Muon Source have now invented a technique that does just that. Results from their study are published in RSC Chemical Biology.

"This is an important breakthrough, as information on fibre growth is key to understanding the diseases associated with amyloid fibrils," said Dr Adam Squires from the Department of Chemistry at Bath, and study co-author. "Knowing what makes these fibres grow faster or slower, or whether they break and what makes them break - in other words, understanding these fibres at a molecular level - could eventually have implications for researchers looking for treatments for these serious diseases."

He added: "This new technique will also help scientists investigating non-medical roles of protein folding and self-assembly - for instance, in biological processes such as inheritance in yeast, or for research into new nanomaterials."

WHY GROWTH RATE IS BEST MEASURED IN SOLUTION

Most experimental techniques for measuring fibril growth in solution only measure how fast proteins transform into fibril material overall, not how long each fibril is or how fast it is growing. Other techniques measure just one fibril attached to a surface such as glass or mica. These conditions do not reflect the real biological process, which occurs in solution.

Researchers for the new study used Small Angle Neutron Scattering (SANS) to study the growth rate and length of amyloid fibrils as they assembled in solution. By using the unique ways neutrons interact with hydrogen and its isotope deuterium, the researchers were able to use 'contrast matching' to make all of the fibrils invisible to neutrons apart from the growing tips. Using the SANS2D instrument at the ISIS neutron facility, they watched these tips become longer in real time. This gave a direct measurement of the growth rate, which had never been done before.

The results of growth rate from this study align with values estimated from other methods, indicating that SANS is a suitable tool for measuring amyloid fibril growth.

The technique also allowed the researchers to measure the number of fibril ends present in a given sample. This information told them how many separate fibres were growing, and the length of each one. The fragility of fibrils from different proteins, and how often they break into shorter fragments exposing more growing ends, is a key part of the puzzle to understand fibril disease propagation.

Lead researcher Dr Ben Eves carried out the experiments at Bath as part of his ISIS Facility Development studentship.

"I'm thrilled with the success of this method," he said. "Developing this technique was a truly amazing experience. Understanding the growth of amyloid fibrils is fundamental to understanding their pathogenic, biological and technological properties."

He added: "In future, I believe this technique could be used to investigate the effect of different factors that affect the growth rate of amyloid fibrils, as well as to measure the impact of therapeutic molecules (the building blocks of medicines) designed to slow down or prevent the growth of amyloid fibrils."

Credit: 
University of Bath

Agricultural trade across US states can mitigate economic impacts of climate change

image: Sandy Dall'Erba, agricultural economist at University of Illinois, studied how U.S. interstate trade can mitigate the economic effects of climate change on agricultural production.

Image: 
University of Illinois.

URBANA, Ill. - Agricultural producers deal firsthand with changing weather conditions, and extreme events such as drought or flooding can impact their productivity and profit. Climate change models project such events will occur more often in the future. But studies of the economic consequences of weather and climate on agriculture typically focus on local impacts only.

A new study from the University of Illinois looks at how changes in weather - including extreme events - may decrease crop profit in one state while increasing profits in other states. The secret ingredient: U.S. interstate trade. It is expected to mitigate the economic impact of climate change by up to $14.5 billion by the middle of the century.

"Our motivation for the study is twofold: Climate change brings about more frequent and intense extreme weather events, which impact agricultural production. At the same time, U.S. and global populations are growing. We need to plan for an additional 64 million people in the U.S. and 1.9 billion people worldwide by 2050, which raises concerns for future food security," says Sandy Dall'Erba, professor in the Department of Agricultural and Consumer Economics (ACE) and director of the Regional Economics Applications Laboratory (REAL) at U of I. Dall'Erba is lead author on the study, published in the American Journal of Agricultural Economics.

Researchers typically have a dire view of extreme weather effects on agricultural production. For instance, the 2012 drought in the Midwest reduced corn production by 20% in Iowa, 34% in Illinois, and 16% in Nebraska compared to the previous year. This had a disastrous impact on crop producer profits in these states.

"However, crop producers in states that did not experience drought at all or to a much smaller extent, like Minnesota, saw record profits," Dall'Erba notes. "This is because the price of crops went up that year, and the food supply chain, from livestock in Louisiana and Texas to food processing plants throughout the country, needed the raw products to carry on with their activities. As a result, drought events outside of your own state can benefit you."

Dall'Erba and study co-authors Zhangliang Chen and Noé Nava, graduate students in ACE, conducted their research within the U.S. because the country is so large that a particular weather event is unlikely to affect every state simultaneously.

"Furthermore, U.S. interstate trade has been relatively understudied - in part because data were not available until recently - even though we use around 90% of agricultural products domestically, for livestock and food processing," Dall'Erba says.

The researchers analyzed trade flows between U.S. states, based on data from the U.S. Economic Census Bureau's commodity flow survey at four time points: 1997, 2002, 2007 and 2012. They focused on crop and vegetable trade because these commodities, unlike livestock and food processing, are directly affected by weather events.

As expected, they found drought reduces the capacity of a state to sell its commodities to other states, while it increases the state's demand for imports of crops, fruits, and vegetables from other states.

"You see how a drought event affects not only yourself, but also all the places you export to and all the places you import from. The reason is that activities down the value chain, such as livestock feeding, food manufacturing, and human consumption carry on at the same level so if people cannot get their raw products from their usual producer state, they will get them from another state," Dall'Erba notes.

While agricultural profits are very sensitive to weather-induced changes in trade, not all states will experience the effects equally. Some states depend on trade more than others; for example, Illinois, Minnesota, California, and Nebraska are the largest importers and exporters in the domestic trade system, Dall'Erba explains.

The authors also projected future weather conditions based on four commonly used global and regional climate models.

"Without trade, the climate change impact forecast indicates a loss of $11.2 billion in agricultural profits at the national level by 2050. That estimate accounts only for the negative impact of a local drought, which is how current impact forecasts are usually done. However, when we include trade in the analysis, we discover that its capacity to mitigate climate change is worth $14.5 billion. This means we project a $3.3 billion gain nationwide. It's a complete shift in paradigm compared to current forecasts," Dall'Erba states.

Dall'Erba and his research team are working on a website that will allow anyone to explore the data for themselves. Users can enter information about states, crops, weather, and trade to create their own forecasts and estimates.

While the current project focuses on U.S. interstate trade, Dall'Erba says the approach can also apply to global weather events and international trade as well as other sources of disruption in the supply chain such as diplomatic events or infrastructure vulnerability.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Novel muscular dystrophy gene connects to a key biological pathway

MINNEAPOLIS/ST.PAUL (04/15/2021) -- New research from the University of Minnesota Medical School found mutations in a novel gene that may help identify patients with a specific form of muscular dystrophy.

The laboratory of Peter B. Kang, MD, the new director of the Paul & Sheila Wellstone Muscular Dystrophy Center at the U of M Medical School, studies the genetics and disease mechanisms of muscular dystrophy. It uses cutting-edge genomic methods to discover disease-causing mutations in patients who cannot find answers via clinical genetic test facilities.

The Kang laboratory and collaborators at the Université Libre de Bruxelles found a novel gene associated with muscular dystrophy and led an international coalition of scientists and physicians to describe a group of 13 families from around the world affected by muscular dystrophy who harbor disease-causing mutations in a gene named JAG2. The group found a distinct pattern of abnormalities on muscle MRI that may help identify other patients with this specific form of muscular dystrophy in the future. This work was begun while Kang was at the University of Florida, and completed after his arrival at the U of M Medical School.

Kang, who is also a professor and vice chair of research in the Department of Neurology at the Medical School, is the senior author of the study published today in the American Journal of Human Genetics. The major findings are:

Mutations in the gene JAG2 cause a form of muscular dystrophy;

This form of muscular dystrophy may be found in a number of countries and ethnic groups;

This form of muscular dystrophy is often accompanied by a distinct pattern of abnormalities on muscle MRI;

And, evidence for interactions between the protein product Jagged2 and the Notch signaling pathway suggests new areas of exploration for potential therapeutic development.

The Notch signaling pathway is a critically important biological pathway that is involved with the regulation of developmental and healing processes in a number of organs, including skeletal muscle.

"We hope that our findings lead to JAG2 being included in genetic testing panels for muscular dystrophy and muscle MRI being a more routine diagnostic procedure for patients with suspected muscular dystrophy who do not have a clear diagnosis on initial tests," Kang said.

Kang and his team plan to continue exploring both small molecule and cutting-edge molecular therapeutic strategies that target the interactions between Jagged2 and the Notch signaling pathway, with the expectation that they will discover novel treatments for muscular dystrophy.

Credit: 
University of Minnesota Medical School

Thirdhand smoke exposure linked to fabric type, heat, and humidity

image: Thirdhand smoke refers to the residues left behind by smoking.

Image: 
University Communications, UC Riverside.

RIVERSIDE, Calif. -- A study led by scientists at the University of California, Riverside, has found chemicals in thirdhand smoke, or THS, get extracted more readily from household fabrics in a humid environment than in a dry one.

"This could have implications for human exposure to THS chemicals in areas where there is high humidity," said Prue Talbot, a professor of cell biology at UCR, who led the study published in the International Journal of Environmental Research and Public Health. "Our work shows that people living in humid environments, such as Florida, will receive greater THS exposure than those living in dry environments."

To rapidly cross-compare THS concentrations from different samples, labs, and environments, the researchers developed a simple method based on autofluorescence -- the natural emission of light -- of tobacco tar and total particulate matter in smoke. The method, which can be used to evaluate changes in the THS content of common household fabrics, showed the removal of THS from indoor environments also depends on a number of factors such as the type of household fabric and the chemicals in THS.

THS is created when exhaled smoke and smoke emanating from the tip of burning cigarettes settles on surfaces such as clothing, hair, furniture, and cars. Not strictly smoke, THS refers to the residues left behind by smoking. Children and toddlers are particularly vulnerable to THS due to their low body mass and frequent contact with indoor surfaces. Nicotine is a major component of THS.

"We introduced a rapid method to assess THS contaminants in household fabrics such as cotton, terry cloth, polyester, and wool," said Giovanna L. Pozuelos, the first author of the study and a graduate student in Talbot's lab. "Until now, there was no rapid method for making such comparisons. Absorption and extraction of THS chemicals depend on the chemical of interest, the fabric it has absorbed to, the temperature of extraction, and the ambient humidity during sorption. All these factors affect human exposure to THS."

The authors argue that understanding the dynamics of THS in fabrics can guide appropriate remediation policies to protect humans from exposure. Their findings, they add, can help develop more effective remediation methods for THS contaminated environments.

Pozuelos explained that THS chemicals absorbed by polyester tend to bind tightly and cannot be easily extracted under dry conditions.

"Under humid conditions, however, THS becomes more extractable from polyester," she said. "Our experiments also showed that cotton and terry cloth released higher concentrations of nicotine than polyester and wool carpet."

The new method relies on the use of a fluorescence spectrophotometer, which the authors note is easy to operate, requires no special training, can perform analysis in minutes, and is inexpensive. They note, too, that most labs can easily adapt the autofluorescence method with minimal expense.

Next, the team plans to examine the concentration of chemicals in THS in different field sites such as homes and casinos; and determine the ease with which these chemicals can be removed from these fabrics.

Credit: 
University of California - Riverside