Tech

Many forests scorched by wildfire won't bounce back

With flames racing across hundreds of square miles throughout Colorado and California this summer and a warming climate projected to boost wildfire activity across the West, residents can't help but wonder what our beloved forests will look like in a few decades.

A new University of Colorado Boulder-led study offers an unprecedented glimpse, suggesting that when forests burn across the Southern Rocky Mountains, many will not grow back and will instead convert to grasslands and shrublands.

"We project that post-fire recovery will be less likely in the future, with large percentages of the Southern Rocky Mountains becoming unsuitable for two important tree species--ponderosa pine and Douglas fir," said lead author Kyle Rodman, who conducted the study while a PhD student in the Department of Geography.

Previous CU Boulder studies have looked at individual fire sites, including the site of the 2000 Walker Ranch fire in Boulder County, and found that forests recovered slowly or not at all. Even 15 years post-fire, as many as 80% of the plots the researchers surveyed still contained no new trees.

Rodman and his team of coauthors--including scientists from the U.S. Forest Service, Northern Arizona University, Colorado State University and the University of North Carolina Wilmington--wanted to build on those studies, projecting the future by looking at the past.

To that end, they looked at 22 burned areas encompassing 710 square miles from southern Wyoming through central and western Colorado to northern New Mexico. The team focused on ponderosa pine and Douglas fir forests, which make up about half of the forested area in the region.

"For those of us who live along Colorado's Front Range, these are the trees that we see, live near and recreate in on a daily basis," said Rodman.

Higher elevations, lower temperatures fare better

The study, published in the journal Global Ecology and Biogeography, included regions that had burned as long ago as 1988, and land ravaged by the 2002 Hayman Fire near Colorado Springs; the 1996 Buffalo Creek Fire southwest of Denver; the 2000 Eldorado Springs and Walker Ranch fires near Boulder; and the 2002 Missionary Ridge fire outside of Durango.

Using satellite images and on-the-ground measurements, the scientists first reconstructed what the forests looked like prior to the fire. Then, by counting juvenile trees and looking at tree rings, they assessed how well the forests were recovering. Not surprisingly, those at higher-elevations with lower temperatures, and more precipitation fared better. Those with more surviving trees nearby (which can spread their seeds via wind and water) were also more likely to rebound.

Meanwhile, lower-elevation forests, like those south of Pueblo or in portions of the Front Range foothills, proved less resilient.

And compared to regions that burned in the 19th and early 20th centuries, the more recent burn areas failed to bounce back.

"This study and others clearly show that the resilience of our forests to fire has declined significantly under warmer, drier conditions," said coauthor Tom Veblen, professor of geography at CU Boulder.

The team then used statistical modeling to project what might happen in the next 80 years if montane forests of ponderosa pine and Douglas fir were to burn under different scenarios. In one scenario, humans do nothing to reduce greenhouse gas emissions, and climate change escalates unchecked. In another, considered a "moderate emissions scenario," emissions begin to decline after 2040.

'The future is not written in stone'

Currently, the team estimates that about half of its study area is suitable for post-fire "recovery." (Trees there may return to at least their lowest densities from the 1800s).

By 2051, under the moderate emissions scenario, less than 18% of Douglas fir and ponderosa pine forests will likely recover if burned. Under the higher emission scenario, that number dips to 6.3% for Douglas fir and 3.5% percent for pine forests.

Meanwhile, Veblen notes, the number and intensity of wildfires will continue its steady rise. The number of acres burned annually across the country has already doubled since the 1990s.

"The big takeaway here is that we can expect to have an increase in fire continue for the foreseeable future, and, at the same time, we are going to see much of our land convert from forest to non-forest," said Veblen.

Rodman, now a postdoctoral research associate at the University of Wisconsin-Madison, hopes the database of post-fire recovery he and his team have created can help land managers better plan where to invest their resources, or not, after a fire.

For instance, they may be better off planting seedlings in regions more likely to bounce back, rather than plant them in dry sites no longer suitable for their survival.

He also hopes the projections spelled out in the paper give people one more reason to care about climate change.

"This was a hard study to write and can be a bit depressing to read, but there are some positive takeaways," he said. "If we can get a handle on some of these trends and reduce our greenhouse gas emissions, the outcomes may not look so dire. The future is not written in stone."

Credit: 
University of Colorado at Boulder

Eat more to grow more arms...if you're a sea anemone

image: These four images show the development process of the characteristic tentacle arms of a sea anemone.

Image: 
Anniek Stokkermans/EMBL

Your genetic code determines that you will grow two arms and two legs. The same fate is true for all mammals. Similarly, the number of fins a fish has and the number of legs and wings an insect has are embedded in their genetic code. Sea anemones, however, defy this rule and have a variable number of tentacle arms.

Until now it's been unclear what regulates the number of tentacles a sea anemone can grow. Scientists from the Ikmi group at EMBL Heidelberg, in collaboration with researchers in the Gibson lab at the Stowers Institute for Medical Research in Kansas City, have shown that the number of tentacles is defined by the amount of food consumed. "Controlling the number of tentacle arms by food intake makes the sea anemone behave more like a plant developing new branches than an animal growing a new limb," explains group leader Aissam Ikmi. Defining what environmental factors trigger morphological changes is a particularly important question given the longevity of sea anemones, with some species living for more than 65 years. "As predominantly sessile animals, sea anemones must have evolved strategies to deal with environmental changes to sustain such a long lifespan," adds Ikmi.

The scientists have shown that the growth of new tentacles happens not only when the sea anemone is a juvenile, but also throughout adulthood. "We can conclude that the number of tentacle arms must be determined by the interplay between genetic and environmental factors," says Ikmi, who started this project when he was still a postdoc in the lab of Matt Gibson. While the sea anemone uses different strategies to build tentacles in the different stages of its life, the final arms are morphologically indistinguishable from each other. "If humans could do the same, it would mean that the more we ate, the more arms and legs we could grow," says Ikmi. "Imagine how handy it would be if we could activate this when we needed to replace damaged limbs."

When Ikmi's group studied the locations at which the new arms form, they found that muscle cells pre-mark the sites of new tentacles. These muscle cells change their gene expression signature in response to food. The same molecular signalling employed to build tentacles in sea anemones also exists in many other species - including humans. So far, however, its role has been studied mainly in embryonic development. "We propose a new biological context in which to understand how nutrient uptake impacts the function of this developmental signalling: a situation that is relevant for defining the role of metabolism in guiding the formation of organs during adulthood" explains Ikmi. "Sea anemones show us that it is possible that nutrients are not converted into excess fat storage - as it is the case in all mammals - but instead transformed into a new body structure."

While this finding is novel on its own, it also shows that sea anemones, which are traditionally used for evolutionary developmental studies, are well suited to study morphogenesis in the context of organism-environment interactions.

To build the branching map of new tentacles, researchers analysed more than 1000 sea anemones one by one. "Scoring such a massive number of tentacles is, in some ways, a story in itself," says Mason McMullen, laughing. McMullen, a clinical pharmacist at the University of Kansas Health System, spent months imaging sea anemones' heads to score their tentacle number and location.

Knowing that the number of tentacles in sea anemones is determined by their food intake, the group plans to define the key nutrients critical to this process. Ikmi and his group also want to further investigate the unconventional role of muscles in defining the sites where new tentacles form. "We're currently investigating this novel property of muscle cells and are eager to find out the mystery behind them," he concludes.

Credit: 
European Molecular Biology Laboratory

Circadian rhythms help guide waste from brain

New research details how the complex set of molecular and fluid dynamics that comprise the glymphatic system - the brain's unique process of waste removal - are synchronized with the master internal clock that regulates the sleep-wake cycle. These findings suggest that people who rely on sleeping during daytime hours are at greater risk for developing neurological disorders.

"These findings show that glymphatic system function is not solely based on sleep or wakefulness, but by the daily rhythms dictated by our biological clock," said neuroscientist Maiken Nedergaard, M.D., D.M.Sc., co-director of the Center for Translational Neuromedicine at the University of Rochester Medical Center (URMC) and senior author of the study, which appears in the journal Nature Communications.

The findings add to a growing understanding of the operation and function of glymphatic system, the brain's self-contained waste removal process which was first discovered in 2012 by researchers in the Nedergaard's lab. The system consists of a network of plumbing that follows the path of blood vessels and pumps cerebrospinal fluid (CSF) through brain tissue, washing away waste. Research a few years later showed that the glymphatic system primarily functions while we sleep.

Since those initial discoveries, Nedergaard's lab and others have shown the role that blood pressure, heart rate, circadian timing, and depth of sleep play in the glymphatic system's function and the chemical signaling that occurs in the brain to turn the system on and off. They have also shown how disrupted sleep or trauma can cause the system to break down and allow toxic proteins to accumulate in the brain, potentially giving rise to a number of neurodegenerative diseases, such as Alzheimer's.

The link between circadian rhythms and the glymphatic system is the subject of the new paper. Circadian rhythms - a 24-hour internal clock that regulates several important functions, including the sleep-wake cycle - are maintained in a small area of the brain called the suprachiasmatic nucleus.

The new study, which was conducted in mice, the researchers showed that when the animals were anesthetized all day long, their glymphatic system still only functioned during their typical rest period - mice are nocturnal, so their sleep-wake cycle is the opposite of humans.

"Circadian rhythms in humans are tuned to a day-wake, night-sleep cycle," said Lauren Hablitz, Ph.D., first author of the new study and a research assistant professor in the URMC Center for Translational Neuromedicine. "Because this timing also influences the glymphatic system, these findings suggest that people who rely on cat naps during the day to catch up on sleep or work the night shift may be at risk for developing neurological disorders. In fact, clinical research shows that individuals who rely on sleeping during daytime hours are at much greater risk for Alzheimer's and dementia along with other health problems."

The study singles out cells called astrocytes that play multiple functions in the brain. It is believed that astrocytes in the suprachiasmatic nucleus help regulate circadian rhythms. Astrocytes also serve as gatekeepers that control the flow of CSF throughout the central nervous system. The results of the study suggest that communication between astrocytes in different parts of the brain may share the common goal of optimizing the glymphatic system's function during sleep.

The researchers also found that during wakefulness, the glymphatic system diverts CSF to lymph nodes in the neck. Because the lymph nodes are key waystations in the regulation of the immune system, the research suggests that CSF may represent a "fluid clock" that helps wake up the body's infection fighting capabilities during the day.

"Establishing a role for communication between astrocytes and the significant impacts of circadian timing on glymphatic clearance dynamics represent a major step in understanding the fundamental process of waste clearance regulation in the brain," said Frederick Gregory, Ph.D., program manager for the Army Research Office, which helped fund the research and is an element of the U.S. Army Combat Capabilities Development Command's Army Research Laboratory. "This knowledge is crucial to developing future countermeasures that offset the deleterious effects of sleep deprivation and addresses future multi-domain military operation requirements for Soldiers to sustain performance over longer periods without the ability to rest."

Credit: 
University of Rochester Medical Center

Climate change could increase rice yields

image: Many people across the globe rely on rice as a source of nutrition.

Image: 
Rachel Schutte

Rice is the most consumed staple food in the world. It is especially common in Asia, where hunger concerns are prevalent.

Rice is classified as an annual plant, which means it completes its life cycle within one growing season then dies. However, in some tropical areas, rice can continue to grow year after year when taken care of properly.

Just as grass grows back in a lawn after it is mowed, rice can be cut after it is harvested, and the plant will regrow. The farming practice of cutting the rice above ground and allowing it to regrow is called ratooning.

Although Rice ratooning allows farmers to harvest more rice from the same fields, it requires a longer growing season compared to traditional single-harvest rice farming.

In many areas of the world where rice is grown, a long growing season isn't a problem due to the tropical climates. But in Japan, cooler weather means rice ratooning has been a rare farming practice.

Hiroshi Nakano and a research team set out to learn more about the potential of ratooning to help Japanese rice farmers. Nakano is a researcher at the National Agriculture and Food Research Organization.

Average temperatures in Japan have been higher in recent years. As climate change continues to affect the region, rice farmers may have a longer window for growing rice. "Rice seedlings will be able to be transplanted earlier in the spring, and farmers can harvest rice later into the year," explains Nakano.

"The goal of our research is to determine the effects of harvest time and cutting height of the first harvest on the yield of the first and second rice crops," says Nakano. "Ultimately, we want to propose new farming strategies to increase yield as farmers in southwestern Japan adjust to climate change."

During the study on rice ratooning, researchers compared two harvest times and two cutting heights of the first crop. After the first harvest, they collected the seeds from the cut off portions of the rice plants. Researchers measured the yield by counting and weighing the seeds. The second harvest of rice was done by hand and the yield was determined in the same way.

The total grain yield and the yields from the first and second crops were different depending on the harvest times and cutting heights. This wasn't too surprising, since the team already knew harvest time and height affected yield.

Rice plants harvested at the normal time for the first crop yielded more seed than the rice plants harvested earlier. "That's because the plants had more time to fill their spikelets with seed," explains Nakano.

"At both harvest times, rice harvested at the high cutting height had a higher yield than the low cutting height," says Nakano. That's because the plants cut at a higher height had access to more energy and nutrients stored in their leaves and stems.

"Our results suggest that combining the normal harvest time with the high cutting height is important for increasing yield in rice ratooning in southwestern Japan and similar climate regions," says Nakano. "This technology will likely increase rice grain yield in new environments that arise through global climate change."

Credit: 
American Society of Agronomy

Common sunscreen ingredients prove dangerous for freshwater ecosystems

image: Water flea, Daphnia magna, rougly 5mm across. Study results show that exposure to UV-filters over a 48-hour period prevented the fleas from navigating through their environment. Exposure over a 14-day period--similar to what might occur near popular beach areas--proved lethal for the tiny crustaceans.

Image: 
Photo Aaron Boyd

The active ingredients found in sunscreen have detrimental effects on freshwater ecosystems, according to new research by University of Alberta biologists.

The results show that long-term exposure to ultraviolet (UV) filters--including avobenzone, oxybenzone, and octocrylene--is lethal for some organisms living in freshwater environments. One of the largest sources of UV-filter contamination in both marine and freshwater environments is from sunscreen leaching off of the skin while swimming.

"We do know that UV-filters are particularly devastating to coral reefs and cause bleaching, but there has been almost no research on what the effects are to freshwater animals," explained Aaron Boyd, graduate student in the Department of Biological Sciences and lead author on the paper. "To address this, we examined the effects of UV-filters in the water flea, Daphnia magna."

The results show that exposure to UV-filters over a 48-hour period prevented the fleas from navigating through their environment. Exposure over a 14-day period--similar to what might occur near popular beach areas--proved lethal for the tiny crustaceans.

"This is particularly bad for a freshwater ecosystem as a whole, as Daphnia are an important part of the food chain for many smaller species of fish," added Boyd, who completed this research in collaboration with graduate student Connor Stewart, under the supervision of Assistant Professor Tamzin Blewett and Professor Keith Tierney. "Losing a Daphnia population would put all of the species that rely on them at risk of starvation, and in certain conditions could cause the local ecosystem to collapse."

The good news, Boyd explained, is that the fleas were able to recover their ability to navigate through the water once the contamination was removed--a good sign for environmental recovery. "These chemicals are short-lived in the environment, so if we remove the sources of pollution, then there is a reasonable chance for the organisms in those environments to recover," he said.

Further research is required to better understand the long-term impact of UV-filters--and research continues in the search for non-toxic UV filters.

Credit: 
University of Alberta

Combining PCR and antibody tests at point of care dramatically increases COVID-19 detection

A Cambridge hospital has piloted the use of combined rapid point-of-care nucleic acid and antibody testing for SARS-CoV-2 infection after researchers at the University of Cambridge showed that this approach was superior to virus detection alone for diagnosing COVID-19 disease.

Point-of-care testing - in other words, testing patients as soon as they arrive at the hospital - is essential for enabling healthcare workers to rapidly diagnose patients and direct those who test positive for infection to dedicated wards. A recent study showed that SAMBA II, a new point-of-care PCR test for SARS-CoV-2 developed by Cambridge researchers, was able to dramatically reduce time spent on COVID-19 'holding' wards - allowing patients to be treated or discharged far quicker than with current lab testing set-ups.

PCR tests involve extracting a miniscule amount of RNA from the virus and copying it millions of times, creating an amount large enough to confirm presence of the virus. The virus is captured through a swab inside the nostrils and at the back of the throat. However, it can take as long as 14 days for an individual to show symptoms of COVID-19, by which time the virus may have moved away from the nose and throat and into the lungs and other tissues and organs, making it harder to detect via a swab test. As a result, studies have shown that PCR tests can miss as many as a half of infected patients five days after infection.

Antibody tests provide an alternative way of identifying infected individuals, but antibodies - molecules produced by our immune system in response to infection - generally do not appear until at least six days after infection.

Professor Ravi Gupta from the Cambridge Institute of Therapeutic Immunology and Infectious Disease at the University of Cambridge said: "We still do not have a gold standard test for diagnosing COVID-19. This poses a challenge to healthcare workers who need to make quick and safe decisions about how and where to treat patients.

"The two main types of test - PCR and antibody tests - both have limitations because of the nature of coronavirus infection and how our body responds. But we've shown that if you combine them and carry out both at point of care, their reliability can be hugely increased."

Professor Gupta led a team that used the approach of combining rapid point-of-care PCR and antibody tests to diagnose 45 patients at Addenbrooke's Hospital, Cambridge University Hospitals NHS Foundation Trust. The results of this peer-reviewed study are published in Cell Reports Medicine.

The patients, each of whom had suspected moderate to severe COVID-19 disease, provided nose/throat swabs for the tests detecting nucleic acid (virus genetic material) and blood serum for antibody testing an average (median) of seven days after the onset of illness.

The authors designed a gold standard reference test made of two parts, either of which could be positive to confirm COVID-19. The first part was an in vitro test where artificial SARS-CoV-2 viruses were made and mixed with serum from patients to see whether the serum contained neutralising antibodies. The second part of the gold standard was the standard Public Health England laboratory test looking for genetic viral material in nose/throat swabs. Using this gold standard, 24 of the patients had COVID-19.

Professor Gupta's team used SAMBA II machines, developed by Cambridge spinout company Diagnostics for the Real World, for the nucleic acid tests, and a combination of two finger prick antibody tests, both of which test for antibodies against the spike protein on the surface of the SARS-CoV-2 virus.

Overall, the nucleic acid tests could identify eight out of ten patients with COVID-19, but when combined with the rapid antibody tests, 100% of the COVID-19 patients were correctly identified. Among the 21 patients who did not have COVID-19, there were four false positive results with one antibody test and only one false positive with the second antibody test, demonstrating that one performed better than the other.

"Combining point-of-care PCR and antibody testing could be a game-changer for rapidly identifying those patients with moderate to severe COVID-19 infection," said Professor Gupta. "This could prove extremely useful, particularly in the event of a second wave arising during flu season, when it will not be immediately clear whether the patients had COVID-19 or seasonal flu."

Professor Gupta envisages that hospitals deploying this approach would carry out a finger prick blood test and nose/throat swab at the same time on admission to hospital. The antibody test result is available within 15 minutes, but might benefit from confirmation with a second point-of-care antibody test. Importantly the study showed that the antibody tests can detect antibodies against a mutated form of SARS-CoV-2, D614G in spike protein, that has now become the dominant strain worldwide.

This approach could be particularly beneficial in low resource settings where centralised virology laboratories are scarce and the pandemic is expanding, said Professor Gupta. In addition, it removes the need for repeated nose/throat swabbing when the first test is negative and suspicion of COVID-19 is high, which may generate aerosols and lead to transmission.

Credit: 
University of Cambridge

A molecular approach to quantum computing

image: An illustration showing a molecule in a state of superposition.

Image: 
Caltech

The technology behind the quantum computers of the future is fast developing, with several different approaches in progress. Many of the strategies, or "blueprints," for quantum computers rely on atoms or artificial atom-like electrical circuits. In a new theoretical study in the journal Physical Review X, a group of physicists at Caltech demonstrates the benefits of a lesser-studied approach that relies not on atoms but molecules.

"In the quantum world, we have several blueprints on the table and we are simultaneously improving all of them," says lead author Victor Albert, the Lee A. DuBridge Postdoctoral Scholar in Theoretical Physics. "People have been thinking about using molecules to encode information since 2001, but now we are showing how molecules, which are more complex than atoms, could lead to fewer errors in quantum computing."

At the heart of quantum computers are what are known as qubits. These are similar to the bits in classical computers, but unlike classical bits they can experience a bizarre phenomenon known as superposition in which they exist in two states or more at once. Like the famous Schrödinger's cat thought experiment, which describes a cat that is both dead and alive at the same time, particles can exist in multiple states at once. The phenomenon of superposition is at the heart of quantum computing: the fact that qubits can take on many forms simultaneously means that they have exponentially more computing power than classical bits.

But the state of superposition is a delicate one, as qubits are prone to collapsing out of their desired states, and this leads to computing errors.

"In classical computing, you have to worry about the bits flipping, in which a '1' bit goes to a '0' or vice versa, which causes errors," says Albert. "This is like flipping a coin, and it is hard to do. But in quantum computing, the information is stored in fragile superpositions, and even the quantum equivalent of a gust of wind can lead to errors."

However, if a quantum computer platform uses qubits made of molecules, the researchers say, these types of errors are more likely to be prevented than in other quantum platforms. One concept behind the new research comes from work performed nearly 20 years ago by Caltech researchers John Preskill, Richard P. Feynman Professor of Theoretical Physics and director of the Institute of Quantum Information and Matter (IQIM), and Alexei Kitaev, the Ronald and Maxine Linde Professor of Theoretical Physics and Mathematics at Caltech, along with their colleague Daniel Gottesman (PhD '97) of the Perimeter Institute in Ontario, Canada. Back then, the scientists proposed a loophole that would provide a way around a phenomenon called Heisenberg's uncertainty principle, which was introduced in 1927 by German physicist Werner Heisenberg. The principle states that one cannot simultaneously know with very high precision both where a particle is and where it is going.

"There is a joke where Heisenberg gets pulled over by a police officer who says he knows Heisenberg's speed was 90 miles per hour, and Heisenberg replies, 'Now I have no idea where I am,'" says Albert.

The uncertainty principle is a challenge for quantum computers because it implies that the quantum states of the qubits cannot be known well enough to determine whether or not errors have occurred. However, Gottesman, Kitaev, and Preskill figured out that while the exact position and momentum of a particle could not be measured, it was possible to detect very tiny shifts to its position and momentum. These shifts could reveal that an error has occurred, making it possible to push the system back to the correct state. This error-correcting scheme, known as GKP after its discoverers, has recently been implemented in superconducting circuit devices.

"Errors are okay but only if we know they happen," says Preskill, a co-author on the Physical Review X paper and also the scientific coordinator for a new Department of Energy-funded science center called the Quantum Systems Accelerator. "The whole point of error correction is to maximize the amount of knowledge we have about potential errors."

In the new paper, this concept is applied to rotating molecules in superposition. If the orientation or angular momentum of the molecule shifts by a small amount, those shifts can be simultaneously corrected.

"We want to track the quantum information as it's evolving under the noise," says Albert. "The noise is kicking us around a little bit. But if we have a carefully chosen superposition of the molecules' states, we can measure both orientation and angular momentum as long as they are small enough. And then we can kick the system back to compensate."

Jacob Covey, a co-author on the paper and former Caltech postdoctoral scholar who recently joined the faculty at the University of Illinois, says that it might be possible to eventually individually control molecules for use in quantum information systems such as these. He and his team have made strides in using optical laser beams, or "tweezers," to control single neutral atoms (neutral atoms are another promising platform for quantum-information systems).

"The appeal of molecules is that they are very complex structures that can be very densely packed," says Covey. "If we can figure out how to utilize molecules in quantum computing, we can robustly encode information and improve the efficiency in which qubits are packed."

Albert says that the trio of himself, Preskill, and Covey provided the perfect combination of theoretical and experimental expertise to achieve the latest results. He and Preskill are both theorists while Covey is an experimentalist. "It was really nice to have somebody like John to help me with the framework for all this theory of error-correcting codes, and Jake gave us crucial guidance on what is happening in labs."

Says Preskill, "This is a paper that no one of the three of us could have written on our own. What's really fun about the field of quantum information is that it's encouraging us to interact across some of these divides, and Caltech, with its small size, is the perfect place to get this done."

Credit: 
California Institute of Technology

Attacking tumors directly on identification

Andres Luengo from the research group led by Professor M. Concepción Gimeno based at the University of Zaragoza (Spain) carried out part of the work during a research stay in the group headed by Professor Nils Metzler-Nolte at Ruhr-Universität Bochum (RUB). The international research team published its report in the coverstory of the journal Chemistry - A European Journal from 31. August 2020.

Visualisation and treatment in one step

Theranostics, the combination of "therapy" and "diagnostics", refers to drugs that are used not only to treat tumours but also to render them visible. The principle is as simple as it is ingenious: for example, in prostate cancer treatment, a prostate-specific antibody is radioactively labelled. Once the antibody has bound the prostate cancer cells, the radioactivity emitted by the theranostic agent is used to visualise the tumour and possible metastases, and at the same time it also has a damaging effect on the cancer cells at the target site.

During his research stay at RUB, Andres Luengo took advantage of the experience of the Bioinorganic Chemistry group regarding the production of small biomolecules and metal building blocks that have a toxic effect on cancer cells. He combined a small biomolecule called enkephalin, which can dock to opioid receptors that are abundant in some cancers, with a luminescent and a toxic metal building block. He thus succeeded in producing a molecule that has the same properties as other advanced theranostic agents, but which can be detected by irradiation with visible light rather than radioactivity.

Promising new system

Gimeno's research team used the molecule's luminescent properties to detect it within cells and demonstrated its toxic effect at the same time, thus paving the way for further research into this promising and innovative theranostic system.

Following the analysis of the new compound, the researchers found that only one of three slightly different compounds had an active effect against cancer cells. In addition, it turned out that the compound moved to an unexpected location in the cancer cells where the team had not expected to find it. The damaging effect on tumour cells depended on the stability of the bond between the biomolecule, a peptide, and the cell-damaging metal complex: the cell-damaging complex can reach its cellular target structure and attack the cells only if that bond is less stable and can therefore break up.

Credit: 
Ruhr-University Bochum

Decades-old mystery of lithium-ion battery storage solved

image: Battery testing system in Dr. Yu's Lab for developing advanced electrode materials.

Image: 
The University of Texas at Austin

For years, researchers have aimed to learn more about a group of metal oxides that show promise as key materials for the next generation of lithium-ion batteries because of their mysterious ability to store significantly more energy than should be possible. An international research team, co-led by The University of Texas at Austin, has cracked the code of this scientific anomaly, knocking down a barrier to building ultra-fast battery energy storage systems.

The team found that these metal oxides possess unique ways to store energy beyond classic electrochemical storage mechanisms. The research, published in Nature Materials, found several types of metal compounds with up to three times the energy storage capability compared with materials common in today's commercially available lithium-ion batteries.

By decoding this mystery, the researchers are helping unlock batteries with greater energy capacity. That could mean smaller, more powerful batteries able to rapidly deliver charges for everything from smartphones to electric vehicles.

"For nearly two decades, the research community has been perplexed by these materials' anomalously high capacities beyond their theoretical limits," said Guihua Yu, an associate professor in the Walker Department of Mechanical Engineering at the Cockrell School of Engineering and one of the leaders of the project. "This work demonstrates the very first experimental evidence to show the extra charge is stored physically inside these materials via space charge storage mechanism."

To demonstrate this phenomenon, the team found a way to monitor and measure how the elements change over time. Researchers from UT, the Massachusetts Institute of Technology, the University of Waterloo in Canada, Shandong University of China, Qingdao University in China and the Chinese Academy of Sciences participated in the project.

At the center of the discovery are transition-metal oxides, which are compounds that include oxygen bonded with transition metals such as iron, nickel and zinc. Energy can be stored inside the metal oxides -- as opposed to typical methods that see lithium ions move in and out of these materials or convert their crystal structures for energy storage. And the researchers show that additional charge capacity can also be stored at the surface of iron nanoparticles formed during a series of conventional electrochemical processes.

A broad range of transition metals can unlock this extra capacity, according to the research, and they share a common thread -- the ability to collect a high density of electrons. These materials aren't yet ready for prime time, Yu said, primarily because of a lack of knowledge about them. But the researchers said these new findings should go a long way in shedding light on the potential of these materials.

The key technique employed in this study, named in situ magnetometry, is a real-time magnetic monitoring method to investigate the evolution of a material's internal electronic structure. It is able to quantify the charge capacity by measuring variations in magnetism. This technique can be used to study charge storage at a very small scale that is beyond the capabilities of many conventional characterization tools.

"The most significant results were obtained from a technique commonly used by physicists but very rarely in the battery community," Yu said. "This is a perfect showcase of a beautiful marriage of physics and electrochemistry."

Credit: 
University of Texas at Austin

Partnership leverages evidence-based practices to improve long-term care quality

A study published in the Journal of the Medical Directors Association demonstrated that a partnership between long-term care organizations in two countries working in collaboration with researchers and national health care organizations can generate changes that improve quality of care for residents. Authors of "The Seniors Quality Leap Initiative: An International Collaborative to Improve Quality in Long-term Care" include lead author John Hirdes, Ph.D., Professor, School of Public Health and Health Systems, University of Waterloo; Paul Katz, M.D., Professor, College of Medicine, Florida State University; John Morris, Ph.D., Director Emeritus of Social and Health Policy Research in the Hinda and Arthur Institute for Aging Research at Hebrew SeniorLife; Tammy Retalic, Chief Nursing Officer and Vice President, Patient Care Services, at Hebrew Rehabilitation Center; and Cyrelle Muskat, Director, Quality, Systems, and Wellness and Manager, Seniors Quality Leap Initiative at Baycrest Health Sciences.

Using evidence to drive quality improvement is often a daunting task for individual organizations providing long-term care. Although these facilities care for some of the most vulnerable patients, they vary widely in their ability to collect and analyze data that would help put evidence-based practices in place to improve patient care. An evidence-based practice is any practice that relies on scientific research for guidance and decision-making.

The study reports on the work of the Seniors Quality Leap Initiative (SQLI). The membership is currently drawn from Canada and the United States. The participating long-term care homes include for-profit and not-for-profit organizations in different geographic regions with two distinctive systems of government, health policy, and funding models.

SQLI leverages the collective expertise of its members to enhance both quality of care and quality of life for its many thousands of residents. By demonstrating that meaningful change in two countries with disparate long-term care environments is possible through sharing evidence-based practices, SQLI offers a potentially new model for system-wide quality improvement.

According to Dr. Hirdes, "This collaborative community of practice is a replicable real-life demonstration that scientifically sound evidence can be used to improve the quality of long term care." Organizations participating in SQLI work together to improve care processes in ways that enhance quality through a shared commitment to:

Identifying needs;

Employing flexible, but practical initiatives; and

Evaluating the impact of those initiatives through a transparent reporting mechanism.

"SQLI created a safe zone that encouraged each participating organization to identify gaps in evidence- based practice," Retalic said. "Using the organizations' internal quality improvement processes, each member organization identified and implemented process improvement strategies designed to improve their own internal results with the goal of improving the overall scores for the SQLI collaborative. The format resulted in open conversations about difficult challenges that ultimately improved practices for all the member sites."

The study relied on ongoing clinical assessment records related to pain management and included long-term care residents and patients in 14 organizations in Canada and the U.S. between 2014 and 2017. The most recent analytic samples involve 11,123 unique residents/patients in 68 facilities associated with 14 different long-term care organizations. Data shows improving care related to pain resulted in notable improvements in quality in specific facilities, as well as within the network as a whole.

"The objective of SQLI is first to improve the targeted quality outcome and second to assess how well these sites measure up against all facilities in the U.S. and Canada," said Dr. Morris. "Standards have been established and the SQLI goal is to be among the best performing facilities in North America."

Credit: 
Hebrew SeniorLife Hinda and Arthur Marcus Institute for Aging Research

These lifestyle choices can reduce the risk of chronic kidney disease

image: Juan Jesus Carrero, professor of epidemiology in the Department of Medical Epidemiology and Biostatistics at Karolinska Institutet.

Image: 
Stefan Zimmerman

Active lifestyle choices such as eating vegetables, exercising and quitting smoking can reduce the risk of chronic kidney disease, a new study led by researchers at Karolinska Institutet in Sweden and Griffith University in Australia, reports. The study is published in The Journal of the American Society of Nephrology.

About 10 percent of the world population suffers from some kind of chronic kidney disease. In 2017, more than 1.2 million people were estimated to have died as a direct result of their kidney disease and another 1.4 million of the cardiovascular complications caused by reduced kidney function.

Despite these alarming figures, there is no evidence-based guidance on what lifestyle changes can help to prevent kidney disease from occurring. Current advice to patients is based on how to prevent other diseases, such as hypertension and cardiovascular disease, which are considered important causes of kidney damage.

The researchers have conducted a systematic review and meta-analysis of more than 100 published research papers to investigate which lifestyle changes can lower the risk of kidney disease.

The study included more than 2.5 million healthy people from 16 countries. Of particular interest to the researchers were the effects of diet, exercise, tobacco smoking and alcohol on the risk of developing kidney problems.

"We discovered that lifestyle plays a big role and identified a number of recommendations that can be conveyed to healthy people wanting to reduce their risk of developing chronic kidney disease," says Dr Jaimon Kelly, a postdoctoral research fellow at Griffith University.

The advice includes a more vegetable-rich diet, a higher potassium intake, more exercise, less alcohol consumption, less salt consumption and quitting smoking. Adherence to these recommendations could reduce the risk of chronic kidney disease by between 14 and 22 percent.

"In the absence of randomised intervention studies in the field, this study is the best evidence we have to date on what lifestyle choices can help for primary prevention of kidney disease," says Juan Jesus Carrero, professor of epidemiology at the Department of Medical Epidemiology and Biostatistics, Karolinska Institutet. "The results can be used in the development of public health recommendations and in discussions with patients on how to lower their risk of kidney disease."

The researchers stress that the advice applies to healthy people at risk of developing kidney problems, and that people who are already suffering from kidney disease are to follow other lifestyle recommendations to avoid unnecessary strain on their kidneys.

Credit: 
Karolinska Institutet

Electromagnetic chirality: From fundamentals to nontraditional chiroptical phenomena

image: Chiroptical properties of discrete scattering chiroptical objects (chiral molecules and particles) and continuous chiroptical media can be quantified. In addition, chiroptical properties of light have been quantified in terms of local density of optical chirality and optical helicity, and their fluxes and spin, and orbital angular momenta.

Image: 
by Jungho Mun, Minkyung Kim, Younghwan Yang, Trevon Badloe, Jincheng Ni, Yang Chen, Cheng-Wei Qiu, and Junsuk Rho

Recent advancements in artificial nanomaterials and structured optical fields have expanded the concept of chiroptical phenomena. However, chiroptical phenomena originate from complicated processes involving transitions between states with opposite parities, so fundamentals of chiroptical processes are required for solid interpretation the phenomena. Here, theoretical frameworks on chiroptical properties of electromagnetic materials are discussed in the context of microscopic (discrete chiroptical scatterers) and macroscopic (continuous chiroptical media) systems.

"Chiral object refers to a three-dimensional object that cannot be superimposed onto its mirror image using only translations and rotations. Such chiral objects interact differently with left- and right-circularly polarized lights, and absorption difference at these two circular polarizations (circular dichroism) has been widely used to characterize chiroptical properties of the chiral objects. However, (geometric) chirality is a qualitative property; that is, we do not say one's hand is more chiral than another's hand. On the other hand, observed chiroptical effects are measurable quantities. By introducing chiroptical parameters, the chiroptical effects can be described and the degree of electromagnetic chirality can be defined and quantified."

Additionally, chiroptical properties of electromagnetic fields are discussed in the context of local density of field chirality and its flux, which have been defined as the optical chirality and optical helicity. Also, helical beams with intrinsic orbital angular momentum are discussed as another class of chiral light.

"Generally speaking, a chiral phenomenon involves two chiral objects, where one chiral object differently interacts with another chiral object and its enantiomer (mirror image). In chiroptical phenomena, one of the chiral objects is the light itself. By recognizing that light can also be chiral, the degree of chirality of the field can also be quantified."

Several chiroptical phenomena are discussed under the framework of using the identical chiroptical parameters of the fields and materials. This approach provides a clear understanding of several chiroptical phenomena including intrinsic and extrinsic chirality, enantioselective scattering, molecular sensing, and optomechanical effects. This review paper will be helpful to understand complicated chiroptical phenomena and for designing and optimizing chiroptical systems and fields with well-defined figure of merit.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

Regional variations in freshwater overconsumption

Freshwater -- which falls to the earth as precipitation or exists beneath the surface as groundwater -- is desperately needed to sustain people, plants and animals. With an ever-increasing human population, water shortages already occurring in many areas are only expected to get worse. Now, researchers reporting in ACS' Environmental Science & Technology have estimated the freshwater supply and demand of about 11,000 water basins across the globe, determining that one-fourth of freshwater consumption exceeds regional capacities.

People use freshwater for many essential purposes, including drinking, hygiene and irrigating crops. Studies have estimated that the current global level of freshwater consumption does not exceed the global supply. However, regional variations exist, and the need to grow more food for larger populations will increase freshwater demand. In addition, international trade -- for example, water-scare countries importing food from water-rich countries -- can influence regional freshwater supply and demand. Masaharu Motoshita and colleagues wanted to conduct a comprehensive analysis of how much freshwater is available for human consumption in many different regions. They also wanted to determine how much of this water is essential to sustain human life, and how much is surplus or "luxury" consumption.

To find out, the researchers calculated the freshwater available for humans in about 11,000 watersheds around the world and compared that amount with the water consumed in that region for basic human needs (drinking water, food production and hygiene), as well as luxury use. They found that about 24% of total freshwater used by humans in these watersheds exceeded regional capacities, often at the expense of ecosystems. About 59% of this overconsumption was to satisfy basic human needs, while the rest was luxury use. In many areas, overconsumption occurred only at certain times of the year. International trade alleviated about 4.8% of global overconsumption. Although options to reduce water overconsumption vary by region, some possibilities include improving irrigation efficiency, shifting to less water-intensive crops or different production sites, increasing water storage in reservoirs, reducing food waste and changing food consumption patterns, the researchers say.

Credit: 
American Chemical Society

Excitable cells

A study led by researchers from Tasmania, Chile and Germany has furthered our understanding of plant evolution by tracking the origins of electrical signalling components that plants developed to communicate and adapt to life on land.

The research team, including University of Tasmania plant scientist Dr Frances Sussmilch, studied DNA sequences of diverse plant species to map the evolutionary origins of important adaptations.

"By looking closely at the sequences for potassium channels in these species, we were excited to see evidence of ancient evolutionary events that have led to the complex cell-to-cell signalling mechanisms that can be seen in modern-day plants," Dr Sussmilch said.

This story began around 500 million years ago when the algal ancestors of land plants emerged from water and conquered dry land.

Plants faced new challenges on land, including difficult and changeable environmental conditions such as drought, heat, UV and wind, and interactions with new biological threats.

"These ancient plant pioneers made the landscape more hospitable for the first terrestrial animals and gave rise to the incredible diversity of land plants we see today, including bryophytes, ferns, gymnosperms and flowering plants," Dr Sussmilch said.

"A feature that likely helped plants to thrive on land is the presence of molecular mechanisms for cell-to-cell signalling. This enables communication between cells in the plant body, facilitating coordinated responses to environmental changes."

One means of rapid cell-to-cell communication that operates in plants is action-potential-based electrical signalling, which is comparable to the electrical impulses that control the synchronised beat of the human heart.

In plants, action potentials can be used for rapid responses to signals including wounding or touch, such as a Venus flytrap snapping shut. For electrical communication, plants need voltage-gated ion channels which control the flow of ions in and out of a cell in response to stressors.

Dr Sussmilch said channels that enable the movement of potassium ions across the cell membrane in plants play a critical role in electrical excitability.

The researchers made use of gene sequence resources for species representing each of the major groups of land plants and green algae.

They examined the sequences of potassium channels from these diverse plants for sequence 'fingerprints' indicative of different structural and functional characteristics.

The study found sequences with the fingerprint characteristic of one group of related efflux channels, that enable potassium ions to exit cells, was preserved between most land plant lineages - liverworts, hornworts, ferns, gymnosperms and flowering plants - and their closest living algal relatives.

"This suggests that this channel type may have already been present in plants when they first emerged from the water and was likely retained as land plants continued to evolve due to its importance for cell-to-cell communication," Dr Sussmilch said.

However, this channel type was absent in all moss species examined, indicating that it may have become unnecessary in the niches that early mosses occupied and been lost from the genome of a common ancestor of the mosses alive today.

In contrast, they found that the fingerprints for four distinct types of related influx channels, that enable potassium ions to enter cells, have diversified more recently, allowing for an increased variety of channel functions in flowering plants.

The research findings, together with a sequence analysis pipeline that can be used to examine the evolution of other gene families in the same way, have recently been published in the journal Trends in Plant Science.

Credit: 
University of Tasmania

Aviation contributes 3.5% to the drivers of climate change that stem from humans

image: Climate forcings from global aviation emissions and cloudiness

Image: 
Manchester Metropolitan University

Aviation has been calculated to be 3.5 per cent of all human activities that drive climate change, new research shows.

A new international study provides unprecedented calculations of the impact of aviation on the climate from 2000 to 2018 to produce the most comprehensive insight to date.

The findings show that two-thirds of the impact from aviation is attributed to non-carbon dioxide emissions and the rest from CO2.

The research was led by the UK's Manchester Metropolitan University, in collaboration with numerous academic and research institutions across the globe, over the past five years

The analysis - published in the journal Atmospheric Environment - is the first of its kind since 2009 and will be of significant use to stakeholders such as policymakers, industry bodies and non-government organisations.

Researchers evaluated all of the aviation industry's contributing factors to climate change including carbon dioxide (CO2) and nitrogen oxide (NOx) emissions, and the effect of contrails and contrail cirrus - clouds of ice crystals created by aircraft jet engines at high altitude.

This was analysed alongside the water vapour, soot, and aerosol and sulfate aerosol gases - fine particles suspended in the air - found in the exhaust plumes emitted by aircraft engines.

The study is unique because it is the complete first set of calculations for aviation that uses a new metric introduced in 2013 by the Intergovernmental Panel on Climate Change.

This metric is called 'effective radiative forcing' (ERF) and represents the increase or decrease since pre-industrialisation times in the balance between the energy coming from the sun and the energy emitted from the earth, known as the earth-atmosphere radiation budget.

Using the new ERF metric, the team found that contrail cirrus' impact is less than half than that estimated previously but still the sector's largest contribution to global warming, by reflecting and trapping escaping heat from the atmosphere.

Carbon dioxide emissions represent the second largest contribution but unlike the effects of contrail cirrus, CO2's effect on climate lasts for many centuries.

Lead author Professor David Lee, Professor of Atmospheric Science at Manchester Metropolitan University and Director of its Centre for Aviation, Transport, and the Environment research group, said: "Given the dependence of aviation on burning fossil fuel, its significant CO2 and non-CO2 effects, and the projected fleet growth, it is vital to understand the scale of aviation's impact on present day climate change, especially in view of the requirements of the Paris Agreement to reach 'net zero' CO2 emissions by around 2050.

"But estimating aviation's non-CO2 effects on atmospheric chemistry and clouds is a complex challenge for contemporary atmospheric modeling systems.

"It is difficult to calculate the contributions caused by a range of atmospheric physical processes, including how air moves, chemical transformations, microphysics, radiation, and transport."

The scientists undertook a comprehensive analysis of individual aviation ERFs to provide an overall ERF for global aviation for the first time.

Similar studies were conducted in 1999, 2005 and 2009 but this is the most current and most extensive, with lots of the details in the science having changed and matured.

Professor Lee added: "The new study means that aviation's impact on climate change can be compared with other sectors such as maritime shipping, ground transportation and energy generation as it has a consistent set of ERF measurements."

Dr Laura Wilcox, an atmospheric scientist at the University of Reading and NCAS, contributed the assessment of water vapour impact to the study. She said: "There are many different components of aviation's large impact on climate change, but the positive side of that is it provides us with many ways we can make changes to mitigate it.

"This massive assessment demonstrates the magnitude of the climate change impact of aviation, and confirms that urgent action is needed to reduce the environmental impact of all travel to avoid very serious impacts to our way of life in the future."

Professor Lee and his team calculated that the cumulative CO2 emissions of global aviation throughout the course of the industry's entire history - defined as between 1940 and 2018 - were 32.6 billion tonnes.

Approximately half the total cumulative emissions of CO2 were generated in the last 20 years alone, attributed largely to the expansion of the number of flights, number of routes and fleet sizes, particularly in Asia, though partially offset by improvements in aircraft and jet engine technology, larger average aircraft sizes and increasing efficiency in the use of aircraft capacity to fit more passengers in the same space.

The research team estimated the figure of 32.6 billion tonnes accounted for 1.5 per cent of total CO2 emissions ever at that point.

And when the non-CO2 impacts were factored in, aviation's was calculated to be 3.5 per cent of all human activities that drive climate change.

The researchers noted that while the 2016 Paris Agreement on climate change does include domestic aviation in individual country's reduction targets, it does not address international aviation, which accounts for 64 per cent of air traffic.

Unlike direct emissions of non-CO2 greenhouse gases, such as nitrous oxide and methane from sources such as the agricultural sector, aviation's non-CO2 effects are not covered by the former Kyoto Protocol.

Professor Lee added: "It is unclear whether future developments of the Paris Agreement or International Civil Aviation Organization negotiations to mitigate climate change, in general, will include short-lived indirect greenhouse gases like nitrogen oxides, contrail cirrus, aerosol-cloud effects, or other aviation non-CO2 effects.

"Aviation is not mentioned explicitly in the text of the Paris Agreement, which says total global greenhouse gas emissions need to be reduced rapidly to achieve a balance between man-made emissions and sinks of greenhouse gases in the second half of this century.

"As the COVID-19 pandemic changes, aviation traffic is likely to recover to meet projected rates on varying timescales, with continued growth, further increasing CO2 emissions and, of course, historical emissions of CO2 take many centuries to be removed.

"Therefore, reducing CO2 aviation emissions will remain a continued focus in reducing future man-made climate change, along with aviation's non-CO2 contribution."

The study suggests solutions that include re-routing flights to avoid creating contrail cirrus but the trade-off is a longer flight path and more fuel burnt, producing more greenhouse gas emissions.

The team also noted how changes to combustion technology to reduce NOx emissions can increase CO2 emissions.

Co-author David Fahey, Director of the Earth System Research Laboratories at the United States' National Oceanic and Atmospheric Administration, and a Visiting Professor at Manchester Metropolitan University, said: "This study is a great example of an international collaboration to clarify how human activities cause climate change.

"Our assessment has strengthened the scientific foundation of the role of aviation in the climate system and established a framework for future assessments.

"Our assessment will aid decision makers and the industry in pursuing any future mitigation actions while protecting this important sector from any inaccurate assertions concerning its role in the climate system."

Credit: 
University of Reading