Tech

Hotter, drier, CRISPR: editing for climate change

image: Dr Karen Massel from the University of Queensland has authored a review supporting the integration of genome and gene editing into plant breeding, to combat major challenges facing the agricultural industries, such as climate change.

Image: 
The University of Queensland

Gene editing technology will play a vital role in climate-proofing future crops to protect global food supplies, according to scientists at The University of Queensland.

Biotechnologist Dr Karen Massel from UQ's Centre for Crop Science has published a review of gene editing technologies such as CRISPR-Cas9 to safeguard food security in farming systems under stress from extreme and variable climate conditions.

"Farmers have been manipulating the DNA of plants using conventional breeding technologies for millennia, and now with new gene-editing technologies, we can do this with unprecedented safety, precision and speed," Dr Massel said.

"This type of gene editing mimics the way cells repair in nature."

Her review recommended integrating CRISPR-Cas9 genome editing into modern breeding programs for crop improvement in cereals.

Energy-rich cereal crops such as wheat, rice, maize and sorghum provide two-thirds of the world's food energy intake.

"Just 15 plant crops provide 90 per cent of the world's food calories," Dr Massel said.

"It's a race between a changing climate and plant breeders' ability to produce crops with genetic resilience that grow well in adverse conditions and have enriched nutritional qualities.

"The problem is that it takes too long for breeders to detect and make that genetic diversity available to farmers, with a breeding cycle averaging about 15 years for cereal crops.

"Plus CRISPR allows us to do things we can't do through conventional breeding in terms of generating novel diversity and improving breeding for desirable traits."

In proof-of-concept studies, Dr Massel and colleagues at the Queensland Alliance for Agriculture and Food Innovation (QAAFI) applied gene editing technology to sorghum and barley pre-breeding programs.

"In sorghum, we edited the plant's genes to unlock the digestibility level of the available protein and to boost its nutritional value for humans and livestock," she said.

"We've also used gene-editing to modify the canopy architecture and root architecture of both sorghum and barley, to improve water use efficiency."

Dr Massel's research also compared the different genome sequences of cereals - including wild variants and ancestors of modern cereals - to differences in crop performance in different climates and under different kinds of stresses.

"Wild varieties of production crops serve as a reservoir of genetic diversity, which is especially valuable when it comes to climate resilience," she said.

"We are looking for genes or gene networks that will improve resilience in adverse growing climates.

"Once a viable gene variant is identified, the trick is to re-create it directly in high-performing cultivated crops without disrupting the delicate balance of genetics related to production traits.

"These kinds of changes can be so subtle that they are indistinguishable from the naturally occurring variants that inspired them."

In 2019, Australia's Office of the Gene Technology Regulator deregulated gene-editing, differentiating it from genetically modified organism (GMO) technology.

Gene edited crops are not yet grown in Australia, but biosecurity and safety risk assessments of the technology are currently being undertaken.

Credit: 
University of Queensland

Assessing hemp-containing foodstuff

In order to avoid the occurrence of such effects, the Federal Institute for Health Protection of Consumers and Veterinary Medicine (BgVV) recommended guidance values for maximum THC levels in various food groups in 2000. The guidance value for beverages was given as 0.005 mg/kg, for edible oils with 5 mg/kg and for all other foods with 0.150 mg/kg. In 2018, the BfR came to the conclusion that these values no longer correspond to current scientific knowledge.

Instead, the BfR recommends that the toxicological assessment of hemp-containing foods be carried out on the basis of the acute reference dose (ARfD) of 1 microgram Δ-THC/kg body-weight derived by the European Food Safety Authority (EFSA) in 2015. The ARfD specifies the estimated maximum quantity of a substance that can be consumed with food in the course of one day - either during one meal or during several meals - without a detectable risk to health. From the point of view of the BfR, whether the ARfD can possibly be exceeded should be checked on a case-by-case basis for each product under assessment. The measured THC levels and the estimated consumption quantities are used for this assessment. Information on the latter can be found in the "EFSA Comprehensive European Food Consumption Database".

Credit: 
BfR Federal Institute for Risk Assessment

Researchers watch anti-cancer drug release from DNA nanostructures in real time

DNA nanotechnology - the research field using DNA molecules as building material - has developed rapidly during recent years and enabled the construction of increasingly complex nanostructures. DNA nanostructures, such as DNA origami, serve as an excellent foundation for nanocarrier-based drug delivery applications, and examples of their use in medical treatments have already been demonstrated. Although the stability of such DNA nanostructures under physiological conditions can be improved, little is known about their digestion by endonucleases, which, found everywhere in our blood and tissues, are responsible for destroying foreign DNA in our bodies.

To tackle this emerging question, a team of researchers from Aalto University (Finland), the University of Jyväskylä (Finland), Ludwig-Maximilian-Universität München (Germany) and Universität Paderborn (Germany) have found a way to study the endonuclease-driven digestion of drug-loaded DNA nanostructures in real time.

The researchers' previous experiments used high-speed atomic force microscopy to show that the design of DNA origami plays a role in how quickly they break apart in an endonuclease-rich environment. While they could follow the digestion process at a single-structure level, the approach was limited to two-dimensional DNA origami shapes deposited on a microscope substrate.

Now the group has monitored DNA degradation and the subsequent anti-cancer drug doxorubicin (Dox) release from the DNA structures. The drug bonds between DNA base pairs.

'We observed both the digestion and drug release profiles as the drug is released upon DNA fragmentation by nucleases, and importantly, in the solution phase. With this method we can actually see the collective behaviour of all the nanostructures when they are floating freely in liquid,' says Adjunct Professor Veikko Linko from Aalto University, who led the study.

'It seems the digestion happens differently on substrates and in solution, and by combining these two types of information, we can better understand how the nanostructures are digested by nucleases in the bloodstream. Moreover, we showed that the drug release profiles were closely linked to the digestion profiles, and a wide range of drug doses could be achieved simply by changing the shape or geometry of the DNA nanostructure,' explains doctoral student Heini Ijäs, the main author of the research.

As the team investigated the binding of Dox to the DNA structures in great detail, they discovered that the majority of previous studies have vastly overestimated the Dox loading capacity of DNA origami.

'The anti-cancer effects of Dox-equipped DNA nanostructures have been reported in many publications, but it seems these effects may have been mainly caused by free or aggregated Dox molecules, not by the drug-loaded DNA motifs. We believe this type of information is crucial for the development of safe and more effective drug delivery systems, and brings us one step closer to real-world DNA-based biomedical applications,' says Ijäs.

Credit: 
Aalto University

Tundra vegetation shows similar patterns along microclimates from Arctic to sub-Antarctic

image: The researchers collected data across four distinct tundra regions.

Image: 
Photos: Julia Kemppinen and Peter C. le Roux.

Researchers are in the search for generalisable rules and patterns in nature. Biogeographer Julia Kemppinen together with her colleagues tested if plant functional traits show similar patterns along microclimatic gradients across far-apart regions from the high-Arctic Svalbard to the sub-Antarctic Marion Island. Kemppinen and her colleagues found surprisingly identical patterns.

It is widely known that global vegetation patterns and plant properties follow major differences in climate. Yet, it has remained a mystery how well the same rules can be applied at very local scales. Are responses to the environment similar in plant communities along local temperature gradients in Svalbard, Greenland, Fennoscandia, and Marion island? The results published in Nature Ecology & Evolution indicate that these generalisable patterns do exist.

The researchers collected field data on 217 species from nearly 7000 study plots. The results revealed strong, consistent plant functional trait-environment relationships across the four tundra ecosystems.

"This is important because plant functional traits inform us how plants use resources, such as soil moisture, and how plants shape their environments such as carbon cycling. In addition, traits investigations can also give a hint on how plants may react to the ongoing climate change", says Post doctoral researcher Julia Kemppinen from the University of Oulu.

The researchers found patterns that hold despite unique species pools and other site-specific characteristics. This information improves the biological basis for climate change impact predictions for vulnerable tundra ecosystems.

At coarse spatial scales, there are clear global climatic patterns in temperature and precipitation. For instance, the high-Arctic Svalbard is generally much colder than sub-Arctic Fennoscandia. However, ground-dwelling plants experience local climate conditions, the microclimate, which plays an important role in how ecosystems are responding to climate change.

"It is fascinating to find very distinct differences in soil moisture and soil temperatures at a local scale. If you look close enough, the warmest micro spots in Svalbard have higher temperatures than the coldest spots in Fennoscandia. These local hydrological and thermal conditions clearly affect plants and their functional traits", says Post doctoral researcher Pekka Niittynen from the University of Helsinki.

The results indicate that the tundra plant communities respond similarly to microclimate. This helps generalising scientific results from one tundra region to another without making too bold conclusions.

Investigating the connections between plant functional traits and the environment requires a lot of data. The researchers combined their field data with over 76000 database trait records provided by the Botanical Information and Ecological Network, TRY Plant Trait Database and the Tundra Trait Team.

"Our research groups at the BioGeoClimate Modelling Lab at the University of Helsinki and the le Roux lab at the University of Pretoria collected a lot of data in the field, but we couldn't have done this study without high-quality, open data from global databases", says professor Miska Luoto from the University of Helsinki.

The study is a part of Kemppinen's PhD thesis Soil moisture and its importance for tundra plants at the University of Helsinki, Helsinki, Finland.

Credit: 
University of Helsinki

Story tips: Quantum building blocks, high-pressure diamonds, wildfire ecology and more

image: Transition metals stitched into graphene with an electron beam form promising quantum building blocks.

Image: 
Ondrej Dyck, Andrew Lupini and Jacob Swett/ORNL, U.S. Dept. of Energy

Materials - Quantum building blocks

Oak Ridge National Laboratory scientists demonstrated that an electron microscope can be used to selectively remove carbon atoms from graphene's atomically thin lattice and stitch transition-metal dopant atoms in their place.

This method could open the door to making quantum building blocks that can interact to produce exotic electronic, magnetic and topological properties.

This is the first precision positioning of transition-metal dopants in graphene. The produced graphene-dopant complexes can exhibit atomic-like behavior, inducing desired properties in the graphene.

"What could you build if you could put any atoms exactly where you want? Just about anything," ORNL's Ondrej Dyck said. He co-led the study with Stephen Jesse at ORNL's Center for Nanophase Materials Sciences.

"If a lot of these quantum building blocks get together, they can start to act in a correlated manner, which is when really exciting properties begin to emerge," Jesse said. The scientists plan to make arrays of interacting quantum building blocks to investigate emergent properties.

Media contact: Dawn Levy, 865.202.9465, levyd@ornl.gov

Image: https://www.ornl.gov/sites/default/files/2021-02/quantum-building-blocks_0.jpg

Caption: Transition metals stitched into graphene with an electron beam form promising quantum building blocks. Credit: Ondrej Dyck, Andrew Lupini and Jacob Swett/ORNL, U.S. Dept. of Energy

Neutrons - Hard diamonds, high pressures

Researchers at Oak Ridge National Laboratory's Spallation Neutron Source have developed a diamond anvil pressure cell that will enable high-pressure science currently not possible at any other neutron source in the world.

Using the SNAP instrument, the team measured high-quality powder diffraction data on a material above 120 gigapascals, shattering the previously held record of 62 GPa for meaningful structural data.

What's more, the tiny submillimeter-sized sample used in the experiment is likely the smallest neutron sample ever measured and yet is also one of the largest powder samples ever held at such a high static pressure.

While scientists have used X-ray powder diffraction at such pressures for decades, it was previously not possible using neutrons.

"This breakthrough enables new studies on the structures of high-pressure super-hydrides that exhibit room-temperature superconductivity. It even enables investigations into materials at earth-core pressure conditions," said ORNL's Bianca Haberl.

Media contact: Jeremy Rumsey, 865.576.2038, rumseyjp@ornl.gov

Image: https://www.ornl.gov/sites/default/files/2021-02/SNAP%20pressure%20cell%202021-P00978RR.jpg

Caption: ORNL researchers Reinhard Boehler, left, and Bianca Haberl demonstrate the improved pressure cell developed by Boehler. The device uses two gem-quality synthetic opposing diamonds to exert extreme pressures on materials. Credit: Genevieve Martin/ORNL, U.S. Dept. of Energy

Image: https://www.ornl.gov/sites/default/files/2021-02/SNAP%20pressure%20cell.jpg

Caption: The pressure cell uses two gem-quality synthetic opposing diamonds to exert extreme pressures on materials, providing fundamental insights into materials that only neutrons can reveal. Credit: Genevieve Martin/ORNL, U.S. Dept. of Energy

Ecology - After the burn

An Oak Ridge National Laboratory research team discovered that aspen saplings emerging after wildfire have less diverse microbiomes and more pathogens in their leaves, providing new insights about how fire affects ecosystem recovery.

This study demonstrated, for the first time, the indirect impacts of fire on microbes throughout plant structures.

"The leaves of these saplings never experienced fire, but we were able to show differences in their microbiome compared to saplings from unburned areas," ORNL's Chris Schadt said. "Since aspen saplings are clonally derived from the surviving roots, we had thought the leaves might be populated by organisms that were drawn up through the common root. That didn't happen."

The ORNL team plans to further study how microbes repopulate plants after fire. Additional analysis could inform ways to speed vegetative regeneration and the host of benefits that come with healthy forests, from clean water to biodiversity to carbon capture.

Media contact: Kim Askey, 865.576.2841, askeyka@ornl.gov

Image: https://www.ornl.gov/sites/default/files/2021-02/2019-09-12_09.10.25_panorama.jpg

Caption: Saplings in an aspen grove recovering from wildfire have more fungal pathogens in their leaves than the original trees. Credit: Chris Schadt/ORNL, U.S. Dept. of Energy

Image: https://www.ornl.gov/sites/default/files/2021-02/2019-09-12_09.27.18_copy.jpg

Caption: Aspen saplings begin to emerge after a fire in June of 2019. Credit: Chris Schadt/ORNL, U.S. Dept. of Energy

Manufacturing - Quick cooling tooling

A team of Oak Ridge National Laboratory researchers demonstrated that an additively manufactured hot stamping die - a tool used to create car body components - cooled faster than those produced by conventional manufacturing methods, which could lead to reduced manufacturing costs and production time.

In collaboration with industry partners Lincoln Electric and DTS, they used a gas metal arc welding-based additive technology to print the die for a B-pillar or vertical roof support structure for a sport utility vehicle. The production method allowed for the entire body of the die to be created as one monolithic part.

"With conventional methods, the dies are manufactured by drilling cooling ports in one-foot-long blocks of steel, then assembling, machining the blocks and sealing, and they take 20 days to produce," ORNL's Andrzej Nycz said. "We machined and tested the additively manufactured die in eight days and showed more uniform temperature distribution and 20% improvement in the cooling rate."

Media contact: Jennifer Burke, 865.414.6835, burkejj@ornl.gov

Image: https://www.ornl.gov/sites/default/files/2021-02/Hot_stamping_die.jpg

Caption: ORNL researchers used gas metal arc welding additive technology to print the die for a B-pillar or vertical roof support structure for a sport utility vehicle, demonstrating a 20% improvement in the cooling rate. Credit: ORNL, U.S. Dept. of Energy

Sensors - Printing on the fly

A method developed at Oak Ridge National Laboratory to print high-fidelity, passive sensors for energy applications can reduce the cost of monitoring critical power grid assets.

The sensors use surface acoustic waves, or SAWs, which can pick up changes in temperature, pressure and the presence of gases. In search of a simpler, cheaper alternative to sensors that require elaborate assembly in a clean room, ORNL researchers developed a method to print SAW sensors on substrates of lithium niobate crystal using nanoparticle inks.

The scientists demonstrated that the sensor features can be printed at a resolution of about 10 micrometers, which increases their operating frequency and sensitivity. Ongoing research aims to reach 1 micrometer resolution and to test the sensors in both a simulated nuclear plant application and on essential grid components such as transformers.

"The sensors are low cost, easy to deploy and customizable, and you can make them on the fly," said ORNL's Tim McIntyre.

Media contact: Stephanie Seay, 865.576.9894, seaysg@ornl.gov

Image 1: https://www.ornl.gov/sites/default/files/2021-02/SAW%20sensors%202021-P01086.jpg

Image 2: https://www.ornl.gov/sites/default/files/2021-02/SAW%20sensors%202021-P01084.jpg

Caption: ORNL researchers are developing a method to print low-cost, high-fidelity, customizable sensors for monitoring power grid equipment. Credit: Carlos Jones/ORNL, U.S. Dept. of Energy

Credit: 
DOE/Oak Ridge National Laboratory

Assessing a compound's activity, not just its structure, could accelerate drug discovery

Assessing a drug compound by its activity, not simply its structure, is a new approach that could speed the search for COVID-19 therapies and reveal more potential therapies for other diseases.

This action-based focus -- called biological activity-based modeling (BABM) -- forms the core of a new approach developed by National Center for Advancing Translational Sciences (NCATS) researchers and others. NCATS is part of the National Institutes of Health (NIH). Researchers used BABM to look for potential anti-SARS-CoV-2 agents whose actions, not their structures, are similar to those of compounds already shown to be effective.

NCATS scientists Ruili Huang, Ph.D., and Wei Zheng, Ph.D., led the research team that created the approach. Their findings were posted online Feb. 23 by the journal Nature Biotechnology.

"With this new method, you can find completely new chemical structures based on activity profiles and then develop completely new drugs," Huang explained. Thus, using information about a compound's biological activity may expand the pool of promising treatments for a wide range of diseases and conditions.

When researchers seek new compounds or look for existing drugs to repurpose against new diseases, they are increasingly using screening tools to predict which drugs might be good candidates. Virtual screening, or VS, allows scientists to use advanced computer analyses to find potentially effective candidates from among millions of compounds in collections.

Traditional VS techniques look for compounds with structures similar to those known to be effective against a particular target on a pathogen or cell, for example. Those structural similarities are then assumed to deliver similar biological activities.

With BABM, however, researchers don't need to know a compound's chemical structure, according to Huang. Instead, they use a profile of a compound's activity patterns -- how it behaves at multiple concentrations against a panel of targets or tests -- to predict its potential effectiveness against a new target or in a new drug assay.

The now-widespread use of quantitative high-throughput screening (qHTS) allows BABM more accuracy in its predictions. qHTS assesses a compound's effectiveness at multiple concentrations in thousands of tests over time. That practice provides far more detail about how a compound behaves than does traditional high-throughput screening, which tests only a single concentration of the compound. The information generated by qHTS creates a stronger biological activity profile -- also known as a signature -- for each one of millions of compounds.

To test the BABM approach, the researchers tapped the vast pool of data generated by hundreds of qHTS analyses run on NCATS' in-house collection of more than 500,000 compounds and drugs. First, they verified BABM's ability to use activity profiles to identify compounds already shown to be effective against the Zika and Ebola viruses. BABM also identified new compounds that showed promise against those viruses.

The scientists then turned to SARS-CoV-2, the virus that causes COVID-19. They applied BABM, a structure-based model and a combined approach to analyze the NCATS library's compounds to find potential anti-SARS-CoV-2 agents. BABM predicted that the activity profiles of 311 compounds might indicate promise against the coronavirus.

The researchers then had an outside laboratory test those 311 compounds against the live SARS-CoV-2 virus. The result: Nearly one-third of the BABM-backed compounds (99) showed antivirus activity in the test. The BABM-driven prediction hit rate topped that of the structure-based model -- and combining the activity-based and structure-based models yielded even better predictive results.

A key advantage to BABM is speed. "This method is very fast -- you essentially just run a computer algorithm, and you can identify many new drug leads, even with new chemical structures," Huang noted. In fact, screening the entire NCATS library of half a million compounds for anti-SARS-CoV-2 candidates took only a few minutes.

BABM also is a transferable tool -- it's not limited to use in the NCATS compound libraries. "Anyone can use this method by applying any biological activity profile data, including publicly available NCATS data," Huang emphasized.

The NCATS researchers predict their activity-based model's impact could extend far beyond the search for COVID-19 treatments and small-molecule drug discovery. Given any substance with an available activity profile, scientists can predict its activity against a new target, for a new indication, or against a new disease.

"In addition to small molecules, this approach can be applied to biologics, antibodies, and other therapies," Huang said. "BABM is for all drug discovery projects."

Credit: 
NIH/National Center for Advancing Translational Sciences (NCATS)

Understanding the spatial and temporal dimensions of landscape dynamics

The Earth's surface is subject to continual changes that dynamically shape natural landscapes. Global phenomena like climate change play a role, as do short-term, local events of natural or human origin. The 3D Geospatial Data Processing (3DGeo) research group of Heidelberg University has developed a new analysis method to help improve our understanding of processes shaping the Earth's surface like those observed in coastal or high-mountain landscapes. Unlike conventional methods that usually compare two snapshots of the topography, the Heidelberg approach can determine - fully automatically and over long periods - when and where surface alterations occur and which type of associated changes they represent.

The method, known as spatiotemporal segmentation, was developed under the guidance of Prof. Dr Bernhard Hoefle, whose 3DGeo group is based at the Institute of Geography and the Interdisciplinary Center for Scientific Computing (IWR) of Heidelberg University. "By observing entire surface histories, our new computer-based method allows for more flexible approaches. Unlike with previous methods, we no longer have to specify which individual change processes we want to detect or the points in time the analysis should include," the geoinformation scientist states. "Instead, areas and entire time periods during which similar changes occur are identified fully automatically. The huge three-dimensional datasets from the automatic laser measurements in the landscape thereby reveal various types of changes that the direct comparison of only two measurement points does not."

Among other techniques, Prof. Hoefle's team uses terrestrial laser scanning (TLS) to measure mountain and coastal landscapes. It generates three-dimensional models of a landscape represented as billions of measurement points in so-called 3D point clouds. "Measurement systems are installed on site and capture the terrain in short, regular intervals over several months, thus generating three-dimensional time series," explains Katharina Anders, a PhD student in Bernhard Hoefle's research group and at the IWR of Heidelberg University. These 3D time series are special because they contain both the temporal and spatial - ergo 4D - properties of surface changes, which can then be reviewed as in a time-lapse video.

"Spatiotemporal segmentation allows us to differentiate in detail between various phenomena that conventional methods detect as a single event or sometimes not at all," states Katharina Anders. The Heidelberg geoinformation scientists applied their method to a 3D time series of a stretch of coast in the Netherlands, which was acquired hourly over five months by scientists of the Delft University of Technology. The data analysis of the entire observation period revealed more than 2,000 changes representing temporary accumulation or erosion of sand that occurred in different locations at varying magnitudes and across various time periods. In this case, the dynamic transport of sand recorded by the measurement system was caused by complex interactions of wind, waves, and human influence. As a result, several truckloads of sand were transported on average in an area of 100 square metres over a period of four weeks, without influence from major storm events.

Findings of such analyses provide the basis for further studies of specific phenomena or underlying processes. At the same time, the information obtained on the dynamic evolution of surfaces opens up new possibilities for parameterisation and hence adaptation of computer-based environmental models. "The method we developed therefore makes an overall contribution to improving our geographic understanding of natural landscape dynamics," adds Katharina Anders.

Credit: 
Heidelberg University

Second order optical merons, or light pretending to be a ferromagnet

image: Spin texture of a second-order half-skyrmion (meron) on the surface of a birefringent cavity. (Source: Physics UW, M. Krol)

Image: 
Source: Physics UW, M. Krol

The scientists have demonstrated how to structure light such that its polarization behaves like a collective of spins in a ferromagnet forming half-skyrmion (also known as merons). To achieve this the light was trapped in a thin liquid crystal layer between two nearly perfect mirrors. Skyrmions in general are found, e.g., as elementary excitations of magnetization in a two-dimensional ferromagnet but do not naturally appear in electromagnetic (light) fields.

One of the key concepts in physics, and science overall is the notion of a "field" which can describe the spatial distribution of a physical quantity. For instance, a weather map shows the distributions of temperature and pressure (these are known as scalar fields), as well as the wind speed and direction (known as a vector field). Almost everyone wears a vector field on their head - every hair has an origin and an end, just like a vector. Over 100 years ago L.E.J. Brouwer proved the hairy ball theorem which states that you can't comb a hairy ball flat without creating whorls, whirls (vortices) or cowlicks.

In magnetism, the elementary excitations in a two-dimensional magnetization vector field have the form of such vortices and are called skyrmions. Going clockwise around the center of such a vortex, we can observe, that the vectors attached to subsequent points on our path can rotate once or many times, clockwise or anticlockwise (Fig. 2). The quantity that describes this feature is called the vorticity. Skyrmions and half-skyrmions (merons) of various vorticities can be found in such different physical systems as nuclear matter, Bose-Einstein condensates or thin magnetic layers. They are also used in the description of the quantum Hall effect, cyclones, anticyclones and tornadoes. Especially interesting are experimental setups, in which one can create various vector fields on demand and investigate the interactions of their excitations.

Scientists from University of Warsaw, Military University of Technology, University of Southampton, Skolkovo Institute in Moscow, and Institute of Physics PAS have demonstrated how to structure light such that its polarization behaves like a half-skyrmion (meron). To achieve this the light has been trapped in a thin liquid crystal layer between two nearly perfect mirrors, known as an optical cavity. By controlling the polarization of incident light and the orientation of the liquid crystal molecules they were able to observe first-order and second-order (first experimental observation) merons and anti-merons (vorticities -2, -1, 1, and 2).

A relatively simple optical cavity filled with a liquid crystal enables the scientists to create and investigate exotic states of polarization of light. The device can potentially allow to test the behavior of these excitations (annihilation, attraction or repulsion of skyrmions and merons) on an optical table when combined with more exotic optically responsive materials. Recognizing the nature of the interaction between these objects can help understand the physics of more complex systems, requiring more sophisticated experimental methods (e.g. ultra-low temperatures).

Physics and Astronomy first appeared at the University of Warsaw in 1816, under the then Faculty of Philosophy. In 1825 the Astronomical Observatory was established. Currently, the Faculty of Physics' Institutes include Experimental Physics, Theoretical Physics, Geophysics, Department of Mathematical Methods and an Astronomical Observatory. Research covers almost all areas of modern physics, on scales from the quantum to the cosmological. The Faculty's research and teaching staff includes ca. 200 university teachers, of which 87 are employees with the title of professor. The Faculty of Physics, University of Warsaw, is attended by ca. 1000 students and more than 170 doctoral students.

Credit: 
University of Warsaw, Faculty of Physics

Study examines what makes people susceptible to fake health news

LAWRENCE -- A new study from University of Kansas journalism & mass communication researchers examines what influences people to be susceptible to false information about health and argues big tech companies have a responsibility to help prevent the spread of misleading and dangerous information.

Researchers shared a fake news story with more than 750 participants that claimed a deficiency of vitamin B17 could cause cancer. Researchers then measured if how the article was presented -- including author credentials, writing style and whether the article was labeled as "suspicious" or "unverified" -- affected how participants perceived its credibility and whether they would adhere to the article's recommendations or share it on social media. The findings showed that information presentation did not influence how people perceived it and that only social media efficacy played a role in whether respondents said they would share it.

Hong Tien Vu, assistant professor of journalism & mass communications, and Yvonnes Chen, associate professor of journalism & mass communications at KU, co-wrote the study. They will present their work, funded by a KU General Research Fund grant, at the 2021 International Communication Association Conference.

Vu and Chen shared eight versions of an article verified as false with respondents that claimed a lack of vitamin B17, which does not exist, could be a cause of cancer. In one version, it included a doctor's byline, including a short description of her medical credentials. In another version, the author was described as a mother of two with a background in creative writing who was a lifestyle blogger in another. Some versions followed a journalistic style, while others used more casual language.

"We wanted to test two skills that are often employed in media literacy training programs around the world, author credentials and writing style, as well as flagging," Vu said. "The results suggest relying on audience members to do the work to determine fake news may be a long way to go. When people have to evaluate the credibility of information, it requires mental work. When surfing the web in general, we tend to rely on big tech companies to verify information."

Respondents who showed higher levels of social media efficacy, or were more savvy in using the technology, evaluated information more carefully and reported they would be less likely to share the article. Health orientation, or whether or not respondents were interested in or sought out health information, did not play a role in discerning accuracy of information. It is significant, however, as those highly interested in health information are more likely to share news they find, whether credible or not, the authors said.

Results showed that author credentials and how the story was written did not have significant differences on how people perceived its credibility, whether they would adhere to its recommendations or share it. However, those who saw the article presented with any sort of flagging stating it was not verified information were significantly less likely to find it credible, adhere to recommendations or share it.

While the study took place before the COVID-19 pandemic, its findings are especially relevant, as misinformation and politicized information about the pandemic have proliferated. It shows seemingly innocuous misinformation can be dangerous as well.

"One problem with fake news studies is the topic becomes so politicized," Vu said. "Fake news can be about something that is not politicized or polarizing as well. Talking about vitamin B17 seems to be harmless, but people believed it. People can spend time, money and efforts on trying to find a cure, and that can be very dangerous if you don't follow a doctor's advice and come across false information."

The fact that any sort of flagging information significantly affected readers' perceptions and intentions to share show how important it is for big technology companies such as social media platforms to verify information or label content that has false, unverified or dangerous information, the authors wrote.

"Whenever we see information that has been flagged, we immediately raise our skepticism, even if we don't agree with it. Big tech companies have a very important role to play in ensuring a healthy, clean information environment," Vu said.

Credit: 
University of Kansas

Metal whispering: Finding a better way to recover precious metals from electronic waste

image: New technology developed by Iowa State engineers uses heat and oxidation to recover pure and precious metals from electronic waste. It works in two ways -- it can bring the most reactive components to the surface, forming stalagmite-like spikes (left); and it can leave the least reactive components in the core surrounded by metal-oxide spikes, creating a "ship-in-a-bottle" structure (right).

Image: 
Photo courtesy of Martin Thuo/Iowa State University.

AMES, Iowa - Inspired by nature's work to build spiky structures in caves, engineers at Iowa State University have developed technology capable of recovering pure and precious metals from the alloys in our old phones and other electrical waste.

Using controlled applications of oxygen and relatively low temperatures, the engineers say they can dealloy a metal by slowly moving the most reactive components to the surface where they form stalagmite-like spikes of metal oxides.

That leaves the least-reactive components in a purified, liquid core surrounded by brittle metal-oxide spikes "to create a so-called 'ship-in-a-bottle structure,'" said Martin Thuo, the leader of the research project and an associate professor of materials science and engineering at Iowa State University.

"The structure formed when the metal is molten is analogous to filled cave structures such as stalactites or stalagmites," Thuo said. "But instead of water, we are using oxidation to create these structures."

A paper describing the new technology, "Passivation-driven speciation, dealloying and purification," has recently been published by the journal Materials Horizons. (See sidebar for the paper's co-authors.)

University startup funds and part of a U.S. Department of Energy Small Business Innovation Research grant supported development of the technology.

Thuo noted this project is the exact opposite of his research group's previous work to develop heat-free solder.

"With heat-free solder, we wanted to put things together," he said. "With this, we want to make things fall apart."

But not just fall apart any which way. Thuo and the engineers in his research group want to control exactly how and where alloy components fall apart, or dealloy.

"It's like being a metal whisperer," he said. "We make things go the way we want."

The engineers offered a more precise description in their paper: "This work demonstrates the controlled behavior of surface oxidation in metals and its potential in design of new particle structures or purification/dealloying. By tuning oxidation via temperature, oxidant partial pressure, time and composition, a balance between reactivity and thermal deformation enables unprecedented morphologies."

Those unprecedented forms and structures could be very useful.

"We need new methods to recover precious metals from e-waste or mixed metal materials," Thuo said. "What we demonstrate here is that the traditional electrochemical or high-temperature methods (above 1,832 degrees Fahrenheit) may not be necessary in metal purification as the metal's reactivity can be used to drive separation."

Thuo said the oxidation technology works well at temperatures of 500 to 700 degrees Fahrenheit. ("This is set in an oven and getting metals to separate," he said.)

Besides metal purification and recovery, this new idea could also be applied to metal speciation - the ability to dictate creation and distribution of certain metal components. One use could be production of complex catalysts to drive multi-stage reactions.

Let's say chemists need a tin oxide catalyst followed by a bismuth oxide catalyst. They'll start with an alloy with the bismuth oxide buried beneath the tin oxide. They'll run the reaction with the tin oxide catalyst. Then they'll raise the temperature to the point that the bismuth oxide comes to the surface as spikes. And then they'll run the reaction with the bismuth oxide catalyst.

Thuo credits development of the new technology to working with talented students and two collaborators.

"We built on this big idea very slowly," he said. "And working together, we were able to break into this knowledge gap."

Credit: 
Iowa State University

Mutant gene-targeted immunotherapy approach developed

video: The immunotherapy approach is targeted to alterations in the common cancer-related p53 tumor suppressor gene, the RAS tumor-promoting oncogene or T-cell receptor genes. It is in the form of bispecific antibodies, comprising one component that specifically recognizes cancer cells and another component that recognizes immune cells and brings the cancer cells and immune cells together to destroy tumor cells.

Image: 
Elizabeth Cook

A novel targeted immunotherapy approach developed by researchers at the Ludwig Center, the Lustgarten Laboratory, and Bloomberg~Kimmel Institute for Cancer Immunotherapy at the Johns Hopkins Kimmel Cancer Center employs new antibodies against genetically altered proteins to target cancers.

The researchers targeted their immunotherapy approach to alterations in the common cancer-related p53 tumor suppressor gene, the RAS tumor-promoting oncogene or T-cell receptor genes. They also tested the therapy on cancer cells in the laboratory and in animal tumor models. Their findings are reported in three related studies published March 1 in Science Immunology, Science and Science Translational Medicine.

Two of the three research studies -- led by Jacqueline Douglass, M.D., Ph.D. candidate at the Johns Hopkins University School of Medicine and Emily Han-Chung Hsiue, M.D., Ph.D., postdoctoral fellow at Johns Hopkins--report on a precision medicine immunotherapy approach that specifically kills cancer cells by targeting mutant protein fragments presented as antigens on the cancer cell surface.

Although common across cancer types, p53 mutations have not been successfully targeted with drugs. Genetic alterations in tumor suppressor genes often resulted in their functional inactivation.

"Traditional drugs are aimed at inhibiting proteins. Inhibiting an already inactivated tumor suppressor gene protein in cancer cells, therefore, is not a feasible approach, says Hsiue, lead author on the Science paper.

Targeted drug therapies have been most successful against oncogenes, but most RAS gene mutations have been notoriously difficult to target. Instead of drugs, the researchers set out to target these gene alterations with newly-developed antibodies.

Conventional antibodies require an antigen target on the cell surface --most commonly a protein that looks like a foreign invader to the immune system. But the proteins produced by mutant oncogenes and tumor suppressor genes are inside the cells, out of reach by conventional antibodies. However, proteins are routinely degraded within cells, generating protein fragments called peptides.

"These peptides can be presented on the cell surface when complexed with the human leukocyte antigens (HLA) proteins," says Katharine Wright, postdoctoral fellow at the Johns Hopkins University School of Medicine and a lead author on the Science Immunology paper. "Mutated proteins in cancer cells can also be degraded and generate mutant peptides presented by the HLA molecules. These mutant peptide HLA complexes serve as antigens and mark cancer cells as foreign to the immune system."

The development of potent antibodies that specifically recognize the one amino acid difference between mutant proteins bound to HLA molecules is an extremely challenging task. To address this, the researchers used a five-step approach--combining state-of-the-art research technologies such as a mass spectroscopic, genetic, and X-ray crystallographic that analyze cells at the molecular level--and immunologic techniques to develop a therapeutic strategy that targets these antigens.

They developed a therapeutic strategy in the form of bispecific antibodies, comprising one component that specifically recognizes cancer cells and another component that recognizes immune cells and brings the cancer cells and immune cells together. In laboratory and animal tumor cell models, it resulted in the destruction of tumor cells.

"This therapeutic strategy is dependent on a cancer containing at least one p53 or RAS alteration and the patient having an HLA type that will bind to the mutant peptide to present it on the cell surface," says senior author Shibin Zhou Ph.D., associate professor of oncology and director of experimental therapeutics for the Ludwig Center at Johns Hopkins and a study leader.

Over the last five years, the researchers worked to overcome a variety of technical obstacles in an effort to develop antibodies that recognize only the mutant cancer gene fragments and not normal cells. To prove their antibody was specific to the mutant antigens, the researchers used CRISPR (clustered regularly interspaced short palindromic repeats) technology on cancer cells to change specific mutations in the target genes or disrupt the HLA type that is responsible for presenting the mutant peptides. The bispecific antibodies did not bring T cells to cancer cells when these genetic manipulations were made.

In the Science Translational Medicine paper, the researchers report that the powerful bispecific antibody approach they developed could also be used for the treatment of T-cell cancers. In animal models of T-cell cancers, the researchers showed that their approach selectively killed the cancerous T cells while sparing the majority of healthy T cells. Oncology fellow at Johns Hopkins University School of Medicine and lead author Suman Paul, M.B.B.S., Ph.D., was inspired to perform this study while treating a patient with this type of cancer.

"The patient had skin lesions so painful that clothing was intolerable, and like other patients with this disease, had a dismal prognosis," says Paul, who emphasizes the need for better therapies.

"Immunotherapies against B cell lymphomas have worked well with therapeutic agents such as CAR T cells and bispecific antibodies that wipe out both healthy and malignant B cells. These B cell-targeting treatments are effective because humans can tolerate the loss of healthy B cells. But a treatment approach that depletes both healthy and cancerous T cells would not work in T-cell cancer patients, because the healthy T cells are necessary for a functioning human immune system. Wiping out the healthy T cells along with the cancerous T cells would essentially result in a disease like AIDS."

By targeting the cancer-associated T-cell receptors, the Science Translational Medicine study described a novel strategy that allowed killing of the cancerous T cells with loss of only a small fraction of healthy T cells.

Another type of immunotherapy, called checkpoint inhibition, works well in patients whose cancer has already drawn the attention of immune cells. Drugs called checkpoint inhibitors can successfully boost this immune response. Many cancers, such as pancreatic cancer and ovarian cancers do not attract immune cells. However, these cancers very frequently contain RAS and/or p53 mutations, providing the opportunity for new forms of immunotherapy not dependent on natural immune responses, says Zhou.

The researchers say one of the major benefits of this type of immunotherapy is that it has the potential to work broadly across cancer types, as long as the patient has the mutant p53 or RAS gene and a matching HLA type, and the therapeutic agent used should be relatively simple to produce.

"This is an off-the-shelf reagent, not a therapy requiring manipulation of individual patient's own T cells, so it's a much easier product from the manufacturing point of view. It could potentially be used for any patient who has the proper mutation and HLA type," says Sandra Gabelli, Ph.D., associate professor of medicine at the Johns Hopkins University School of Medicine and study co-author.

The researchers say the next steps are to see if the strategy can be applied to other gene alterations in p53, KRAS, and other cancer driver genes.

"We intend to develop a large number of bispecific antibodies that would target such genes," says Alex Pearlman, M.D. Ph.D. student, and co-author of the three studies. "Although any individual bispecific antibody would target a small fraction of cancer patients, a suite of antibodies would allow for the treatment of many patients."

The researchers are also concerned about off-target effects in which antibodies errantly bind to a similar target in a vital tissue or organ, a side effect that has been observed in other types of immunotherapies. Resistance to treatment is another concern the research team will study, as such resistance often occurs in patients treated with any therapy, including immunotherapy.

These findings build upon paradigm-shifting cancer genetics discoveries that emanated from the Ludwig Center laboratory, led by Bert Vogelstein, M.D., Clayton Professor of Oncology and Howard Hughes Medical Institute investigator, and Kenneth Kinzler, Ph.D., professor of oncology at Johns Hopkins University School of Medicine. In 1989, Vogelstein's team revealed the p53 gene as the most commonly mutated gene in cancer. Mutations in the p53 gene are an important step in converting premalignant cells into cancer cells. As the first to reveal the genetic blueprint of cancer, the Ludwig Center team showed that cancers resulted from the gradual accumulation of genetic alterations in specific oncogenes and tumor suppressor genes, starting with colorectal cancer and then expanding their discoveries to a wide range of cancer types.

Credit: 
Johns Hopkins Medicine

Oncotarget: Identification intermediate-risk subgroups in metastatic clear-cell renal cell carcinoma

image: CART-Tree analysis for overall survival in IMDC intermediate risk group.

Image: 
Correspondence to - Laurence Albiges - Laurence.ALBIGES@gustaveroussy.fr

The cover for issue 49 of Oncotarget features Figure 4, "CART-Tree analysis for overall survival in IMDC intermediate risk group," by Guida, et al.recently published in "Identification of international metastatic renal cell carcinoma database consortium (IMDC) intermediate-risk subgroups in patients with metastatic clear-cell renal cell carcinoma" which reported that as these patients have different prognosis, the aim of this study is to better characterize IR patients in order to better tailor the treatment.

A multivariable Cox model with backward selection procedure and a Classification and Regression Tree analysis were performed to identify which prognostic factors were associated to OS in IR patients.

Median OS for patients with PLT > UNL was 18 months versus 29 months for patients with normal PLT count.

The selection of PLT count was confirmed on bootstrap samples and was also selected for the first split of the CART-tree analysis.

Elevated PLT count seems to identify a subgroup of patients with poor outcome in the IMDC intermediate-risk population with ccRCC.

"Elevated PLT count seems to identify a subgroup of patients with poor outcome in the IMDC intermediate-risk population with ccRCC"

Dr. Laurence Albiges from The Université Paris-Saclay said, "The risk stratification models for metastatic renal cell carcinoma (mRCC) patients were developed as clinical tool to guide counseling, to predict individual patient prognosis and also to design clinical trial."

Patients lacking these negative factors have a good prognosis and may reach a longer survival; patients presenting 1 or 2 factors have an intermediate risk of death with a median overall survival about 23 months; patients with 3 or more factors have an expected poor risk outcome with median survival about 8 months.

Only in the poor risk group the decision-making algorithm was different: these patients were not candidate for upfront cytoreductive nephrectomy and in selected cases could benefit of mTOR inhibitor temsirolimus in first-line setting.

In the phase III trial Checkmate-214 nivolumab plus ipilimumab immunotherapy combination significantly prolonged OS versus sunitinib in intermediate and poor-risk untreated patients with mRCC.

The Albiges Research Team concluded in their Oncotarget Research Paper that given the rapidly evolving field of systemic treatment in mRCC, one of the most important challenges in mRCC is how prognostic stratification will guide front-line treatment selection.

Additionally characterization of heterogeneous IMDC intermediate-risk groups of patients should be seeked for optimal clinical trials design and stratification.

High platelet count reflecting the cancer-related inflammatory status and seems to segregate patients with the worst prognosis in the intermediate-risk group.

Further analyses are ongoing to validate these findings in patients receiving first line CPI based combination in first line.

Sign up for free Altmetric alerts about this article

DOI - https://doi.org/10.18632/oncotarget.27762

Full text - https://www.oncotarget.com/article/27762/text/

Correspondence to - Laurence Albiges - Laurence.ALBIGES@gustaveroussy.fr

Keywords -
metastatic clear-cell renal cell carcinoma,
IDMC,
intermediate-risk,
heterogeneous prognostic,
platelets

About Oncotarget

Oncotarget is a biweekly, peer-reviewed, open access biomedical journal covering research on all aspects of oncology.

To learn more about Oncotarget, please visit https://www.oncotarget.com or connect with:

SoundCloud - https://soundcloud.com/oncotarget
Facebook - https://www.facebook.com/Oncotarget/
Twitter - https://twitter.com/oncotarget
LinkedIn - https://www.linkedin.com/company/oncotarget
Pinterest - https://www.pinterest.com/oncotarget/
Reddit - https://www.reddit.com/user/Oncotarget/

Oncotarget is published by Impact Journals, LLC please visit http://www.ImpactJournals.com or connect with @ImpactJrnls

Journal

Oncotarget

DOI

10.18632/oncotarget.27762

Credit: 
Impact Journals LLC

Oncotarget: Exploiting the metabolic dependencies of the broad amino acid transporter SLC6A14

image: Evaluation of stress response markers in the absence of amino acid transporters. (A) Immunoblots of inducible MDA-MB-468 KO cell lines upon metabolic stress. Cells were induced with doxycycline for 4 days, plated and cultured for 24 h or 72 h in the indicated media: Full = complete medium; Starv = medium without aromatic and hydrophobic amino acids; - Met = medium without methionine. (B) Immunoblot quantification. Boxplots represent values from three independent experiments. In the boxplots, centerlines mark the medians, box limits indicate the 25th and 75th percentiles, and whiskers extend to 5th and 95th percentiles.

Image: 
Correspondence to - Balca R. Mardin - mardin@bio.mx

Oncotarget recently published "Exploiting the metabolic dependencies of the broad amino acid transporter SLC6A14" which reported that Tumor cells typically enhance their metabolic capacity to sustain their higher rate of growth and proliferation.

One way to elevate the nutrient intake into cancer cells is to increase the expression of genes encoding amino acid transporters, which may represent targetable vulnerabilities.

The Oncotarget authors analyze the pattern of transcriptional changes in a panel of breast cancer cell lines upon metabolic stress and found that SLC6A14 expression levels are increased in the absence of methionine.

Methionine deprivation, which can be achieved via modulation of dietary methionine intake in tumor cells, in turn leads to a heightened activation of the AMP-activated kinase in SLC6A14-deficient cells.

While SLC6A14 genetic deficiency does not have a major impact on cell proliferation, combined depletion of AMPK and SLC6A14 leads to an increase in apoptosis upon methionine starvation, suggesting that combined targeting of SLC6A14 and AMPK can be exploited as a therapeutic approach to starve tumor cells.

"Suggesting that combined targeting of SLC6A14 and AMPK can be exploited as a therapeutic approach to starve tumor cells"

Dr. Balca R. Mardin from The BioMed X Institute said, "One of the hallmarks of tumors is their deregulated metabolism."

Since amino acids are used for the synthesis of macromolecules required for sustaining the accelerated growth of tumor cells, blocking the amino acid transporters may present as a viable therapeutic option, leading to amino acid starvation selectively in tumor cells.

Consistent with the idea that the function of the amino acid transporters can be more critical for the maintenance of tumor cells, several amino acid transporters are reported to be overexpressed in a wide spectrum of tumors.

For instance, in breast cancer, the metabolism of non-essential amino acids is found to be altered and the expression of amino acid transporters correlates with tumor growth and progression.

The amino acid transporter SLC6A14 is a highly concentrative symporter which makes use of the sodium and chloride gradient to uptake amino acids.

Although in vitro studies have revealed variable affinities of each substrate, SLC6A14 can transport the highest range of amino acids compared to all the other amino acid transporters.

The Mardin Research Team concluded in their Oncotarget Research Paper that Increased levels of phosphorylated AMPK in SLC6A14 KO cells challenged with amino acid stress prompted us to test whether blocking AMPK activation can be a vulnerability under these conditions.

For this experiment, the authors transfected gRNAs targeting PRKAA1 and PRKAA2, which encode AMPK α1 and AMPK α2 subunits respectively, in our isogenic cell lines.

Consistent with the fact that SLC6A14 KO alone induces a mild activation of AMPK, they observed a moderate decrease in the cell proliferation rate only in the SLC6A14 KO cell line.

This combination seems to be most pronounced upon SLC6A14 KO, in line with their previous results demonstrating the highest increase of AMPK activation in SLC6A14 KO cells.

Altogether, the data indicate that AMPK activation is a metabolic vulnerability in SLC6A14-deficient cells that can be exploited as a therapeutic approach to drive unbalanced metabolism in starved tumor cells.

Sign up for free Altmetric alerts about this article

DOI - https://doi.org/10.18632/oncotarget.27758

Full text - https://www.oncotarget.com/article/27758/text/

Correspondence to - Balca R. Mardin - mardin@bio.mx

Keywords -
SLC6A14,
metabolic stress,
transcriptional regulation,
methionine,
AMPK

About Oncotarget

Oncotarget is a biweekly, peer-reviewed, open access biomedical journal covering research on all aspects of oncology.

To learn more about Oncotarget, please visit https://www.oncotarget.com or connect with:

SoundCloud - https://soundcloud.com/oncotarget
Facebook - https://www.facebook.com/Oncotarget/
Twitter - https://twitter.com/oncotarget
LinkedIn - https://www.linkedin.com/company/oncotarget
Pinterest - https://www.pinterest.com/oncotarget/
Reddit - https://www.reddit.com/user/Oncotarget/

Oncotarget is published by Impact Journals, LLC please visit http://www.ImpactJournals.com or connect with @ImpactJrnls

Journal

Oncotarget

DOI

10.18632/oncotarget.27758

Credit: 
Impact Journals LLC

Mechanistic understanding of oxygen-redox processes in lithium-rich battery cathodes

image: The structure of a Li-rich cathode in a pristine, charged and discharged state showing how molecular O2 takes part in the energy storage mechanism of an O-redox material.

Image: 
Dr Robert House, University of Oxford

HARWELL, UK (1 March 2021) Scientists based at the University of Oxford as part of the Faraday Institution CATMAT project researching next-generation cathode materials have made a significant advance in understanding oxygen-redox processes involved in lithium-rich cathode materials. The paper, published in Nature Energy, proposes strategies that offer potential routes to increase the energy density of lithium-ion batteries.

"In the ever more difficult quest to make incremental improvements to Li-ion battery energy density, being able to harness the potential of oxygen-redox cathodes and the bigger improvements they offer relative to the nickel rich cathodes in commercial use today is potentially significant," Prof Peter Bruce, University of Oxford and Chief Scientist of the Faraday Institution. "The deeper understanding of the fundamental mechanisms of oxygen-redox is an important step in informing strategies to mitigate the current limitations of such materials, bringing their potential commercial use a step closer to reality."

"Finding pioneering solutions in the UK's race to electrification needs large-scale, focused research effort targeted at industry relevant goals," said Pam Thomas, CEO of the Faraday Institution. This is one example of Faraday Institution researchers achieving a significant scientific milestone, one that unlocks and accelerates multiple new avenues of research in the quest towards battery materials and that could increase the range of future electric vehicles. The breakthrough was facilitated by use of state-of-the-art facilities provided by Diamond Light Source and Royce Institute, demonstrating the importance of maintaining the strength of the UK's research infrastructure."

Increasing the range of electric vehicles demands battery materials that can store more charge at higher voltages in order to achieve a higher "energy density". There are a limited number of ways to increase the energy density of lithium-ion cathode materials. Most current cathode materials are layered transition metal oxides incorporating, e.g. cobalt, nickel and manganese. One research route involves storing charge on the oxide ions as well as on the transition metal ions.

Use of such oxygen-redox materials to increase cathode energy density has been promising for some years, but realising their full potential in commercial batteries has been hampered by the structural changes they experience during their first charge, which are predominantly irreversible, and which give rise to a significant drop in the voltage available on subsequent discharge and future cycles.

A significant research effort has been underway for some time around the world to uncover a mechanism for oxygen-redox that explains these structural changes, but a clear understanding of the nature of oxidised oxygen remains a key part of the puzzle.

Techniques such as RIXS (resonant inelastic X-ray scattering) have been used with success in the past to probe the changes to the oxygen. But by collaborating with researchers at the state-of-the art, I21 beamline at Diamond Light Source, Faraday Institution researchers have successfully resolved these RIXS features that indicate that the oxidised oxygen species in the bulk of the material is molecular oxygen rather than peroxide or other species.

"Moreover, computational modelling has demonstrated that the evolution of molecular oxygen explains both the observed electrochemical response - the reduction in voltage on first discharge - and the observed structural changes - explained by the accommodation of the molecular oxygen within the bulk of the material," said Prof Saiful Islam, University of Bath and CATMAT Principal Investigator. "This single unified model tying molecular oxygen and voltage loss together allows researchers to propose practical strategies for avoiding oxygen-redox-induced instability, offering potential routes towards more reversible high energy density Li-ion cathodes."

Six such strategies are proposed in the paper, of varying novelty, all of which are promising and are being explored by the CATMAT project. The mechanistic understanding developed will speed up research in each of these areas, offering an alternative to iterative, trial and error approaches. In one novel research direction, researchers are investigating the development of a unique "superstructure" where control is exerted over the ordering of lithium atoms in the transition metal layer, giving more stability to the structure and reducing voltage loss.

Credit: 
The Faraday Institution

'Silent epidemic of grief' leaves bereaved and bereavement care practitioners struggling

Major changes in bereavement care have occurred during the COVID-19 pandemic, amid a flood of demand for help from bereaved people, according to new research from the University of Cambridge. The first major study of pandemic-related changes in bereavement care has found that the switch to remote working has helped some services to reach out, but many practitioners feel they do not have capacity to meet people's needs.

It is estimated that for every death, nine people are affected by bereavement. The scale of the impact of the COVID-19 pandemic on those bereaved is now becoming apparent, whether the death was from COVID or from other causes.

Those whose loved ones have died with COVID-19 have had to cope with sudden and unexpected death, deaths in intensive care units, and with seeing loved ones suffer severe symptoms including breathlessness and agitation at the end of life. Social distancing measures have meant restricted visiting at the end of life, leaving some to die alone. Viewing the deceased person's body and funeral proceedings have been severely curtailed, with major impact on those bereaved from all causes, not only from COVID-19. All these factors mean that the risks of complicated and prolonged grief responses have become higher during the pandemic.

In research published today in BMJ Open, researchers at Cambridge's Department of Public Health and Primary Care report the results of an online survey sent to health and social care staff in August 2020, inviting them to describe their experiences and views about changes in bereavement care. 805 people responded, including those working in community, care home, hospital and hospice settings across the UK and Ireland.

Services faced initial challenges adapting to changing national government guidelines. Some bereavement services were suspended due to staff being furloughed or redeployed, particularly specialist bereavement services. Volunteer support in hospitals and hospices was reduced and some services saw increased waiting lists.

"We had 600% increase in deaths for a 3-week period. Dealing with the backlog of bereavement support was challenging," said one palliative medicine doctor.

Bereavement care fell to a wider range of staff members, including some with limited experience. Some people reported that services were under-resourced before the pandemic and that the pandemic would worsen the situation and add new difficulties due to the complex grief reactions.

The biggest change has been the switch to remote methods of providing support - such as telephone and video - which was reported by 90% of respondents. Adapting care to online or telephone formats was particularly challenging, with limited access to the equipment needed and limited staff training in their use.

The move to remote support has been a double-edged sword. On one hand, it increased some opportunities for bereavement support. Services supporting children and young people at times reported these groups to be more receptive to online support and hospices and hospital teams reported widening access to their bereavement support. However, practitioners described the remote work as "draining" and difficult to manage, alongside their own emotional strains during the pandemic.

Some practitioners feared being overwhelmed by demand: "We are really only seeing those who have been bereaved in Jan/Feb so far, so there may be many more to come," said one Community Listening Service Coordinator.

The changes to services were reported to have disrupted the ability to offer emotional support: "It has felt as though we are dealing with them at arm's length whereas we would be there to hold their hands, give them a hug as needed," said a palliative medicine doctor.

Many respondents expressed grave concerns over the long-term impacts on bereaved people, highlighting the inability or restrictions on being with the dying patient as having a profound impact in bereavement.

"Many people who died were denied opportunity to die in their preferred place of care / preferred place of death and died in suboptimal environments to receive their care in last days," said a GP.

While those bereaved from COVID-19 and non-COVID conditions were similarly affected by the restrictions, specific challenges related to COVID-19 were reported. Some respondents described relatives' anger at having COVID-19 on the death certificate. One Bereavement Specialist Liaison Nurse said that the disease "seemed to have a 'stigma' for some". This sense of stigma was thought to exacerbate peoples' feelings of having failed to protect their family member from COVID-19.

Concerns were raised over a large and 'invisible cohort of people' who may not access support or for whom support will be restricted, leading to greater unmet need. "There may be a silent epidemic of grief that we have not yet picked up on," said a Palliative Medicine Doctor.

Dr Caroline Pearce, the lead researcher, said: "Bereavement care has undergone major changes in both acute and community settings affecting bereaved people, clinicians, support workers and the wider health and social care system. The increased need for bereavement care has challenged practitioners as they have taken on new responsibilities and skills and shifted to remote and electronic working. The increased potential for prolonged and complicated grief responses among those bereaved during this period is particularly concerning."

Andy Langford, Clinical Director, CRUSE Bereavement Care, added: "Speaking about grief remains an area of public discomfort, and it is important practitioners encourage bereaved people to view grief as a 'valid' reason to seek help from health and community services, as well as from those they trust in their communities. It was heartening that many respondents reported the development of new and expanded services, but it is imperative that these are made sustainable in the longer-term. The need isn't going away."

Credit: 
University of Cambridge