Earth

Brazilian researchers develop an optical fiber made of gel derived from marine algae

image: Edible, biocompatible and biodegradable, these fibers have potential for various medical applications.

Image: 
Eric Fujiwara

An optical fiber made of agar has been produced at the University of Campinas (UNICAMP) in the state of São Paulo, Brazil. This device is edible, biocompatible and biodegradable. It can be used in vivo for body structure imaging, localized light delivery in phototherapy or optogenetics (e.g., stimulating neurons with light to study neural circuits in a living brain), and localized drug delivery.

Another possible application is the detection of
microorganisms in specific organs, in which case the probe would be completely absorbed by the body after performing its function.

The research project, which was supported by São Paulo Research Foundation - FAPESP, was led by Eric Fujiwara, a professor in UNICAMP's School of Mechanical Engineering, and Cristiano Cordeiro, a professor in UNICAMP's Gleb Wataghin Institute of Physics, in collaboration with Hiromasa Oku, a professor at Gunma University in Japan.

An article on the study is published) in Scientific Reports, an online journal owned by Springer Nature.

Agar, also called agar-agar, is a natural gelatin obtained from marine algae. Its composition consists of a mixture of two polysaccharides, agarose and agaropectin. "Our optical fiber is an agar cylinder with an external diameter of 2.5 millimeters [mm] and a regular inner arrangement of six 0.5 mm cylindrical airholes around a solid core. Light is confined owing to the difference between the refraction indices of the agar core and the airholes," Fujiwara told.

"To produce the fiber, we poured food-grade agar into a mold with six internal rods placed lengthwise around the main axis," he continued. "The gel distributes itself to fill the available space. After cooling, the rods are removed to form airholes, and the solidified waveguide is released from the mold. The refraction index and geometry of the fiber can be adapted by varying the composition of the agar solution and mold design, respectively."

The researchers tested the fiber in different media, from air and water to ethanol and acetone, concluding that it is context-sensitive. "The fact that the gel undergoes structural changes in response to variations in temperature, humidity and pH makes the fiber suitable for optical sensing," Fujiwara said.

Another promising application is its simultaneous use as an optical sensor and a growth medium for microorganisms. "In this case, the waveguide can be designed as a disposable sample unit containing the necessary nutrients. The immobilized cells in the device would be optically sensed, and the signal would be analyzed using a camera or spectrometer," he said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Plant-based diets shown to lower blood pressure even with limited meat and dairy

Consuming a plant-based diet can lower blood pressure even if small amounts of meat and dairy are consumed too, according to new research from the University of Warwick.

Published online by a team from Warwick Medical School in the Journal of Hypertension today (25 July), they argue that any effort to increase plant-based foods in your diet and limit animal products is likely to benefit your blood pressure and reduce your risk of heart attacks, strokes and cardiovascular disease. They conducted a systematic review of previous research from controlled clinical trials to compare seven plant-based diets, several of which included animal products in small amounts, to a standardised control diet and the impact that these had on individuals' blood pressure.

Plant-based diets support high consumption of fruits, vegetables, whole grains, legumes, nuts and seeds, limiting the consumption of most or all animal products (mainly meat and diary). (See Notes to Editors for further details)

High blood pressure is the leading risk factor globally for heart attacks, strokes and other cardiovascular diseases. A reduction in blood pressure has important health benefits both for individuals and for populations. Unhealthy diets are responsible for more deaths and disabilities globally than tobacco use, high alcohol intake, drug use and unsafe sex put together. An increased consumption of whole grains, vegetables, nuts and seeds, and fruit, as achieved in plant-based diets, could avert up to 1.7, 1.8, 2.5 and 4.9 million deaths globally respectively every year according to previous research.

Vegetarian and vegan diets with complete absence of animal products are already known to lower blood pressure compared to omnivorous diets. Their feasibility and sustainability are, however, limited. Until now, it has not been known whether a complete absence of animal products is necessary in plant-based dietary patterns to achieve a significant beneficial effect on blood pressure.

Lead author Joshua Gibbs, a student in the University of Warwick School of Life Sciences, said: "We reviewed 41 studies involving 8,416 participants, in which the effects of seven different plant-based diets (including DASH, Mediterranean, Vegetarian, Vegan, Nordic, high fibre and high fruit and vegetables) on blood pressure were studied in controlled clinical trials. A systematic review and meta-analysis of these studies showed that most of these diets lowered blood pressure. The DASH diet had the largest effect reducing blood pressure by 5.53/3.79 mmHg compared to a control diet, and by 8.74/6.05 mmHg when compared to a 'usual' diet.

"A blood pressure reduction of the scale caused by a higher consumption of plant-based diets, even with limited animal products would result in a 14% reduction in strokes, a 9% reduction in heart attacks and a 7% reduction in overall mortality.

"This is a significant finding as it highlights that complete eradication of animal products is not necessary to produce reductions and improvements in blood pressure. Essentially, any shift towards a plant-based diet is a good one."

Senior author Professor Francesco Cappuccio of Warwick Medical School said: "The adoption of plant-based dietary patterns would also play a role in global food sustainability and security. They would contribute to a reduction in land use due to human activities, to global water conservation and to a significant reduction in global greenhouse gas emission.

"The study shows the efficacy of a plant-based diet on blood pressure. However, the translation of this knowledge into real benefits to people, i.e. its effectiveness, depends on a variety of factors related to both individual choices and to governments' policy decisions. For example, for an individual, the ability to adopt a plant-based diet would be influenced by socio-economic factors (costs, availability, access), perceived benefits and difficulties, resistance to change, age, health status, low adherence due to palatability and acceptance.

"To overcome these barriers, we ought to formulate strategies to influence beliefs about plant-based diets, plant food availability and costs, multisectoral actions to foster policy changes focusing on environmental sustainability of food production, science gathering and health consequences."

Credit: 
University of Warwick

Experimental optimal verification of entangled states using local measurements

Quantum information is a field where the information is encoded into quantum states. Taking advantage of the "quantumness" of these states, we can perform more efficient computations and more secure cryptography compared to their classical counterparts.

A team led by Prof. GUO Guangcan from University of Science and Technology of China (USTC) of CAS modified the original proposal to be robust to practical imperfections, and experimentally implement a scalable quantum state verification on two-qubit and four-qubit entangled states with nonadaptive local measurements. The research results were published in Physical Review Letters on July 17th.

The initialization of a quantum system into a certain state is a crucial aspect of quantum information science. While a variety of measurement strategies have been developed to characterize how well the system is initialized, for a given one, there is in general a trade-off between its efficiency and the accessible information of the quantum state. Conventional quantum state tomography can characterize unknown states while requiring exponentially expensive time-consuming postprocessing.

Alternatively, recent theoretical breakthroughs show that quantum state verification provides a technique to quantify the prepared state with significantly fewer samples, especially for multipartite entangled states.

In the research led by Prof. GUO Guangcan, for all the tested states, the estimated infidelity is inversely proportional to the number of samples, which illustrates the power to characterize a quantum state with a small number of samples. Compared to the globally optimal strategy which requires nonlocal measurements, the efficiency in their experiment is only worse by a small constant factor (They compared the performance difference between quantum state verification and quantum state tomography in an experiment to characterize a four-photon Greenberger-Horne-Zeilinger state, and the results indicate the advantage of quantum state verification in both the achieved efficiency and precision.

They experimentally realized an optimal quantum state verification (QSV), which is easy to implement and robust to realistic imperfections. The exhibited 1/n scaling results from the strategy itself without entangled or adaptive measurements.

Their results have clear implications for many quantum measurement tasks and may be used as a firm basis for subsequent work on more complex quantum systems.

Credit: 
University of Science and Technology of China

Neurons are genetically programmed to have long lives

image: Photo shows Sika Zheng.

Image: 
Zheng lab, UC Riverside.

RIVERSIDE, Calif. -- When our neurons -- the principal cells of the brain -- die, so do we.

Most neurons are created during embryonic development and have no "backup" after birth. Researchers have generally believed that their survival is determined nearly extrinsically, or by outside forces, such as the tissues and cells that neurons supply with nerve cells.

A research team led by Sika Zheng, a biomedical scientist at the University of California, Riverside, has challenged this notion and reports the continuous survival of neurons is also intrinsically programmed during development.

The study, published in the journal Neuron, identifies a mechanism the researchers say is triggered at neuron birth to intrinsically decrease a general form of cell death -- or "apoptosis" -- specifically in neurons. When this genetic regulation is stopped, continuous neuronal survival is disrupted and leads to the death of the animal.

An organism's survival, brain function, and fitness are dependent upon the survival of its neurons. In higher organisms, neurons control breathing, feeding, sensation, motion, memory, emotion, and cognition. They can die of many unnatural causes, such as neurodegenerative diseases, injury, infection, and trauma. Neurons are long-lived cells, but the genetic controls that enable their longevity are unknown.

Zheng's team now reports the central piece of the mechanism involved is a small piece of genetic sequence in Bak1, a pro-apoptotic gene whose activation leads to apoptosis. Bak1 expression is turned off when this small piece of genetic sequence, termed microexon, is spliced in the final Bak1 gene product. Exons are sequences that make up messenger RNA.

"Apoptosis is a pathway that controls cell turnover and tissue homeostasis in all metazoans," explained Zheng, an associate professor of biomedical sciences. "Most non-neural cells readily engage in apoptosis in response to intrinsic and extrinsic stress. But this cellular suicidal program needs to be reined in for neurons so that they live for many years. We now show how genetic attenuation of neuronal apoptosis takes place."

Zheng's team identified the Bak1 microexon through a large-scale analysis of expression data from human tissues, mouse tissues, human developing brains, mouse developing forebrains, and mouse developing midbrains. The team first compared neural tissues with non-neural tissues in both humans and mice to identify neural-specific exons. Then, they found cortical neurons reduce their sensitivity to apoptosis as early as neuron birth. They also found apoptosis is gradually reduced during neuronal development before neurons make connections or innervate other cells, suggesting factors other than extrinsic signals can play a role.

"We show neurons transform how they regulate cell death during development," Zheng said. "This is to ensure neuronal longevity, which is needed to maintain the integrity of neural circuits for brain functions."

Next, Zheng's team will study whether the identified mechanism is activated in neurodegenerative diseases and injury that cause neuronal cell death.

Credit: 
University of California - Riverside

Alaska is getting wetter. That's bad news for permafrost and the climate.

image: Postdoctoral fellow Catherine Dielemen associated with Merritt Turetsky's research group uses a frost probe to determine the location of surface permafrost beneath the ground surface in interior Alaska.

Image: 
Merritt Turetsky

Alaska is getting wetter. A new study spells out what that means for the permafrost that underlies about 85% of the state, and the consequences for Earth's global climate.

The study, published today in Nature Publishing Group journal Climate and Atmospheric Science, is the first to compare how rainfall is affecting permafrost thaw across time, space, and a variety of ecosystems. It shows that increased summer rainfall is degrading permafrost across the state.

As Siberia remains in the headlines for record-setting heat waves and wildfires, Alaska is experiencing the rainiest five years in its century-long meteorological record. Extreme weather on both ends of the spectrum--hot and dry versus cool and wet--are driven by an aspect of climate change called Arctic amplification. As the earth warms, temperatures in the Arctic rise faster than the global average.

While the physical basis of Arctic amplification is well understood, it is less known how it will affect the permafrost that underlies about a quarter of the Northern Hemisphere, including most of Alaska. Permafrost locks about twice the carbon that is currently in the atmosphere into long-term storage and supports Northern infrastructure like roads and buildings; so understanding how a changing climate will affect it is crucial for both people living in the Arctic and those in lower latitudes.

"In our research area the winter has lost almost three weeks to summer," says study lead author and Fairbanks resident Thomas A. Douglas, who is a scientist with the U.S. Army Cold Regions Research and Engineering Laboratory. "This, along with more rainstorms, means far more wet precipitation is falling every summer."

Over the course of five years, the research team took 2750 measurements of how far below the land's surface permafrost had thawed by the end of summer across a wide range of environments near Fairbanks, Alaska. The five-year period included two summers with average precipitation, one that was a little drier than usual, and the top and third wettest summers on record. Differences in annual rainfall were clearly imprinted in the amount of permafrost thaw.

More rainfall led to deeper thaw across all sites. After the wettest summer in 2014, permafrost didn't freeze back to previous levels even after subsequent summers were drier. Wetlands and disturbed sites, like trail crossings and clearings, showed the most thaw. Tussock tundra, with its deep soils and covering of tufted grasses, has been found to provide the most ecosystem protection of permafrost. While permafrost was frozen closest to the surface in tussock tundra, it experienced the greatest relative increase in the depth of thaw in response to rainfall, possibly because water could pool on the flat surface. Forests, especially spruce forests with thick sphagnum moss layers, were the most resistant to permafrost thaw. Charlie Koven, an Earth system modeler with the Lawrence Berkeley National Laboratory, used the field measurements to build a heat balance model that allowed the team to better understand how rain was driving heat down into the permafrost ground.

The study demonstrates how land cover types govern relationships between summer rainfall and permafrost thaw. As Alaska becomes warmer and wetter, vegetation cover is projected to change and wildfires will disturb larger swathes of the landscape. Those conditions may lead to a feedback loop between more permafrost thaw and wetter summers.

In the meantime, rainfall--and the research--continue. Douglas says, "I was just at one of our field sites and you need hip waders to get to areas that used to be dry or only ankle deep with water. It is extremely wet out there. So far this year we have almost double the precipitation of a typical year."

"This study adds to the growing body of knowledge about how extreme weather--ranging from heat spells to intense summer rains--can disrupt foundational aspects of Arctic ecosystems," says Merritt Turetsky, Director of the University of Colorado Boulder's Institute of Arctic and Alpine Research (INSTAAR) and a coauthor of the study. "These changes are not occurring gradually over decades or lifetimes; we are watching them occur over mere months to years."

Credit: 
University of Colorado at Boulder

Increasing rates of preventable hospitalizations among adults with dementia

BOSTON - Older adults with dementia tend to be hospitalized more often than those without cognitive impairment. Now a team of investigators at Beth Israel Deaconess Medical Center (BIDMC) has found that in recent years, increasing numbers of these hospitalizations were for conditions for which hospitalization can often be avoided with improvements in outpatient care. The findings, published today in the Journal of the American Geriatric Society, point to the need for improved strategies to safeguard the health of individuals in the community who have dementia, to avoid the need for hospitalized care.

For the study, researchers examined nationally representative hospital discharge data from 2012 to 2016 pertaining to 1.8 million hospitalizations of older U.S. adults (aged 65 years and older) with dementia.

The analysis revealed that 40 percent of hospitalizations of older adults with dementia were for potentially preventable conditions, including those like pneumonia and heart failure that can possibly be avoided with access to high quality outpatient care. Although the national incidence of all hospitalizations for individuals with dementia declined between 2012 and 2016, the incidence of hospitalizations for potentially preventable conditions increased. Specifically, between 2012 and 2016, the incidence of hospitalizations for any cause declined from 1.87 million to 1.85 million per year, while the incidence of potentially preventable hospitalizations increased from 0.75 million to 0.87 million per year, driven by an increased number of hospitalizations for sepsis, injuries, and dehydration of older adults with dementia living in the community.

"Care for older adults with dementia is often mischaracterized as exclusively a nursing home issue, but our study shows that over 80 percent of hospitalizations occur in older adults who reside in the community," said Timothy Anderson, MD, the study's lead author and a general internist and health services researcher in the Division of General Medicine at BIDMC and Instructor in Medicine at Harvard Medical School. "Thus, initiatives to reduce preventable hospitalizations must encompass outpatient care."

Anderson noted that addressing sepsis, a serious condition caused by the body's response to an infection, should take high priority. "Our findings suggest that infections are a particularly important driver of potentially preventable hospitalizations, indicating that we need better strategies to detect infections in older adults with dementia early enough to treat them before they become life-threatening."

The study also found that among patients with dementia who were hospitalized for potentially preventable conditions, inpatient deaths declined from 6.4 percent in 2012 to 6.1 percent in 2016, inflation-adjusted median costs increased from $7,319 to $7,543, and total annual costs increased from $7.4 billion to $9.3 billion. Although 86 percent of hospitalized patients were admitted from the community, only 33 percent were discharged to the community.

"These preventable hospitalizations have important effects that stretch beyond the hospital stay, both in terms of outcomes for the patients--as the majority are discharged to skilled nursing facilities rather than returning home--and in terms of costs to the health system," said Anderson.

He stressed that in-depth studies are needed to implement and evaluate the impact of patient-centered programs to improve outpatient care for older adults with dementia towards the goal of preventing hospitalizations.

Credit: 
Beth Israel Deaconess Medical Center

New technique to capture carbon dioxide could greatly reduce power plant greenhouse gases

A big advance in carbon capture technology could provide an efficient and inexpensive way for natural gas power plants to remove carbon dioxide from their flue emissions, a necessary step in reducing greenhouse gas emissions to slow global warming and climate change.

Developed by researchers at the University of California, Berkeley, Lawrence Berkeley National Laboratory and ExxonMobil, the new technique uses a highly porous material called a metal-organic framework, or MOF, modified with nitrogen-containing amine molecules to capture the CO2 and low temperature steam to flush out the CO2 for other uses or to sequester it underground.

In experiments, the technique showed a six times greater capacity for removing CO2 from flue gas than current amine-based technology, and it was highly selective, capturing more than 90% of the CO2 emitted. The process uses low temperature steam to regenerate the MOF for repeated use, meaning less energy is required for carbon capture.

"For CO2 capture, steam stripping -- where you use direct contact with steam to take off the CO2 -- has been a sort of holy grail for the field. It is rightly seen as the cheapest way to do it," said senior researcher Jeffrey Long, UC Berkeley professor of chemistry and of chemical and biomolecular engineering and senior faculty scientist at Berkeley Lab. "These materials, at least from the experiments we have done so far, look very promising."

Because there's little market for most captured CO2, power plants would likely pump most of it back into the ground, or sequester it, where it would ideally turn into rock. The cost of scrubbing the emissions would have to be facilitated by government policies, such as carbon trading or a carbon tax, to incentivize CO2 capture and sequestration, something many countries have already implemented.

The work was funded by ExxonMobil, which is working with both the Berkeley group and Long's start-up, Mosaic Materials Inc., to develop, scale up and test processes for stripping CO2 from emissions.

Long is the senior author of a paper describing the new technique that will appear in the July 24 issue of the journal Science.

"We were able to take the initial discovery and, through research and testing, derive a material that in lab experiments has shown the potential to not only capture CO2 under the extreme conditions present in flue gas emissions from natural gas power plants, but to do so with no loss in selectivity," said co-author Simon Weston, senior research associate and the project lead at ExxonMobil Research and Engineering Co. "We have shown that these new materials can then be regenerated with low-grade steam for repeated use, providing a pathway for a viable solution for carbon capture at scale."

Carbon dioxide emissions by fossil fuel-burning vehicles, electricity generating plants and industry account for an estimated 65% of the greenhouse gases driving climate change, which has already increased Earth's average temperature by 1.8 degrees Fahrenheit (1 degree Celsius) since the 19th century. Without a decrease in these emissions, climate scientists predict ever hotter temperatures, more erratic and violent storms, several feet of sea level rise and resulting droughts, floods, fires, famine and conflict.

"In reality, of the kinds of things that the Intergovernmental Panel on Climate Change says we need to do to control global warming, CO2 capture is a huge part," Long said. "We don't have a use for most of the CO2 that we need to stop emitting, but we have to do it."

Stripping

Power plants strip CO2 from flue emissions today by bubbling flue gases through organic amines in water, which bind and extract the carbon dioxide. The liquid is then heated to 120-150 C (250-300 F) to release the CO2 gas, after which the liquids are reused. The entire process consumes about 30% of the power generated. Sequestering the captured CO2 underground costs an additional, though small, fraction of that.

Six years ago, Long and his group in UC Berkeley's Center for Gas Separations, which is funded by the U.S. Department of Energy, discovered a chemically modified MOF that readily captures CO2 from concentrated power plant flue emissions, potentially reducing the capture cost by half. They added diamine molecules to a magnesium-based MOF to catalyze the formation of polymer chains of CO2 that could then be purged by flushing with a humid stream of carbon dioxide.

Because MOFs are very porous, in this case like a honeycomb, an amount the weight of a paper clip has an internal surface area equal to that of a football field, all available for adsorbing gases.

A major advantage of the amine-appended MOFs is that the amines can be tweaked to capture CO2 at different concentrations, ranging from the 12% to 15% typical of coal plant emissions to the 4% typical of natural gas plants, or even the much lower concentrations in ambient air. Mosaic Materials, which Long co-founded and directs, was created to make this technique available widely to power and industrial plants.

But the 180 C stream of water and CO2 needed to flush the captured CO2 eventually drives off the diamine molecules, shortening the life of the material. The new version uses four amine molecules -- a tetraamine -- that is much more stable at high temperatures and in the presence of steam.

"The tetraamines are so strongly bound within the MOF that we can use a very concentrated stream of water vapor with zero CO2, and if you tried that with the previous adsorbents, the steam would start destroying the material," Long said.

They showed that direct contact with steam at 110-120 C -- a bit above the boiling point of water -- works well to flush out the CO2. Steam at that temperature is readily available in natural gas power plants, whereas the 180 C CO2-water mix required to regenerate the earlier modified MOF necessitated heating, which wastes energy.

When Long, Weston and their colleagues first thought about replacing diamines with tougher tetraamines, it seemed like a long shot. But crystal structures of the diamine-containing MOFs suggested that there could be ways of connecting two diamines to form a tetraamine while preserving the ability of the material to polymerize CO2. When UC Berkeley graduate student Eugene Kim, the paper's first author, chemically created the tetraamine-appended MOF, it outperformed the diamine-appended MOF on the first try.

The researchers subsequently studied the structure of the modified MOF using Berkeley Lab's Advanced Light Source, revealing that the CO2 polymers that line the pores of the MOF are actually linked by the tetraamines, like a ladder with tetraamines as the rungs. First-principles density functional theory calculations using the Cori supercomputer in Berkeley Lab's National Energy Research Scientific Computing Center (NERSC), computing resources at the Molecular Foundry and resources provided by the campus's Berkeley Research Computing program confirmed this remarkable structure that Long's team had initially envisioned.

"I have been doing research at Cal for 23 years now, and this is one of those times where you have what seemed like a crazy idea, and it just worked right away," Long said.

Co-authors with Long, Kim and Weston are Joseph Falkowski from ExxonMobil; Rebecca Siegelman, Henry Jiang, Alexander Forse, Jeffrey Martell, Phillip Milner, Jeffrey Reimer and Jeffrey Neaton from UC Berkeley; and Jung-Hoon Lee from Berkeley Lab. Neaton and Reimer also are faculty senior scientists at Berkeley Lab.

Credit: 
University of California - Berkeley

Comparing hyperthyroidism treatments with risk of cancer death

What The Study Did: Researchers compared long-term risk of death from a solid cancer in patients treated with radioactive iodine, anti-thyroid drugs or surgery for hyperthyroidism.

Authors: Cari M. Kitahara, Ph.D., M.H.S., of the National Cancer Institute in Bethesda, Maryland, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.9660)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Gene-controlling mechanisms play key role in cancer progression

image: MIT researchers have analyzed how epigenomic modifications change as tumors evolve. This image shows a lung with tumors that researchers collected with multiplexed immunohistochemistry.

Image: 
Isabella Del Priore and Lindsay LaFave

CAMBRIDGE, MA -- As cancer cells evolve, many of their genes become overactive while others are turned down. These genetic changes can help tumors grow out of control and become more aggressive, adapt to changing conditions, and eventually lead the tumor to metastasize and spread elsewhere in the body.

MIT and Harvard University researchers have now mapped out an additional layer of control that guides this evolution -- an array of structural changes to "chromatin," the mix of proteins, DNA, and RNA that makes up cells' chromosomes. In a study of mouse lung tumors, the researchers identified 11 chromatin states, also called epigenomic states, that cancer cells can pass through as they become more aggressive.

"This work provides one of the first examples of using single-cell epigenomic data to comprehensively characterize genes that regulate tumor evolution in cancer," says Lindsay LaFave, an MIT postdoc and the lead author of the study.

In addition, the researchers showed that a key molecule they found in the more aggressive tumor cell states is also linked to more advanced forms of lung cancer in humans, and could be used as a biomarker to predict patient outcomes.

Tyler Jacks, director of MIT's Koch Institute for Integrative Cancer Research, and Jason Buenrostro, an assistant professor of stem cell and regenerative biology at Harvard University, are the senior authors of the study, which appears today in Cancer Cell.

Epigenomic control

While a cell's genome contains all of its genetic material, the epigenome plays a critical role in determining which of these genes will be expressed. Every cell's genome has epigenomic modifications -- proteins and chemical compounds that attach to DNA but do not alter its sequence. These modifications, which vary by cell type, influence the accessibility of genes and help to make a lung cell different from a neuron, for example.

Epigenomic changes are also believed to influence cancer progression. In this study, the MIT/Harvard team set out to analyze the epigenomic changes that occur as lung tumors develop in mice. They studied a mouse model of lung adenocarcinoma, which results from two specific genetic mutations and closely recapitulates the development of human lung tumors.

Using a new technology for single-cell epigenome analysis that Buenrostro had previously developed, the researchers analyzed the epigenomic changes that occur as tumor cells evolve from early stages to later, more aggressive stages. They also examined tumor cells that had metastasized beyond the lungs.

This analysis revealed 11 different chromatin states, based on the locations of epigenomic alterations and density of the chromatin. Within a single tumor, there could be cells from all 11 of the states, suggesting that cancer cells can follow different evolutionary pathways.

For each state, the researchers also identified corresponding changes in where gene regulators called transcription factors bind to chromosomes. When transcription factors bind to the promoter region of a gene, they initiate the copying of that gene into messenger RNA, essentially controlling which genes are active. Chromatin modifications can make gene promoters more or less accessible to transcription factors.

"If the chromatin is open, a transcription factor can bind and activate a specific gene program," LaFave says. "We were trying to understand those transcription factor networks and then what their downstream targets were."

As the structure of tumor cells' chromatin changed, transcription factors tended to target genes that would help the cells to lose their original identity as lung cells and become less differentiated. Eventually many of the cells also gained the ability to leave their original locations and seed new tumors.

Much of this process was controlled by a transcription factor called RUNX2. In more aggressive cancer cells, RUNX2 promotes the transcription of genes for proteins that are secreted by cells. These proteins help remodel the environment surrounding the tumor to make it easier for cancer cells to escape.

The researchers also found that these aggressive, premetastatic tumor cells were very similar to tumor cells that had already metastasized.

"That suggests that when these cells were in the primary tumor, they actually changed their chromatin state to look like a metastatic cell before they migrated out into the environment," LaFave says. "We believe they undergo an epigenetic change in the primary tumor that allows them to become migratory and then seed in a distal location like the lymph nodes or the liver."

A new biomarker

The researchers also compared the chromatin states they identified in mouse tumor cells to chromatin states seen in human lung tumors. They found that RUNX2 was also elevated in more aggressive human tumors, suggesting that it could serve as a biomarker for predicting patient outcomes.

"The RUNX positive state was very highly predictive of poor survival in human lung cancer patients," LaFave says. "We've also shown the inverse, where we have signatures of early states, and they predict better prognosis for patients. This suggests that you can use these single-cell gene regulatory networks as predictive modules in patients."

RUNX could also be a potential drug target, although it traditionally has been difficult to design drugs that target transcription factors because they usually lack well-defined structures that could act as drug docking sites. The researchers are also seeking other potential targets among the epigenomic changes that they identified in more aggressive tumor cell states. These targets could include proteins known as chromatin regulators, which are responsible for controlling the chemical modifications of chromatin.

"Chromatin regulators are more easily targeted because they tend to be enzymes," LaFave says. "We're using this framework to try to understand what are the important targets that are driving these state transitions, and then which ones are therapeutically targetable."

Credit: 
Massachusetts Institute of Technology

'Self-eating' process of stem cells may be the key to new regenerative therapies

image: Translucently colored embryonic stem (ES) cell (upper right) and its differentiating derivatives (left and lower right). The small round bodies inside cells represent lysosomes, with the pink color indicating ones that are undergoing chaperone-mediated autophagy (CMA), a selective form of autophagy that is demonstrated only in mammals. CMA governs the balance between self-renewal and differentiation of ES cells. It is kept at low levels in undifferentiated ES cells to maintain the pluripotent state. Upon induction of differentiation, CMA flux increases due to the reduction of pluripotency factors, leading to changes in cellular metabolism and epigenetic landscape that favor differentiation.

Image: 
Alex Tokarev

PHILADELPHIA--The self-eating process in embryonic stem cells known as chaperone-mediated autophagy (CMA) and a related metabolite may serve as promising new therapeutic targets to repair or regenerate damaged cells and organs, Penn Medicine researchers show in a new study published online in Science.

Human bodies contain over 200 different types of specialized cells. All of them can be derived from embryonic stem (ES) cells, which relentlessly self-renew while retaining the ability to differentiate into any cell type in adult animals, a state known as pluripotency. Researchers have known that the cells' metabolism plays a role in this process; however, it wasn't clear exactly how the cells' internal wiring works to keep that state and ultimately decide stem cell fate.

The new preclinical study, for the first time, shows how the stem cells keeps CMA at low levels to promote that self-renewal, and when the stem cell is ready, it switches that suppression off to enhance CMA, among other activities, and differentiate into specialized cells.

"It's an intriguing discovery in the field of stem cell biology and for researchers looking to develop therapies for tissue or organ regeneration," said senior author Xiaolu Yang, PhD, a professor of Cancer Biology at the Abramson Family Cancer Research Institute in the Perelman School of Medicine at the University of Pennsylvania. "We reveal two novel ways to potentially manipulate the self-renewal and differentiation of stem cells: CMA and a metabolite, known as alpha-ketoglutarate, that is regulated by CMA. Rationally intervening or guiding these functions could be a powerful way to increase the efficiency of regenerative medicine approaches."

Autophagy is a cell-eating mechanism necessary for survival and function of most living organisms. When cells self-eat, the intracellular materials are delivered to lysosomes, which are organelles that help break down these materials. There are a few forms of autophagy. However, unlike the other forms, which are present in all eukaryotic cells, CMA is unique to mammals. To date, the physiological role of CMA remains unclear.

Using metabolomic and genetic laboratory techniques on the embryonic stem cells of mice, the researchers sought to better understand significant changes that took place during their pluripotent state and subsequent differentiation.

They found that CMA activity is kept at a minimum due to two cellular factors critical for pluripotency--Oct4 and Sox2--that suppresses a gene known as LAMP2A, which provides instructions for making a protein called lysosomal associated membrane protein-2 necessary in CMA. The minimal CMA activity allows stem cells to maintain high levels of alpha-ketoglutarate, a metabolite that is crucial to reinforce a cell's pluripotent state, the researchers found.

When it's time for differentiation, the cells begin to upregulate CMA due to the reduction in Oct4 and Sox2. Augmented CMA activity leads to the degradation of key enzymes responsible for the production of alpha-ketoglutarate. This leads to a reduction in alpha-ketoglutarate levels as well as an increases in other cellular activities to promote differentiation. These findings reveal that CMA and alpha-ketoglutarate dictate the fate of embryonic stem cells.

Embryonic stem cells are often called pluripotent due to their remarkable ability to give rise to every cell type in the body, except the placenta and umbilical cord. Embryonic stem cells not only provide a superb system to study early mammalian development, but also hold great promise for regenerative therapies to treat various human disorders. The development of stem-cell based regenerative medicine therapies has rapidly increased in the last decade, with several approaches in studies shown to repair damaged heart tissue, replace cells in solid organ transplantation, and in some cases address neurological disorders.

"This newly discovered role of autophagy in the stem cell is the beginning of further investigations that could lead to researchers and physician-scientists to better therapies to treat various disorders," Yang said.

Credit: 
University of Pennsylvania School of Medicine

Excellent research results for CAR-T Cell therapy against Hodgkin lymphoma

image: Barbara Savaldo, MD, Ph.D., and colleagues at UNC Lineberger Comprehensive Cancer Center and Baylor College of Medicine report that results from an early-phase clinical trial demonstrated CAR-T cell therapy was highly active in patients with relapsed/refractory Hodgkin lymphoma. The treatment led to the complete disappearance of tumor in the majority of patients treated at the highest dose level of therapy with almost all patients having clinical benefit after treatment.

Image: 
UNC Lineberger Comprehensive Cancer Center

CHAPEL HILL, N.C.--CAR-T cell therapy, which attacks cancer cells using a person's reprogrammed immune cells, has been used to treat Hodgkin lymphoma with remarkable success for the first time, according to the results of an early phase clinical trial led by researchers at UNC Lineberger Comprehensive Cancer Center and Baylor College of Medicine in Houston.

The clinical trial, whose results are published in the Journal of Clinical Oncology, was designed to determine the treatment's safety and efficacy for patients with relapsed Hodgkin lymphoma. Researchers demonstrated that the treatment was safe, but perhaps more importantly, that the treatment was highly active in patients with relapsed/refractory Hodgkin lymphoma. The treatment led to the complete disappearance of tumor in the majority of patients treated at the highest dose level of therapy with almost all patients having clinical benefit after treatment.

"This is particularly exciting because the majority of these patients had lymphomas that had not responded well to other powerful new therapies," said study senior author Barbara Savoldo, MD, PhD, professor in the UNC Department of Microbiology and Immunology at the UNC School of Medicine and a UNC Lineberger member.

"Everyone worked tirelessly on the study and I am proud of the collaborative work it fueled between UNC Lineberger and Baylor," Savoldo said.

Chimeric antigen receptor (CAR) T cells are human T cells - a powerful type of immune cell - that have been harvested from a patient and genetically re-engineered to recognize proteins found on the patient's cancer cells. They are reinfused into the patient to circulate in the blood for months as a "living drug" to attack the patient's cancer cells. In some cases, patients are infused with CAR-T cells made from T cells provided by other donors.

CAR-T cell therapies in the past decade have had striking successes in some clinical trials, and so far have been approved by the U.S. Food and Drug Administration for treating two blood cancers, acute lymphoblastic leukemia and diffuse large B-cell lymphoma. These CAR-T cell therapies are designed to target the protein CD19, which is found on malignant cells in these cancers. Inspired by the success of CAR-T cell therapies against these cancers, researchers have been developing the technology for use against cancers that express other cancer-associated proteins.

Savoldo and her colleagues in recent years have been exploring the use of CAR-T cells against Hodgkin lymphoma, a blood cancer that afflicts more than 200,000 people in the United States. While about 85 percent of Hodgkin lymphoma patients are cured or have many cancer-free years following standard chemotherapy and/or radiation regimens, the rest either don't respond to standard therapy or do respond but experience a cancer relapse within a few years. Many of these "refractory/relapsing" patients go through years of further treatments without success, and end up with no good options.

In a pilot study in seven refractory/relapsing Hodgkin lymphoma patients, published in 2017, Savoldo and Baylor colleagues found that a CAR-T cell therapy targeting Hodgkin cell-associated protein CD30 appeared safe but brought about only modest responses.

In the new study, which included 41 patients treated at Baylor and UNC, the researchers used the same anti-CD30 CAR-T cell strategy, but added a preconditioning regimen in which patients' existing lymphocytes - a broad family of white blood cells including T cells - were greatly depleted with chemotherapy drugs prior to the addition of the CAR-T cells.

"Lymphodepletion prior to CAR-T cell infusion seems to produce a more favorable environment for the CAR-T cells to proliferate and attack their cancerous targets," said study co-first author Natalie Grover MD, assistant professor in the UNC Department of Medicine and a UNC Lineberger member.

Carlos Ramos, MD at Baylor College of Medicine is the paper's other co-first author.

Side effects of the lymphodepletion plus CAR-T cell treatment were common and included flu-like symptoms due to an immune chemical storm called cytokine release syndrome, but these events were generally modest. None of the patients experienced the more serious, life-threatening complications, such as brain swelling, that have been seen in CAR-T cell trials against other blood cancers.

Even more promising, the study showed that this anti-CD30 CAR-T cell therapy appeared to be very active even against refractory/relapsing Hodgkin lymphoma.

As the trial progressed, the researchers settled on fludarabine as a key element of the pre-therapy lymphodepletion regimen, since patient outcomes seemed better when it was used. The researchers found that among the 32 patients with active cancer who received fludarabine for lymphodepletion before their CAR-T cells, 19 patients (59 percent) had a complete response.

Of the patients in the study who had a complete response, 61 percent still had no evidence of recurrence a year later. Overall, 94 percent of the treated patients were still alive a year after their treatment.

"This treatment showed remarkable antitumor activity without significant toxicity, and we think it should be considered for patients in earlier stages of refractory/relapsing Hodgkin lymphoma," Savoldo said.

"The activity of this new therapy is quite remarkable and while we need to confirm these findings in a larger study, this treatment potentially offers a new approach for patients who currently have very limited options to treat their cancer," said Jonathan Serody, MD, the Elizabeth Thomas Professor of Medicine, Microbiology and Immunology at UNC School of Medicine, director of the bone marrow transplant and cellular therapy program at UNC, and a UNC Lineberger member. "Additionally, unlike other CAR T cell therapies, clinical success was not associated with significant complications from therapy. This means this treatment should be available to patients in a clinic setting and would not require patients to be hospitalized, which is critical in our current environment."

Credit: 
UNC Lineberger Comprehensive Cancer Center

The spatial consistency of summer rainfall variability between the Mongolian Plateau and North China

image: Schematic diagram of the physical mechanism responsible for the consistency of precipitation variability in the MP and Northeast and North China

Image: 
©Science China Press

The Mongolian Plateau (MP) is located in the eastern part of arid Central Asia (ACA). Climatically, much of the MP is dominated by the westerly circulation and has an arid and semi-arid climate; however, the eastern part of the MP is also influenced by the East Asian summer monsoon (EASM) and has a humid and semi-humid climate. Recently, several studies have shown that precipitation variability in the MP differs from that in western ACA but is consistent with that in the EASM region.

To find out the accurate spatial scope of this precipitation consistency, Professor Huang Wei and Dr. Jie Chen in the Key Laboratory of Western China's Environmental Systems at Lanzhou University used reanalysis data to characterize and determine this spatial consistency of summer precipitation variability.

They found that this consistent pattern shows in MP and North and Northeast China.

The Eurasian mid-latitude teleconnection wave train, with negative (positive) anomalies in the North Atlantic and ACA and positive (negative) anomalies in the MP and mid-latitude EASM region, is largely responsible for the consistency of precipitation variability in the MP and North and Northeast China. The positive anomalies over the North Atlantic and ACA and negative anomalies over Europe and the MP would enhance the transport of westerly and monsoon moisture to the MP and North and Northeast China. They could also strengthen the Northeast Asian low, enhance the EASM, and trigger the anomalous ascending motion over the MP, which promotes precipitation in the MP and Northeast and North China.

Additionally, they also found the decadal variability of the precipitation in the MP and ACA exhibits an out-of-phase pattern, due to the Eurasian mid-latitude teleconnection wave train. These results would help explain the spatial variations of paleo-precipitation/humidity reconstructions in East Asia and clarify the reasons for the consistency of the regional climate.

Credit: 
Science China Press

Novel theory of climate dynamics: Three-pattern decomposition of global atmospheric circulation

image: Climatological mean of vertical vorticity (d) and its decomposition: (a) vertical vorticity of horizontal circulation, (b) vertical vorticity of meridional and zonal circulations, (c) vertical vorticity of horizontal, meridional and zonal circulations at 850 hPa (1979-2013) for DJF. e~h, i~l and m~p same as a~d, but for 500 hPa, 200 hPa and the vertical mean surface, respectively.

Image: 
©Science China Press

Climate change is already affecting the world on an unprecedented scale. However, there is a lack of complete basic theoretical system for climate prediction for a long time, which limits the accuracy of climate prediction fundamentally. A recent research development, the theory of three-pattern decomposition of global atmospheric circulation, introduces the pioneering research results in the basic theory of climate prediction. The paper entitled "Theory of three-pattern decomposition of global atmospheric circulation", was published in SCIENCE CHINA: Earth Sciences, which was written by Prof. Shujuan Hu of Lanzhou University as the first author and corresponding author. This paper presented the achievements of Prof. Shujuan Hu's team in the basic theoretical research of climate dynamics over the years, and also summarized the research progress of others in the representation of global atmospheric circulation, which was a systemically important theory of climate prediction.

Under the background of global warming, major natural disasters are on the rise, causing serious economic losses and human casualties. Due to the special geographical environment, China has been seriously affected by meteorological disasters for a long time, especially drought and flood disasters. As a result, month-to-season-scale climate prediction with drought and flood forecasting as the focus has always been a major demand for national development. However, the accuracy of the existing climate prediction model is far from meeting the actual needs. The fundamental reason for the inaccuracy of climate prediction lies in the lack of basic theories of climate dynamics.

Shujuan Hu said she has been working on the dynamics of global atmospheric circulation since 2001, when she studied for her doctorate with Prof. Jifan Chou, a famous Chinese meteorologist and academician of Chinese Academy of science. Prof. Jifan Chou is one of the few scholars in the field of atmospheric science in China and abroad who specialize in atmospheric theory research and make important contributions. He has devoted his life to the fundamental theories and methods of long-term numerical weather forecasting and the related atmospheric and oceanic dynamics, particularly nonlinear dynamics problems. Under the leadership of Prof. Jifan Chou, the novel theory of three-pattern decomposition of global atmospheric circulation (3P-DGAC) is constructed as the systematic basic theory of the climate prediction on the month-to-season scale. The phenomena of Rossby waves in middle-high latitudes and Hadley and Walker circulations in low latitudes were first extended to the global general circulation, and the definitions of three dimensional (3D) horizontal, meridional, and zonal circulations were then quantified. The global atmospheric circulation was decomposed into the sum of 3D horizontal circulation, meridional circulation, and zonal circulation; thus, the 3P-DGAC was established. Furthermore, combining the 3P-DGAC with the primitive equations of planetary-scale atmospheric motion, a new set of dynamical equations was established to directly describe the evolution mechanisms of global large-scale horizontal circulation, meridional circulation, and zonal circulation. From a global perspective, the theory of 3P-DGAC unifies the atmospheric motions in the middle-high latitudes with those in the low latitudes, which compensates for the deficiency of the partitioning of middle-high latitude atmospheric dynamics and low latitude atmospheric dynamics in the current study.

Guoxiong Wu, an academician of the Chinese Academy of Science and the famous Chinese climate dynamicist, had once paid attention to the progress of the 3P-DGAC. Prof. Guoxiong Wu commented: "3P-DGAC provides a feasible method for the study of the complex interactions between the mid-high latitude atmospheric circulation and the low latitude atmospheric circulation as well as between the horizontal circulation and the vertical circulation, which is an original and innovative research achievement with international level."

Credit: 
Science China Press

Mercury remains a persistent poison in Connecticut's still river

Western Connecticut is known for rolling hills, rich history, and industry, such as hat making. Once called the "Hat City of the World," Danbury thrived. Anyone familiar with Lewis Carroll's Mad Hatter may also be aware of the dangers of hat making, due to the industry's use of the potent toxin mercury. Starting in the late 1700s, Danbury hat factories were a point source of pollution, dumping large quantities of mercury into the nearby Still River.

Fashions change, the use of mercury in hat making was outlawed in 1940, and now all that remains of the once-thriving hatting industry in Danbury is its history - or is it?

A group of researchers from UConn and Wesleyan University spent four years studying a stretch of the Still River, and found that the industrial waste of a century ago is still very much present in 2020.

Kayla Anatone '12 (CAHNR), a current PhD student at Wesleyan University, was interested in the local history but also in learning if "legacy" mercury was impacting the environment and making its way into the food web. She and co-authors from the UConn Marine Sciences department - including PhD student Gunnar Hansen, Professor Robert Mason, Assistant Research Professor Zofia Baumann and Wesleyan University Professor Barry Chernoff - recently published the findings in Chemosphere.

Baumann says there have been studies performed to measure some aspects of mercury pollution in the river, however the data has not been summarized in a systematic way, and this study is the first comprehensive investigation of the Still River.

Baumann explains that mercury is a global pollutant, with multiple sources. Though the element is naturally occurring at low levels, mercury emissions have tripled since the industrial revolution, when mercury-enriched coal and other fossil fuels were used to power industry. Mercury can be used in various processes and products, from filling thermometers to filling cavities in teeth, but in the case of hat making it was used to soften the felt to make it more pliable for shaping.

To make the tracing of mercury through the environment even more complex, Anatone explains mercury can exist in numerous compounds and in either inorganic or organic forms. If mercury is in an inorganic form it does not move as easily through the food web. However some bacteria can convert the mercury to organic molecules, making them more "bioavailable" and readily enter the food web.

"The organic forms are the forms we are most concerned about, because organic mercury can accumulate in organisms such as humans and wildlife, and cause detrimental effects such as neurological damage," says Anatone.

The researchers performed the studies by sampling water, sediments, and tissues from a fish called the Eastern Blacknose Dace from seven sites on the river over the course of four years. Some of the sample sites were taken at former factory sites and some were reference sites for comparison. The results were staggering.

"The Still River watershed has significantly high levels of mercury in the fish no matter where the fish are from along the river," says Anatone. "Fish muscle tissue from six out of seven of the sites had concentrations that exceed EPA guidance levels for weekly mercury consumption. That was especially surprising because the fish are only about three inches in size and for them to be accumulating so much mercury, I just didn't expect it."

Anatone explains that they also found very high amounts of mercury persisting in the sediments: "All of the Still River sites which previously had hatting factories and direct point source pollution have concentrations in the sediment that exceeded the background levels of mercury found in sediments in other Connecticut sites."

Baumann says one aspect of the study is somewhat unexpected: "One of the really interesting findings in this study was that despite the very high concentrations of mercury in the sediments, at least it is my feeling based on the data that we have, is that a lot of the mercury is not bioavailable. Around one percent is available for further uptake in the food web, and that is what we are worried about essentially. Even though it is a pretty low percentage, it is impressive to see that it resulted in such high levels of bioaccumulation in the fish."

Anatone and Baumann hope that this research will inspire conversations and action.

"Research like this is the only way to find out how things are really moving in the ecosystem," says Baumann. "These studies are what you can use to inform decision-makers. Do we need to remediate? Should we let it be? Should we warn people who angle there regularly? This info is really needed."

Anatone says at the very least, it is important to set fishing guidelines for these areas: "We studied Eastern Blacknose Dace. Humans don't eat dace but humans eat trout and trout eat the dace. I think it is important that guidelines for fishing are put into place, like catch and release, or these areas are made off-limits for fishing."

Though Anatone will be graduating shortly and will not be doing any further sampling, she is hopeful this research will motivate others to continue to study the Still River and the impacts of legacy mercury on the ecosystem and food web.

"This research is not simple, it takes a lot of effort and time. It would be interesting to carry on this work and measure in other organisms such as trout."

Baumann adds that the river presents a unique system for studying how mercury cycles through New England streams, especially now with the changing climate, this work is important: "We want people to get curious about this."

Credit: 
University of Connecticut

2 immunotherapies merged into single, more effective treatment

image: Researchers at Washington University School of Medicine in St. Louis have combined two types of immunotherapy into a single treatment that may be more effective and possibly safer than current immunotherapies for blood cancers. Shown is a type of immune cell called a memory-like natural killer cell (right) attacking a leukemia cell (left). In the new study, the researchers modified the natural killer cells to help them find the leukemia cells more effectively.

Image: 
Julia Wagner

Some of the most promising advances in cancer treatment have centered on immunotherapies that rev up a patient's immune system to attack cancer. But immunotherapies don't work in all patients, and researchers have been searching for ways to increase their effectiveness.

Now, researchers at Washington University School of Medicine in St. Louis have combined two immunotherapy strategies into a single therapy and found, in studies in human cells and in mice, that the two together are more effective than either alone in treating certain blood cancers, such as leukemia. Evidence also suggests that the new approach could be safer than one of the most recent cellular immunotherapies to be approved by the FDA, called CAR-T cell therapy, in which the immune system's T cells are engineered to target tumor cells. Cell-based immunotherapies are most commonly used against blood cancers but can be harnessed against some solid tumors as well, such as prostate and lung tumors and melanoma.

The study appears online in the journal Blood.

In the new research, the scientists have harnessed the technology used to engineer CAR-T cells and, instead of modifying specialized immune cells called T cells, they have used similar technology to alter different immune cells called natural killer (NK) cells. The resulting immunotherapy combines the benefits of both strategies and may reduce the side effects that are sometimes seen in CAR-T cell therapy. In some patients, for example, CAR-T cell therapy causes a cytokine storm, a life-threatening overreaction of the immune system.

"Immunotherapies show great promise for cancer therapy, but we need to make them more effective and more safe for more patients," said co-senior author Todd A. Fehniger, MD, PhD, a professor of medicine. "This combined approach builds on the treatment strategy that we developed for leukemia patients using natural killer cells. We can supercharge natural killer cells to enhance their ability to attack cancer cells. And at the same time, we can use the genetic engineering approaches of CAR cell therapy to direct the natural killer cells to a tumor target that would normally be overlooked by NK cells. It fundamentally changes the types of cancer that NK cells could be used to treat, both additional blood cancers and potentially solid tumors as well."

In past work, Fehniger and his colleagues showed that they could collect a patient's own NK cells, expose the cells to a specific recipe of chemical signals that prime the cells to attack tumors, and then return the primed cells to patients for therapy. This chemical exposure is a sort of basic training for the cells, according to the investigators, preparing the NK cells to fight the cancer. When the cells are then returned to the body, they remember their training, so to speak, and are more effective at targeting the tumor cells. Because their training has given the NK cells a memory of what to do when they encounter tumor cells, the researchers dubbed them memory-like NK cells.

In small clinical trials conducted at Siteman Cancer Center at Barnes-Jewish Hospital and Washington University School of Medicine, such cells were effective in putting some patients with leukemia into a lasting remission, but they didn't work for everyone. Some tumor cells still evaded the memory-like NK cells, despite the cells' basic training. To help the cells find the tumor cells, so their basic training can kick in and kill the correct target, the researchers modified the memory-like NK cells with the same CAR (chimeric antigen receptor) molecule that is typically used to target T cells to tumor cells. The CAR molecule is flexible and can be modified to direct the cells to different tumor types, depending on the proteins on the surfaces of the cancer cells.

The resulting hybrid cells were more effective in treating mice with leukemia than memory-like NK cells alone, leading to longer survival for mice treated with CAR memory-like NK cells. The researchers also found the therapy to be effective despite the fact that the mice were given relatively low doses of the cells.

"One aspect of this study I find most exciting is how nicely these hybrid NK cells expand in the mice to respond to their tumors," said co-senior author Melissa Berrien-Elliott, PhD, an instructor in medicine. "We can provide a tiny dose and see an incredible amount of tumor control. To me, this highlights the potency of these cells, as well as their potential to expand once in the body, which is critical for translating these findings to the clinic."

Fehniger also pointed out that an advantage of NK cells in general -- and for biological reasons that the scientists are still working to understand -- NK cells don't trigger a dangerous immune response or the long-term side effects that T-cell therapy can cause in attacking the patient's healthy tissues, a condition called graft-versus-host disease.

"In all of the clinical trials exploring any type of NK cells, we don't see the troublesome side effects of cytokine release syndrome or neurotoxicity that we see with CAR-T cells that can profoundly affect patients," Fehniger said. "These side effects can be life-threatening and require intensive care. We're still working to understand how NK cells are different. But if you can get the benefits of CAR-T cells with few if any of the side effects, that's a reasonable line of research to pursue. Another benefit of this safer therapy is the potential to give these cells to patients at an earlier stage in their disease, rather than using them as a last resort."

Other groups have developed CAR-NK cells, but a major difference is that other groups' NK cells came from donated cord blood or induced stem cells, rather than adult donors or the patients themselves.

"The other groups have artificially differentiated stem cells into something that resembles an NK cell," Fehniger said. "With that strategy, there's no guarantee that those cells will have all the properties of typical mature NK cells. In contrast, we're starting with adult NK cells, so we're more confident that they will have all the inherent properties and behavior of adult NK cells, which we have already shown to be effective in certain types of cancer patients, especially those with leukemia. Inducing memory properties adds to their persistence and effectiveness against many cancer types."

"Over the next several years, we would like to be able to scale up this process to produce enough cells for a first-in-human clinical trial, and investigate their effectiveness in different types of human blood cancers," he said.

Credit: 
Washington University School of Medicine