Earth

Mathematical modeling for translational research of new CRSD medication

image: Figure 1. Interspecies and Inter-patients Variations in PF-670462 Efficacy

Image: 
© KAIST and Pfizer

Mathematicians' new modeling has identified major sources of interspecies and inter-individual variations in the clinical efficacy of a clock-modulating drug: photosensitivity and PER2 level. This enabled precision medicine for circadian disruption.

A KAIST mathematics research team led by Professor Jae Kyoung Kim, in collaboration with Pfizer, applied a combination of mathematical modeling and simulation tools for circadian rhythms sleep disorders (CRSDs) to analyze the animal data generated by Pfizer. This study was reported in Molecular Systems Biology as the cover article on July 8.

Pharmaceutical companies have conducted extensive studies on animals to determine the candidacy of this new medication. However, the results of animal testing do not always translate to the same effects in human trials. Furthermore, even between humans, efficacy differs across individuals depending on an individual's genetic and environmental factors, which require different treatment strategies.

To overcome these obstacles, KAIST mathematicians and their collaborators developed adaptive chronotherapeutics to identify precise dosing regimens that could restore normal circadian phase under different conditions.

A circadian rhythm is a 24-hour cycle in the physiological processes of living creatures, including humans. A biological clock in the hypothalamic suprachiasmatic nucleus in the human brain sets the time for various human behaviors such as sleep.

A disruption of the endogenous timekeeping system caused by changes in one's life pattern leads to advanced or delayed sleep-wake cycle phase and a desynchronization between sleep-wake rhythms, resulting in CRSDs. To restore the normal timing of sleep, timing of the circadian clock could be adjusted pharmacologically.

Pfizer identified PF-670462, which can adjust the timing of circadian clock by inhibiting the core clock kinase of the circadian clock (CK1d/e). However, the efficacy of PF-670462 significantly differs between nocturnal mice and diurnal monkeys, whose sleeping times are opposite.

The research team discovered the source of such interspecies variations in drug response by performing thousands of virtual experiments using a mathematical model, which describes biochemical interactions among clock molecules and PF-670462. The result suggests that the effect of PF-670462 is reduced by light exposure in diurnal primates more than in nocturnal mice. This indicates that the strong counteracting effect of light must be considered in order to effectively regulate the circadian clock of diurnal humans using PF-670462.

Furthermore, the team also found the source of inter-patients variations in drug efficacy using virtual patients whose circadian clocks were disrupted due to various mutations. The degree of perturbation in the endogenous level of the core clock molecule PER2 affects the efficacy.

This explains why the clinical outcomes of clock-modulating drugs are highly variable and certain subtypes are unresponsive to treatment. Furthermore, this points out the limitations of current treatment strategies tailored to only the patient's sleep and wake time but not to the molecular cause of sleep disorders.

PhD candidate Dae Wook Kim, who is the first author, said that this motivates the team to develop an adaptive chronotherapy, which identifies a personalized optimal dosing time of day by tracking the sleep-wake up time of patients via a wearable device and allows for a precision medicine approach for CRSDs.

Professor Jae Kyoung Kim said, "As a mathematician, I am excited to help enable the advancement of a new drug candidate, which can improve the lives of so many patients. I hope this result promotes more collaborations in this translational research."

Credit: 
The Korea Advanced Institute of Science and Technology (KAIST)

Improved model could help scientists better predict crop yield, climate change effects

video: A new computer model incorporates how microscopic pores on leaves may open in response to light--an advance that could help scientists create virtual plants to predict how higher temperatures and rising levels of carbon dioxide will affect food crops, according to a study published in a special issue of the journal Photosynthesis Research.

Image: 
RIPE Project

CHAMPAIGN, Ill. – A new computer model incorporates how microscopic pores on leaves may open in response to light—an advance that could help scientists create virtual plants to predict how higher temperatures and rising levels of carbon dioxide will affect food crops, according to a study published in a special July 2019 issue of the journal Photosynthesis Research.

“This is an exciting new computer model that could help us make much more accurate predictions across a wide range of conditions,” said Johannes Kromdijk, who led the work as part of an international research project called Realizing Increased Photosynthetic Efficiency (RIPE).

RIPE, which is led by the University of Illinois, is engineering crops to be more productive without using more water by improving photosynthesis, the natural process all plants utilize to convert sunlight into energy to fuel growth and crop yields. RIPE is supported by the Bill & Melinda Gates Foundation, the U.S. Foundation for Food and Agriculture Research (FFAR), and the U.K. Government’s Department for International Development (DFID).

The current work focused on simulating the behavior of what are known as stomata—microscopic pores in leaves that, in response to light, open to allow water, carbon dioxide, and oxygen to enter and exit the plant. In 2018, the RIPE team published a paper in Nature Communications that showed increasing one specific protein could prompt plants to close their stomata partially—to a point where photosynthesis was unaffected, but water loss decreased significantly. This study’s experimental data was used to create the newly improved stomata model introduced today.

“We’ve known for decades that photosynthesis and stomatal opening are closely coordinated, but just how this works has remained uncertain,” said Stephen Long, Ikenberry Endowed University Chair of Crop Sciences and Plant Biology at the University of Illinois. “With this new computer model, we have a much better tool for calculating stomatal movements in response to light.”

The ultimate goal, Long said, is to identify opportunities to control these stomatal gatekeepers to make drought-tolerant crops. “Now we’re closing in on the missing link: How photosynthesis tells stomates when to open.”

Computer modeling has been a major advance in crop breeding. The father of modern genetics, Gregor Mendel, made his breakthrough discovery that pea plants inherit traits from their parents by growing and breeding more than 10,000 pea plants over eight years. Today, plant scientists can virtually grow thousands of crops in a matter of seconds using these complex computer models that simulate plant growth.

Stomatal models are used together with models for photosynthesis to make wide-ranging predictions from future crop yields to crop management, such as how crops respond when there is a water deficit. In addition, these models can give scientists a preview of how crops like wheat, maize, or rice could be affected by rising carbon dioxide levels and higher temperatures.

“The previous version of the stomatal model used a relationship that wasn’t consistent with our current understanding of stomatal movements,” said Kromdijk, now a University Lecturer at the University of Cambridge. “We found that our new version needs far less tuning to make highly accurate predictions.”

Still, there’s a lot of work to be done to show that this modified model functions in a wide variety of applications and to underpin the relationship between stomata and photosynthesis further.

“We have to show that this model works for a diverse range of species and locations,” said former RIPE member Katarzyna Glowacka, now an assistant professor at the University of Nebraska-Lincoln. “Large-scale simulation models string together models for atmospheric turbulence, light interception, soil water availability, and others—so we have to convince several research communities that this is an improvement that is worth making.”

Credit: 
Carl R. Woese Institute for Genomic Biology, University of Illinois at Urbana-Champaign

Scientists reveal close connections between the Northern Hemisphere mid-high latitudes and East Asia

image: Four pathways of influence of the Northern Hemisphere mid-high latitudes on the East Asian monsoon.

Image: 
Jianping Li

Chinese scientists have made significant progress on the influence of the Northern Hemisphere mid-high latitudes on East Asian climate, according to Prof. Jianping Li, offrom the Key Laboratory of Physical Oceanography-Institute for Advanced Ocean Studies, Ocean University of China and Qingdao National Laboratory for Marine Science and Technology, and the lead/corresponding author of a study recently published in Advances in Atmospheric Sciences. The article is included in a special issue on the national report (2011-2018) to the International Union on Geophysics and Geodesy (IUGG) Centennial by the China National Committee for IAMAS.

China is located in the East Asian monsoon region, and its weather and climate are greatly affected by the East Asian monsoon. Drought and flood disasters caused by monsoons often cause significant economic losses and casualties. Understanding the formation and variation of the East Asian monsoon has important implications for understanding the climate change and variability in China, revealing the predictability sources of flood and drought, proposing new theories and methods for climate prediction, and producing drought and flood prediction products. Therefore, the related research is also a scientific issue of great significance for the development of the national economy, especially industrial and agricultural production, and people's property.

In recent years, increasingly more observational and simulation evidence shows that the mid-high latitude climate variability has an important impact on the East Asian monsoon climate, and its impact is as significant as the tropical climate variability, which has been of more concern in previous studies. Among the evidence, Chinese scientists have produced systematic research results and played a crucial role in promoting the development of climate research in East Asia. Therefore, it is necessary to periodically review these aspects of this progress and provide reference for young scholars and researchers who are new to the field of East Asian climate research.

Focusing on the above issues, Professor Li and colleagues conducted a systematic review of research findings on the connections between the Northern Hemisphere mid-high latitudes and East Asian climate. In the paper, a theoretical framework for multi-sphere coupled bridges (ocean-atmosphere coupled bridge, land-atmosphere coupled bridge, ice-atmosphere coupled bridge, etc.) and chain coupled bridges (e.g. tropical-extratropical, Southern-Northern Hemisphere, troposphere-stratosphere, different ocean basins, and ocean-land interactions) is first proposed, which is a useful concept for studying and understanding multi-scale interactions in climate systems. Then, under this theoretical framework, existing research findings are summarized and categorized. The pathways of influence of the Northern Hemisphere mid-high latitudes on the East Asian monsoon are divided into four categories (North Atlantic, North Pacific, Arctic, and synergistic mid-high latitudes-tropics), and for each type of pathway, detailed discussion on the impacts and dynamical mechanisms involved is provided. These classifications greatly promote the attribution and understanding of East Asian monsoon climate and potential predictability sources. Finally, new concepts referred to as the "synergistic effect" and "antagonistic effect" are proposed, which objectively explain the combined effect of the impacts of multiple regional climate variabilities on East Asian climate.

Professor Li points out that, although much work has been carried out on East Asian climate, in practice the current prediction skill remains insufficient to meet the needs of society. The climate variability over the different regions mentioned in the review paper has a very significant "synergistic effect" on the East Asian monsoon climate, including tropical-mid-to-high latitude climate variability, the Southern-Northern Hemisphere, the five oceans, and the synergistic effects between the oceans and land. These synergistic effects may provide more predictability sources for East Asian climate, and how to consider contributions from these climate variabilities in East Asian climate predictions will be the focus of future research.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Study highlights need for tailored skin cancer prevention programs

WASHINGTON (July 9, 2019) -- Sun safety practices for attendees at skin cancer screening events differ from the general public, according to findings published by researchers from the George Washington University (GW) Cancer Center. The study was published in the Journal of Drugs in Dermatology.

According to the Centers for Disease Control and Prevention (CDC), skin cancer is the most common cancer in the United States. Proper sun safety practices, like wearing sunscreen, seeking shade, and wearing sun-protective clothing are critical to reduce the risk for skin cancers like squamous cell carcinoma, basal cell carcinoma, and melanoma.

Through a survey randomly administered at six locations in Washington, D.C., and to attendees of a free skin cancer screening event at GW, a team led by Emily Murphy, a research fellow in the Department of Dermatology at the GW School of Medicine and Health Sciences, found that respondents from the screening group were significantly more likely to always wear sunscreen, always seek shade, and always or sometimes wear sun-protective clothing than the public group. These data suggest that individuals who do not typically attend free screenings may have greater gaps in sun protective knowledge and behavior, highlighting the need to reach these populations through different mechanisms. Survey responses were also analyzed by age and race to identify additional disparaties. Participants who identified as white were more likely to always or sometimes wear sunscreen and sun-protective clothing than non-white participants. Patients over 61 years were more likely to always seek shade and wear sun-protective clothing than those younger than 31 years.

"These findings highlight the importance of tailoring free skin cancer screening events for non-white and younger populations," said Adam Friedman, MD, interim chair of the Department of Dermatology at the GW School of Medicine and Health Sciences and director of the Supportive Oncodermatology Clinic at the GW Cancer Center, who also served as senior author on the study. "While free screening events are important, we also have to think about comprehensive, community-based solutions that reach broader demographic populations than skin cancer screenings alone."

White participants in the study reported more blistering sunburns than non-white participants, as well as more indoor tanning uses. However, indoor tanning use was equal among the screening and general public groups, indicating that all patients need to be educated on indoor tanning risks. The Community Preventive Services Task Force, an independent, non-federal panel of public health and prevention experts, recommends multicomponent interventions that combine individually focused strategies, educational campaigns, and environmental or policy changes to influence sun safety behaviors. Other recommendations include education and policy approaches in primary school and outdoor recreation settings.

"This study also highlights the importance of reaching non-white populations with skin cancer prevention messages," said Friedman. "We have to address the myth that skin cancer only affects fair-skinned individuals. Skin cancer does not discriminate and therefore we need to encourage sun safety practices among all individuals."

Credit: 
George Washington University

Fall in GP antibiotic prescribing has been slowest for older patients and those with an unclear diagnosis

GP in England are prescribing fewer antibiotics and when they prescribe them they are increasingly choosing drugs that target a narrow range of organisms rather than broad spectrum antibiotics, suggests new research from King's College London published online in BMJ Open.

However, falls in GPs' rates of prescribing have been smaller in some groups of patients, in particular patients aged over 55 and those with no clear diagnosis.

There has been a global drive to cut antibiotic use in response to the growing threat of antimicrobial resistance - a situation where bacteria develop resistance to antibiotics so become more difficult to treat and potentially ultimately untreatable.

The mechanism by which resistance develops is complex but the more frequently antibiotics are used, the greater the number of bacteria exposed to them and the more likely it is that those bacteria susceptible to the antibiotics will become resistant to them.

Antibiotic stewardship policies promoting more considered use of antibiotics have been introduced to slow the development of antimicrobial resistance. These policies encourage GPs to reduce prescribing of antibiotics overall, and where they are needed to choose one effective against a narrow range of bacteria over broad spectrum options, which target a wider range of bacteria.

Xiaohui Sun, PhD candidate of Population and Environmental Health Sciences, who led the research from King's College London, analysed GP prescription of antibiotics at 102 general practices in England from 2014 to 2017 by extracting data from the UK Clinical Practice Research Datalink (CRPD).

The data showed that over that period total antibiotic prescribing declined by 6.9% per year, from 608 prescriptions per 1000 person-years in 2014 to 489 per 1000 person-years in 2017.

The rate of prescribing for broad spectrum beta lactam antibiotics, which target a wide range of organisms, fell more rapidly - by 9.3% per year, from 221 prescriptions per 1000 person-years in 2014 to 163 per 1000 person-years in 2017.

Prescribing rates declined at a similar pace for male and female patients, but the rate of decline was lower for older patients (those aged over 55).

When the authors looked at the associated diagnostic codes for the prescriptions they noted that prescribing rates had declined most for respiratory infections (9.8% per year), followed by for genitourinary infections (5.7%), but had fallen by only 3.8% in cases where no medical reason for their prescription was recorded.

More than a third of antibiotics (38.8%) prescribed by GPs were associated with medical codes that did not indicate a clinical condition that would require their use, and a further 15.3% of antibiotic prescriptions had no medical codes at all recorded against them. A large proportion of prescriptions not associated with medical codes were repeat prescriptions.

One potential limitation of the study is that not all community antibiotic prescribing may have been fully recorded, the authors point out, as prescribing by out-of-hours services, walk-in and urgent care centres may not have made it into the electronic record. Prescribing data from specialist clinics and hospitals was not included and these services may have issued some community prescriptions.

On the other hand, as the study looked at the number of prescriptions written not the number of antibiotic prescriptions dispensed, the study could not determine whether GPs used a delayed or deferred antibiotic prescribing strategy. If that was the case, antibiotic consumption would be slightly lower than antibiotic prescription, the authors added.

A strength of the study was that it looked at prescribing habits at the same practices over four years. Sun concluded: "Antibiotic prescribing has reduced and become more selective but substantial unnecessary antibiotic use may persist. Improving the quality of diagnostic coding for antibiotic use will help to support antimicrobial stewardship."

Credit: 
BMJ Group

Scientists decode DNA secrets of world's toughest bean

image: Hands of Timothy Close, professor and geneticist at UC Riverside and Sassoum Lo, PhD student, UCR Plant Biology Program, paper co-author from Senegal. Close and Sassoum hold cowpea seeds with a range of coat colors, patterns, sizes and shapes.

Image: 
TJ Close / UCR

UC Riverside scientists have decoded the genome of black-eyed peas, offering hope for feeding Earth's expanding population, especially as the climate changes.

Understanding the genes responsible for the peas' drought and heat tolerance eventually could help make other crops tougher too.

Black-eyed peas are small beans with dark midsections. They've been a global dietary staple for centuries due to their environmental toughness and exceptional nutritional qualities, such as high protein and low fat. In sub-Saharan Africa they remain the number one source of protein in the human diet.

A genome is the full collection of genetic codes that determine characteristics like color, height, and predisposition to diseases. All genomes contain highly repetitive sequences of DNA that UCR Professor of Computer Science and project co-leader Stefano Lonardi likens to "hundreds of thousands of identical jigsaw puzzle pieces."

Lonardi described the process of figuring out how the jigsaw puzzle sequences fit together as "computationally challenging." In order to do so, Lonardi's team assembled the genome many times with different software tools and parameters. Then they created new software capable of merging these various genome solutions into a single, complete picture.

With the success of this project, the black-eyed pea joins only a handful of other major crops whose genomes have been fully sequenced. The team's work on the project was published in the June issue of The Plant Journal, where it was featured as the cover story, and Lonardi's free software can be downloaded online.

Research on black-eyed peas, a legume also known as cowpea, started at UC Riverside more than 40 years ago. But cowpeas' presence in Riverside predates the university by about 200 years.

"The cowpea has been here supporting people since early colonial times," said project co-leader Timothy Close, a UCR professor of botany and plant sciences. 'It's nice that we've brought this plant with so much local history up to state of the art for scientific research."

This is the first high-quality reference genome for the cowpea. Work on it began three years ago, made possible mainly by a $1.6 million grant from the National Science Foundation, or NSF. An additional $500,000 NSF grant also supported the computational efforts.

A clue to the complexity of the project is the size of the research team. In addition to Close and Lonardi, the many other UCR scientists on the team included María Muñoz-Amatrían, Qihua Liang, Steve Wanamaker, Sassoum Lo, Hind Alhakami, Rachid Ounit, Philip Roberts, Jansen Santos, Arsenio Ndeve, and Abid Md. Hasan. Additional team members inside the U.S. came from UC Davis, the Department of Energy's Joint Genome Institute in California, the National Center for Genome Resources in New Mexico, and the U.S. Department of Agriculture in Iowa. International team members came from Finland, France, Brazil, and the Czech Republic.

As with humans, there are differences between individual cowpeas. Knowing which genes are responsible for qualities in individuals such as color, size, or pathogen resistance will help breeders develop new varieties even better able to withstand external challenges.

"Having the genome sequence helps scientists make decisions about the choice of parent plants to crossbreed in order to produce their desired progeny," Close said.

One of the cowpea traits that scientists are now trying to understand is its remarkable ability to recover from drought stress.

"We're trying to figure out why cowpeas are so resilient to harsh conditions," said Close. "As we move into a world with less water available to agriculture, it will be important to capitalize on this ability and expand on it, taking the lead from cowpeas to guide improvements in other crops that are vulnerable to climate change."

Credit: 
University of California - Riverside

Paris Agreement does not rule out ice-free Arctic

image: Probabilities of Arctic summer sea ice disappearing when crossing certain global warming levels.

Image: 
Elke Zeller and Roman Olson

Research published in this week's issue of Nature Communications reveals a considerable chance for an ice-free Arctic Ocean at global warming limits stipulated in the Paris Agreement. Scientists from South Korea, Australia and the USA used results from climate models and a new statistical approach to calculate the likelihood for Arctic sea ice to disappear at different warming levels.

Future climate projections are usually obtained from global climate computer models. These models are based on several hundred thousand lines of computer code, developed to solve the physical equations of the atmosphere, ocean, sea-ice and other climate components. Applying future greenhouse gas concentrations, each computer model produces a version of what the future of the Earth's climate might look like. Transforming this information into practical decisions is not easy, because of the remaining uncertainties in the magnitude of future climate change on regional scales. Decision making in a warming world requires an understanding of the probabilities of certain climatic events to occur.

Up to now, it has been difficult to extract meaningful probabilities from climate models, because these models sometimes share common computer code or make similar assumptions regarding the implementation of less well understood processes, such as clouds or vegetation. To obtain more accurate probability estimates for future climate change in the Arctic region, the research team has developed a novel statistical method which translates results from a suite of climate computer model simulations to probabilities. This method ranks the models in terms of how well they agree with present-day observations and accounts also for inter-dependencies amongst the models.

"Translating model dependence into mathematical equations has been a long-standing issue in climate science. It is exciting to see that our method can provide a general framework to solve this problem," said coauthor Won Chang, assistant professor in the department of Mathematical Sciences at the University of Cincinnati, USA.

The researchers applied the new statistical method to climate model projections of the 21st century. Using 31 different climate models, which exhibit considerable inter-dependence, the authors find that there is at least a 6% probability that summer sea ice in the Arctic Ocean will disappear at 1.5 °C warming above preindustrial levels - a lower limit recommended by the Paris Agreement of the United Nations Framework Convention on Climate Change (Figure 1). For a 2°C warming, the probability for losing the ice rises to at least 28%. Most likely we will see a sea ice-free summer Arctic Ocean for the first time at 2 to 2.5°C warming.

"Our work provides a new statistical and mathematical framework to calculate climate change and impact probabilities," commented Jason Evans, professor at the Climate Change Research Center in UNSW Australia in Sydney.

"Up to now, there was no established mathematical framework to assign probabilities on non-exclusive theories. While we only tested the new approach on climate models, we are eager to see if the technique can be applied to other fields, such as stock market predictions, plane accident investigations, or in medical research", says Roman Olson, the lead author and researcher at the Institute for Basic Science, Center for Climate Physics (ICCP) in South Korea.

Credit: 
Institute for Basic Science

A clearer picture of global ice sheet mass

Fluctuations in the masses of the world's largest ice sheets carry important consequences for future sea level rise, but understanding the complicated interplay of atmospheric conditions, snowfall input and melting processes has never been easy to measure due to the sheer size and remoteness inherent to glacial landscapes.

Much has changed for the better in the past decade, according to a new review paper co-authored by researchers at the University of Colorado Boulder, NASA, Utrecht University and Delft University of Technology and recently published in the Review of Geophysics.

The study outlines improvements in satellite imaging and remote sensing equipment that have allowed scientists to measure ice mass in greater detail than ever before.

"We've come a long way in the last 10 years from an observational perspective," said Jan Lenaerts, lead author of the research and an assistant professor in CU Boulder's Department of Atmospheric and Oceanic Sciences (ATOC). "Knowing what happens to ice sheets in terms of mass in, mass out allows us to better connect climate variations to ice mass and how much the mass has changed over time."

Ice sheets primarily gain mass from precipitation and lose it due to solid ice discharge and runoff of melt water. Precipitation and runoff, along with other surface processes, collectively determine the surface mass balance. The Antarctic Ice Sheet, the world's largest, is cold year-round with only marginal summer melting. A small increase or decrease in yearly snowfall, then, can make a considerable difference in surface mass because the addition or subtraction is compounded over a massive area.

"Snowfall is dominant over Antarctica and will stay that way for the next few decades," Lenaerts said. "And we've seen that as the atmosphere warms due to climate change, that leads to more snowfall, which somewhat mitigates the loss of ice sheet mass there. Greenland, by contrast, experiences abundant summer melt, which controls much of its present and future ice loss."

In years past, climate models would have been unable to render the subtleties of snowfall in such a remote area. Now, thanks to automated weather stations, airborne sensors and Earth-orbit satellites such as NASA's Gravity Recovery and Climate experiment (GRACE) mission, these models have been improved considerably. They produce realistic ice sheet surface mass balance, allow for greater spatial precision and account for regional variation as well as wind-driven snow redistribution--a degree of detail that would have been unheard of as recently as the early 2000s.

"If you don't have the input variable right, you start off on the wrong foot," Lenaerts said. "We've focused on snowfall because it heavily influences the ice sheet's fate. Airborne observations and satellites have been instrumental in giving a better view of all these processes."

Ground-based radar systems and ice core samples provide a useful historical archive, allowing scientists to go back in time and observe changes in the ice sheet over long periods of time. But while current technologies allow for greater spatial monitoring, they lack the ability to measure snow density, which is a crucial variable to translate these measurements into mass changes.

The biggest opportunity may lie in cosmic ray counters, which measure surface mass balance directly by measuring neutrons produced by cosmic ray collisions in Earth's atmosphere, which linger in water and can be read by a sensor. Over long periods of time, an array of these devices could theoretically provide even greater detail still.

Overall, Lenaerts said, the field of ice sheet observation has come of age in recent years, but still stands to benefit from additional resources.

"The community of researchers studying these issues is still relatively small, but it's already a global community and interest is growing," he said. "We'd like to get to a point where ice sheet mass processes are factored into global climate and Earth system models, to really show that bigger picture."

Credit: 
University of Colorado at Boulder

Brain stimulation enhances motivation to work for food

Electrical stimulation of the brain through the vagus nerve increases the motivations to work for food, according to recent findings of a research group at the University of Tübingen. These findings, which were presented at the annual meeting of the Society for the Study of Ingestive Behavior this week in Utrecht, Netherlands, demonstrate a novel method to alter motivation to obtain food.

"Vigorous work is costly and has to be recuperated by energy intake. That makes it vital for us to know when it is worth the effort. The vagus nerve helps set the tone for actions by signaling, for example, if energy is readily available for that action or not," says Dr. Nils B. Kroemer, the Principal Investigator of the study and junior group leader of the University's Neuroscience of Motivation, Action, and Desire Laboratory (neuroMADLAB) . "We knew that vagus nerve stimulation changes dopamine levels in animals and that chronic stimulation improves depressive symptoms in humans, but it was not known if it could acutely improve motivation. We found that it may provide a much-needed technique to rapidly change reward-related behavior such as eating".

The researchers invited 81 hungry participants to their laboratory on two occasions. Everyone was offered a tasty breakfast, but there was a catch. Participants had to exert physical effort to win reward points that could be "cashed in" for their favorite cereals. At one of the two sessions, participants completed the task while receiving electrical stimulation of the vagus nerve; during the other session, they received placebo stimulation. Results showed that stimulation boosted how vigorously participants exerted effort for the rewards at stake compared to the control condition.

"This ambitious experiment is one of the first to study motivational changes during acute vagus nerve stimulation,'' said Monja P. Neuser, a Ph.D. student in the neuroMADLAB and lead study author. "The motivational effects elicited by the stimulation are very promising and encourage us to further unravel the exact neural mechanisms. We think that the stimulation increases dopamine levels in the brain, which is known to enhance vigor. "

By using concurrent functional neuroimaging, researchers of the neuroMADLAB will continue investigating how non-invasive vagus nerve stimulation can be administered to maximize its benefits. Most notably, researchers plan to apply this technique in people suffering from anhedonia--an absence of desire to engage in normally pleasurable activities such as eating--to determine whether it increases their motivation to seek out and consume food.

Credit: 
Society for the Study of Ingestive Behavior

A drier future sets the stage for more wildfires

image: Droughts can create ideal conditions for wildfires. Dry trees and vegetation provide fuel. Low soil and air moisture make it easier for fires to spread quickly. In these conditions, a spark from lightning, electrical failures, human error or planned fires can quickly get out of control. As Earth's climate warms and precipitation patterns change, increasingly severe droughts will leave some areas of the world vulnerable to increasingly severe fires.

Image: 
Earth Observatory

November 8, 2018 was a dry day in Butte County, California. The state was in its sixth consecutive year of drought, and the county had not had a rainfall event producing more than a half inch of rain for seven months. The dry summer had parched the spring vegetation, and the strong northeasterly winds of autumn were gusting at 35 miles per hour and rising, creating red flag conditions: Any planned or unplanned fires could quickly get out of control.

Sure enough, just before daybreak, strong winds whipped a stray spark from a power line into an inferno. The Camp Fire became the most destructive fire in California's history, scorching approximately 240 square miles, destroying nearly 14,000 buildings, causing billions of dollars in damage and killing 88 people. Later the same day, the Woolsey Fire broke out in Los Angeles County, burning 150 square miles and killing three.

Droughts can create ideal conditions for wildfires. Lack of rain and low humidity dry out trees and vegetation, providing fuel. In these conditions, a spark from lightning, electrical failures, human error or planned fires can quickly get out of control.

Global climate change is predicted to change precipitation and evaporation patterns around the world, leading to wetter climate in some areas and drier in others. Areas that face increasingly severe droughts will also be at risk for more and larger fires. Several NASA missions collect valuable data to help scientists and emergency responders monitor droughts and fires. Some instruments monitor water in and below the soil, helping to assess whether areas are moving toward dangerous droughts. Others watch for heat and smoke from fires, supporting both research and active disaster recovery.

Understanding how fires behave in dry conditions can help firefighters, first responders and others prepare for a hotter, drier future.

Climate Change: Not Just Wet

Earth's warming climate is forecasted to make global precipitation patterns more extreme: Wet areas will become wetter, and dry areas will become drier. Areas such as the American Southwest could see both reduced rainfall and increased soil moisture evaporation due to more intense heat, and in some cases, the resulting droughts could be more intense than any drought of the past millennium.

Ben Cook of NASA's Goddard Institute for Space Studies (GISS) in New York City researches "megadroughts" -- droughts lasting more than three decades. Megadroughts have occurred in the past, like the decades-long North American droughts between 1100 and 1300, and the team used tree ring records to compare these droughts with future projections. He and his team examined soil moisture data sets and drought severity indices from 17 different future climate models, and they all predicted that if greenhouse gas emissions continue to increase at their present rate, the risk of a megadrought in the American Southwest could hit 80 percent by the end of the century. Additionally, these droughts will likely be even more severe than those seen in the last millennium.

Such severe droughts will affect the amount and dryness of fuel such as trees and grass, Cook said.

"Fire depends on two things: having enough fuel and drying that fuel out so it can catch fire. So in the short term, more droughts probably mean more fire as the vegetation dries out," said Cook. "If those droughts continue for a long period, like a megadrought, however, it can actually mean less fire, because the vegetation will not grow back as vigorously, and you may run out of fuel to burn. It's definitely complicated."

Current and future NASA measurements of soil moisture and precipitation will help to evaluate climate models' predictions, making them even more accurate and useful for understanding Earth's changing climate.

Cook and his GISS colleague Kate Marvel were the first to provide evidence that human-generated greenhouse gas emissions were influencing observed drought patterns as long ago as the early 1900's. By showing that human activities have already affected drought in the past, their research provides evidence that climate change from human-generated greenhouse gas emissions will likely influence drought in the future.

Staying Ahead of the Fire

If the future does hold megadroughts for the southwestern United States, what might this mean for its fire seasons?

"Once we change the climatology and get drier and drier fuels, we should expect more intense fires and higher fire severity," said Adam Kochanski, an atmospheric scientist at the University of Utah, referring to the size and impact of the fires. If fuels are moist, the fire is more likely to stay close to the ground and be less destructive, he said. Dry trees and plants make it more likely that flames will reach the forest canopy, making the fire more destructive and harder to control.

Kochanski and Jan Mandel of the University of Colorado Denver used data from NASA and other sources to simulate the interactions between wildfires, soil moisture and local weather. They built on previous work by the National Center for Atmospheric Research (NCAR) and others to develop the SFIRE module for the widely used Weather Research and Forecasting model (WRF).

This module uses data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) aboard its Aqua and Terra satellites, and the Visible Infrared Imaging Radiometer Suite (VIIRS) aboard the Suomi National Polar-Orbiting Partnership (Suomi NPP) spacecraft.

Weather influences fires, but fires also influence local weather by producing heat, water vapor and smoke, Kochanski said. The winds from large fires can alter local weather patterns, and in extreme conditions, generate firestorms and fire tornadoes.

"It's not uncommon for people involved in wildland fires to report that although the wind is not very strong, the fires propagate very fast," Kochanski said. "If it isn't that windy, but your fire is intense and releases a lot of heat, it has the potential to generate its own winds. Even if the ambient winds are weak, this fire will start moving as if it were really windy."

Better modeling of these interactions not only helps firefighters better predict where and how a wildfire might spread, but also helps forest managers know whether a planned burn is safe.

A Tale of Fire and Snow

Fires' effects persist long after they are extinguished, and the availability or lack of fresh water plays an important role in vegetation regrowth and recovery. Dry conditions may prevent new seeds from germinating in the burned areas. Vegetation loss can lead to erosion and sediment blocking waterways, and firefighting chemicals may contaminate water sources.

Forest fires can have impacts on future winter snowpacks as well, said Kelly Gleason, a snow hydrologist and assistant professor at Portland State University. "Snowpack" refers to the snow that accumulates over an entire winter, rather than a single snowfall.

Here too, NASA data are key to understanding the processes involved. Gleason and her team used 16 years of data from NASA's MODIS instrument to investigate wildfires' effects on snow melt in forests in the American West. They discovered that soot and debris from fire makes snow darker and less reflective for up to 15 years after a fire.

"It's like wearing a black T-shirt on a sunny day," Gleason said. "It primes the snowpack to absorb more sunlight energy. And there's more energy anyway, because the forest canopy was burned, so more sun comes through."

Their survey of roughly 850 fires between 2000 and 2016 showed that snow in burned forests melted, on average, five days earlier than snow in unburned forests. In some areas the snow melted weeks or months earlier than normal, Gleason said.

"Every year we experience earlier snow melt, there are strong relationships with big, hot, long-lasting fires the following summer," she said. "It creates this vicious cycle where snow melts earlier due to climate change, which extends the summer drought period where the soil dries out, and when the fuels dry out, you get these big fires. This further accelerates snowmelt, further extending the summer drought period and fire potential."

Modeling a safer future

Mandel and Kochanski's fire-atmosphere model is already in operational use in Israel and Greece. While the software requires computing expertise to use, it is available for free, consistent with NASA's mission to freely provide its data and other products to the public.

Branko Kosovi?, program manager for Renewable Energy for the Research Applications Laboratory and director of the Weather Systems and Assessment Program at NCAR, also used WRF to develop the fire prediction system for the state of Colorado's Division of Fire Prevention and Control. This model uses a related module called FIRE and produces a fire, weather and smoke forecast useful for both wildfires and planned fires.

Kosovi? is also using the WRF system for his research, which uses NASA remote sensing data and machine learning to estimate fuel moisture daily over the contiguous Unites States.

"Measuring live fuel moisture [currently] has to be done manually," Kosovi? said. "People have to go out, take the live fuel, and essentially cure it in ovens to see how much moisture there is. It's very labor intensive. And you can imagine that, because of that, the data is sparse, both in space and in frequency and time."

Kosovi?, Mandel and Kochanski hope to build systems that will give forest managers better information to plan controlled fires and help improve resource allocation during wildfires, leading to better risk assessment and recovery.

NASA scientists monitor both freshwater and fires constantly, from space, the air and the ground, collecting short- and long-term data as Earth's climate continues to change. Programs such as the NASA Earth Science Disasters Program use satellite data to track active fires, monitor their effects on air quality and perform research that helps communities be more prepared before disasters strike. And looking to the future, modeling plays a key role in preparing for changing drought and fire seasons around the world.

Credit: 
NASA/Goddard Space Flight Center

Interstellar iron isn't missing, it's just hiding in plain sight

image: Carbon-chain molecules as complex as C60 buckminsterfullerenes -- 'buckyballs' -- may form in space with the help of clustered iron atoms, according to new work by ASU cosmochemists. The work also explains how these iron clusters hide out inside common carbon-chain molecules.

Image: 
NASA/JPL-Caltech

Astrophysicists know that iron (chemical symbol: Fe) is one of the most abundant elements in the universe, after lightweight elements such as hydrogen, carbon, and oxygen. Iron is most commonly found in gaseous form in stars such as the Sun, and in more condensed form in planets such as Earth.

Iron in interstellar environments should also be common, but astrophysicists detect only low levels of the gaseous kind. This implies that the missing iron exists in some kind of solid form or molecular state, yet identifying its hiding place has remained elusive for decades.

A team of cosmochemists at Arizona State University, with support from the W.M. Keck Foundation, now claims that the mystery is simpler than it seems. The iron isn't really missing, they say. Instead it's hiding in plain sight. The iron has combined with carbon molecules to form molecular chains called iron pseudocarbynes. The spectra of these chains are identical with the much more common chains of carbon molecules, long known to be abundant in interstellar space.

The team's work was published late in June in the Astrophysical Journal.

"We are proposing a new class of molecules that are likely to be widespread in the interstellar medium," said Pilarasetty Tarakeshwar, research associate professor in ASU's School of Molecular Sciences. His coauthors, Peter Buseck and Frank Timmes, are both in ASU's School of Earth and Space Exploration; Buseck, an ASU Regents Professor, is also in the School of Molecular Sciences with Tarakeshwar.

The team examined how clusters containing only a few atoms of metallic iron might join with chains of carbon molecules to produce molecules combining both elements.

Recent evidence obtained from stardust and meteorites indicate the widespread occurrence of clusters of iron atoms in the cosmos. In the extremely cold temperatures of interstellar space, these iron clusters act as deep-freeze particles, enabling carbon chains of various lengths to stick to them, thus producing different molecules from those that can occur with the gaseous phase of iron.

Said Tarakeshwar, "We calculated what the spectra of these molecules would look like, and we found that they have spectroscopic signatures nearly identical to carbon-chain molecules without any iron." He added that because of this, "Previous astrophysical observations could have overlooked these carbon-plus-iron molecules."

That means, the researchers say, the missing iron in the interstellar medium is actually out in plain view but masquerading as common carbon-chain molecules.

The new work may also solve another longstanding puzzle. Carbon chains with more than nine atoms are unstable, the team explains. Yet observations have detected more complex carbon molecules in interstellar space. How nature builds these complex carbon molecules from simpler carbon molecules has been a mystery for many years.

Buseck explained, "Longer carbon chains are stablized by the addition of iron clusters." This opens a new pathway for building more complex molecules in space, such as polyaromatic hydrocarbons, of which naphthalene is a familiar example, being the main ingredient in mothballs.

Said Timmes, "Our work provides new insights into bridging the yawning gap between molecules containing nine or fewer carbon atoms and complex molecules such as C60 buckminsterfullerene, better known as 'buckyballs.'"

Credit: 
Arizona State University

Coral reefs shifting away from equator

image: An experiment in Palmyra Atoll National Wildlife Refuge collects coral larvae, allowing researchers to enumerate the number of baby corals settling on a reef. Recent research shows that corals are establishing new reefs in temperate regions as they retreat from increasingly warmer waters at the equator.

Image: 
Nichole Price/Bigelow Laboratory for Ocean Sciences

Coral reefs are retreating from equatorial waters and establishing new reefs in more temperate regions, according to new research in the journal Marine Ecology Progress Series. The researchers found that the number of young corals on tropical reefs has declined by 85 percent – and doubled on subtropical reefs – during the last four decades.

"Climate change seems to be redistributing coral reefs, the same way it is shifting many other marine species," said Nichole Price, a senior research scientist at Bigelow Laboratory for Ocean Sciences and lead author of the paper. "The clarity in this trend is stunning, but we don’t yet know whether the new reefs can support the incredible diversity of tropical systems."

As climate change warms the ocean, subtropical environments are becoming more favorable for corals than the equatorial waters where they traditionally thrived. This is allowing drifting coral larvae to settle and grow in new regions. These subtropical reefs could provide refuge for other species challenged by climate change and new opportunities to protect these fledgling ecosystems.

The researchers believe that only certain types of coral are able to reach these new locations, based on how far the microscopic larvae can swim and drift on currents before they run out of their limited fat stores. The exact composition of most new reefs is currently unknown, due to the expense of collecting genetic and species diversity data.

"We are seeing ecosystems transition to new blends of species that have never coexisted, and it’s not yet clear how long it takes for these systems to reach equilibrium," said Satoshi Mitarai, an associate professor at Okinawa Institute of Science and Technology Graduate University and an author of the study. "The lines are really starting to blur about what a native species is, and when ecosystems are functioning or falling apart."

New coral reefs grow when larvae settle on suitable seafloor away from the reef where they originated. The research team examined latitudes up to 35 degrees north and south of the equator, and found that the shift of coral reefs is perfectly mirrored on either side. The paper assesses where and when "refugee corals" could settle in the future – potentially bringing new resources and opportunities such as fishing and tourism.

The researchers, an international group from 17 institutions in 6 countries, compiled a global database of studies dating back to 1974, when record-keeping began. They hope that other scientists will add to the database, making it increasingly comprehensive and useful to other research questions.

"The results of this paper highlight the importance of truly long-term studies documenting change in coral reef communities," said Peter Edmunds, a professor at the University of California Northridge and author of the paper. "The trends we identified in this analysis are exceptionally difficult to detect, yet of the greatest importance in understanding how reefs will change in the coming decades. As the coral reef crisis deepens, the international community will need to intensify efforts to combine and synthesize results as we have been able to accomplish with this study."

Coral reefs are intricately interconnected systems, and it is the interplay between species that enables their healthy functioning. It is unclear which other species, such as coralline algae that facilitate the survival of vulnerable coral larvae, are also expanding into new areas ¬– or how successful young corals can be without them. Price wants to investigate the relationships and diversity of species in new reefs to understand the dynamics of these evolving ecosystems.

"So many questions remain about which species are and are not making it to these new locations, and we don’t yet know the fate of these young corals over longer time frames," Price said. "The changes we are seeing in coral reef ecosystems are mind-boggling, and we need to work hard to document how these systems work and learn what we can do to save them before it’s too late."

Some of the research that informed this study was conducted at the National Science Foundation’s Moorea Coral Reef Long-Term Ecological Research site near French Polynesia, one of 28 such long-term research sites across the country and around the globe.

"This report addresses the important question of whether warming waters have resulted in increases in coral populations," says David Garrison, a program director in the National Science Foundation’s Division of Ocean Sciences, which funded the research. "Whether this offers hope for the sustainability of coral reefs requires more research and monitoring."

Journal

Marine Ecology Progress Series

DOI

10.3354/meps12980

Credit: 
Bigelow Laboratory for Ocean Sciences

Decades-long butterfly study shows common species on the decline

image: Swallowtail on flower.

Image: 
Rob Liptak, Ohio Lepidopterists

CORVALLIS, Ore. - The most extensive and systematic insect monitoring program ever undertaken in North America shows that butterfly abundance in Ohio declined yearly by 2%, resulting in an overall 33% drop for the 21 years of the program.

Though the study was limited to one group of the insect class and one geographic area, the findings provide an important baseline for what's happening more broadly with insect populations amid climate change and other human-caused disturbances, the study's corresponding author said. The findings also are in line with those of butterfly monitoring programs in multiple European countries.

"These declines in abundance are happening in common species," said Oregon State University researcher Tyson Wepprich, who led the study. "Declines in common species concern me because it shows that there are widespread environmental causes for the declines affecting species we thought were well adapted to share a landscape with humans. Common species are also the ones that contribute the bulk of the pollination or bird food to the ecosystem, so their slow, consistent decline is likely having ripple effects beyond butterfly numbers."

Findings were published today in PLOS ONE.

Wepprich, a postdoctoral scholar in botany and plant pathology in OSU's College of Agricultural Sciences, used more than 24,000 butterfly surveys contributed by trained citizen scientists from 1996 through 2016 to establish his findings.

"Because it's easier to monitor butterflies than other insects - lots of people like butterflies and enjoy keeping track of them - butterflies tend to be the best source of abundance data for tracking insect population declines and increases," Wepprich said. "Environmental assessments use them as an indicator for the general trajectory of biodiversity since they experience the same types of pressures from land-use changes, climate change and habitat degradation as other insect groups."

The data from Ohio enabled population trends to be estimated for 81 butterfly species and found three times as many species were trending downward as upward - three out of every four species with a positive or negative trend grew less abundant over the course of the monitoring. Forty of the analyzed species had no significant trend up or down.

"Species with more northern distributions and fewer annual generations declined the most rapidly," he said, adding that these species are adapted to cooler regions and may do worse in Ohio with warming temperatures.

Wepprich noted that even some invasive species associated with human-dominated landscapes are declining, which suggests the trends are rooted in widespread environmental causes.

"Analyses of insect declines are dominated by European studies about butterflies, but our study is showing that the rate of change in Ohio butterfly abundance is very similar to that found in monitoring programs in the UK, the Netherlands and Spain," Wepprich said. "The rate of total decline and the proportion of species in decline mirror those documented in comparable monitoring programs. What's common among all of the monitoring programs is that they are in areas with a high human impact and made possible by dedicated volunteer recorders."

Even though the common butterfly species aren't yet close to extinction, declines in those species will nevertheless have an outsized, negative impact on ecosystem services provided by insects, he said.

Earlier studies involving intensive, long-term monitoring of individual butterfly species have allowed for rigorous estimates of declines in those species, Wepprich said. Over the past two decades, the migratory eastern North American monarch has declined by more than 85% and the western North American monarch by more than 95%, said Wepprich, adding that some of the rarest butterflies have also fallen off sharply.

"Monarchs and rare species were monitored because people are worried about them going extinct," Wepprich said. "In Ohio, they monitored every species they could and found declines in species previously not on the radar for conservation."

The rate of butterfly decline in Ohio is greater than the global rate of 35% over 40 years, Wepprich said, and is closer to the estimated rate for insects in general: a 45% decline over 40 years.

"Our study adds another example of declines in common butterfly species thought to be well suited to human-modified habitat," he said.

Credit: 
Oregon State University

Hinge-like protein may open new doors in cystic fibrosis treatment

image: A potentiator (orange) binds to a protein 'hotspot,' altering the molecule's conformation.

Image: 
Laboratory of Membrane Biology and Biophysics at The Rockefeller University

In recent decades, treatment options for people with cystic fibrosis have improved dramatically. The newest drugs, known as potentiators, target a protein called cystic fibrosis transmembrane conductance regulator, which is mutated in people with the disease. Yet, while these medications can help some people with CF, they are far from perfect. Moreover, researchers haven't been able to figure out how the drugs actually work--until now.

A new study by Rockefeller scientists characterizes, for the first time, the interaction between potentiators and the protein they target at atomic resolution. The research, described in a recent report in Science, shows that two distinct compounds act on the same protein region--a finding that points to strategies for developing more effective drugs.

Finding the hotspot

The cystic fibrosis transmembrane conductance regulator (CFTR) is a channel that, when open, allows chloride ions to move in and out of cells. When CFTR is mutated, ions cannot flow freely, leading to changes in the make-up of mucus lining internal organs. These changes can be particularly dangerous in the lungs where they cause thick mucus to accumulate, often leading to impaired breathing and persistent infections.

Potentiators are used to increase the flow of ions through CFTR, ameliorating some symptoms of cystic fibrosis (CF). Currently, only one such drug, known as ivacaftor, is on the market; another, called GLPG1837, is now in development.

"Ivacaftor can improve lung function by about ten percent. It can help a lot, but it's not a cure and not everybody responds to it," says Jue Chen, the William E. Ford Professor. "So there's a lot of interest in developing new potentiators."

Pursuing this goal, Chen and her colleagues investigated how existing potentiators work. They used cryo-electron microscopy--a technique that beams electrons at a frozen specimen to reveal protein architecture at an atomic level--to study the structure of CFTR attached to either ivacaftor or GLPG1837. Somewhat surprisingly, the researchers found that the two drugs bind to the exact same spot on the protein.

"These compounds are developed by two different companies and have very different chemical properties. But they manage to make their way to the same site," says Chen. "That tells us that this is a very sensitive, very important region of the protein."

Better drugs, more access

Upon analyzing the "hotspot" where the two potentiators bound, the researchers noticed a peculiar feature: this area contained unwound loops inside the membrane that signify a flexible structure. And this flexibility, the researchers realized, serves a practical function.

"The region we identified, it turns out, works as a hinge that swings open to allow ions through the channel--so its structure needs to be flexible," says Chen. "The compounds we studied bind to that very region, locking it into a channel-open conformation to improve ion flow. That's how they work."

With this knowledge, the researchers hope to craft compounds that directly target the hinge and do an even better job at keeping the ion channel open. And as Chen and her colleagues work toward the development of new drugs, she encourages other researchers to do the same. This kind of competition, she hopes, will drive down the cost of potentiators, making the medication available to a much larger portion of patients.

"We put our original data online and welcome anyone to use it," says Chen. "Because if more researchers use it, more treatment options will become available, prices will drop, and more people will be helped."

Reflecting on this breakthrough study, Chen acknowledges the work of David C. Gadsby, who passed away this March. The Patrick A. Gerschel Family Professor Emeritus and head of the Laboratory of Cardiac and Membrane Physiology, Gadsby's early work on CFTR laid the groundwork for much of Chen's research.

"He did a series of beautiful functional studies of CFTR, and he was a source of inspiration and knowledge," she says. "It's a pity he didn't live to see it. We dedicate this study to him."

Credit: 
Rockefeller University

Scientists identify new virus-killing protein

A new protein called KHNYN has been identified as a missing piece in a natural antiviral system that kills viruses by targeting a specific pattern in viral genomes, according to new findings published today in eLife.

Studying the body's natural defenses to viruses and how viruses evolve to evade them is crucial to developing new vaccines, drugs and anticancer treatments.

The genetic information that makes up the genomes for many viruses is comprised of building blocks called RNA nucleotides. Recently, it was discovered that a protein called ZAP binds to a specific sequence of RNA nucleotides: a cytosine followed by a guanosine, or CpG for short.

The human immunodeficiency virus (HIV) normally escapes being inhibited by ZAP because it has evolved to have few CpGs in its genome. However, when CpGs are added back to the virus, ZAP promotes its destruction. This helps us understand why HIV with more CpGs multiplies less successfully, and likely explains why many strains of HIV have evolved to have few CpGs. But a mystery remained because ZAP is unable to break down the viral RNA by itself.

"As ZAP can't degrade RNA on its own, we believed that it must recruit other proteins to the viral RNA to destroy it," says lead author Mattia Ficarelli, a PhD student in Chad Swanson's Lab, Department of Infectious Diseases, King's College London. "So, in the current study, we set out to identify new human proteins that are essential for ZAP to target viral RNAs for destruction."

After discovering that KHNYN interacts with ZAP, the team tested what happens when they increased the amount of KHNYN produced in cells infected with a typical HIV that has few CpGs, or an HIV genetically engineered to have many CpGs. Increasing KHNYN production in the cells reduced the typical HIV's ability to multiply about five-fold and decreased the ability of the CpG-enriched HIV to multiply by about 400-fold.

To figure out if KHNYN and ZAP work together, the team repeated the same experiments in cells without ZAP and found that KHNYN did not inhibit the ability of CpG-enriched HIV to multiply. They then looked at what happened in cells genetically engineered to lack KHNYN, and found that both CpG-enriched HIV and a mouse leukemia virus that has many CpGs were no longer inhibited by ZAP.

"We have identified that KHNYN is required for ZAP to prevent HIV from multiplying when it is enriched for CpGs," explains co-corresponding author Professor Stuart Neil, Department of Infectious Diseases, King's College London. He adds that KHNYN is likely an enzyme that cuts up the viral RNA that ZAP binds to.

"An interesting potential application of this work is to make new vaccines or treat cancer," adds senior author and lecturer Chad Swanson, from the same department. "Since some cancer cells have low levels of ZAP, it may be possible to develop CpG-enriched, cancer-killing viruses that would not harm healthy cells. But much more research is necessary to learn more about how ZAP and KHNYN recognise and destroy viral RNA before we can move on to explore such applications."

Credit: 
eLife