Culture

Drug-resistant hospital bacteria persist even after deep cleaning

Scientists have used genome sequencing to reveal the extent to which a drug-resistant gastrointestinal bacterium can spread within a hospital, highlighting the challenge hospitals face in controlling infections.

Enterococcus faecium is a bacterium commonly found in the gastrointestinal tract, where it usually resides without causing the host problems. However, in immunocompromised patients, it can lead to potentially life-threatening infection.

Over the last three decades, strains have emerged that are resistant to frontline antibiotics including ampicillin and vancomycin, limiting treatment options - and particularly worrying, these strains are often those found in hospital-acquired E. faecium infections.

A team of scientists at the University of Cambridge and the London School of Hygiene and Tropical Medicine has pioneered an approach combining epidemiological and genomic information to chart the spread of bacteria within healthcare settings. This has helped hospitals identify sources of infection and inform infection control measures.

In a study published today in Nature Microbiology, the team has applied this technique to the spread of drug-resistant E. faecium in a hospital setting.

Dr Theodore Gouliouris from the Department of Medicine at the University of Cambridge, and joint first author on the study, said: "We've known for over two decades that patients in hospital can catch and spread drug-resistant E. faecium. Preventing its spread requires us to understand where the bacteria lives - its 'reservoirs' - and how it is transmitted.

"Most studies to date have relied on culturing the bacteria from samples. But as we've shown, whole genome sequencing - looking at the DNA of the bacteria - combined with detailed patient and environmental sampling can be a powerful tool to help us chart its spread and inform ways to prevent further outbreaks."

The team followed 149 haematology patients admitted to Addenbrooke's Hospital, Cambridge University Hospitals NHS Foundation Trust, over a six-month period. They took stool samples from the patients and swabs from the hospital environment and cultured them for E. faecium.

Genomic analysis of the bacteria was much more effective at identifying hospital-acquired E. faecium: out of 101 patients who could be followed up, genomic analysis identified that two thirds of patients acquired E. faecium, compared to less than half using culture methods alone.

Just under half (48%) of the swabs taken from the hospital environment were positive for vancomycin-resistant E. faecium. This included 36% of medical devices, 76% of non-touch areas such as air vents, 41% of bed spaces and 68% of communal bathrooms tested.

The researchers showed that even deep cleaning could not eradicate the bacteria. The hospital undertook deep cleaning on one ward over a three-day period during the study, when patients were moved elsewhere; however, when the team sampled locations prior to patients returning to the ward, they found that 9% of samples still tested positive for the bacteria. Within three days of patients returning to the ward, around half of the sampled sites tested positive.

Three-quarters (74%) of the patients (111/149) were carriers of the A1 clade - a multi-drug resistant strain of E. faecium commonly seen in hospitals that is resistant to the antibiotic ampicillin and which frequently acquires resistance to vancomycin. Of these 111 patients, 67 had strong epidemiological and genomic links with at least one other patient and/or their direct environment.

"The fact that these cases were all linked to another patient or their environment suggests strongly that they had picked up the multi-drug resistant bacteria while in the hospital," said Dr Francesc Coll from the London School of Hygiene and Tropical Medicine, joint first author.

Further genomic analysis showed that within this multi-drug resistant strain were several subtypes (defined by how genetically-similar they were). However, it was not uncommon for a patient to be carrying more than one subtype, which - without detailed genomic analysis - could confound attempts to identify the route of transmission of an infection. Notably, despite the circulation of as many as 115 subtypes, 28% of E. faecium acquisitions were caused by just two superspreading subtypes. The authors found no evidence of resistance or tolerance to common disinfectants to explain the success of these subtypes.

Six study patients contracted an 'invasive infection', meaning that they had been carrying E. faecium asymptomatically in their gut, but subsequently developed a symptomatic infection. Comparing the genomes of the infecting and gut strains the authors determined that invasive E. faecium infections originated from the patients' own gut.

"Our study builds on previous observations that drug-resistant strains of E. faecium can persist in the hospital environment despite standard cleaning - we were still surprised to find how short-lasting was the effect of deep cleaning," added Dr Gouliouris.

"We found high levels of hospital-adapted E. faecium despite the use of cleaning products and procedures that have proven effective against the bug. It highlights how challenging it can be to tackle outbreaks in hospitals."

Senior author Professor Sharon Peacock from the Department of Medicine at the University of Cambridge added: "The high rates of infection with drug-resistant E. faecium in specific vulnerable patient groups and its ability to evade cleaning measures pose an important challenge to infection control. Patient screening, adequate provision of isolation and ensuite toilet facilities, improved and more frequent cleaning procedures, and stricter health-care worker hygiene practices will all be needed to curtail this global epidemic.

"But this is also a sign of how urgently we need to tackle inappropriate use of antibiotics worldwide, which is widely recognised as posing a catastrophic threat to our health and our ability to control infections."

Credit: 
University of Cambridge

T-Cells from recovered COVID-19 patients show promise to protect vulnerable patients from infection

T-cells taken from the blood of people who recovered from a COVID-19 infection can be successfully multiplied in the lab and maintain the ability to effectively target proteins that are key to the virus's function, according to a new study published Oct. 26 in Blood.

"We found that many people who recover from COVID-19 have T-cells that recognize and target viral proteins of SARS-CoV-2, giving them immunity from the virus because those T-cells are primed to fight it," says Michael Keller, M.D., a pediatric immunology specialist at Children's National Hospital, who led the study. "This suggests that adoptive immunotherapy using convalescent T-cells to target these regions of the virus may be an effective way to protect vulnerable people, especially those with compromised immune systems due to cancer therapy or transplantation."

Based on evidence from previous phase 1 clinical trials using virus-targeting T-cells "trained" to target viruses such as Epstein-Barr virus, the researchers in the Cellular Therapy Program at Children's National hypothesized that the expanded group of COVID-19 virus-targeting T-cells could be infused into immunocompromised patients, helping them build an immune response before exposure to the virus and therefore protecting the patient from a serious or life-threatening infection.

"We know that patients who have immune deficiencies as a result of pre-existing conditions or following bone marrow or solid organ transplant are extremely vulnerable to viruses like SARS-CoV-2," says Catherine Bollard, M.D., M.B.Ch.B., senior author of the study and director of the novel cell therapies program and the Center for Cancer and Immunology Research at Children's National. "We've seen that these patients are unable to easily clear the virus on their own, and that can prevent or delay needed treatments to fight cancer or other diseases. This approach could serve as a viable option to protect or treat them, especially since their underlying conditions may make vaccines for SARS-CoV-2 unsafe or ineffective."

The T-cells were predominantly grown from the peripheral blood of donors who were seropositive for SARS-CoV-2. The study also identified that SARS-CoV-2 directed T-cells have adapted to predominantly target specific parts of the viral proteins found on the cell membrane, revealing new ways that the immune system responds to COVID-19 infection.

Current vaccine research focuses on specific proteins found mainly on the "spikes" of the coronavirus SARS-CoV-2. The finding that T-cells are successfully targeting a membrane protein instead may add another avenue for vaccine developers to explore when creating new therapeutics to protect against the virus.

"This work provides a powerful example of how both scientific advances and collaborative relationships developed in response to a particular challenge can have broad and unexpected impacts on other areas of human health," says Brad Jones, Ph.D., an associate professor of immunology in medicine in the Division of Infectious Diseases at Weill Cornell Medicine and co-author on the study, whose lab focuses on HIV cure research. "I began working with Dr. Bollard's team several years ago out of our shared interest in translating her T-cell therapy approaches to HIV. This put us in a position to quickly team up to help develop the approach for COVID-19."

Credit: 
Children's National Hospital

3D printing the first ever biomimetic tongue surface

image: The 3D printed negative mould showing holes for the filiform and fungiform papillae

Image: 
University of Leeds

Scientists have created synthetic soft surfaces with tongue-like textures for the first time using 3D printing, opening new possibilities for testing oral processing properties of food, nutritional technologies, pharmaceutics and dry mouth therapies.

UK scientists led by the University of Leeds in collaboration with the University of Edinburgh have replicated the highly sophisticated surface design of a human tongue and demonstrated that their printed synthetic silicone structure mimics the topology, elasticity and wettability of the tongue's surface.

These factors are instrumental to how food or saliva interacts with the tongue, which in turn can affect mouthfeel, swallowing, speech, nutritional intake and quality of life.

A biomimetic tongue will help developers to perform screening of newly designed products and accelerate the new development processes without the necessity of human trials at early stages that are often very expensive and time consuming.

Particularly, since the onset of the COVID-19 pandemic, social distancing has posed significant challenges to carry out such sensory trials and consumer tests. A biomimetic tongue will be immensely helpful to increase development productivity and reducing manufacturers' reliance on human trials in the early stages.

A biomimetic tongue could further offer myriad applications to fight against adulteration in food and other orally administered pharmaceutics whether textural attributes are governing features and can save huge economic loss.

The complex nature of the tongue's biological surface has posed challenges in artificial replication, adding major obstacles to the development and screening of effective long-lasting treatments or therapies for dry mouth syndrome -- roughly 10% of the general population and 30% of older people suffer from dry mouth.

Study lead author, Dr Efren Andablo-Reyes conducted this research while a postdoctoral fellow in the School of Food Science and Nutrition at Leeds. He said: "Recreating the surface of an average human tongue comes with unique architectural challenges. Hundreds of small bud-like structures called papilla give the tongue its characteristic rough texture that in combination to the soft nature of the tissue create a complicated landscape from a mechanical perspective.

"We focused our attention on the anterior dorsal section of the tongue where some of these papillae contain taste receptors, while many of them lack such receptors. Both kinds of papillae play a critical role in providing the right mechanical friction to aid food processing in the mouth with the adequate amount saliva, providing pleasurable mouthfeel perception and proper lubrication for swallowing.

"We aimed to replicate these mechanically relevant characteristics of the human tongue in a surface that is easy to use in the lab to replicate oral processing conditions."

The study that brought together unique expertise in food colloid science, soft matter physics, dentistry, mechanical engineering and computer science is published today in the journal ACS Applied Materials & Interfaces.

The team took silicone impressions of tongue surfaces from fifteen adults. The impressions were 3D optically scanned to map papillae dimensions, density and the average roughness of the tongues. The texture of a human tongue was found to resemble a random layout.

The team used computer simulations and mathematical modelling to create a 3D-printed artificial surface to function as a mould containing wells with the shape and dimensions of the different papillae randomly distributed across the surface with right density. This was replica-moulded against elastomers of optimised softness and wettability.

University of Edinburgh co-author, Rik Sarkar of the School of Informatics said: "The randomness in distribution of papillae appears to play an important sensory role for the tongue.

"We defined a new concept called collision probability to measure mechanosensing that will have large impact in this area. In the future, we will use a combination of machine learning and computational topology to create tongue models of diverse healthy and diseased individuals to address various oral conditions."

The artificial surface was then 3D printed using digital light processing technology based in the School of Mechanical Engineering at Leeds.

The team ran a series of experiments using different complex fluids to ensure that the printed surface's wettability - how a liquid keeps contact and spreads across a surface - and the lubrication performance was the same as the human tongue impressions.

Co-author Dr Michael Bryant from the School of Mechanical Engineering at Leeds said: "The application of bio-tribological principles, the study of friction and lubrication, in the creation of this tongue-like surface is a significant step forward in this field.

"The ability to produce accurate replicas of tongue surfaces with similar structure and mechanical properties will help streamline research and development for oral care, food products and therapeutic technologies."

Principal Investigator Anwesha Sarkar, Professor of Colloids and Surfaces at Leeds, said: "Accurately mapping and replicating the tongues surface and combining that with a material that approximates the elasticity of human tongue was no small task. Harnessing expertise from multiple STEM disciplines, we've demonstrated the unprecedented capability of a 3D printed silicone surface to mimic the mechanical performance of the human tongue.

"We believe that fabricating a synthetic surface with relevant properties that mimics the intricate architectural features, and more importantly the lubricating performance of the human tongue is paramount to gaining quantitative understanding of how fluids interact within the oral cavity.

"This biomimetic tongue surface could also serve as a unique mechanical tool to help detect counterfeit in food and high-valued beverages based on textural attributes, which is a global concern and can help to ensure food safety.

"Ultimately, our hope is that the surface we have designed can be important in understanding how the biomechanics of the tongue underpin the fundamentals of human feeding and speech.

Credit: 
University of Leeds

CRISPR screen identifies genes, drug targets to protect against SARS-CoV-2 infection

image: Graphical abstract of the Cell study

Image: 
Sanjana Lab of New York Genome Center/New York University

NEW YORK, NY (October 26, 2020) - To identify new potential therapeutic targets for SARS-CoV-2, a team of scientists at the New York Genome Center, New York University, and the Icahn School of Medicine at Mount Sinai, performed a genome-scale, loss-of-function CRISPR screen to systematically knockout all genes in the human genome. The team examined which genetic modifications made human lung cells more resistant to SARS-CoV-2 infection. Their findings revealed individual genes and gene regulatory networks in the human genome that are required by SARS-CoV-2 and that confer resistance to viral infection when suppressed. The collaborative study described a wide array of genes that have not previously been considered as therapeutic targets for SARS-CoV-2. Their study was published online by Cell on October 24.

In order to better understand the complex relationships between host and virus genetic dependencies, the team used a broad range of analytical and experimental methods to validate their results. This integrative approach included genome editing, single-cell sequencing, confocal imaging, and computational analyses of gene expression and proteomic datasets. The researchers found that these new gene targets, when inhibited using small molecules (drugs), significantly reduced viral load, and with some drugs, up to 1,000-fold. Their findings offer insight into novel therapies that may be effective in treating COVID-19 and reveal the underlying molecular targets of those therapies.

"Seeing the tragic impact of COVID-19 here in New York and across the world, we felt that we could use the high-throughput CRISPR gene editing tools that we have applied to other diseases to understand what are the key human genes required by the SARS-CoV-2 virus," said the study's co-senior author, Dr. Neville Sanjana, Core Faculty Member at the New York Genome Center, Assistant Professor of Biology, New York University, and Assistant Professor of Neuroscience and Physiology at NYU Grossman School of Medicine. Previously, Dr. Sanjana has applied genome-wide CRISPR screens to identify the genetic drivers of diverse diseases, including drug resistance in melanoma, immunotherapy failure, lung cancer metastasis, innate immunity, inborn metabolic disorders, and muscular dystrophy.

For this project, genome editing was only one-half of the equation. "We previously developed a series of human cell models for coronavirus infection in our work to understand immune responses to the virus. It was great to team up with Neville's group to understand and comprehensively profile host genes from a new angle," said co-senior author Dr. Benjamin tenOever, Fishberg Professor of Medicine, Icahn Scholar and Professor of Microbiology, Icahn School of Medicine at Mount Sinai.

Gene clusters lead the way

The team discovered that the top-ranked genes -- those whose loss reduces viral infection substantially -- clustered into a handful of protein complexes, including vacuolar ATPases, Retromer, Commander, Arp2/3, and PI3K. Many of these protein complexes are involved in trafficking proteins to and from the cell membrane.

"We were very pleased to see multiple genes within the same family as top-ranked hits in our genome-wide screen. This gave us a high degree of confidence that these protein families were crucial to the virus lifecycle, either for getting into human cells or successful viral replication," said Dr. Zharko Daniloski, a postdoctoral fellow in the Sanjana Lab and co-first author of the study.

While researchers performed the CRISPR screen using human lung cells, the team also explored whether the expression of required host genes was lung-specific or more broadly expressed. Among the top-ranked genes, only ACE2, the receptor known to be responsible for binding the SARS-CoV-2 viral protein Spike, showed tissue-specific expression, with the rest of the top gene hits broadly expressed across many tissues, suggesting that these mechanisms may function independent of cell or tissue type. Using proteomic data, they found that several of the top-ranked host genes directly interact with the virus's own proteins, highlighting their central role in the viral lifecycle. The team also analyzed common host genes required for other viral pathogens, such as Zika or H1N1 pandemic influenza.

Mechanistic insights: Cholesterol and viral receptors

After completing the primary screen, the group of researchers used several different techniques to validate the role of many of the top-ranked genes in viral infection. Using human cell lines derived from the lung and other organs susceptible to SARS-CoV-2 infection, they measured viral infection after gene knockout by CRISPR, gene suppression using RNA interference, or drug inhibition. After validating that these manipulations reduced viral infection, they next sought to understand the mechanisms by which loss of these genes block coronavirus infection.

Using a recently-developed technology that couples large-scale CRISPR editing with single-cell RNA-sequencing (ECCITE-seq), the team identified that loss of several top-ranked genes results in upregulation of cholesterol biosynthesis pathways and an increase in cellular cholesterol. Using this insight, they studied the effects of amlodipine, a drug that alters cholesterol levels.

"We found that amlodipine, a calcium-channel antagonist, upregulates cellular cholesterol levels and blocks SARS-CoV-2 infection. Since recent clinical studies have also suggested that patients taking calcium-channel blockers have a reduced COVID-19 case fatality rate, an important future research direction will be to further illuminate the relationship between cholesterol synthesis pathways and SARS-CoV-2," said Dr. Tristan Jordan, a postdoctoral fellow in the tenOever Lab and co-first author of the study.

Building on previous work on mutations in the Spike protein and viral entry through the ACE2 receptor, the research team also asked whether loss of some genes might confer resistance to the coronavirus by lowering ACE2 levels. They identified one gene in particular, RAB7A, that has a large impact on ACE2 trafficking to the cell membrane. Using a combination of flow cytometry and confocal microscopy, the team showed that RAB7A loss prevents viral entry by sequestering ACE2 receptors inside cells.

"Current treatments for SARS-CoV-2 infection currently go after the virus itself, but this study offers a better understanding of how host genes influence viral entry and will enable new avenues for therapeutic discovery and hopefully accelerate recovery for susceptible populations," said Dr. Sanjana.

Credit: 
New York Genome Center

Cancer anti-sickness drug offers hope for hallucinations in Parkinson's

Monday 26 October 2020 - A world-first double-blind clinical trial, will investigate if a powerful drug used to treat nausea in chemotherapy patients, could alleviate hallucinations in people with Parkinson's.

Parkinson's UK, the largest charitable funder of Parkinson's research in Europe, is partnering with UCL, and investing £1 million in a pioneering phase II clinical trial to explore if the drug ondansetron is safe and effective against hallucinations. There are currently 145,000 people living with Parkinson's in the UK and 75 per cent of them will experience visual hallucinations at some point.

The trial comes at a crucial time as a survey carried out by the charity worryingly found that 1 in 10 people with Parkinson's reported an increase in hallucinations during lockdown, which led to an increase in calls to their helpline.¹

The funding for this four-year project comes via the charity's drug development arm, the Parkinson's Virtual Biotech. Launched in 2017, the innovative programme is plugging the funding gap to fast-track the projects with the greatest scientific potential to transform the lives of people with Parkinson's. This is the second clinical trial to be added to the Virtual Biotech's pipeline of projects in a drive to develop better drug treatments for people with Parkinson's.

Dr Arthur Roach, Director of Research at Parkinson's UK said:

"It's vital we find better treatments for people with Parkinson's who have seen their hallucinations worsen at home and ondansetron offers much hope for them and their families. If successful, positive results from the trial could see this drug, which is already used in the NHS, quickly repurposed to become an available treatment in Parkinson's.

"With the support of Parkinson's UK, UCL has been rapidly adapting the research during the pandemic, to enable us to drive forward and launch this promising trial, which marks another milestone in our thriving Parkinson's Virtual Biotech programme."

Treating hallucinations is one of the most challenging aspects in people with Parkinson's and it can have a big effect on their quality of life. It can be extremely distressing for carers as well as people with Parkinson's, putting stress on relationships. The only medications available to treat it today are anti-psychotic drugs which can worsen Parkinson's symptoms and potentially cause serious side effects. This urgent need for new treatments is why hallucinations are a top priority for the Parkinson's Virtual Biotech.

The 12-week, double-blind, placebo-controlled trial is set to recruit 216 people over 2 years in 20-25 NHS clinics across the UK. Patients will be randomised to receive either drug or placebo tablets, to take at home for 12 weeks. To accommodate social distancing, researchers will conduct the majority of the study via video or telephone consultations, with face-to-face assessments limited to only three for essential blood tests or ECGs. Visual and other types of hallucinations, as well as delusions (false beliefs), will be assessed after 6 and 12 weeks of treatment, along with Parkinson's related motor and non-motor symptoms.

Lead Researcher, Suzanne Reeves, Professor of Old Age Psychiatry and Psychopharmacology at UCL said:

"Visual hallucinations pose a particular challenge in Parkinson's as the very treatments for motor symptoms in Parkinson's can also trigger and worsen this distressing symptom. Finding treatments for hallucinations that are both effective and safe is an area of great unmet need.

"Ondansetron influences visual processing in the brain and its potential for treating visual hallucinations in Parkinson's was first identified in small studies in the early 1990s.

"This trial will enable us to find out if ondansetron is effective and safe as a treatment and if it is, we could see clinicians prescribing an inexpensive drug with fewer side effects to people with Parkinson's throughout the UK."

Parkinson's UK has their own in-house experts, and partners with clinical and academic institutions, pharmaceutical companies and biotechs worldwide. To partner with Parkinson's UK to drive forward potential new treatments, visit http://www.parkinsonsvirtualbiotech.co.uk

Credit: 
Parkinson's UK

COVID-19 containment shaped by strength, duration of natural, vaccine-induced immunity

"Much of the discussion so far related to the future trajectory of COVID-19 has rightly been focused on the effects of seasonality and non-pharmaceutical interventions [NPIs], such as mask-wearing and physical distancing," said co-first author Chadi Saad-Roy, a Ph.D. candidate in Princeton's Lewis-Sigler Institute for Integrative Genomics. "In the short term, and during the pandemic phase, NPIs are the key determinant of case burdens. However, the role of immunity will become increasingly important as we look into the future."

"Ultimately, we don't know what the strength or duration of natural immunity to SARS-CoV-2 -- or a potential vaccine -- will look like," explained co-first author Caroline Wagner, an Assistant Professor of Bioengineering at McGill University who worked on the study as a postdoctoral research associate in the Princeton Environmental Institute (PEI).

"For instance, if reinfection is possible, what does a person's immune response to their previous infection do?" Wagner asked. "Is that immune response capable of stopping you from transmitting the infection to others? These will all impact the dynamics of future outbreaks."

The current study builds on previous research by the same team, published in Science on May 18 that reported that local variations in climate are not likely to dominate the first wave of the COVID-19 pandemic.

In the most recent paper, the researchers used a simple model to project the future incidence of COVID-19 cases -- and the degree of immunity in the human population -- under a range of assumptions related to how likely individuals are to transmit the virus in different contexts. For example, the model allows for different durations of immunity after infection, as well as different extents of protection from reinfection. The researchers posted online an interactive version of model's predictions under these different sets of assumptions.

As expected, the model found that the initial pandemic peak is largely independent of immunity because most people are susceptible. However, a substantial range of epidemic patterns are possible as SARS-CoV-2 infection -- and thus immunity -- increases in the population.

"If immune responses are only weak, or transiently protective against reinfection, for example, then larger and more frequent outbreaks can be expected in the medium term," said co-author Andrea Graham, professor of ecology and evolutionary biology at Princeton.

The nature of the immune responses also can affect clinical outcomes and the burden of severe cases requiring hospitalization, the researchers found. The key question is the severity of subsequent infections in comparison to primary ones.

Importantly, the study found that in all scenarios a vaccine capable of eliciting a strong immune response could substantially reduce future caseloads. Even a vaccine that only offers partial protection against secondary transmission could generate major benefits if widely deployed, the researchers reported.

Factors such as age and superspreading events are known to influence the spread of SARS-CoV-2 by causing individuals within a population to experience different immune responses or transmit the virus at different rates. The study found that these factors do not affect the qualitative projections about future epidemic dynamics. However, the researchers note that as vaccine candidates emerge and more detailed predictions of future caseloads with vaccination are needed, these additional details will need to be incorporated into more complex models.

The study authors also explored the effect of "vaccine hesitancy" on future infection dynamics. Their model found that people who decline to partake in pharmaceutical and non-pharmaceutical measures to contain the coronavirus could nonetheless slow containment of the virus even if a vaccine is available.

"Our model indicates that if vaccine refusal is high and correlated with increased transmission and riskier behavior such as refusing to wear a mask, then the necessary vaccination rate needed to reach herd immunity could be much higher," said co-author Simon Levin, the James S. McDonnell Distinguished University Professor in Ecology and Evolutionary Biology and an associated faculty member in PEI. "In this case, the nature of the immune response after infection or vaccination would be very important factors in determining how effective a vaccine would be."

One of the main takeaways of the study is that monitoring population-level immunity to SARS-CoV-2, in addition to active infections, will be critical for accurately predicting future incidence.

"This is not an easy thing to do accurately, particularly when the nature of this immune response is not well understood," said co-author Michael Mina, an assistant professor at the Harvard School of Public Health and Harvard Medical School. "Even if we can measure a clinical quantity like an antibody titer against this virus, we don't necessarily know what that means in terms of protection."

Studying the effects of T-cell immunity and cross-protection from other coronaviruses are important avenues for future work.

Credit: 
McGill University

Surprised researchers: Number of leopards in northern China on the rise

Leopards are fascinating animals. In addition to being sublime hunters that will eat nearly anything and can survive in varied habitats from forests to deserts, they are able to withstand temperatures ranging from minus 40 degrees Celsius during winter to plus 40 degrees in summer.

Despite their resilience, the majority of leopard species are endangered. Poaching and the clearing of forest habitat for human activities are among the reasons for their global decline.

But in northern China -- and specifically upon the Loess Plateu -- something fantastic is occurring.

Numbers of a leopard subspecies called the North Chinese leopard have increased according to a new study conducted by researchers from the University of Copenhagen and their colleagues in Beijing.

"We were quite surprised that the number of leopards has increased, because their populations are declining in many other places. We knew that there were leopards in this area, but we had no idea how many," says Bing Xie, a PhD student at UCPH's Department of Biology and one of the researchers behind the study.

Together with researchers at Beijing Normal University, she covered 800 square kilometers of the Loess Plateu between 2016 and 2017.

The just-completed count reports that the number of leopards increased from 88 in 2016 to 110 in 2017 -- a 25 percent increase. The researchers suspect that their numbers have continued to increase in the years since.

This is the first time that an estimate has been made for the status of local population in North Chinese leopards.

Five-year reforestation plan has worked

The reason for this spotted golden giant's rebound likely reflects the 13'th five-year plan that the Chinese government, in consultation with a range of scientific researchers, implemented in 2015 to restore biodiversity in the area.

"About 20 years ago, much of the Loess Plateau's forest habitat was transformed into agricultural land. Human activity scared away wild boars, toads, frogs and deer -- making it impossible for leopards to find food. Now that much of the forest has been restored, prey have returned, along with the leopards," explains Bing Xie, adding:

"Many locals had no idea there were leopards in the area, so they were wildly enthused and surprised. And, it was a success for the government, which had hoped for greater biodiversity in the area. Suddenly, they could 'house' these big cats on a far greater scale than they had dreamed of."

Leopards are nearly invisible in nature

The research team deployed camera equipment to map how many leopards were in this area of northern China. But even though the footage captured more cats than expected on film, none of the researchers saw any of the big stealthy felines with their own eyes:

"Leopards are extremely shy of humans and sneak about silently. That's why it's not at all uncommon to study them for 10 years without physically observing one," she explains.

Even though Bing Xie has never seen leopards in the wild, she will continue to fight for their survival.

"That 98 percent of leopard habitat has been lost over the years makes me so sad. I have a great love for these gorgeous cats and I will continue to research on how best to protect them," she concludes.

Credit: 
University of Copenhagen

New lead screening method zooms in on highest-risk areas in Georgia

image: Screening index scores for low-level lead exposure in Georgia. Click here to see an interactive version of the map: http://scholarblogs.emory.edu/esaikawa/files/2020/10/GA_map.html

Image: 
Saikawa Lab, Emory University

While many people think of lead poisoning as a problem of the past, chronic exposure still occurs in some communities that may be missed in limited screening programs for children's blood lead levels. Now researchers at Emory University have developed a more precise screening index, illustrated with a map, which provides a fine-grain view of areas where children are most at risk for low-level lead exposure in the city of Atlanta and throughout the state of Georgia.

Scientific Reports published their new method, including analyses that tested and showed its efficacy, using historical data.

The new screening index is based on established risk factors for lead exposure, including poverty and housing built before 1950. The index pinpointed 18 highest-priority census tracts in metro Atlanta, encompassing 2,715 children under the age of six -- or 1.7 percent of all children that age in greater Atlanta.

These highest-priority areas include the historically black neighborhoods of English Avenue and Vine City, where Emory researchers had previously identified elevated levels of lead in the soil of some yards and vacant lots.

"As we move forward into an age when acute lead poisoning is rare, we need better tools to monitor for chronic, long-term exposure to lead," says Emory graduate Samantha Distler, first author of the paper. "We developed an interactive map that can be used by physicians and other health officials, and even by individuals who want to check their own children's risk levels. You can easily zoom in to find an exact location, so there's less guess work involved in assessing what is a high-risk area."

The method could be applied to any area in the United States, she adds.

Distler led the work as an Emory undergraduate majoring in quantitative sciences on the neuroscience and behavioral biology track. She is now a graduate student of epidemiology at the University of Michigan School of Public Health.

"Lead is a toxicant that is particularly dangerous to children and their developing brains," Distler says. "Even low blood lead levels are associated with neurological deficits in children."

"One of the biggest problems concerning lead is that many people don't know if their children are being exposed," says Eri Saikawa, senior author of the study and associate professor in Emory's Department of Environmental Sciences and Rollins School of Public Health. "Detecting lead exposure as early as possible is very important so preventative measures can be taken. The easiest way to do that is to screen the blood."

In 2018, the Saikawa lab collaborated with members of Atlanta's Historic Westside Gardens to test urban soil on Atlanta's Westside for contaminants. That project uncovered high levels of heavy metal and metalloids in some yards, and even some industrial waste known as slag. The project led to an investigation by the U.S. Environmental Protection Agency, which in 2019 began decontaminating properties in the area by removing and replacing soil.

In addition to neurological deficits, lead exposure is associated with immunological and endocrine effects and cardiovascular disease. Decades ago, federal regulations reduced lead in paint and gasoline and other common exposure sources. The resulting drop in children's blood lead levels in the United States is considered one of the greatest public health achievements in the country's history.

Many people remain unaware, however, that lead persists in the environment. "It can linger for a really long time in everything from soil to water," Distler says. "That puts some people at risk for chronic exposures to low levels over a long time."

The Centers for Disease Control and Prevention (CDC) estimates that at least four million households in the United States have children living in them who are being exposed to high levels of lead. And about half a million of those children aged one to five years have blood lead levels above five micrograms per deciliter, the level at which the CDC recommends initiating public health action.

Despite this alarming statistic, many children in higher-risk areas are not screened for blood lead levels. In Georgia, data from the period 2011 to 2018 show that the proportion in various ZIP code tabulation areas who have been tested range from 1 percent to 67 percent, with a median of 13 percent.

The Emory researchers realized that one problem may be that health officials focus screening efforts on a county-wide basis, rather than zeroing in on the highest-risk neighborhoods within those counties.

In 2009, a team led by researchers at the CDC developed and published a priority screen index for Atlanta neighborhoods based on housing age and percentage of residents enrolled in Georgia's Special Supplemental Nutrition Program for Women, Infants and Children (WIC), a proxy for poverty.

For the current paper, the Emory researchers built on the efforts of the 2009 paper, drilling down from neighborhoods to more precise U.S. Census Bureau tracts. Data from the American Community Survey was used to assess the relative level of poverty and proportion of homes built before 1950.

A priority screening index, ranging from two to eight, was applied to the census tracts. The areas of highest relative poverty and proportion of homes built before 1950 received the highest score. The researchers applied this index to census tracts across the state of Georgia and to the entire United States to identify tracts that consistently have the highest priority screening index values.

"The visualizations of our priority screening index that we've created using interactive maps can empower physicians and health officials to better target children at high risk for lead exposure," Distler says. "We hope our work will help lead to improved policies and actions to reach children who are most at risk for lead exposure and to improve their lives -- not just in Georgia but throughout the United States."

Credit: 
Emory Health Sciences

Modern computational tools may open a new era for fossil pollen research

image: Researcher Ingrid Romero looking for fossil pollen in a light microscope.

Image: 
Doug Peterson

One of the best sources of information on the evolution of terrestrial ecosystems and plant diversity over millions of years is fossil pollen. For palynologists--the scientists who study ancient pollen--a common challenge in the field is the identification of plant species based on these fossil grains. By integrating machine-learning technology with high-resolution imaging, a team from the Smithsonian Tropical Research Institute (STRI), the University of Illinois at Urbana-Champaign, the University of California, Irvine and collaborating institutions was able to advance this goal.

Pollen is resistant to disintegration, even when exposed to elevated temperatures or strong acids. This allows it to preserve in sediments for millions of years, making it an invaluable record of how the different groups of plants have evolved and the environmental factors that have played a role in this evolution. In order to achieve this, researchers must first be able to identify what they are observing under the microscope.

To help improve the efficiency and accuracy of fossil pollen identifications, scientists developed and trained three machine-learning models to differentiate among several existing Amherstieae legume genera and tested them against fossil specimens from western Africa and northern South America dating back to the Paleocene (66-56 million years ago), Eocene (56-34 million years ago) and Miocene (23-5.3 million years ago).

The models classified existing pollen accurately over 80% of the time and also showed high consensus on the identification of fossil pollen specimens. These results support previous hypotheses suggesting that Amherstieae originated in Africa and later dispersed to South America, revealing an evolutionary history of nearly 65 million years.

"We do not know the biological affinity of the majority of types of deep-time fossil pollen," said STRI paleontologist Carlos Jaramillo, co-author of the study. "This study shows that with the right tools, we are able to taxonomically classify fossil pollen beyond what has been previously possible."

However, over a third of the fossil specimens did not present biological affinity with any existing genera, suggesting that part of this ancient diversity may have went extinct at some point during the evolutionary process.

"These new tools reveal the vast amount of taxonomic information that pollen can offer and that has been hidden from researchers until now," said Ingrid Romero, doctoral student at the University of Illinois-Urbana Champaign and lead author of the study.

This new approach improves the taxonomic resolution of fossil pollen identifications and greatly enhances the use of pollen data in ecological and evolutionary research. It also narrows down the range of options for experts in fossil pollen identification, allowing them to save time and invest their energy on the most challenging specimens.

"Fossil pollen analysis is a very visual science and computer vision algorithms augment expert identifications and can assist palynologists with challenging visual classification problems," said Surangi Punyasena, associate professor at the University of Illinois-Urbana Champaign and senior author of the study. "The machine-learning models that our colleagues Shu Kong and Charless Fowlkes developed are truly remarkable. Our hope is that other researchers apply these techniques and that as a community we expand the ecological and evolutionary questions that are addressed by the fossil pollen record."

Credit: 
Smithsonian Tropical Research Institute

Localized vaccination surveillance could help prevent measles outbreaks

Access to more localized data on childhood vaccination coverage, such as at the school or neighborhood levels, could help better predict and prevent measles outbreaks in the United States, according to a new University of Michigan study.

The research also shows that when people who are not vaccinated are geographically clustered, the probability and size of an outbreak increases. The study is scheduled for publication Oct. 26 in the journal Proceedings of the National Academy of Sciences.

"We found that even at 99% overall vaccination coverage, clustering of non-vaccinators allowed outbreaks to occur. This means that we really must rethink whether or not herd immunity is meaningful when applied at large spatial scales like the state or country level," said lead author Nina Masters, a doctoral student in epidemiology at the U-M School of Public Health.

Because measles is a highly contagious disease, vaccination coverage of at least 95% is required to maintain herd immunity. But despite having reached that level nationally, the U.S. saw 1,282 cases of measles in 31 states last year--the most cases since 1992.

Masters and colleagues sought to present a simple, easy-to-understand model that shows what happens when people who are not vaccinated become clustered within neighborhoods and other small communities, even within a population that has achieved 95% vaccination coverage overall against measles.

They also analyzed how the aggregation of vaccination data to the levels it is typically reported at, such as cities and states, may inadvertently obscure important fine-scale clustering, making outbreaks appear less likely than they truly are in many cases.

Fine-scale vaccination data at the level the researchers would like to be able to analyze is not widely accessible or available in the U.S., so the researchers instead built a computational model representing a mid-sized city: a 16x16 square grid with 256 cells of 1,000 people each. The model had four levels: blocks (cells) of 1,000 people that approximate census block groups; tracts of 4,000 people (four cells, approximating census tracts); neighborhoods of 16,000 people (16 cells); and quadrants of 64,000 people (64 cells, approximating towns or districts).

Holding the overall vaccination rate constant at the typical herd immunity threshold of 95%, the researchers used an array of different clustering patterns to distribute the 5% of non-vaccinators across the environment, and examined the impact of different clustering motifs on outbreak size and frequency.

The researchers also looked at how aggregating the data up to these four coarser levels would impact predicted outbreak size. At the block level, across all the clustering scenarios, the average predicted outbreak size was 3,886 cases, however, as they 'zoomed out,' the details of the outbreak were lost. Aggregating data to the tract-level predicted 45% fewer cases, to the neighborhood-level predicted 76.5% fewer cases, and to the quadrant level predicted a 99% reduction.

"Holding everything else constant, we saw that the more you increase fine-scale clustering, the higher the outbreak potential," Masters said. "And then as you aggregate up, you lose a lot of the ability to predict those outbreaks. This shows that if we can't measure that fine-scale clustering, we don't know what landscape of susceptibility we're dealing with, and then can't effectively assess outbreak risk."

In the U.S., public health surveillance systems typically report vaccination coverage at the county and state level, obscuring fine-scale clustering and susceptibility to potential outbreaks.

For example, in Michigan, only 4.5% of kindergarteners statewide had vaccination waivers for the 2018-19 school year--above the herd immunity threshold of 95%. However, that same year, a large measles outbreak occurred in Oakland County, where the waiver rate was about 7%. In that county, school-district waiver rates ranged from 0 to about 23%, and two schools reported more than half of kindergarteners with vaccine waivers.

"Vaccination data is collected at the school level, and in an ideal world, it would be great to have that data available at the school level so local and state health departments can be aware of regions that are very susceptible to an outbreak and respond accordingly," Masters said, adding this is especially important now, as the number of children receiving vaccinations has decreased due to the pandemic.

"I think it's safe to say that the existing clusters of non-vaccinators have gotten worse, because more people who normally would get their vaccines are not getting vaccines right now for their children."

Masters said it's important for policymakers to think critically about how assumptions in science are made and how they should be applied at the population level. For example, while the critical vaccination fraction to reach herd immunity is often used as a tool for setting disease elimination targets--the World Health Organization uses a target of 95% vaccination against measles at the national level--this calculation assumes that populations are homogeneous: both in terms of vaccination distribution and human contact.

However, societies are actually very heterogeneous and humans are very clustered in terms of their vaccination behavior, which calls into question how useful thresholds like these really are, Masters said.

"Transmission happens on city blocks, not in countries," she said. "We have to think more about how these definitions make sense in terms of matching the scale (of the policies) to the scale of actual transmission."

The researchers said that globally, there were 791,143 suspected cases in 2019, compared to 484,077 in 2018, a 63% increase. In all, 187 out of 194 WHO member states reported measles cases in 2019. To meet global measles elimination goals and prevent a worldwide resurgence, finer-scale measles data should be used to better predict and prevent outbreaks.

Credit: 
University of Michigan

A molecular break for root growth

image: Prof. Caroline Gutjahr in her laboratory.

Image: 
U. Benz / TUM

Roots are essential for reaching water and nutrients, for anchorage to the ground, but also for interacting and communicating with microorganisms in the soil. A long root enables the plant to reach deeper, more humid layers of soil, for example during drought. A shallower root with many root hairs is good for phosphate uptake, as phosphate is mostly found in the upper soil layers.

Caroline Gutjahr, Professor of Plant Genetics at the TUM School of Life Sciences in Weihenstephan, and her team discovered new hormone interactions which influence the growth of plant roots.

Why some plant roots have long and others have short hairs

"We found that the protein SMAX1 acts as molecular break for ethylene production", says Caroline Gutjahr. Ethylene is a plant hormone that is considered to trigger or accelerate the ripening of many fruits and vegetables, but it can also trigger other processes in plants. If less of the gaseous hormone is produced by the plant, the plant is stimulated to grow long roots and short root hairs.

The suppressor "SMAX1" can be removed by activating the so-called karrikin signaling pathway, which is triggered by another hormone. This switches on the production of ethylene, resulting in short primary roots and elongated root hairs.

This is the first time that scientists have succeeded in identifying and understanding a molecular process that is switched on by the karrikin signaling pathway and in showing a molecular mechanism, by which this signaling pathway regulates a developmental process in plants.

Plant diversity is also reflected in molecular mechanisms

"Surprisingly this mechanism has a significant impact on the roots of the legume Lotus japonicus, the model plant for peas, beans and lentils, on which we conducted our research," says Gutjahr.

In contrast, the research team observed a much weaker influence in the roots of another model plant, Arabidopsis thaliana or thale cress, which is related to cabbage plants.

"This shows that the diversity of plants is not only reflected in their appearance, but also in the effect of their molecular triggers on growth," the researcher concludes.

The relevance of improving root growth for plant breeding

"If we understand more precisely how root growth is regulated at the molecular level and in coordination with environmental stimuli, we can cultivate crops that are better able to cope with unfavorable environmental conditions and thus produce yield even under stress," explains the scientist.

This is why her research group is now investigating how the identified hormone signaling pathways (karrikin and ethylene signaling) react to different environmental conditions. They hope to discover how these two signaling pathways collaborate with the sensors that allow plants to perceive various environmental influences to adjust root growth to benefit plant survival and yield.

Credit: 
Technical University of Munich (TUM)

State gun laws may help curb violence across state lines: study

Columbia University Mailman School of Public Health researchers find that strong state firearm laws are associated with fewer firearm homicides--both within the state where the laws are enacted and across state lines. Conversely, weak firearm laws in one state are linked to higher rates of homicides in neighboring states.

Results of the study appear in the journal Epidemiology, and are based on an analysis of county-level data on firearm homicides as they relate to state gun laws between 2000 and 2014.

Around 14,000 Americans civilians die and over 75,000 people are treated in emergency departments because of gun violence every year. This study adds evidence that the count and composition of states' gun laws affect within-state homicide incidence. The associations are strongest for background checks compared to other gun laws, including child access prevention laws, dealer registration laws, and licensing laws. As well as examining within-state effects, the research is one of the few studies to explore the potential effects of gun laws on homicides in neighboring states, with the results emphasizing that firearm laws spill over state lines.

The study found that homicide incidence was greatest in counties with weak within-state laws and where the largest nearby population centers were in other states that also weak laws. As an example, the researchers contrast New Hampshire and Alabama, which both had 10 gun laws in 2014. The most populous urban center near New Hampshire is Boston, which had 100 gun laws, whereas the major city nearest to Alabama is Atlanta, where there were 6 laws. The weak gun laws in Alabama and Georgia both contribute to higher homicide incidence in Alabama, but the stronger gun laws in Massachusetts temper the effect of the weak laws in New Hampshire. To explain these results, the researchers suggest it may be easier for guns to flow undetected into places where laws are already weak.

"Gun violence is a public health crisis in the United States," says first author Christopher Morrison, PhD, Assistant Professor of epidemiology at the Columbia Mailman School. "Research has demonstrated that strong gun laws can reduce this burden. It's now becoming clear that weak gun laws don't only drive up gun violence within their own borders, they also affect gun violence in neighboring states."

Credit: 
Columbia University's Mailman School of Public Health

Illinois study tracks evolution of SARS-CoV-2 virus mutations

URBANA, Ill. - Since COVID-19 began its menacing march across Wuhan, China, in December 2019, and then across the world, the SARS-CoV-2 virus has taken a "whatever works" strategy to ensure its replication and spread. But in a new study published in Evolutionary Bioinformatics, University of Illinois researchers and students show the virus is honing the tactics that may make it more successful and more stable.

A group of graduate students in a spring-semester Bioinformatics and Systems Biology class at Illinois tracked the mutation rate in the virus's proteome - the collection of proteins encoded by genetic material - through time, starting with the first SARS-CoV-2 genome published in January and ending more than 15,300 genomes later in May.

The team found some regions still actively spinning off new mutations, indicating continuing adaptation to the host environment. But the mutation rate in other regions showed signs of slowing, coalescing around single versions of key proteins.

"That is bad news. The virus is changing and changing, but it is keeping the things that are most useful or interesting for itself," says Gustavo Caetano-Anolles, professor of bioinformatics in the Department of Crop Sciences at Illinois and senior author on the study.

Importantly, however, the stabilization of certain proteins could be good news for the treatment of COVID-19.

According to first author Tre Tomaszewski, a doctoral student in the School of Information Sciences at Illinois, "In vaccine development, for example, you need to know what the antibodies are attaching to. New mutations could change everything, including the way proteins are constructed, their shape. An antibody target could go from the surface of a protein to being folded inside of it, and you can't get to it anymore. Knowing which proteins and structures are sticking around will provide important insights for vaccines and other therapies."

The research team documented a general slowdown in the virus's mutation rate starting in April, after an initial period of rapid change. This included stabilization within the spike protein, those pokey appendages that give coronaviruses their crowned appearance.

Within the spike, the researchers found that an amino acid at site 614 was replaced with another (aspartic acid to glycine), a mutation that took over the entire virus population during March and April.

"The spike was a completely different protein at the very beginning than it is now. You can barely find that initial version now," Tomaszewski says.

The spike protein, which is organized into two main domains, is responsible for attaching to human cells and helping inject the virus's genetic material, RNA, inside to be replicated. The 614 mutation breaks an important bond between distinct domains and protein subunits in the spike.

"For some reason, this must help the virus increase its spread and infectivity in entering the host. Or else the mutation wouldn't be kept," Caetano-Anolles says.

The 614 mutation was associated with increased viral loads and higher infectivity in a previous study, with no effect on disease severity. Yet, in another study, the mutation was linked with higher case fatality rates. Tomaszewski says although its role in virulence needs confirmation, the mutation clearly mediates entry into host cells and therefore is critical for understanding virus transmission and spread.

Remarkably, sites within two other notable proteins also became more stable starting in April, including the NSP12 polymerase protein, which duplicates RNA, and the NSP13 helicase protein, which proofreads the duplicated RNA strands.

"All three mutations seem to be coordinated with each other," Caetano-Anolles says. "They are in different molecules, but they are following the same evolutionary process."

The researchers also noted regions of the virus proteome becoming more variable through time, which they say may give us an indication of what to expect next with COVID-19. Specifically, they found increasing mutations in the nucleocapsid protein, which packages the virus's RNA after entering a host cell, and the 3a viroporin protein, which creates pores in host cells to facilitate viral release, replication, and virulence.

The research team says these are regions to watch, because increasing non-random variability in these proteins suggests the virus is actively seeking ways to improve its spread. Caetano-Anolles explains these two proteins interfere with how our bodies combat the virus. They are the main blockers of the beta-interferon pathway that make up our antiviral defenses. Their mutation could explain the uncontrolled immune responses responsible for so many COVID-19 deaths.

"Considering this virus will be in our midst for some time, we hope the exploration of mutational pathways can anticipate moving targets for speedy therapeutics and vaccine development as we prepare for the next wave," Tomaszewski says. "We, along with thousands of other researchers sequencing, uploading, and curating genome samples through the GISAID Initiative, will continue to keep track of this virus."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

From sea to shining sea: new survey reveals state-level opinions on climate change

North and South, rural and urban--the United States is a complex mix of cultures, mindsets, and life experiences. And, as a new report by researchers at Stanford University, Resources for the Future, and ReconMR illustrates, those state-by-state differences affect climate attitudes and opinions.

The report is the latest installment of Climate Insights 2020, a seven-part series that reveals American beliefs, attitudes, and opinions on climate change and mitigation policies. The latest installment combines data from 27,661 respondents into a single dataset, then separates those data by state. Through this unique formulation, the report reveals state and regional opinions on climate change and their potential impacts on voting just days ahead of the presidential election.

As well as assessing where each state stands on a variety of climate-related issues, the report explains variation between states using indicators ranging from state-level politics to typical temperatures to residential energy prices.

Top Findings

The majority of residents in all analyzed states hold "green" opinions--for example, more than 70% of residents in all states believe that climate change has occurred.

At least 60% of people in all analyzed states believe that climate change will be a serious problem for the United States and world.

The "issue public"--the people who consider climate change extremely personally important and vote based on the issue--varies from state to state. In Rhode Island, 33% of people care deeply about climate change, while in South Dakota only 9% do so.

People in states that conferred more votes to President Trump in the 2016 election demonstrated a lower level of acceptance of the fundamentals of climate change and reduced support for specific policies to mitigate it.

The larger the majority expressing "green" opinions on climate change, the more likely their US congressional representatives were to vote for "green" policies. The more of a state's population is passionate about the issue, the more likely representatives are to vote for those policies.

"These data provide strong signals to many policymakers about how their constituents would like them to vote on legislation related to global warming," report coauthor Jon Krosnick said. "With just over a week until the presidential election, these findings document the likely role that climate will play in voting decisions from coast to coast."

To learn more about these findings, read Climate Insights 2020: Opinion in the States by Jared McDonald, postdoctoral research fellow at Stanford University; Bo MacInnis, lecturer at Stanford University and PhD economist; and Jon Krosnick, social psychologist, professor at Stanford University, and RFF university fellow. The Climate Insights 2020 interactive data tool also allows users to explore the data in greater depth.

The final report installment in this series will be a synthesis of the six installments. Previous ones have focused on overall trends, natural disasters, climate policies, partisan breakdowns of those policies, and electric vehicles.

Credit: 
Resources for the Future (RFF)

MFS Is a strong surrogate endpoint for OS for men receiving salvage RT for recurrent prostate cancer

An analysis of the phase III NRG Oncology clinical trial RTOG 9601 on men receiving salvage radiotherapy (SRT) following prostatectomy for recurrent prostate cancer indicated that, while biochemical failure (BF) was not a strong surrogate endpoint to determine overall survival (OS), metastasis-free survival (MFS) was in this patient population. These findings are consistent with data in the intact prostate primary radiation treatment setting. The analysis results were presented at the virtual edition of the American Society for Radiation Oncology's (ASTRO) Annual Meeting in October 2020.

Previously, the Intermediate Clinical Endpoints in Cancer of the Prostate (ICECaP) working group identified MFS as a valid surrogate for OS in men treated with localized prostate cancer, however, only 8% of the men in the ICECaP analysis were treated with prostatectomy and this included no trials that employed SRT. Prior to the analysis of NRG-RTOG 9601, the performance of intermediate clinical endpoints (ICEs) to serve as surrogate endpoints were unknown in a SRT setting. The NRG-RTOG 9601 analysis surveyed two types of BF including PSA nadir +0.3-0.5 ng/mL or initiation of salvage hormone therapy as defined by the NRG-RTOG 9601 study and PSA nadir + 2ng/mL as defined by the NRG-RTOG 0534 study. Researchers also assessed DM and MFS endpoints. Endpoints were all assessed for surrogacy using two approaches; the Prentice Criteria, and a two-stage meta analytic approach in which two conditions had to be met for surrogacy including that the ICE had to be correlated to OS and the treatment effect on the ICE and OS had to be correlated.

Although BF, MFS, and distant metastasis (DM) each satisfied the four Prentice Criteria for OS, there were differences with the two-condition approach that was used to determine surrogacy. In the two-stage, meta-analytic approach, MFS was strongly correlated with OS (τ = 0.86). Other endpoints were not significantly associated with OS. DM was only moderately correlated with OS (τ = 0.66) and BF weakly correlated to OS on both the RTOG 9601-defined BF (τ = 0.25) and RTOG 0534-defined BF (τ = 0.40) endpoints. Biochemical failure was considered prognostic in this analysis; however, PSA-based ICEs as well as the treatment effect of antiandrogen therapy were only weakly correlated with OS and, therefore, would be considered a poor surrogate endpoint.

"From this analysis, we believe that researchers should be cautious when inferring clinical benefit from studies utilizing biochemical failure as a surrogate for overall survival," stated William Jackson, MD, of the University of Michigan and lead author of the NRG-RTOG 9601 analysis abstract. "These findings highlight that the two-stage meta analytic approach should be the preferred method when assessing surrogacy."

These findings can be further validated as data is collected from current, ongoing salvage radiotherapy trials in this patient population.

Credit: 
NRG Oncology