Tech

BU finds alternative to 'revolving door' of opioid detox and relapse

In a first-ever randomized trial, patients at a short-term inpatient program began long-term outpatient treatment with buprenorphine before discharge, with better outcomes than detox patients.

Three out of four people who complete an inpatient opioid withdrawal management program--commonly known as "detox"--relapse within a month, leading to a "revolving door" effect. Few successfully transition from the inpatient setting to long-term treatment with proven medications such as buprenorphine, methadone, or naltrexone to prevent overdose.

But patients who start long-term buprenorphine treatment at a detox program, instead of going through detox and getting a referral for such treatment at discharge, are less likely to use opioids illicitly over the following six months, and more likely to keep up treatment, according to a first-of-its-kind study led by a Boston University School of Public Health (BUSPH) researcher and published in the journal Addiction.

"The idea of detox--getting inpatient treatment for a few days and expecting to quit opioids--has always been magical thinking," says study lead author Dr. Michael Stein, professor and chair of health law, policy & management at BUSPH. "We've quantified here for the first time how successful we can be if we use short-term inpatient programs as starting grounds for long-term treatment."

In the randomized trial, 59 patients at the Stanley Street Treatment and Resources program (SSTAR) in Fall River, Mass., went through a standard buprenorphine-assisted detox program (including then tapering off buprenorphine). Another 56 patients received the typical first-day buprenorphine treatment, then went on to a daily dose of buprenorphine, and were discharged as already-established patients at SSTAR's nearby primary healthcare center, with an outpatient appointment for the following week and a prescription to be able to continue taking their daily dose of buprenorphine until then.

The researchers found that these participants were more likely to be taking buprenorphine up to six months after discharge than the patients who had gone through standard detox. They were also less likely to use illicit opioids, "thereby lowering the overdose risk that comes from use of fentanyl and other lethal opioids," Stein says.

Credit: 
Boston University School of Medicine

TGen and Ohio State collaborate on landmark precision medicine canine cancer study

PHOENIX, Ariz. -- Aug. 20, 2019 -- Despite those velvet paintings of poker-playing dogs smoking pipes, cigars and cigarettes, our canine friends really don't use tobacco. But like many humans who have never smoked, dogs still get lung cancer.

And, like many women who develop a particular type of breast cancer, the same gene -- HER2 -- also appears to be the cause of lung cancer in many dogs, according to a promising new study of pet dogs led by the Translational Genomics Research Institute (TGen), an affiliate of the City of Hope, and The Ohio State University.

Published today in the journal Clinical Cancer Research, this study could have significant implications for people who have never smoked.

TGen and Ohio State found that neratinib -- a drug that has successfully been used to battle human breast cancer -- might also work for many of the nearly 40,000 dogs in the U.S. that annually develop the most common type of canine lung cancer, known as canine pulmonary adenocarcinoma, or CPAC.

Neratinib inhibits a mutant cancer-causing form of the gene HER2, which is common to both CPAC and HER2-positive human breast cancer patients.

"With colleagues at Ohio State, we found a novel HER2 mutation in nearly half of dogs with CPAC. We now have a candidate therapeutic opportunity for a large proportion of dogs with lung cancer," said Dr. Will Hendricks, an Assistant Professor in TGen's Integrated Cancer Genomics Division, Director of Institutional Research Initiatives, and the study's senior author.

Based on the results from this study, a clinical trial using neratinib is planned for dogs with naturally occurring lung cancer that have the HER2 mutation.

"This is the first precision medicine clinical trial for dogs with lung cancer. That is, the selection of cancer therapy for a particular patient is based on the genomic profile of the patient's tumor and matched with agents that are known to specially target the identified mutation," said Dr. Wendy Lorch, an Associate Professor in the Department of Veterinary Clinical Sciences at The Ohio State University College of Veterinary Medicine, who also will run the study's clinical trial.

"Our team at The Ohio State University has worked for years to find treatments for canine lung cancer. This breakthrough shows the value of these studies for dogs, as well as humans with lung cancer who never smoked," said Dr. Lorch, who also is the study's lead author.

CPAC is an aggressive disease that clinically resembles human lung cancer among never-smokers. There is no standard-of-care treatment for CPAC and -- prior to the work performed by the TGen-Ohio State team -- little was known of the disease's genetic underpinnings.

"These results are the first example of our efforts to adapt genomics tools from the human world, such as gene sequencing and liquid biopsies, to generate novel insights in canine cancers, with mutual benefit for both," said Dr. Muhammed Murtaza, Assistant Professor and Co-Director of TGen's Center for Noninvasive Diagnostics, and one of the study's contributing authors.

While the sequencing of hundreds of thousands of human cancer genomes has driven the transformational development of precise targeted cancer treatments for humans over the past decade, relatively few canine cancer genomes have undergone similar profiling. The canine cancer genomic discovery and drug development efforts of the TGen-Ohio State team are pieces of a larger puzzle that could similarly transform veterinary oncology, while creating bridges between canine and human cancer drug development.

"This study is groundbreaking because it not only identified a recurring mutation in a canine cancer that had never been found before, but it actually led directly to a clinical trial," said Dr. Jeffrey Trent, TGen President and Research Director, and one of the study's contributing authors. "This clinical translation from dog to human and back is the holy grail of comparative cancer research."

Lung cancer is the leading cause of cancer death in the U.S., annually taking the lives of more than 154,000 Americans.

"This study is really exciting to us because, not only have we found a recurrent hot-spot mutation in a canine cancer that had never been found before, but it actually has direct clinical translational relevance. For humans, we already have drugs that can inhibit many dysregulated proteins. We hope to show that we can provide the same benefit for dogs with canine cancers," Dr. Hendricks added.

No dogs were harmed in this study. Only pet dogs with naturally occurring cancer were examined.

This study -- Identification of recurrent activating HER2 mutations in primary canine pulmonary adenocarcinoma -- lays the foundation for potential rapid translational development. Follow-up clinical and genomic studies have been funded in part by a $300,000 grant investment from the Petco Foundation made possible through their 10-year Pet Cancer Campaign in partnership with Blue Buffalo. Susanne Kogut, President of The Petco Foundation, said her organization's investment in the next phase of TGen-Ohio State studies is part of a larger effort to improve the health and welfare of pets everywhere.

"We are so excited to be a part of this study of canine lung cancer, which we hope will rapidly benefit our pet, and pet-parent, communities worldwide," said Kogut, who in 2016 was named one of 25 "women of influence" by Pet Age magazine.

Credit: 
The Translational Genomics Research Institute

Machine learning models help clinicians identify people who need advanced depression care

image: Regenstrief Institute research scientist Suranga N. Kasthurirathne, PhD and colleagues have created decision models capable of predicting which patients might need more treatment for their depression than what their primary care provider can offer. The algorithms were specifically designed to provide information the clinician can act on and fit into existing clinical workflows.

Image: 
Regenstrief Institute

Researchers at Regenstrief Institute and Indiana University created decision models capable of predicting which patients might need more treatment for their depression than what their primary care provider can offer. The algorithms were specifically designed to provide information the clinician can act on and fit into existing clinical workflows.

Depression is the most commonly occurring mental illness in the world. The World Health Organization estimates that it affects about 350 million people. Some people may be able to manage their depression on their own or with guidance from a primary care provider. However, others may have more severe depression that requires advanced care from mental health care providers.

The Regenstrief and IU researchers created algorithms to identify those patients so that primary care doctors and providers can refer them to mental health specialists.

"Our goal was to build reproducible models that fit into clinical workflows," said Suranga N. Kasthurirathne, PhD, first author of the paper and research scientist at Regenstrief Institute. "This algorithm is unique because it provides actionable information to clinicians, helping them to identify which patients may be more at risk for adverse events from depression."

The algorithms combined a wide variety of behavioral and clinical information from the Indiana Network for Patient Care, a statewide health information exchange, for patients at Eskenazi Health. Dr. Kasthurirathne and his team developed algorithms for the entire patient population, as well as several different high-risk groups.

"By creating models for different patient populations, we offer health system leaders the option of selecting the best screening approach for their needs," said Dr. Kasthurirathne. "Perhaps they don't have the computational or human resources to run models on every single patient. This gives them the option to screen select high-risk patients." Dr. Kasthurirathne is also a visiting research assistant professor at the Indiana University Richard M. Fairbanks School of Public Health at IUPUI.

"Primary care doctors often have limited time, and identifying patients with more severe forms of depression can be challenging and time consuming. Our model helps them help their patients more efficiently and improve quality of care simultaneously," said Shaun Grannis, M.D., M.S., co-author on the paper and director of the Clem McDonald Center for Biomedical Informatics at Regenstrief Institute. "Our approach is also well suited to leverage increasing health information technology adoption and interoperability to enable preventive care and improve access to wraparound health services." Dr. Grannis is the Clem McDonald Professor of Biomedical Informatics at Indiana University School of Medicine.

Researchers are now working to integrate social determinants of health into these models.

This research was conducted as part of Dr. Kasthurirathne's doctoral dissertation.

Credit: 
Regenstrief Institute

Helping skin cells differentiate could be key to treating common skin cancer

image: Brian C. Capell, M.D., Ph.D.

Image: 
Penn Medicine

PHILADELPHIA - The outer layer of the skin completely replaces itself every two to four weeks, but when this process is blocked, cancer can grow. A new study from researchers in the Perelman School of Medicine at the University of Pennsylvania has now identified a key regulator of that block known as LSD1, as well as a way to genetically influence the skin to grow in a way that prevents this block from happening. This is the first study to show that LSD1 - a regulator involved in telling parent cells what type of specific cells their lineage should become as they reproduce - plays a role in the growth of non-melanoma skin cancers, and that blocking LSD1 could be an effective, targeted treatment method for those cancers, which are the most common in the world. The journal Cell Reports published the findings today.

Cutaneous squamous cell carcinoma (cSCC) is a skin cancer caused by the abnormal growth of skin cells. Together with a similar type of cancer known as basal cell carcinoma (BCC), they outnumber all other human cancers combined. While many patients can have these cancers removed, others are not candidates for surgery and need alternative treatment options, which can include chemotherapy. Previous research has shown that these types of cancers thrive when skin cells, which are constantly renewing, don't differentiate themselves as they reproduce.

"Our study shows that targeting LSD1 can force the skin cells down a differentiation path, which could open the door to new topical therapies that can ultimately turn tumor cells into healthier, more normal cells," said the study's senior author Brian C. Capell, MD, PhD, an assistant professor of Dermatology and a member of Penn's Epigenetics Institute and Abramson Cancer Center. The co-lead authors on the study are Shaun Egolf, a graduate student, and Yann Aubert, PhD, a postdoctoral fellow, both in Capell's lab.

LSD1 is typically elevated in many types of cancer, and there are several inhibitors that attempt to target it. But until now, no one has shown its role in repressing the genes the skin needs for healthy turnover. That knowledge could open the door to a new treatment method that blocks LSD1 with a skin cream or other topical therapy.

"By knocking out LSD1, we can essentially turn the switch back on that would tell the skin to differentiate in a healthy way," Capell said.

Researchers say work is already underway to prove the concept can work, which would pave the way for human clinical trials.

Credit: 
University of Pennsylvania School of Medicine

Kidney transplants covered by Medicaid increased in states after Medicaid expansion

PHILADELPHIA -- Medicaid expansion has helped more young, low-income adults with advanced kidney disease to avoid the costs and poor quality-of-life associated with dialysis, reports a study in the Journal of General Internal Medicine from researchers at Drexel University College of Medicine and the Dornsife School of Public Health at Drexel.

The study included 15,775 United States adults age 21-64 who received a pre-emptive kidney transplant (i.e., a transplant before needing dialysis treatment) from 2010-2017.

The team examined the numbers of living and deceased donor kidney transplants, respectively, that occurred during the four years leading up to Medicaid expansion and the four years following the date of expansion in states that opted to expand Medicaid as part of the Affordable Care Act, compared to trends in preemptive transplants in states that chose not to expand Medicaid.

Researchers found that the overall number of pre-emptive kidney transplants covered by Medicaid have increased by 37 percent in states that did not expand Medicaid and by 66 percent in states that did expand Medicaid. Medicaid-covered preemptive, living-donor kidney transplants increased by 0.7 percentage points in non-expansion states, and by 2.2 percentage points in expansion states.

The Affordable Care Act became law in March 2010, expanding the nation's Medicaid program, particularly to almost all non-elderly adults whose income is at or below 138 percent of the federal poverty level. This 100 percent federal funding coverage -- for states that elected to receive it -- began Jan. 1, 2014, (90 percent coverage starting in 2020).

"More Americans die from chronic kidney disease than from breast cancer, prostate cancer, and many other well-known diseases," said lead author Meera N. Harhay, MD, an associate professor of Medicine at Drexel College of Medicine. "From improving early detection of kidney disease to increasing outreach and educational efforts, there are many steps that we can take to advance care for those with kidney disease. To promote early access to transplants, expanding Medicaid was clearly one of those steps."

Approximately 37 million Americans suffer from chronic kidney disease, a condition in which the kidneys cannot properly pass waste and filter blood. In the advanced form of chronic kidney disease, a living donor transplant is often the best option to avoid dialysis, but health insurance is needed to cover the costs of the procedure. Although transplant before the need for dialysis treatment is the ideal scenario for individuals with advanced kidney disease, Medicare coverage is only available to non-elderly individuals after they begin dialysis. The shortage of kidneys available for transplant requires that people without a living donor often wait for five to 10 years on dialysis before receiving a transplant, and many die on dialysis before they get that opportunity.

The research findings come amidst President Donald Trump signing an executive order in July aimed at improving kidney care. Its goals include increasing rates of preemptive kidney transplant, identifying and treating at-risk populations in earlier stages of kidney disease, removing financial barriers to living organ donation, among others. The study also comes at a time when the fate of the ACA, and Medicaid expansion, are also in question.

Last year, there were 36,500 transplants of any organ in the United States. A total of 21,167 of these, 59 percent, were kidney transplants, according to the United Network for Organ Sharing.

A total of 33 states and Washington D.C. have expanded Medicaid under the Affordable Care Act (ACA), covering millions of previously uninsured Americans - including those with kidney disease who are not dialysis-dependent. The latest research follows a study Harhay published in Journal of General Internal Medicine in October 2018 with Ryan M. McKenna, PhD, an assistant professor in Drexel's Dornsife School of Public Health, which found that 30 percent of the lowest -income individuals in the U.S. with kidney disease were uninsured in 2015 and 2016, despite coverage gains made by Medicaid.

Credit: 
Drexel University

A single change at telomeres controls the ability of cells to generate a complete organism

image: In normal iPS cells (induced pluripotent stem cells) TRF1 is highly expressed, the Polycomb (PRC2) complex (encompassing EDD, EZH2 and Suz12) is weakly bound to the genome and pluripotency genes are expressed. After TRF1 depletion, TERRA increases its expression, this event results in PRC2 recruitment to genes involved in the control of pluripotency and differentiation and establishment of the K27me3 epigenetic mark, altering their expression

Image: 
CNIO

Pluripotent cells can give rise to all cells of the body, a power that researchers are eager to control because it opens the door to regenerative medicine and organ culture for transplants. But pluripotency is still a black box for science, controlled by unknown genetic (expression of genes) and epigenetic signals (biochemical marks that control gene expression like on/off switches). The Telomeres and Telomerase Group, led by Maria Blasco at the Spanish National Cancer Research Centre (CNIO), now uncovers one of those epigenetic signals, after a detective quest that started almost a decade ago.

It is a piece of the puzzle that explains the observed powerful connection between the phenomenon of pluripotency and telomeres -protective structures at the ends of chromosomes-, a kind of butterfly effect in which a protein that is only present in telomeres shows a global action on the genome. This butterfly effect is essential to initiate and maintain pluripotency.

The DNA of telomeres directs the production of long RNA molecules called TERRAs. What the CNIO researchers found is that TERRAs act on key genes for pluripotency through the Polycomb proteins, which control the programs that determine the fate of cells in the early embryo by depositing a biochemical mark on the genes. The on/off switch that regulates TERRAs, in turn, is a protein that is only present in telomeres; this protein is TRF1, one of the components of the telomere-protecting complex called shelterin. The new result is published this week in the journal eLife.

Why is a telomere gene required for pluripotency?

It has been known for about fifteen years how to return the power of pluripotency to cells by acting on certain genes. However, the researchers noticed that this recipe did not work if the TRF1 gene was turned off. Moreover, TRF1 was one of the most activated genes when pluripotency was induced. These facts intrigued the researchers. Why was TRF1, a gene whose product is only found in telomeres, activated so much, and how could this be essential for pluripotency?

"We could not understand how a gene that deals with telomere maintenance has such a profound effect on a global process like pluripotency," says Maria Blasco, Head of the Telomeres and Telomerase Group at CNIO.

To find an explanation, they decided to carry out a random search by analyzing the changes in the expression of the entire genome when the expression of TRF1 was prevented - something like blindly casting a large net into the sea to see what is in it. "We saw that TRF1 had an enormous, but very organized, effect," explains Blasco.

The expression of many genes was altered, and more than 80% of them were directly related to the phenomenon of pluripotency. The researchers also noted that many of these genes were regulated by Polycomb, a protein complex that is very important in the early stages of embryonic development and that directs cells to specialize into the different cell types of the adult body.

The link is TERRA

But they still did not understand what the link between Polycomb and TRF1 was. Last year, however, Blasco's Group discovered that the TERRA molecules that are produced in telomeres communicate with Polycomb and that together they are involved in building the telomere structure.

The researchers decided to analyze the interaction between TERRA and the entire genome, and sure enough, they found that TERRA stuck to the same genes that were regulated by Polycomb. This suggested that TERRA was the link between TRF1 and pluripotency.

TRF1 "exerts a butterfly effect on the transcription of pluripotent cells, by altering the epigenetic landscape of these cells through a novel mechanism, which involves TERRA-mediated changes in the action of Polycomb," the researchers write in eLife.

As Rosa Marión, first author of the study, explains, "these findings tell us that TRF1 is essential for reprogramming specialized cells and for maintaining pluripotency."

Credit: 
Centro Nacional de Investigaciones Oncológicas (CNIO)

Multi-tasking protein at the root of neuropathic pain

image: This is the mechanism of FLRT3-induced pain.

Image: 
Osaka University

Osaka, Japan - Researchers from Japan's Osaka University have made an important leap in our understanding of how chronic pain conditions develop. In a study published on July 25 in Journal of Neuroscience, the team explains how a protein previously implicated in neuron growth and cell adhesion is also critical for the development of pain sensitization.

Neuropathic pain is a chronic condition arising from previous nerve injury or certain diseases, including diabetes, cancer, and multiple sclerosis. Affected patients often display hypersensitivity to normally non-painful stimuli such as touch or repetitive movement, with pain commonly manifesting as shooting burning sensations, numbness, or pins and needles. In many cases, the pain cannot be relieved with analgesics.

In humans, the spinal cord dorsal horn acts as a sorting station for pain stimuli. Signals coming in from peripheral areas of the body are processed and then transmitted via secondary neurons to the brain. Importantly, this is a key region in the development of neuropathic pain; studies have linked the condition to abnormal neuronal excitability in the spinal cord dorsal horn. However, what causes these neurons to become overly excited remains a mystery.

FLRT3, or fibronectin leucine-rich transmembrane protein-3, is a protein commonly found in both embryonic and adult nervous systems. And while researchers don't know exactly what role it plays in adult tissues, FLRT3 has been implicated in synapse formation and cell adhesion in the developing brain.

But it was reports of FLRT3 expression in the dorsal horn following nerve injury that led the researchers from Osaka University to investigate the possibility that FLRT3 could be involved in neuropathic pain.

"We examined FLRT3 expression in the dorsal horns of adult rats after peripheral nerve injury," explains lead author of the study Moe Yamada. "Interestingly, even though Flrt3 gene expression was only observed in the dorsal root ganglion, high levels of FLRT3 protein were found in the dorsal horn.

"When we then injected purified FLRT3 into the subarachnoid space so that it reaches the cerebrospinal fluid or overexpressed the protein in the dorsal root ganglion using a viral vector, the treated rats developed touch sensitivity, called mechanical allodynia," adds Yamada.

Encouragingly, if they then blocked the activity of FLRT3 using antibodies or by gene silencing, the mechanical allodynia symptoms arising after nerve damage all but disappeared.

Explains senior author Toshihide Yamashita, "Our results suggest that FLRT3 is produced by injured neurons in the dorsal root ganglion, causing neuronal excitability in the entire dorsal horn and subsequent pain sensitization. This is a novel role for FLRT3, and provides new avenues to explore in the search for effective treatments for neuropathic pain."

Credit: 
Osaka University

Stardust in the Antarctic snow

image: The Kohnen Station is a container settlement in the Antarctic, from whose vicinity the snow samples in which iron-60 was found originate.

Image: 
Martin Leonhardt / Alfred-Wegener-Institut (AWI)

The quantity of cosmic dust that trickles down to Earth each year ranges between several thousand and ten thousand tons. Most of the tiny particles come from asteroids or comets within our solar system. However, a small percentage comes from distant stars. There are no natural terrestrial sources for the iron-60 isotope contained therein; it originates exclusively as a result of supernova explosions or through the reactions of cosmic radiation with cosmic dust.

Antarctic Snow Travels around the World

The first evidence of the occurrence of iron-60 on Earth was discovered in deep-sea deposits by a TUM research team 20 years ago. Among the scientists on the team was Dr. Gunther Korschinek, who hypothesized that traces of stellar explosions could also be found in the pure, untouched Antarctic snow. In order to verify this assumption, Dr. Sepp Kipfstuhl from the Alfred Wegener Institute collected 500 kg of snow at the Kohnen Station, a container settlement in the Antarctic, and had it transported to Munich for analysis. There, a TUM team melted the snow and separated the meltwater from the solid components, which were processed at the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) using various chemical methods, so that the iron needed for the subsequent analysis was present in the milligram range, and the samples could be returned to Munich.

Korschinek and Dominik Koll from the research area Nuclear, Particle and Astrophysics at TUM found five iron-60 atoms in the samples using the accelerator laboratory in Garching near Munich. "Our analyses allowed us to rule out cosmic radiation, nuclear weapons tests or reactor accidents as sources of the iron-60," states Koll. "As there are no natural sources for this radioactive isotope on Earth, we knew that the iron-60 must have come from a supernova."

Stardust Comes from the Interstellar Neighborhood

The research team was able to make a relatively precise determination as to when the iron-60 has been deposited on Earth: The snow layer that was analyzed was not older than 20 years. Moreover, the iron isotope that was discovered did not seem to come from particularly distant stellar explosions, as the iron-60 dust would have dissipated too much throughout the universe if this had been the case. Based on the half-life of iron-60, any atoms originating from the formation of the Earth would have completely decayed by now. Koll therefore assumes that the iron-60 in the Antarctic snow originates from the interstellar neighborhood, for example from an accumulation of gas clouds in which our solar system is currently located.

"Our solar system entered one of these clouds approximately 40,000 years ago," says Korschinek, "and will exit it in a few thousand years. If the gas cloud hypothesis is correct, then material from ice cores older than 40,000 years would not contain interstellar iron-60," adds Koll. "This would enable us to verify the transition of the solar system into the gas cloud - that would be a groundbreaking discovery for researchers working on the environment of the solar system.

Credit: 
Technical University of Munich (TUM)

Queen bees face increased chance of execution if they mate with two males rather than one

image: Stingless bee mating

Image: 
Ayrton Vollet

Queen stingless bees face an increased risk of being executed by worker bees if they mate with two males rather than one, according to new research by the University of Sussex and the University of São Paulo.

A colony may kill their queen because of the quality of offspring, according to the paper by Professor Francis Ratnieks, from the University of Sussex, along with colleagues Ayrton Vollet-Neto and Vera Imperatriz-Fonseca from the University of São Paulo, published in a leading evolutionary journal, the American Naturalist.

Professor of Apiculture Francis Ratnieks said: "By studying test colonies, we found that queen stingless bees will have an increased chance of being executed by the workers in their colony if they mate with two males instead of the one male they normally mate with."

"The reasons for this are fairly complex but, in short, it's due to the genetics of sex determination in bees and the risk of what's known as 'matched mating'."

Queen stingless bees are closely related to honeybees and bumblebees but are only found in tropical countries like Brazil.

While a queen honeybee might mate with ten to twenty males, queen stingless bees normally only mate with one male. According to this new paper, that may be to reduce the chance of execution.

In bees whether an individual egg becomes a male or a female depends on a single genetic locus, known as the sex determination locus. Normal males arise from an unfertilized egg and have only one set of chromosomes, from the mother, and so only one sex allele.

If the egg is fertilized it will have two sets of chromosomes, one from the mother and one from the father. The two sex alleles can be different, in which case it is female, or the same, in which case it will be a diploid male - males who are a genetic dead end as they cannot reproduce and serve no useful function to the colony. What should have become a female worker, who will benefit the colony, is instead a useless diploid male.

When diploid males are produced, the worker bees in the colony can tell that things are not right and they generally execute the queen soon after adult diploid males emerge from their cells.

Diploid males are produced by 'matched mating' where the sex allele of a male the queen mates with is the same as one of the queen's two, different, alleles. In a matched mating, 50% of the fertilized eggs from that male's sperm will be diploid males.

If a queen bee mates with two males, although her chances of making a matched mating are doubled, the number of diploid males that could be produced decreases from 50% to 25%.

It turns out, however, that worker bees are just as likely to execute a queen who produces 25% diploid males as one who produces 50%.

Professor Ratnieks said: "If a queen mates with two males instead of one, her chance of being executed double. As a result, natural selection favours queens to mate with a single male in stingless bees."

Interestingly, the researchers found that if a queen were to mate with four males, this would actually reduce her chance of being executed.

If a queen were to mate with four males and there was a matched mating, only 12.5% of the offspring would be diploid males. This low proportion is not enough to cause the workers to execute the queen.

The researchers point out that for stingless bees to evolve from single mating to multiple mating, with 4 or more males, there would need to be an intermediate stage of double mating. As double mating causes higher queen execution, natural selection does not allow this first stage to occur. Stingless bee queens seem to be stuck on single mating.

Credit: 
University of Sussex

Treatment for sexual and domestic violence offenders does work

A first-of-its-kind study has found that specialised psychological programmes for sexual and domestic violence offenders have led to major reductions in reoffending but best results are achieved with consistent input from a qualified psychologist.

For the study, which was led by Professor Theresa Gannon at the University of Kent, a team of psychologists from the UK and Canada reviewed 70 previous studies and 55,000 individual offenders from five countries (UK, Canada, USA, Australia, New Zealand) to examine whether specialised psychological offence treatments were associated with reductions in recidivism.

Three specialised treatments were examined: sexual offence, domestic violence and general violence programmes, with the first two comprising the majority of specialised psychological programmes offered in correctional and community settings.

The study showed that, across all programmes, offence specific reoffending was 13.4% for treated individuals and 19.4% for untreated comparisons over an average follow up of 66?months. Relative reductions in offence specific reoffending were 32.6% for sexual offence programmes, 36% for domestic violence programmes, and 24.3% for general violence programmes. All programmes were also associated with significant reductions in non-offence specific reoffending.

However, overall, treatment effectiveness appeared improved when programmes received consistent hands-on input from a qualified registered psychologist and facilitating staff were provided with clinical supervision. For sexual offenders, specific group-based treatment, rather than mixed group and individual treatment, produced the greatest reductions in sexual reoffending as did treatment that focussed specifically on reducing inappropriate sexual arousal. All sexual offence treatment in these studies was Cognitive Behavioural Therapy.

Amongst its recommendations, the study suggests that policy makers and offender programme providers might optimise programmes outcomes by providing qualified psychologists who are consistently present in hands-on treatment. It also suggests that programme providers might also want to consider methods for tightly controlling programme implementation given that the researchers found single site treatments seemed to fare better than multisite treatments.

Professor Gannon, a chartered forensic psychologist and Director of Kent's Centre of Research and Education in Forensic Psychology, said: 'The results of this study are good news. They suggest that treatment can be effective; particularly if care and attention is paid to who delivers the treatment as well as how treatment is implemented.'

Credit: 
University of Kent

Plants could remove six years of carbon dioxide emissions -- if we protect them

By analysing 138 experiments, researchers have mapped the potential of today's plants and trees to store extra carbon by the end of the century.

The results show trees and plants could remove six years of current emissions by 2100, but only if no further deforestation occurs.

The study, led by Stanford University and the Autonomous University of Barcelona, and including Imperial College London researchers, is published in Nature Climate Change.

As plants grow they take in carbon dioxide (CO2) from the air. As CO2 concentrations in the air rise due to human-caused emissions, researchers have suggested that plants will be able to grow larger, and therefore take in more CO2.

However, plant growth is not only due to CO2 concentrations, but relies on the availability of nutrients in the soil, particularly nitrogen and phosphorus. If the plants can't get enough nutrients, they will not grow more despite higher CO2 concentrations.

Hundreds of experiments over the past few decades have tried to determine how much extra CO2 plants can take in before the availability of nutrients becomes limiting, but many have come up with different answers.

Now, a group of 32 scientists from 13 countries have analysed all the previous experiments to come up with a global estimate of plants' ability to take in CO2.

Their results show that globally plants can increase their biomass (organic material) by 12 percent when exposed to concentrations of CO2 predicted for the year 2100.

This extra growth would draw enough CO2 from the atmosphere to cancel out six years of current human-induced emissions.

However, the result is based on plant and forest cover remaining at current levels - so no further deforestation occurs.

Lead author Dr César Terrer, now at Stanford University's School of Earth, Energy & Environmental Sciences, initiated the project while at Imperial College London. He said: "Keeping fossil fuels in the ground is the best way to limit further warming. But stopping deforestation and preserving forests so they can grow more is our next-best solution."

Several individual experiments, such as fumigating forests with elevated levels of carbon dioxide and growing plants in gas-filled chambers, have provided critical data but no definitive answer globally.

To more accurately predict the capacity of trees and plants to sequester CO2 in the future, the researchers synthesized data from all elevated carbon dioxide experiments conducted so far, in grassland, shrubland, cropland and forest systems.

Using statistical methods, machine-learning, models and satellite data, they quantified how much soil nutrients and climate factors limit the ability of plants and trees to absorb extra CO2.

They found that tropical forests had the greatest capacity for growth and increased CO2 uptake, such as those in the Amazon, Congo and Indonesia.

Dr Terrer said: "We have already witnessed indiscriminate logging in pristine tropical forests, which are the largest reservoirs of biomass in the planet. We stand to lose a tremendously important tool to limit global warming."

The study also shows how plants' and trees' ability to absorb extra CO2 relies on their association with different fungi in their roots, which help them get extra soil nutrients.

The results of the study will be valuable to scientists building models of future climate change and the impact of reforestation or deforestation.

Credit: 
Imperial College London

Connected forest networks on oil palm plantations key to protecting endangered species

image: Forested conservation set-aside within a Roundtable on Sustainable Palm Oil (RSPO) certified oil palm plantation in Borneo.

Image: 
Robin Hayward

Connected areas of high-quality forest running through oil palm plantations could help support increased levels of biodiversity, new research suggests.

There is growing pressure to reduce the consumption of palm oil due to concerns over deforestation. However, the research team, led by the University of York, says promoting more sustainable palm oil is a better alternative.

For palm oil to be certified as sustainable, the Roundtable on Sustainable Palm Oil (RSPO) requires oil palm growers to identify and conserve areas within a plantation that support high conservation values.

If these patches contain high-quality forest, they may help protect species like orangutans, as well as various species of insects, birds and bats - many of which are threatened with extinction in areas of Indonesia and Malaysia, where 85% of the world's palm oil is produced.

Connections between forest fragments in oil palm plantations to other areas of forest and remaining natural habitat are essential for species to be able to move freely - something that is increasingly important as species face growing pressure to seek out alternative habitat due to continued land-use and climate change.

The researchers suggest that current criteria for the sustainable production of palm oil should incorporate clearer guidance for plantation companies to ensure connectivity between set-aside areas of forest.

Lead author of the research, Dr Sarah Scriven, who is working in Professor Jane Hill's lab within the Department of Biology at the University of York, said: "Palm is the world's most-productive major vegetable oil crop and yields six to 10 times as much oil per hectare as crops like soy or rapeseed. Switching to alternative sources of vegetable oil wouldn't enable producers to provide enough oil for the world's growing population and has the potential to do even more environmental damage.

"With demand for crop land set to increase, coming up with new ways to conserve biodiversity within agricultural landscapes is of critical importance."

However, the researchers found that even large areas of set-aside forest provide few benefits to forest species movement if they are isolated from other forested areas in the wider landscape.

In addition, set-aside areas frequently contain degraded forest. If plantation companies were to reforest these patches, the researchers calculate that set-asides within plantations in the lowlands of Borneo would be 16% better connected for forest species.

Dr Scriven added: "There is a pressing need to restore previously forested habitats. Rapid expansion of commodity agriculture has resulted in widespread loss and fragmentation of forest and in many areas of Indonesia and Malaysia, formerly extensive forests now persist as isolated remnants scattered across vast agricultural landscapes.

"Current RSPO guidelines are not prescriptive about strategies for maximising the connectivity of forest set-asides in oil palm landscapes. We therefore recommend that large, isolated areas of forest should be identified and reconnected with forested areas in the wider landscape.

"Future revisions to the RSPO guidelines should also ensure that plantation companies improve the quality of previously forested set-asides so that they can support high levels of biodiversity and contribute to landscape connectivity."

Credit: 
University of York

Moffitt Researchers complete largest genomic analysis of Merkel cell carcinoma patients

TAMPA, Fla. - Merkel cell carcinoma (MCC) is a rare, aggressive skin tumor that is diagnosed in approximately 2,000 people each year in the United States. Since MCC affects so few people, it is difficult to study the genetic factors that lead to its development and how those factors correlate with response to therapy. However, Moffitt Cancer Center researchers have developed the largest descriptive genomic analysis of MCC patients to date, in collaboration with Foundation Medicine and the Dana Farber Cancer Institute. Their analysis, published in Clinical Cancer Research, will provide important information to improve the care and treatment of MCC patients for many years to come.

Researchers are beginning to learn more about how MCC develops and its associated risk factors. Many patients with MCC have mutations within their DNA that are caused by UV radiation exposure, demonstrating that exposure to natural or artificial sunlight increases a person's risk. Additionally, DNA and proteins from the virus Merkel cell polyomavirus (MCPyV) are present in many patients with MCC, and it is now accepted that MCPyV plays an important role in MCC development in some cases.

In the past, patients with MCC had few effective treatment options, resulting in a poor prognosis with a 5-year survival rate of only 20%. However, Todd Knepper, PharmD, assistant member of the Department of Individualized Cancer Management at Moffitt, says that MCC patients now have hope for improved outcomes. "Just a few years ago there were no FDA-approved treatments for patients with MCC, but recently the treatment paradigm for advanced MCC has shifted dramatically with immune checkpoint inhibitors demonstrating remarkable efficacy in this disease," said Knepper. "Indeed, since 2017 several immune checkpoint inhibitors have been approved for the treatment of patients with MCC, and clinical data have demonstrated their ability to improve patient response rates and survival."

With these improvements in the understanding of MCC biology and therapeutic advances in immunotherapy, the Moffitt researchers wanted to generate a more comprehensive analysis of patients with MCC to understand its genetic landscape and how these genetic differences affect treatment responses. They performed a comprehensive genomic analysis of 317 patients with MCC, and also analyzed the outcomes of 57 MCC patients treated at Moffitt. Importantly, they also compared these genetic profiles to other skin cancers showing that MCPyV-positive MCC resembles other viral cancers whereas MCPyV-negative MCC resembles other neuroendocrine cancers.

The researchers reported that there were two distinct populations among the 317 MCC patients - patients with a high tumor mutational burden (TMB) and those with a low TMB. Of the patients with a high TMB, 94% had a UV-signature mutation in their tumor DNA and none of these patients had evidence of MCPyV. On the other hand, patients with a low TMB did not have a UV-signature mutation, but rather 63% of these patients had evidence of MCPyV virus within their tumors. Among both TMB-high and TMB-low tumor populations, mutations in the genes TP53 and RB1 were the most prevalent.

In their analysis of treatment outcomes, the researchers discovered that immunotherapies were highly effective for patients with both a high TMB and a low TMB; 50% of patients with TMB-high/UV-driven tumors had a response to therapy while 41% of patients with TMB-low/MCPyV-positive had a response to therapy. Importantly, the researchers found that the earlier the patients were treated with immunotherapy, the better they responded to therapy. The percent of MCC patients who responded to immunotherapy when given as their first treatment was 75%, but the response rate decreased to 39% for those treated with immunotherapy as their second therapy and 18% for those treated as their third or later therapy. The researchers also reported that patients who expressed the biomarker PD-1 had a better response to immunotherapy than patients who did not express PD-1.

Prior to the Moffitt study, the largest analysis of MCC patients included fewer than 50 patients. According to Andrew Brohl, MD, assistant member of Moffitt's Cutaneous Oncology Department, "This study represents the largest description of the genomic landscape of Merkel cell carcinoma. The magnitude of this study provides a more definitive landscape of the disease, demonstrating the distinctive mutational spectra of MCPyV-positive/TMB-low and UV-driven MCC subgroups. While there are two distinct molecular subsets of this disease, interestingly, they exhibit similar response rates to checkpoint inhibitor therapy."

Credit: 
H. Lee Moffitt Cancer Center & Research Institute

Vaping impairs vascular function

image: Neurovascular response to breath hold. (a) Magnitude image intensity of superior sagittal sinus (SSS, box). Insets show velocity maps at different points of the velocity time-course (40 seconds [t40], 50 seconds [t50], and 70 seconds [t70]). (b) Sample SSS blood flow velocity time-course (red line) shown for a representative participant. The thick black line is linear fit during breath holds, the slope of which is the breath-hold index (BHI). ΔVSSS = post-breath hold relative velocity increase.

Image: 
Radiological Society of North America

OAK BROOK, Ill. - Inhaling a vaporized liquid solution through an e-cigarette, otherwise known as vaping, immediately impacts vascular function even when the solution does not include nicotine, according to the results of a new study published in Radiology.

E-cigarette use is on the rise. According to the Centers for Disease Control and Prevention, more than 9 million adults in the U.S. use e-cigarettes, and vaping has become especially popular among teens. The 2018 National Youth Tobacco Survey reported that in 2018 more than 3.6 million middle and high school students were using e-cigarettes.

"The use of e-cigarettes is a current public health issue because of widespread use, especially among teenagers, and the fact that the devices are advertised as safe despite uncertainty about the effects of long-term use," said Alessandra Caporale, Ph.D., a post-doctoral researcher in the Laboratory for Structural, Physiologic and Functional Imaging (LSPFI) directed by senior author and principal investigator of the study, Felix W. Wehrli, Ph.D., at the University of Pennsylvania Perelman School of Medicine in Philadelphia. The research was funded by the National Heart, Lung, and Blood Institute (NHLBI).

According to the authors, e-cigarette inhalants, upon vaporization of the e-cigarette solution, contain potentially harmful toxic substances. Once inhaled, these particles can reach the alveoli of the lung, from where they are taken up by the blood vessels, thereby interfering with vascular function and promoting inflammation.

To study the acute effects of vaping on systemic vascular function, the researchers performed a series of MRI exams on 31 healthy non-smoking young adults (mean age 24; 14 women) before and after nicotine-free e-cigarette inhalation. The e-cigarette liquid contained pharma-grade propylene glycol and glycerol with flavoring, but no nicotine.

Using novel multi-parametric MRI protocols developed by Michael C. Langham, Ph.D., one of the co-authors of the study, scans of the femoral artery in the leg, the aorta and brain were performed before and after a single vaping episode equivalent to smoking a single conventional cigarette. For the femoral artery MRI, blood flow in the upper leg was constricted using a cuff and then released; the brain MRI was conducted in the sagittal sinus, during a series of thirty-second breath holds and normal breathing.

Comparing the pre- and post-MRI data, the single episode of vaping resulted in reduced blood flow and impaired vascular reactivity in the femoral artery, in which a 34 percent reduction in flow-mediated dilation--or the dilation of an artery mediated by blood flow increase - was found. There was a 17.5 percent reduction in peak flow, a 25.8 percent reduction in blood acceleration.

These findings suggest impaired function of the endothelium (inner lining of blood vessels). Moreover, a 20 percent reduction in venous oxygen saturation is indicative of altered microvascular function. The researchers also found a three percent increase in aortic pulse-wave velocity, a measure of arterial stiffness, or the rate at which pressure waves move down the aorta.

"These products are advertised as not harmful, and many e-cigarette users are convinced that they are just inhaling water vapor," Dr. Caporale said. "But the solvents, flavorings and additives in the liquid base, after vaporization, expose users to multiple insults to the respiratory tract and blood vessels."

Dr. Caporale said further studies are needed to address the potentially adverse long-term effects of vaping on vascular health.

Credit: 
Radiological Society of North America

Some pregnant women are exposed to gadolinium in early pregnancy

OAK BROOK, Ill. - A small but concerning number of women are exposed to a commonly used MRI contrast agent early in their pregnancy, likely before many of them are aware that they're pregnant, according to a study published in the journal Radiology. The results support adherence to effective pregnancy screening measures to help reduce inadvertent exposures to these contrast agents during early pregnancy.

Gadolinium-based contrast agents (GBCAs) are used in as many as 45 percent of MRI exams in the United States to improve the visualization of organs and tissue. Recent evidence shows that trace levels of gadolinium may be retained in the body after the MRI, although the implications of this are not yet understood.

Gadolinium can cross the placenta and enter the fetal circulation. The safety of GBCAs in pregnant women has not been established, and their use during pregnancy is not recommended unless essential to the health of the woman or fetus. Available data from cohort studies and case reports have revealed inconsistent findings regarding the association between gadolinium and adverse fetal outcomes.

To obtain a more precise idea of the prevalence of GBCA exposure among pregnant women, study lead author Steven Bird, Pharm.D., Ph.D., of the U.S. Food and Drug Administration's (FDA) Division of Epidemiology, and colleagues analyzed data on U.S. pregnancies resulting in live births between 2006 and 2017. The data was collected from 16 partners of the FDA's Sentinel System, a program that allows for active surveillance of healthcare data from multiple sources to monitor the safety of regulated medical products.

The data revealed exposures to GBCAs in 5,457 of 4,692,744 live births, a number corresponding to one in 860 pregnancies. Most of the exposures came from contrast MRI examinations of the head, although the number of pelvic and abdominal MRIs were also noteworthy. Almost three-quarters of exposures occurred during the first trimester. The results strongly suggested that inadvertent exposure to GBCAs may occur before pregnancy is recognized.

"Unintended fetal exposures to gadolinium can occur during early pregnancy among women who are not yet aware they are pregnant. Increased attention to existing pregnancy screening measures may help reduce inadvertent exposures to gadolinium contrast," Dr. Bird said.

The FDA has advised all MRI centers to provide a medication guide to outpatients during their first gadolinium contrast administration, stating that pregnant women and young children may be at increased risk from gadolinium staying in the body.

The researchers suggested several ways radiology imaging centers might avoid inadvertent administration of GBCAs to pregnant women. Among the tools are a safety screening form asking about the potential for pregnancy, direct questioning of women by radiologic technologists regarding pregnancy, prominently displayed signs asking women to notify radiology staff if they may be pregnant, and pregnancy testing when appropriate.

The FDA is continuing to monitor reports of adverse events associated with gadolinium exposure in utero. While this study was not designed to assess health outcomes of this early exposure for women and/or infants, the FDA is collaborating on and funding a study to evaluate potential risk for stillbirth and other neonatal adverse effects following in utero exposure to gadolinium in a large group of pregnant Medicaid beneficiaries.

Credit: 
Radiological Society of North America