Body

New treatment program offers hope for controlling wombat mange

image: Pictured here: Tasmanian wombats.

Image: 
Scott Carver

New research from the University of Tasmania is offering hope that the deadly mange disease affecting Tasmanian wombats could eventually be brought under control for wild individuals and populations.

Long-term disease control or eradication in wildlife is rare and represents a major challenge to wildlife conservation across the globe.

Control is particularly difficult for pathogens that can be transmitted through the environment, which includes the mite that causes sarcoptic mange in bare-nosed wombats.

In a paper published today in the Journal of Applied Ecology, researchers present a treatment program and lessons learned from it to guide the development of more effective and feasible control of sarcoptic mange disease in wombat populations.

Disease control was attempted during the mange outbreak at Narawntapu National Park in northern Tasmania, where PhD student Alynn Martin showed the disease could be controlled temporarily using a Cydectin treatment, remotely delivered to wombats using flaps over their burrows.

"The logistics of this treatment made long-term disease control extremely challenging," she said. "After three months of trying to treat each wombat in the population every week, the disease returned, and wombats continued to die. It was very disappointing to see after going to so much effort to save these wombats."

Rather than giving up, the researchers used their study to identify practical solutions to the problem.

With the help of University of Tasmania ecological modeller Dr Shane Richards, they discovered that a combination of a longer-lasting treatment and improved delivery of the treatment to the wombats would improve capacity to control mange in wombat populations.

"Slight improvements in multiple aspects of disease control can have dramatic impacts on our capacity to control this disease in wombats," Dr Richards said.

Lead researcher Dr Scott Carver says that they are now researching a longer-lasting treatment for wombats, called Bravecto.

"We have researched the safety and dose, and are currently determining the effectiveness of the new treatment. Our overarching aim is to make the management of this pathogen much more feasible for individual wild wombats and local at-risk populations," Dr Carver said.

Dr Richards said that field results suggest that the frequency in which wombats change the burrow in which they sleep was an important factor in disease persistence in populations.

The Sarcoptes scabiei mite was introduced to Australia by European settlers and their domestic animals.

Credit: 
University of Tasmania

Genes underscore five psychiatric disorders

A group of international doctors has uncovered the genes that contribute to the development of ADHD, autism spectrum disorder, bipolar disorder, major depression and schizophrenia.

A collaborative research project carried out by The University of Queensland and Vrije Universiteit in Amsterdam analysed more than 400,000 individuals to determine the genes behind these five psychiatric disorders.

UQ psychiatrist Professor Christel Middeldorp said several sets of genes marked all five disorders.

"Before this analysis, we knew a lot of psychiatric disorders were related to each other due to their hereditary nature," Professor Middeldorp said.

"We often see multiple family members with mental illness in one family, but not necessarily with the same disorder.

"We investigated if specific sets of genes were involved in the development of multiple disorders, which genes are not only related to say, ADHD, but also to the other four psychiatric disorders.

"These are genes that play a role in the same biological pathway or are active in the same tissue type.

"Genes that are highly expressed in the brain were shown to affect the different disorders, and some genes were related to all the illnesses we studied.

"It shows that there is a common set of genes that increase your risk for all five disorders."

The study's lead author Dr Anke Hammerschlag said it was due to the biological pathways shared by the genes in the brain.

"We found that there are shared biological mechanisms acting across disorders that all point to functions in brain cells," Dr Hammerschlag said.

"The synapse plays a vital role as this is the connection point between brain cells where the cells communicate with each other.

"We also found that genes especially active in the brain are important, while genes active in other tissues do not play a role."

New pharmaceutical drugs could potentially target these shared pathways.

"Our findings are an important first step towards the development of new drugs which may be effective for a wide range of patients, regardless of their exact diagnosis," she said.

"This knowledge will bring us closer to the development of more effective personalised medicine."

Credit: 
University of Queensland

Antimalarial treatments less effective in severely malnourished children

image: Researchers have found that severe malnutrition is associated with lower exposure to the antimalarial drug lumefantrine in children treated with artemether-lumefantrine, the most common treatment for uncomplicated falciparum malaria. The study, which is the first to specifically address this, has been published in Clinical Pharmacology and Therapeutics. It calls urgently for further research into optimised dosing regimens for undernourished children.

Image: 
UNAMID, Albert-González-Farran

Researchers have found that severe malnutrition is associated with lower exposure to the antimalarial drug lumefantrine in children treated with artemether-lumefantrine, the most common treatment, for uncomplicated falciparum malaria. The study, which is the first to specifically address this, has been published in Clinical Pharmacology and Therapeutics. It calls urgently for further research into optimised dosing regimens for undernourished children.

Children under the age of five are particularly vulnerable to malaria infection with 61% of all malaria deaths worldwide occurring in this group. Malnourished children are at an even higher risk of contracting and dying from malaria, with the disease being associated strongly with poverty.

The altered physiology in malnourished children may change the uptake and distribution of antimalarial drugs into the body, for example, the absorption and elimination of drugs might be reduced in these children. Despite this, there has been little research into how well treatments work in this particular group. Malnourished children and other vulnerable groups of patients, such as pregnant women and very young children, are often excluded in clinical studies as they do not represent the main target group, are difficult to recruit in large numbers, and their participation may raise specific ethical concerns. The data that does exist in malnourished children has been contradictory to date, as pointed out in a recently published systematic review conducted by the WorldWide Antimalarial Resistance Network (WWARN).

Artemether-lumefantrine, an artemisinin-based combination therapy, is the most common antimalarial treatment used worldwide and is recommended by the World Health Organization. Lumefantrine exposure, levels of the drug in blood after treatment has been administered, has been reported to be lower in children compared to adults. Children are currently prescribed the same dosing regimen as adults, artemether-lumefantrine twice daily for three days.

This study conducted in collaboration with Epicentre, Médecins Sans Frontières, the University of Cape Town, the Malaria Research and Training Center in Mali, and the Mahidol-Oxford Tropical Medicine Research Unit (MORU) and the Ministry of Health of Niger aimed to investigate the pharmacokinetic (PK) and pharmacodynamic (PD) properties of lumefantrine in children with severe malnutrition, previously, these have not been well defined. The analysis included data from a clinical trial across two hospitals in Mali and Niger analysing lumefantrine exposure for the treatment of uncomplicated falciparum malaria in 131 children with severe malnutrition, compared to 226 children without malnutrition. This is the largest cohort of patients in this type of analysis to date.

Key measures of malnutrition were collected including weight-for-height, mid-upper arm circumference or presence of nutritional oedema. These measures are not routinely collected in clinical trials of malaria, a practice that should change for a deeper insight into the systematic underexposure of antimalarial drugs in children due to malnutrition.

"We developed a pharmacokinetic and pharmacodynamic model for describing the pharmacological properties of lumefantrine in these children. Drug absorption, distribution, metabolism, or elimination might be affected by the nutritional status in children." said Dr. Palang Chotsiri, a researcher at MORU and the first author of the study. "Malnutrition effects on the pharmacological parameters were carefully investigated and evaluated."

Researchers found that:

Severely malnourished children had on average 19.2% less exposure to lumefantrine

All children had significantly lower drug exposure compared to adults

All measures of malnutrition correlated with a reduced absorption of lumefantrine, with low mid-upper arm circumference being the most significant factor associated with poor absorption

Lower exposure to lumefantrine also resulted in an increased risk of therapeutic failure and acquiring a new malaria (P. falciparum) infection during the follow-up period

Professor Joel Tarning, Head of the WWARN Pharmacometrics group who led the study, said: "There are serious knowledge gaps in associations between malnutrition and antimalarial drug efficacy and this study provides key insights. Malnourished children are more severely affected by malaria and they have lower levels of antimalarial drug in their bodies after standard treatment. This needs further study so that we can treat these patients better. Working together with our partners like Epicentre, MSF, MTRC, MORU and the University of Cape Town, we can fill in these research gaps."

Dr Rebecca Grais, Research Director, Epicentre, the epidemiology and research satellite of Médecins Sans Frontières said: "This study is critical to improve treatment protocols among this highly vulnerable group of children. Children suffering from both severe acute malnutrition and malaria need care the most"

Researchers then used the model to evaluate three alternative dosing regimens to see if this may improve lumefantrine exposure. The dosage regimen was increased, intensified and extended. It was found that an increased dosing regimen would not increase exposure due to the limited ability of severely malnourished children to absorb the drug. However, both the intensified regimen (three-times-daily for three days) and an extended regimen (twice-daily for five days) would result in equivalent exposures in the two groups, bringing the level up for severely malnourished children.

Researchers recommend that further work is done to investigate optimised dosing regimens to improve antimalarial treatments in malnourished children.

Credit: 
Infectious Diseases Data Observatory

Study finds meal timing strategies appear to lower appetite, improve fat burning

SILVER SPRING, Md.-- Researchers have discovered that meal timing strategies such as intermittent fasting or eating earlier in the daytime appear to help people lose weight by lowering appetite rather than burning more calories, according to a report published online today in the journal Obesity, the flagship journal of The Obesity Society. The study is the first to show how meal timing affects 24-hour energy metabolism when food intake and meal frequency are matched.

“Coordinating meals with circadian rhythms, or your body’s internal clock, may be a powerful strategy for reducing appetite and improving metabolic health,” said Eric Ravussin, PhD, one of the study’s authors and associate executive director for clinical science at Louisiana State University’s Pennington Biomedical Research Center in Baton Rouge.

"We suspect that a majority of people may find meal timing strategies helpful for losing weight or to maintain their weight since these strategies naturally appear to curb appetite, which may help people eat less," said Courtney M. Peterson, PhD, lead author of the study and an assistant professor in the Department of Nutrition Sciences at the University of Alabama at Birmingham.

Peterson and her colleagues also report that meal timing strategies may help people burn more fat on average during a 24-hour period. Early Time-Restricted Feeding (eTRF)--a form of daily intermittent fasting where dinner is eaten in the afternoon--helped to improve people's ability to switch between burning carbohydrates for energy to burning fat for energy, an aspect of metabolism known as metabolic flexibility. The study's authors said, however, that the results on fat-burning are preliminary. "Whether these strategies help people lose body fat need to be tested and confirmed in a much longer study," said Peterson.

For the study, researchers enrolled 11 adult men and women who had excess weight. Participants were recruited between November 2014 and August 2016. Adults, in general good health, aged 20-to-45-years old were eligible to participate if they had a body mass index between 25 and 35 kg/m2 (inclusive), body weight between 68 and 100 kg, a regular bedtime between 9:30 p.m. and 12 a.m., and for women, a regular menstrual cycle.

Participants tried two different meal timing strategies in random order: a control schedule where participants ate three meals during a 12-hour period with breakfast at 8:00 a.m. and dinner at 8:00 p.m. and an eTRF schedule where participants ate three meals over a six-hour period with breakfast at 8:00 a.m. and dinner at 2:00 p.m. The same amounts and types of foods were consumed on both schedules. Fasting periods for the control schedule included 12 hours per day, while the eTRF schedule involved fasting for 18 hours per day.

Study participants followed the different schedules for four days in a row. On the fourth day, researchers measured the metabolism of participants by placing them in a respiratory chamber--a room-like device--where researchers measured how many calories, carbohydrates, fat and protein were burned. Researchers also measured the appetite levels of participants every three hours while they were awake, as well as hunger hormones in the morning and evening.

Although eTRF did not significantly affect how many calories participants burned, the researchers found that eTRF did lower levels of the hunger hormone ghrelin and improved some aspects of appetite. It also increased fat-burning over the 24-hour day.

"By testing eTRF, we were able to kill two birds with one stone," said Peterson, adding that the researchers were able to gain some insight into daily intermittent fasting (time restricted-feeding), as well as meal timing strategies that involve eating earlier in the daytime to be in sync with circadian rhythms. The researchers believe that these two broader classes of meal timing strategies may have similar benefits to eTRF.

Hollie Raynor, PhD, RD, LDN, who was not associated with the research, said "this study helps provide more information about how patterns of eating, and not just what you eat, may be important for achieving a healthy weight." Raynor is a professor and interim dean of research in the Department of Nutrition, College of Education, Health, and Human Sciences at The University of Tennessee, Knoxville.

Peterson and colleagues said prior research was conflicted on whether meal timing strategies help with weight loss by helping people burn more calories or by lowering appetite. Studies in rodents suggest such strategies burn more calories, but data from human studies were conflicting--some studies suggested meal timing strategies increase calories burned, but other reports showed no difference. The study's authors said, however, that previous studies did not directly measure how many calories people burned or were imperfect in other ways.

Credit: 
The Obesity Society

Scientists pinpoint new mechanism that impacts HIV infection

image: Pictured here: Assistant Professor Smita Kulkarni, Ph.D.

Image: 
Texas Biomedical Research Institute

San Antonio, Texas (July 24, 2019) - A team of scientists led by Texas Biomed's Assistant Professor Smita Kulkarni, Ph.D. and Mary Carrington, Ph.D., at the Frederick National Laboratory for Cancer Research, published results of a study that pinpointed a long noncoding RNA molecule which influences a key receptor involved in HIV infection and progression of the disease. This newly-identified mechanism could open up a new avenue for control of HIV, the virus that causes AIDS.

The article was published in a recent edition of the journal Nature Immunology.

Most of the genome is made up of noncoding RNAs which do not directly translate into proteins. In fact, 97% of the human genome is non-protein coding. Dr. Kulkarni says that until the last decade or so, scientists thought many of these particular RNAs were "junk." Now, new research shows they play many roles. Recent developments in technology and genomics have made advances in knowledge in this area possible.

Dr. Kulkarni and her colleagues showed that a specific long noncoding RNA impacts the gene encoding the HIV co-receptor CCR5. Since CCR5 is critical for the HIV virus to enter the cell, a polymorphism associated with variation in expression of this long noncoding RNA impact the infection's outcome. Genomic DNA from various groups including Hispanics, African Americans and Japanese showed that this association is present across many ethnicities. This consistency of this association across populations speak to a single functional mechanism explaining this association.

Nature Immunology also featured a commentary about the study called "A SNP of lncRNA gives HIV-1 a boost" to further underscore the importance of this study to the field. In the article, Sanath Kumar Janaka and David T. Evans write "this study provides new insight into how polymorphisms in regulatory elements may explain genetic variation in pathogenesis." They go on to call Dr. Kulkarni's discovery a "fascinating and intricate mechanism."

"Their comments are encouraging and impel us to explore further," Dr. Kulkarni said."

HIV is still a major public health burden. More than a million people are living with HIV in the U.S. alone. More than 50,000 new cases are reported each year.

"Finding functional mechanisms of the disease-associated gene regions will increase our understanding about how they regulate disease-associated genes and pathways," Dr. Kulkarni explained. "We may be able to find selective targets for therapy."

Dr. Kulkarni said the discoveries outlined in this journal article may have implications for the progression of other infectious diseases. "There are many ways we can use the techniques we have learned through this study - what we have established in our lab," she added. "We can apply it to many other pathogens currently being studied by scientists at Texas Biomed and at many other institutions."

Credit: 
Texas Biomedical Research Institute

Light pollution may be increasing West Nile virus spillover from wild birds

image: House sparrow used in study that concludes birds exposed to artificial light at night are infectious for 2 days longer than those who do not.

Image: 
University of South Florida

TAMPA, Fla. (July 24, 2019)- We're in the midst of summertime mosquito bite season and cities across the country are reporting a heightened number of West Nile Virus (WNV) cases. The house sparrow is one of the most common carriers of WNV in urban areas. Mosquitos feed off the infected birds and spread the virus to humans. New research finds house sparrows exposed to artificial light at night, such as what's used in parking lots, maintain higher burdens of WNV for longer than those who spend their nights in the dark.

The study published in Proceedings of the Royal Society B concludes infected house sparrows living in light polluted conditions remain infectious for two days longer than those who do not, enhancing their host competence, or propensity to generate infection in other hosts or vectors. In turn, mathematical models show this likely increases the potential for a WNV outbreak by about 41 percent.

"The findings may be the first indication that light pollution can affect the spread of zoonotic diseases," said lead author Meredith Kernbach, PhD student in the University of South Florida College of Public Health. "Many hosts and vectors use light cues to coordinate daily and seasonal rhythms, which is among the most reliable environmental cues, and disruption of these rhythms by light exposure at night could affect immune responses, generating the effects we see here."

Researchers studied 45 house sparrows, exposing half to artificial light at night. Following 7-25 days in captivity, the team exposed the birds to WNV and took blood samples 2, 4, 6, and 10 days post-exposure. Researchers found all birds were infected within 2-4 days, however after that, birds exposed to light at night maintained transmissible burdens of WNV.

Kernbach says they picked the little brown birds since they live in close proximity to humans in urban areas, play host to a number of parasites and diseases, and are frequent carriers of WNV. While birds exposed to light pollution remain infected for a longer period of time, this did not increase mortality rates.

These results follow a previous study led by the University of South Florida that found zebra finches that have the avian stress hormone corticosterone (CORT) are more susceptible to mosquito bites. Such stress is known to be caused by a number of factors such as road noise, pesticides and light pollution. Researchers suggest new lighting technologies be created that are detectable to humans, but not for wildlife.

Credit: 
University of South Florida

Risk of death among postmenopausal women with normal weight and high abdominal fat

Bottom Line: Postmenopausal women with normal weight (body mass index 18.5 to 24.9) and central obesity (waist circumference greater than 88 cm) are at higher risk of death compared to women with normal weight and no central obesity. Obesity prevention commonly focuses on BMI, which can't distinguish body shape or body fat distribution. The high abdominal fat distribution that is central obesity is common in the general population. This observational study examined the associations of normal-weight central obesity with risk of death using data from nearly 157,000 postmenopausal women enrolled in the Women's Health Initiative between 1993 and 1998. Normal-weight central obesity in women was associated with an increased risk of death that was comparable to that of women with obesity (BMI equal to or greater than 30) and central obesity. Limitations of the study include its focus on postmenopausal women so the findings may not be broadly generalizable to younger women or men. The study authors suggest that normal-weight central obesity in women may be an underrecognized high-risk subpopulation and that the prevention and control of central obesity should be included in clinical and public health guidelines, even for individuals of normal weight.

Authors: Wei Bao, M.D., Ph.D., University of Iowa, Iowa City, and coauthors

(doi:10.1001/jamanetworkopen.2019.7337)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Medical imaging rates during pregnancy

What The Study Did: Researchers looked at rates of medical imaging (CT, MRI, conventional x-rays, angiography, fluoroscopy and nuclear medicine) during pregnancy in this observational study that included nearly 3.5 million pregnant women in the United States and Canada from 1996 to 2016.

Authors: Marilyn L. Kwan, Ph.D., of Kaiser Permanente Northern California in Oakland is the corresponding author.

(doi:10.1001/jamanetworkopen.2019.7249)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Waist size is a forgotten factor in defining obesity

A new study from the University of Iowa finds that some people considered to be a normal weight could unknowingly be at high risk for obesity-related health issues.

The study, published in the current issue of the Journal of the American Medical Association's JAMA Network Open, finds that a subgroup of people who are considered to be normal weight as measured by body mass index (BMI) could actually be at high risk for death because of their waist size.

Wei Bao, professor of epidemiology in the UI College of Public Health and the study's corresponding author, says that according to current clinical guidelines, physicians need rely only on BMI to determine obesity-related health risk. This leaves people who are actually in a high-risk group because of other risk factors, such as percentage of body fat, thinking they're healthy.

"The results suggest we should encourage physicians to look not only at body weight but also body shape when assessing a patient's health risks," says Bao.

The study used data from the Women's Health Initiative, which tracked the health of more than 156,000 women between the ages of 50 and 79 from 1993 to 2017. Bao and his team linked mortality rates to the respondents' BMI as well as their central obesity, which is the excess accumulation of fat around a person's midsection that has been linked to an array of health problems and is measured by waist circumference.

Women who were considered normal weight on the BMI scale but had a high waist circumference were found to be 31 percent more likely to die within the two-decade observation period. That's comparable to the 30 percent increased likelihood that an obese person with central obesity--considered to be in the highest risk group--will within 20 years of observation.

The study found that the two primary causes of death in people who had normal BMI but high waist size were cardiovascular disease and obesity-related cancer.

This study is so far the largest to identify people with normal weight central obesity as a high-risk subgroup for death. The findings about cardiovascular death and death from any cause largely confirm a similar study published in 2015 by researchers at the Mayo Clinic that was based on a much smaller population.

Bao says the study demonstrates the limitations of BMI when determining a person's risk for health problems. While it's a simple number to understand and easy to determine--it involves only height and weight--it isn't always accurate because it doesn't include other important numbers, such as the percentage of body fat or where that fat has accumulated on the body.

It's not unusual for people with a high BMI score to be in excellent health because the bulk of their weight is muscle--football linemen, for instance. At the same time, as this study indicates, people who are in the normal range on BMI could still have a high percentage of body fat, putting them in a high-risk group.

Despite these limitations, BMI has become the standard for health care providers and health policy makers when measuring obesity-related health risks. Bao says this focus on BMI leaves many people in the dark about the risks from central obesity.

"People with normal weight based on BMI, regardless of their central obesity, were generally considered normal in clinical practice according to current guidelines," Bao says. "This could lead to a missed opportunity for risk evaluation and intervention programs in this high-risk subgroup."

Credit: 
University of Iowa

Findings from CARE Consortium added to global repository for brain injury data

image: Thomas McAllister, MD, is the leader of the CARE Consortium's administrative operations center.

Image: 
IU School of Medicine

Data from the world's most comprehensive concussion study is now publicly available in a repository aimed at providing traumatic brain injury researchers access to a wealth of new knowledge.

The U.S. Department of Defense announced recently that data from the NCAA-DoD Concussion Assessment, Research and Education (CARE) Consortium is now available through the Federal Interagency Traumatic Brain Injury Research (FITBIR) informatics system.

Developed by the National Institutes of Health and the Department of Defense, the goal of the FITBIR informatics system is to help share data across the entire traumatic brain injury research field, and to make collaboration between laboratories easier than before.

With this addition, data from the CARE Consortium comprises nearly two-thirds of the entire FITBIR informatics system.

"An unprecedented amount of data on the short-term effects of concussions has been compiled through the CARE Consortium. Now that knowledge is easily accessible by investigators from around the world," said Dr. Thomas McAllister, chair of the Department of Psychiatry at IU School of Medicine and the leader of the study's administrative operations center. "All of these findings can help researchers start to answer important questions about concussion, and continue to grow the footprint of the CARE Consortium on advancing concussion science."

The CARE Consortium was established as part of the broader NCAA-DoD Grand Alliance in 2014, with the twin goals of understanding how concussions and repetitive head impacts affect the brain while identifying ways to improve diagnosis, treatment and prevention. The DOD support for CARE comes from the Office of the Assistant Secretary of Defense for Health Affairs through the Psychological Health and Traumatic Brain Injury Program under Award No. W81XWH-14-2-0151.

Led by Indiana University School of Medicine, the University of Michigan and the Medical College of Wisconsin, in collaboration with the Uniformed Services University, the study has collected data on more than 39,000 student-athletes and cadets at 30 colleges and military service academies--including more than 3,000 who have experienced concussions. This represents the largest sample of concussions ever researched in a single study.

"The continued partnership between the DoD and the NCAA to better understand the effects of concussion on athletes and service members is of high value to the military to enhance operational readiness," said Dr. Paul Pasquina, chair of the Department of Physical Medicine and Rehabilitation at Uniformed Services University.

Contributing the majority of data that is included in the FITBIR system, contributions from the CARE Consortium include:

88,286,447 data points

597,474 head impacts recorded from sensors embedded in helmets

1,216 MRI scans of concussed, exposed and control participants

2,719 unique blood samples for genomic and proteomic analyses

Additionally, the consortium researchers record data on demographics, medical family history, cognitive function, psychological health, balance and neurological function on all participants.

"The NCAA-DoD CARE Consortium is breathtaking in its depth and breadth, and has become a model of scientific collaboration, integrity and transparency that will advance the health and safety of student-athletes, service academy members, and the public at large," said Dr. Brian Hainline, NCAA Chief Medical Officer.

Credit: 
Indiana University School of Medicine

Researchers find evidence a cancer drug may be extended to many more patients

image: Dr. W. Lee Kraus with Dr. Dae-Seok Kim

Image: 
UTSW

DALLAS - July 24, 2019 - A new molecular mechanism discovered by UT Southwestern researchers indicates that drugs currently used to treat less than 10 percent of breast cancer patients could have broader effectiveness in treating all cancers where the drugs are used, including ovarian and prostate cancers. The new study also revealed a potential biomarker indicating when these drugs, called PARP inhibitors, can be unleashed in the fight against cancer.

"These findings could increase the patient population benefiting from these drugs by two, three, or four-fold. Up to 70 percent of breast cancer patients could now be good candidates," said Dr. W. Lee Kraus, Director of the Green Center for Reproductive Biology Sciences at UT Southwestern. "We have found that PARP inhibitors can act by a mechanism that is different from those previously identified, which rely on BRCA-dependent DNA repair pathways."

This research helps explain why breast cancer patients can be responsive to PARP inhibitors even if they don't have BRCA gene mutations.

The Kraus team's findings were published in the journal Molecular Cell on July 24.

PARP inhibitors were approved by the FDA in 2014 for the treatment of ovarian cancers containing BRCA mutations, rare genetic mutations that disable a DNA repair pathway in cancer cells. The FDA also approved PARP inhibitor for breast cancer treatment in 2018. In their current use, doctors prescribe PARP inhibitors to disable a second DNA repair pathway, making it difficult for cancer cells to survive.

Dr. Kraus' lab discovered that while this war on DNA repair is being waged, PARP inhibitors are also battling for dominance elsewhere in the cancer cell. It is an important, effective fight previously unknown to science. The PARP inhibitors also attack the machinery that makes proteins, called ribosomes.

"Cancer cells are addicted to ribosomes. Cancer cells grow fast and must make proteins to support cell division and other essential processes going on in the cell. If you can slow down or inhibit the production of ribosomes, then you can slow down the growth of the cancer cell," Dr. Kraus said.

This new understanding changes the way that scientists and clinicians think about PARP inhibitors and their clinical applications, which previously have been focused on DNA repair pathways since the initial discoveries in 2005. It took more than a decade to get PARP inhibitors approved by the FDA. New applications of PARP inhibitors based on Dr. Kraus' discovery could reach patients much quicker because three PARP inhibitor drugs are already approved and in use.

"The historical view is that cancers need the mutated BRCA gene to be sensitive to PARP inhibitors. That's what most scientists and clinicians thought," Dr. Kraus said. "But what the field is now coming to realize is that's just not true."

The realizations Dr. Kraus mentioned come from recent laboratory science, preclinical studies, and clinical trials throughout the nation that show additional signs of PARP inhibitors' effectiveness in the absence of BRCA mutations. But a clear molecular explanation for these effects has been lacking - until now.

The new study maps out this molecular pathway in its entirety and identifies a potential biomarker, a clinical test, that might indicate which patients may benefit from PARP Inhibitors. The biomarker is based on a protein called DDX21, which is required for the production of ribosomes in small subcellular compartments called nucleoli. The presence and function of DDX21 in the nucleolus requires PARP-1, the target of PARP inhibitors. Treatment with PARP inhibitors blocks DDX21 function and causes it to leak out of the nucleolus and disburse throughout the nucleus, thus inhibiting ribosome production. High levels of DDX21 in the nucleolus indicate cancers that might be the most responsive to PARP inhibitors.

The Kraus team found the new pathway and potential biomarker by examining a wide spectrum of breast cancer cells, some of which naturally have low levels of PARP. The low-PARP-level cells behaved like cells in which PARP activity was reduced by PARP inhibitors. The discovery builds on 15 years of PARP research so intense that Dr. Kraus' laboratory team put a molecular model of PARP-1 on his birthday cake.

"We started by trying to identify new molecular mechanisms and pursued this line of inquiry. We didn't know where the study would lead," he said. "We started as pure basic scientists, but as the study progressed the clinical relevance became more evident."

The next step is clinical trials Dr. Kraus is currently developing with UT Southwestern oncologists who treat breast and ovarian cancers.

Credit: 
UT Southwestern Medical Center

High blood pressure treatment and nursing home residents

Although 27 percent of all older adults who live in nursing homes in this country have both high blood pressure and dementia, we don't have enough research yet to inform healthcare providers about the best way to treat their high blood pressure.

Specifically, we don't know when the benefits of taking medication to lower blood pressure outweigh the potential risks, especially in older adults who also have moderate to severe dementia and a poor prognosis (the medical term for the likely course of a disease). That's because clinical trials for high blood pressure treatments typically do not include older adults who have severe chronic illnesses or disabilities.

A team of researchers designed a study to learn more about the best high blood pressure treatments for older adults who live in nursing homes. Their study was published in the Journal of the American Geriatrics Society.

The research team used information from Medicare records. The team identified 255,670 long-term nursing home residents in the United States during 2013 who had high blood pressure. Of these, nearly half had moderate or severe dementia-related difficulties with thinking and decision-making. Slightly more than half of them had no or only mild cognitive impairment.

The study's participants were about 85 years old on average. They had moderate impairment of their physical function and about 3 percent were receiving hospice care or had a life expectancy of six months or less.

At the beginning of the study, participants were receiving high blood pressure treatment.

The most common high blood pressure medicines received were beta blockers, followed by calcium channel blockers, and angiotensin converting enzyme (ACE) inhibitors. Nearly half the participants were taking more than one high blood pressure drug.

After following the participants for 180 days, the researchers learned that the residents taking a higher number of high blood pressure medications (more intensive treatment) were slightly more likely to be hospitalized or hospitalized for heart disease than those taking fewer high blood pressure medications. However, they were slightly less likely to experience a decline in their physical abilities than those residents taking fewer high blood pressure medications. This was true whether or not the residents had dementia.

The researchers said that their study's findings suggest that long-term nursing home residents with high blood pressure do not experience significant benefits from more intensive treatment. "Older adults and their caregivers should be aware that intensive treatment for high blood pressure may not be helpful in long-term nursing home residents. It is reasonable to consider reducing the dose of these drugs or discontinuing their use in residents with dementia, if doing so is consistent with their goals of care."

Credit: 
American Geriatrics Society

PrEP use high but wanes after three months among young African women

image: Single pills (brand name Truvada) containing two antiretroviral drugs, emtricitabine (TFC) and tenofovir disoproxyl fumarate (TDF) used for pre-exposure prophylaxis, or PrEP.

Image: 
NIAID

In a study of open-label Truvada as daily pre-exposure prophylaxis (PrEP) to prevent HIV among 427 young African women and adolescent girls, 95% initiated the HIV prevention strategy, and most used PrEP for the first three months. However, PrEP use fell among participants in this critical population during a year of follow-up clinic visits, although HIV incidence at 12 months was low. The preliminary results suggest that tailored, evidence-based adherence support strategies may be needed to durably engage young African women in consistent PrEP use. The study, known as HPTN 082, was supported by the National Institute of Allergy and Infectious Diseases (NIAID) and the National Institute of Mental Health (NIMH), both parts of the National Institutes of Health. The data were presented at the 10th International AIDS Society Conference on HIV Science.

Young women and girls in sub-Saharan Africa account for 3 million of the 4 million people aged 15-26 with HIV in the region. The NIH-sponsored HIV Prevention Trials Network (HPTN) reports that recent clinical trials had unacceptably high HIV incidence rates of 5-6% per year among young African women in this age group.

"Young women and girls in sub-Saharan Africa must be empowered to make informed choices about HIV prevention methods, including PrEP, that have the potential to protect individuals' health as well as turn the tide of the HIV epidemic," said NIAID Director Anthony S. Fauci, M.D. "The new data suggest that we need to do more to help this population use PrEP every day as prescribed to effectively prevent HIV acquisition."

HPTN investigators conducted the Phase 4 clinical trial, known as "Evaluation of Daily Oral PrEP as a Primary Prevention Strategy for Young African Women: A Vanguard Study." At clinic visits three months after initiating PrEP, 84% of 371 study participants who returned for follow-up had detectable levels of tenofovir diphosphate (TFV-DP)--a metabolite of one of the antiretrovirals in Truvada--in their blood according to dried blood spot analyses. Among these participants, 25% had TFV-DP levels suggesting "high adherence." At six- and 12-month follow-up visits, investigators found that participants with detectable levels of TFV-DP declined to 57% and 31%, respectively. Only 9% of young women had TFV-DP levels associated with high adherence at 12 months. Overall, four women acquired HIV during the study, all of whom had undetectable drug levels in their blood, suggesting limited adherence to PrEP.

"Multiple efficacy studies have shown that taking PrEP consistently every day provides high levels of protection against the acquisition of HIV," said Connie Celum, M.D., M.P.H., HPTN 082 protocol chair at the University of Washington in Seattle. "Unfortunately, as with all routine medication--especially for prevention purposes--daily adherence can be a challenge. Our findings suggest that our participants initially adhered well to PrEP. We need to develop effective ways to support young African women who still desire PrEP after this initial period to maintain their adherence and optimize their protection."

In the study, HIV-negative women and girls--ages 16 to 25--received HIV prevention counseling at an initial visit and were offered daily oral Truvada as PrEP. Participants who accepted PrEP as well as those who initially declined PrEP were followed for up to a year at clinics in Cape Town and Johannesburg, South Africa, and Harare, Zimbabwe. All participants who accepted PrEP received standard adherence support services, including counseling, information about peer clubs and SMS text messaging reminders. Approximately half of participants who accepted PrEP were randomized to also receive counseling that included information about their drug-level results throughout the clinical trial. Investigators saw no difference in drug levels between the two groups at three, six or 12 months, indicating that drug level feedback did not improve PrEP adherence.

The HPTN 082 team plans to conduct further analyses, including qualitative interviews with up to 75 study participants, to assess why some women chose to accept or decline PrEP, and to identify specific adherence challenges. Interviews also may illuminate whether a participant's risk behavior or perception of personal HIV risk changed throughout the study, informing a potential choice to continue or discontinue PrEP use.

"Our data offer useful insights into the lives of our study participants," said Sinead Delany-Moretlwe, M.B.Ch.B., Ph.D., HPTN 082 protocol co-chair at the University of Witwatersrand in Johannesburg, South Africa. "In our future analyses, we have an opportunity to capture more complete information about the lives, sexual behavior and adherence support needs of these young women in order to better provide services and strategies that work for them."

In addition to indicating the need for more effective adherence support services, the findings also suggest that HIV prevention methods that do not rely on daily adherence may be advantageous in this population. Learn more about NIAID-supported research to develop and evaluate long-acting HIV prevention strategies.

Credit: 
NIH/National Institute of Allergy and Infectious Diseases

Compensatory strategies to disguise autism spectrum disorder may delay diagnosis

First scientific study of compensatory strategies -- techniques to camouflage autism -- finds that they have positive and negative outcomes, increasing social integration, but possibly also resulting in poor mental health for autistic people, and could be a barrier to diagnosis.

For the first time, compensatory strategies used by people with autism have been investigated and collated in a qualitative study using an online survey of 136 adults, published in The Lancet Psychiatry journal. The study finds that the use of compensatory strategies is associated with both positive and negative consequences. Compensation improves social relationships, increases independence and employment, but may also be associated with poor mental health and delayed diagnosis. The preliminary results highlight the need for increased awareness of these strategies among clinicians and for greater support for those who need it.

Autism spectrum disorder is characterised by social communication impairments and by repetitive and restricted behaviours. There is limited understanding of why some autistic people appear neurotypical in their behaviours, despite having autism-related cognitive difficulties or differences.

Compensation is an adaptive process whereby new behaviours are generated in order to avoid negative outcomes. It is different from masking, where presumed undesirable behaviours are hidden or stopped. In the case of people with autism, people may use past experience or logic to respond to social situations to increase opportunities and "fit in" with society. However, they continue to be autistic at a neurocognitive level and this can lead to challenges in diagnosing and supporting individuals.

Lucy Livingston, lead author of the study from King's College London, UK, says: "This study highlights that compensation is an adaptive response to external societal pressures. This finding is in line with research that autistic people are, despite the negative impact on their wellbeing, driven to meet society's expectations of behaviour. Neurotypical society could do more to accommodate people with autism, which we speculate might reduce the need for them to compensate." [1]

The authors advertised to recruit participants for their study via social media and with the National Autistic Society. 136 adults were asked to complete an online study. Of the participants, 58 had a clinical diagnosis, 19 self-identified without a formal diagnosis and 59 were not diagnosed or self-identified, but reported social difficulties. The study looked into what compensatory strategies participants used, whether the strategies used were similar in diagnosed and undiagnosed people, and how compensatory strategies affected diagnosis.

The participants were asked to self-report autistic traits by completing a ten-item autism spectrum questionnaire and then a series of open questions about their social compensatory strategies. They also reported how successful and tiring their strategies were, and the likelihood of their recommending them to others with social difficulties.

The team identify several strategies used by people with and without an autism diagnosis (including behavioural masking such as holding back true thoughts or suppressing atypical behaviours, shallow and deep compensation such as planning and rehearsing conversations or learning rules about verbal and non-verbal behaviours, and accommodation strategies such as going out of your way to be helpful - for a full list see [2]), which were used equally by people who were formally diagnosed with autism and those who were not.

Shallow strategies such as laughing after joke cues were common in participants who reported more autistic traits and were linked to negative consequences of compensation. Crucially these shallow strategies are more difficult in stressful situations or when tired.

Participants used their intellectual and planning abilities to regulate social behaviour and follow social norms - making eye contact - preplanning social niceties - asking others questions about themselves - and switching between social rules. These strategies were more difficult when distracted or stressed, but crucially, they did not reduce participants' internal social cognitive difficulties.

There were wide-ranging motivations for using these strategies, most notably social motivation and a desire to develop meaningful relationships. One participant says: "With compensation, I have a job in which people respect my work and ask for my help and opinions...I am liked by my colleagues and friends...I haven't lived on the edge, lost and lonely, as I could have. I have been super super lucky."

There was also a perception that neurotypical individuals could "see through" these strategies. One participant reported: "There are obvious flaws, if you are observant - I repeat myself or use tv/film phrases and sometimes say things that are out of place." And another noted: "I feel like I am acting most of the time and when people say that I have a characteristic, I feel like a fraud because I've made that characteristic appear." (See panel in paper for other participant quotes).

The use of these strategies was linked with poor mental health, and autism diagnosis and support appeared to be affected too. 47 out of 58 of the diagnosed participants were diagnosed late in adulthood. The other 11 were diagnosed before the age of 18 years.

External environments were found to affect compensation and it may be the case that people with autism present as neurotypical in certain situations but not in others. Clinicians should be aware of this when measuring compensation and diagnosing autism. Recent evidence suggests that only 40% of UK general practitioners--the first point of contact for individuals seeking diagnosis--are confident in identifying autism spectrum disorder.

Lucy Livingston says: "Until now, no study has directly investigated compensatory strategies used by autistic people in social situations and we provide evidence for their existence and modulation by various factors. Because they present a barrier to diagnosis of autism, increasing awareness of compensatory strategies among clinicians will help detection and the provision of support for autistic people who use them. We hope this study will lead to the refinement of diagnostic manuals which currently contain little guidance on compensatory strategies in autism and co-occurring mental health conditions." [1]

The authors note some limitations, including that their sample included a high proportion of female, late diagnosed and well educated participants. Because of self-reporting it may not have captured subconscious behaviours. Future research should take both of these limitations into account, using more representative population samples and quantitative measures. Additionally, the role of other factors that affect social pressure, such as multiple intersecting identities and their role in compensation, could be investigated in future.

In a linked Comment article, Dr Julia Parish Morris of the Centre for Autism Research at the Children's Hospital of Philadelphia, USA, says: "Although many people compensate during social interaction, it can be an especially exhausting and distressing exercise for people with autism spectrum disorder. This finding begs the question: should subjective distress be listed in the diagnostic criteria for autism spectrum disorder? For example, DSM-5 could be revised to read: "Symptoms cause clinically significant impairment in social, occupational, or other important areas of current functioning [-including subjective distress]."

Credit: 
The Lancet

University of Guelph researchers unlock access to pain relief potential of cannabis

image: Pictured here: Professor Tariq Akhtar.

Image: 
University of Guelph

University of Guelph researchers are the first to uncover how the cannabis plant creates important pain-relieving molecules that are 30 times more powerful at reducing inflammation than Aspirin.

The discovery unlocks the potential to create a naturally derived pain treatment that would offer potent relief without the risk of addiction of other painkillers.

"There's clearly a need to develop alternatives for relief of acute and chronic pain that go beyond opioids," said Prof. Tariq Akhtar, Department of Molecular and Cellular Biology, who worked on the study with MCB professor Steven Rothstein. "These molecules are non-psychoactive and they target the inflammation at the source, making them ideal painkillers."

Using a combination of biochemistry and genomics, the researchers were able to determine how cannabis makes two important molecules called cannflavin A and cannflavin B.

Known as "flavonoids," cannflavins A and B were first identified in 1985, when research verified they provide anti-inflammatory benefits that were nearly 30 times more effective gram-for-gram than acetylsalicylic acid (sold as Aspirin).

However, further investigation into the molecules stalled for decades in part because research on cannabis was highly regulated. With cannabis now legal in Canada and genomics research greatly advanced, Akhtar and Rothstein decided to analyze cannabis to understand how Cannabis sativa biosynthesizes cannflavins.

"Our objective was to better understand how these molecules are made, which is a relatively straightforward exercise these days," said Akhtar. "There are many sequenced genomes that are publicly available, including the genome of Cannabis sativa, which can be mined for information. If you know what you're looking for, one can bring genes to life, so to speak, and piece together how molecules like cannflavins A and B are assembled."

With the genomic information at hand, they applied classical biochemistry techniques to verify which cannabis genes were required to create cannflavins A and B. Their full findings were recently published in the journal Phytochemistry.

These findings provide the opportunity to create natural health products containing these important molecules.

"Being able to offer a new pain relief option is exciting, and we are proud that our work has the potential to become a new tool in the pain relief arsenal," said Rothstein.

Currently, chronic pain sufferers often need to use opioids, which work by blocking the brain's pain receptors but carry the risk of significant side effects and addiction. Cannflavins would target pain with a different approach, by reducing inflammation.

"The problem with these molecules is they are present in cannabis at such low levels, it's not feasible to try to engineer the cannabis plant to create more of these substances," said Rothstein. "We are now working to develop a biological system to create these molecules, which would give us the opportunity to engineer large quantities."

The research team has partnered with a Toronto-based company, Anahit International Corp., which has licensed a patent from the University of Guelph to biosynthesize cannflavin A and B outside of the cannabis plant.

"Anahit looks forward to working closely with University of Guelph researchers to develop effective and safe anti-inflammatory medicines from cannabis phytochemicals that would provide an alternative to non-steroidal anti-inflammatory drugs," said Anahit chief operating officer Darren Carrigan.

"Anahit will commercialize the application of cannflavin A and B to be accessible to consumers through a variety of medical and athletic products such as creams, pills, sports drinks, transdermal patches and other innovative options."

Credit: 
University of Guelph