Culture

Epilepsy drugs linked to increased risk of suicidal behavior, particularly in young people

Treatment with gabapentinoids - a group of drugs used for epilepsy, nerve pain and anxiety disorders - is associated with an increased risk of suicidal behaviour, unintentional overdose, injuries, and road traffic incidents, finds a study from Sweden published by The BMJ today.

Prescriptions have risen steeply in recent years, and gabapentinoids are among the top 15 drugs globally in terms of revenue.

The risks are strongest among 15 to 24 year-olds, prompting the researchers to suggest that treatment guidelines for young people should be reviewed.

Previous studies have linked gabapentinoids to suicidal behaviour and overdose related deaths, but findings have been inconsistent and data on longer term harms are lacking.

Concerns that these drugs are also being used as an opioid substitute and for recreational use have led to prescribing restrictions in several countries, including the UK.

To help fill this evidence gap, an international research team examined associations between gabapentinoids and a range of harms including suicidal behaviour, unintentional overdose, injuries, road traffic incidents, and violent crime.

Using national prescription, patient, death, and crime registers, they identified 191,973 people aged 15 years and older who were prescribed pregabalin or gabapentin in Sweden between 2006 and 2013.

Overall, 59% of participants were women, and most were 45 years or older.

The researchers then compared the risk of harms during treatment periods with baseline risk during periods without treatment.

After taking account of potentially influential factors, they found that during treatment periods, participants were at a 26% increased risk of suicidal behaviour or death from suicide, a 24% increased risk of unintentional overdose, a 22% increased risk of head or body injuries, and a 13% increased risk of road traffic incidents or offences.

There were no statistically significant associations between gabapentinoid treatment and violent crime.

When drugs were examined separately, only pregabalin, not gabapentin, was associated with increased risks of harm.

And when the results were analysed by age, risks were greatest among 15 to 24 year-olds. This could be due to impulsivity and risk taking behaviour, or use of alcohol and illicit drugs alongside gabapentinoids, suggest the authors.

This is an observational study, so can't establish cause, and the researchers weren't able to account for drug adherence or any interplay between alcohol and illicit drug use. Nevertheless, this was a large study examining a wide range of outcomes and designed to minimise the effects of unmeasured (confounding) factors.

Further research is needed to better understand the increased risks found in adolescents and young adults prescribed gabapentinoids, particularly for suicidal behaviour and unintentional overdoses, while clinical guidelines may also need review, they conclude.

In a linked editorial, Derek Tracy, a consultant psychiatrist at Queen Mary's Hospital in London, says these findings provide "solid data to inform patients on the risks associated with treatment."

The findings suggest it might be time to uncouple pregabalin and gabapentin for the purposes of legislation and guidelines, he writes. "We also need to understand what is driving the age related differences in risks and how recent legal restrictions will affect the illicit market in diverted drugs."

Despite reasonable concerns, gabapentinoids "remain a valued therapeutic option for many people," he concludes. "Medicines can harm as well as heal, and the best treatment decisions are made in full partnership with patients, after consideration of all available evidence on both."

Peer-reviewed? Yes (research); No (linked editorial)
Evidence type: Observational; Opinion
Subjects: People

Credit: 
BMJ Group

Braces won't always bring happiness

Research undertaken at the University of Adelaide overturns the belief that turning your crooked teeth into a beautiful smile will automatically boost your self-confidence.

The study, carried out by Dr Esma Dogramaci and Professor David Brennan from the University of Adelaide's Dental School, followed 448 13-year-olds from South Australia in 1988 and 1989. By the time that they turned 30 in 2005 and 2006 more than a third of them had received orthodontic treatment.

"The study, which is the first of its type undertaken in Australia and only the second in the world, examined if having braces lead to a greater level of happiness or psychosocial outcomes, later in life," says Dr Dogramaci.

"There was a pattern of higher psychosocial scores in people who did not have orthodontic treatment meaning people who hadn't had braces fitted were significantly more optimistic than the ones that did have braces.

"Those who didn't have braces had varying levels of crooked teeth, just like those who had braces treatment - ranging from mild through to very severe."

The study looked at four psychosocial aspects: how well people felt they coped with new or difficult situations and associated setbacks; how much they felt that could take care of their own health; the support the person believed they received from their personal network and finally their own level of optimism.

"These indicators were chosen because they are important for psychosocial functioning and are relevant to health behaviours and health outcomes; since the core research question was the impact of braces treatment on patients' self-confidence and happiness in later life," says Dr Dogramaci.

Fourth year dental student Alex Furlan has never had braces fitted: "My orthodontist recommended that I have braces fitted but I'm quite happy without them. I've never felt the need to straighten my teeth - I can get on in life without having perfectly straight teeth," he says.

"A lot of people are convinced that if they have braces, they will feel more positive about themselves and do well, psychosocially, in later life. This study confirmed that other factors play a role in predicting psychosocial functioning as adults - braces as a youngster was not one of them," says Dr Dogramacci.

"But brushing at least twice a day and seeing a dentist regularly were amongst the factors related to better psychosocial scores."

"On a population level, those who have never had braces were more positive than those who had braces. While experiencing braces treatment won't guarantee happiness later in life, brushing teeth twice a day and seeing a dentist for regular check-ups will help to keep you healthy and happy."

Credit: 
University of Adelaide

Determining risk of recurrence in triple-negative breast cancer

image: Katherine Varley in her lab at Huntsman Cancer Institute at the University of Utah.

Image: 
Huntsman Cancer Institute

SALT LAKE CITY - A personalized prognosis for patients diagnosed with triple-negative breast cancer was the goal of a new study by Katherine Varley, PhD, researcher at Huntsman Cancer Institute (HCI) and assistant professor of oncological sciences at the University of Utah.

Twenty percent of women diagnosed with breast cancer in the United States will learn they have triple-negative breast cancer. That diagnosis means the three most common proteins known to fuel breast cancer growth--estrogen receptor, progesterone receptor, and HER2--are not present in the tumor. Those patients will not respond to any of the targeted therapies developed to treat breast cancer with those characteristics. After surgery, their only treatment option is chemotherapy. Targeted therapy allows healthy cells to survive, but chemotherapy can kill normal cells when eliminating the cancer cells.

Sixty percent of patients with triple-negative breast cancer will survive more than five years without disease, but four out of ten women will have a rapid recurrence of the disease. There are currently no clinical tests to assess an individual patient's prognosis, so all patients receive aggressive chemotherapy that can include up to four chemotherapy drugs and six months of treatment. Varley's new findings, recently published in Cancer Research, could change that. "We could very accurately predict which patients were going to have long-term disease-free survival and which patients were likely to have recurring disease. This is very exciting because it could be the first clinical test to enable personalized prognosis for triple-negative breast cancer patients," said Varley.

Varley previously discovered triple-negative breast cancer patients, whose tumors naturally turned on an immune response, were disease-free for much longer than those who did not. The objective of the new study was to find a way to translate this discovery into a clinical test to determine which patients have an inherently good prognosis and might safely be treated with less aggressive therapy. "That's significant because chemotherapy can lead to long-term heart and nerve problems," Varley noted. "If we can understand which patients need aggressive treatment and which patients will likely do well with less aggressive treatment, we could make a big difference in their lives."

Varley worked closely on the study with Rachel Stewart, DO, PhD, assistant professor of pathology and laboratory medicine at the University of Kentucky. They used specimens from patients treated at HCI. The tumor samples were taken more than five years ago, so the researchers could determine how each patient fared in the long term. The next step was developing a way to test for biomarkers of the immune response. The biomarker test was developed using formalin-fixed, paraffin-embedded tissues. This is important because it means this test can be run on tumor biopsy specimens that are routinely collected for breast cancer diagnosis.

The research team is currently applying the test to triple-negative breast cancer patient samples from clinical trials of chemotherapy and immunotherapy. Their next step is to validate that the test can be used to predict prognosis and choose the most effective and safest treatments. They are also investigating whether this test could be used for patients with HER2 positive breast cancer, lung cancer, ovarian cancer, and melanoma because the immune response is similar in those diseases.

"We're working as fast as possible to validate the test so it can benefit patients," said Varley. "One of my goals is to translate the discoveries we make in basic science and in our genomics research into clinical tests because I know patients are waiting."

Credit: 
Huntsman Cancer Institute

Increasing red meat consumption linked with higher risk of premature death

People who increased their daily servings of red meat over an eight-year period were more likely to die during the subsequent eight years compared to people who did not increase their red meat consumption, according to a new study led by researchers from Harvard T.H. Chan School of Public Health. The study also found that decreasing red meat and simultaneously increasing healthy alternative food choices over time was associated with lower mortality.

The study will appear online June 12, 2019 in BMJ.

A large body of evidence has shown that higher consumption of red meat, especially processed red meat, is associated with higher risk of type 2 diabetes, cardiovascular disease, certain types of cancers including those of the colon and rectum, and premature death. This is the first longitudinal study to examine how changes in red meat consumption over time may influence risk of early death.

For this study, researchers used health data from 53,553 women in The Nurses' Health Study and 27,916 men in the Health Professionals Follow-up Study who were free of cardiovascular disease and cancer at baseline. They looked at whether changes in red meat consumption from 1986-1994 predicted mortality in 1994-2002, and whether changes from 1994-2002 predicted mortality in 2002-2010.

Increasing total processed meat intake by half a daily serving or more was associated with a 13% higher risk of mortality from all causes. The same amount of unprocessed meat increased mortality risk by 9%. The researchers also found significant associations between increased red meat consumption and increased deaths due to cardiovascular disease, respiratory disease, and neurodegenerative disease.

The association of increases in red meat consumption with increased relative risk of premature mortality was consistent across participants irrespective of age, physical activity level, dietary quality, smoking status, or alcohol consumption, according to the researchers.

Study results also showed that, overall, a decrease in red meat together with an increase in nuts, fish, poultry without skin, dairy, eggs, whole grains, or vegetables over eight years was associated with a lower risk of death in the subsequent eight years.

The researchers suggest that the association between red meat consumption and increased risk of death may be due to a combination of components that promote cardiometabolic disturbances, including saturated fat, cholesterol, heme iron, preservatives, and carcinogenic compounds produced by high temperature cooking. Red meat consumption has also recently been linked to gut microbiota-derived metabolite trimethylamine N-oxide (TMAO) that might promote atherosclerosis.

"This long-term study provides further evidence that reducing red meat intake while eating other protein foods or more whole grains and vegetables may reduce risk of premature death. To improve both human health and environmental sustainability, it is important to adopt a Mediterranean-style or other diet that emphasizes healthy plant foods," said senior author Frank Hu, Fredrick J. Stare Professor of Nutrition and Epidemiology and chair, Department of Nutrition.

Credit: 
Harvard T.H. Chan School of Public Health

Increasing red meat intake linked with heightened risk of death

Increasing red meat intake, particularly processed red meat, is associated with a heightened risk of death, suggests a large US study published in The BMJ today.

However, reducing red meat intake while increasing healthy protein sources, such as eggs and fish, whole grains and vegetables over time may lower the risk, the researchers say.

High intake of red meat, such as beef, pork and lamb, has been previously linked with a higher risk of type 2 diabetes, cardiovascular disease, certain types of cancers, and premature death. But little is known about how changes in red meat intake may influence risk of death.

So to explore this further, a team of researchers based in the US and China looked at the link between changes in red meat consumption over an eight year period with mortality during the next eight years, starting from 1986 to the end of follow-up in 2010.

They used data for 53,553 US registered female nurses, aged 30 to 55, from the Nurses' Health Study (NHS) and 27,916 US male health professionals, aged 40 to 75, from the Health Professionals Follow-up Study (HPFS), who were free of cardiovascular disease and cancer at the start of the study.

Every four years the participants completed a food frequency questionnaire (FFQ) where they were asked how often, on average, they ate each food of a standard portion size in the past year, ranging from "never or less than once per month" to "6 or more times a day". They were then divided into five categories based on their changes in red meat intake.

During the study period, the total number of deaths from any cause (known as "all cause mortality") reached 14,019 (8,426 women and 5,593 men). The leading causes were cardiovascular disease, cancer, respiratory disease and neurodegenerative disease.

After adjusting for age and other potentially influential factors, increasing total red meat intake (both processed and unprocessed) by 3.5 servings a week or more over an eight year period was associated with a 10% higher risk of death in the next eight years.

Similarly, increasing processed red meat intake, such as bacon, hot dogs, sausages and salami, by 3.5 servings a week or more was associated with a 13% higher risk of death, whereas increasing intake of unprocessed red meat was associated with a 9% higher risk.

These associations were largely consistent across different age groups, levels of physical activity, dietary quality, smoking and alcohol consumption habits.

Overall, reducing red meat intake while eating more whole grains, vegetables, or other protein foods such as poultry without skin, eggs and fish, was associated with a lower risk of death among both men and women.

For example, swapping out one serving per day of red meat for one serving of fish per day over eight years was linked to a 17% lower risk of death in the subsequent eight years.

Similar findings were seen in the shorter-term (four years) and longer-term (12 years) for the link between changes in red meat intake and mortality, and for replacing red meat with healthier food alternatives.

This is an observational study, and as such, can't establish cause. And the authors point out some limitations, including that they did not look at the reasons for changes in red meat consumption which could have influenced the results.

And the study participants were mainly white registered health professionals so the findings may not be more widely applicable.

But the authors say that the data gathered covered a large number of people over a long follow-up period, with repeated assessment of diet and lifestyle factors, and consistent results between the two cohorts. What's more, this is the first study of its kind to examine the association between changes in red meat intake and subsequent risk of mortality.

The findings provide "a practical message to the general public of how dynamic changes in red consumption is associated with health," they write.

"A change in protein source or eating healthy plant based foods such as vegetables or whole grains can improve longevity," they conclude.

Credit: 
BMJ Group

Strobe lighting at dance music festivals linked to tripling in epileptic fit risk

Strobe lighting at electronic dance music festivals may be linked to a tripling in the risk of epileptic fits in susceptible individuals, suggests research published in the online journal BMJ Open.

Organisers need to issue warnings and advice on preventive measures, particularly for those who have a history of epilepsy that responds to flashing lights, known as photosensitive epilepsy, argue the researchers, who note that the popularity of dance music events generates revenues of US$ 5.7 billion every year worldwide.

Strobe lighting is known to heighten the risk of epileptic seizures in susceptible individuals. But the risks associated with attending electronic dance music festivals are not widely known, and organisers consequently don't routinely warn visitors about them.

Prompted by the case of a 20 year old who collapsed at one such festival and then experienced an epileptic seizure for the first time, the researchers decided to look in more detail at the potential health impacts of strobe lighting at dance music events.

They drew on data for incidents requiring medical assistance, including for ecstasy use, among 400,343 visitors to 28 daytime and nighttime electronic dance music festivals in The Netherlands throughout 2015.

They used data from one company which provides medical services to nearly all dance music festivals in The Netherlands.

Eyewitness reports of sudden loss of consciousness and muscle twitching combined with physical findings, such as evidence of tongue biting and temporary urinary incontinence, were used to inform a diagnosis of an epileptic seizure.

Some 241,543 people attended nighttime gigs, where strobe lighting was used, and 158,800 attended daytime gigs, where strobe lighting was less intense because of the effects of sunlight.

In all, medical assistance was provided on 2776 occasions. In 39 cases this was for an epileptic seizure, 30 of which occurred during nighttime gigs, meaning that the risk of a seizure associated with a nighttime event was 3.5 times greater than for a daytime event.

Use of ecstasy, which is the most commonly used recreational drug at dance music events, and which has been associated with heightened epileptic seizure risk, was more likely among those attending nighttime events: around one in four compared with one in 10 of those attending daytime events.

But the proportion of cases in which the drug had been used was similar in both groups of visitors, suggesting that this alone wasn't responsible for the heightened seizure risk, suggest the researchers.

This is an observational study, and as such, can't establish cause. What's more, the researchers weren't able to glean other potentially influential factors, such as medical history, sleep deprivation, or use of other medication, and they relied on witness reports/on site medical assessments, all of which may have affected the accuracy of the figures.

But, they write: "We think, however, that our numbers are probably an underestimate of the total number of people who had epileptic seizures."

And they add: "Regardless of whether stroboscopic light effects are solely responsible or whether sleep deprivation and/or substance abuse also play a role, the appropriate interpretation is that large [electronic dance music] festivals, especially during nighttime, probably cause at least a number of people per event to suffer epileptic seizures."

They advise anyone with photosensitive epilepsy to either avoid such events or to take precautionary measures, such as getting enough sleep and not taking drugs, not standing close to the stage, and leaving quickly if they experience any prodromal 'aura' effects.

"Given the large dataset, we believe our findings are externally valid, at least for other [electronic dance music] festivals in other countries which generally attract a similar audience," they conclude.

Credit: 
BMJ Group

Why Noah's ark won't work

video: A purple sea urchin, in a University of Vermont laboratory, part of a new study that shows that for ocean species to survive climate change large populations will be needed.

Image: 
Joshua Brown/UVM

A Noah's Ark strategy will fail. In the roughest sense, that's the conclusion of a first-of-its-kind study that illuminates which marine species may have the ability to survive in a world where temperatures are rising and oceans are becoming acidic.

Two-by-two, or even moderately sized, remnants may have little chance to persist on a climate-changed planet. Instead, for many species, "we'll need large populations," says Melissa Pespeni a biologist at the University of Vermont who led the new research examining how hundreds of thousands of sea urchin larvae responded to experiments where their seawater was made either moderately or extremely acidic.

The study was published on June 11, 2019, in the Proceedings of the Royal Society B.

RARE RELIEF

Pespeni and her team were surprised to discover that rare variation in the DNA of a small minority of the urchins were highly useful for survival. These rare genetic variants are "a bit like having one winter coat among fifty lightweight jackets when the weather hits twenty below in Vermont," Pespeni says. "It's that coat that lets you survive." When the water conditions were made extremely acidic, these rare variants increased in frequency in the larvae. These are the genes that let the next generation of urchins alter how various proteins function--like the ones they use to make their hard-but-easily-dissolved shells and manage the acidity in their cells.

But to maintain these rare variants in the population--plus other needed genetic variation that is more common and allows for response to a range of acid levels in the water--requires many individuals.

"The bigger the population, the more rare variation you'll have," says Reid Brennan, a post-doctoral researcher in Pespeni's UVM lab and lead author on the new study. "If we reduce population sizes, then we're going to have less fodder for evolution--and less chance to have the rare genetic variation that might be beneficial."

In other words, some organisms might persist in a climate-changed world because they're able to change their physiology--think of sweating more; some will be able to migrate, perhaps farther north or upslope. But for many others, their only hope is to evolve--rescued by the potential for change that lies waiting in rare stretches of DNA.

RAPID ADAPTATION

The purple sea urchins the UVM team studied in their Vermont lab are part of natural populations that stretch from Baja, California to Alaska. Found in rocky reefs and kelp forests, these prickly creatures are a favorite snack of sea otters--and a key species in shaping life in the intertidal and subtidal zones. Because of their huge numbers, geographic range, and the varying conditions they live in, the urchins have high "standing genetic variation," the scientists note. This makes purple urchins likely survivors in the harsh future of an acidified ocean--and good candidates for understanding how marine creatures may adapt to rapidly changing conditions.

It is well understood that rising average global temperatures are a fundamental driver of the imminent extinction faced by a million or more species--as a recent UN biodiversity report notes. But it's not just rising averages that matter. It may be the hottest--or most acidic--moments that test an organism's limits and control its survival. And, as the UVM team writes, "the genetic mechanisms that allow rapid adaptation to extreme conditions have been rarely explored."

CURRENCY IN THE CURRENT SEA

The new study used an innovative "single-generation selection" experiment that began with twenty-five wild-caught adult urchins. Each female produced about 200,000 eggs from which the scientists were able extract DNA out of pools of about 20,000 surviving larvae that were living in differing water conditions. This very large number of individuals gave the scientists a clear view that purple urchins possess a genetic heritage that lets them adapt to extremely acidic ocean water. "This species of sea urchin is going to be okay in the short term. They can respond to these low pH conditions and have the needed genetic variation to evolve," says UVM's Reid Brennan. "So long as we do our part to protect their habitats and keep their populations large."

But coming through the ferocious challenge of rapid climate change may come at a high cost. "It's hopeful that evolution happens--and it's surprising and exciting that these rare variants play such a powerful role," says Melissa Pespeni, an assistant professor in UVM's biology department and expert on ocean ecosystems. "This discovery has important implications for long-term species persistence. These rare variants are a kind of currency that urchins have to spend," she says. "But they can only spend it once."

Credit: 
University of Vermont

Love songs from paradise take a nosedive

image: Fledgling tree finches may be infested in the nest.

Image: 
Dr Katharina Peters, Flinders University

The Galápagos Islands finches named after Charles Darwin are starting to sing a different tune because of an introduced pest on the once pristine environment.

International bird ecology experts, including Professor Sonia Kleindorfer and Dr Katharina Peters from Flinders University in South Australia, have found the beaks of Darwin's finches have changed to cope with infestations of the parasite Philornis downsi - and this is now affecting the birds' mating powers.

In a new paper published in The Royal Society's Proceedings of the Royal Society B, the researchers show that Darwin's finch males whose beaks and nostril (naris) have been damaged by the parasitic invasion are producing "sub-par song".

"In our newest research, we show that Darwin's finch males whose nares have been deformed by the parasite had greater vocal deviation - which females didn't like during mate choice - and had songs with lower maximum frequency," says Professor Kleindorfer, adding this also "confused the species identity of the singer".

The researchers, including University of California, Berkeley Adjunct Professor Frank J Sulloway, conclude that the Galápagos investigation specifically showed that critically endangered medium tree finches with enlarged naris size produced song that was indistinguishable from song of other finches.

"Given that small tree finch and medium tree finch are hybridised on Floreana Island, we suspect that this blurred species-signalling function of song may be partly to blame for the observed reverse speciation that is currently occurring," Dr Peters says.

"This research is evidence that parasite-induced morphological deformation can disrupt host mating signal with devastating effects on bird populations.

"The Philornis downsi larvae -- an accidentally introduced parasite - feed internally on the beaks of Galápagos birds causing permanent structural damage and enlarged naris (nostril) size.

The so-called Darwin's finches captivated the British naturalist during his Galápagos research in the 1830s and became the first vertebrate system to provide compelling field-based evidence for evolution of natural selection.

Years after Darwin's investigations, the finches became known as Darwin's finches.

Credit: 
Flinders University

Rescuers often driven by emotion

Scientists from James Cook University and Royal Life Saving Society - Australia have found reason can go out the window when people's family members, children and pets are in trouble in the water, and people should be better trained in water rescue skills.

JCU's Associate Professor Richard Franklin was part of a study that examined successful rescues, and drownings where someone had died trying to rescue another in trouble in the water.

He said many drowning prevention organisations emphasise the need for people to think about whether they can safely complete a rescue before they attempt it.

"What we have found is that about half of all those being rescued were close family or friends of the rescuer, and half were aged under 10 years. It's possible that thoughts of self-preservation go out the window when the potential drowning victim is from these groups," Dr Franklin said.

He said that seventeen rescuers drowned between 2002 and 2017 while trying to rescue children, while another six who died between January 2006 and December 2015 were trying to rescue a dog.

He said another piece of information was also important.

"About 14 % of the people who had completed a rescue were either current or former lifeguards, and none of them drowned in the attempt. This was almost certainly due to their training and experience," he said.

Dr Franklin said the scientists were calling for skills on safe rescues and effective resuscitation to be taught in high schools and regularly renewed.

"It's best to use primary prevention methods - targeted interventions such as concentrating on specific age groups, locations or activities - to prevent drownings, but we think secondary prevention measures such as rescues and resuscitation are also important for reducing the drowning toll," he said.

Amy Peden, Senior Research Fellow with Royal Life Saving Society - Australia, and an author of the paper, encouraged people to learn resuscitation and consider participating in a Bronze Medallion course.

"Most of those in our study were unprepared for undertaking rescues and often acted in the heat of the moment, to rescue a loved one. Having the skills to act in an emergency is vital to reducing the risk and avoiding an all too common scenario, the rescuer who drowns," she said.

Credit: 
James Cook University

Genetic marker linked to increased risk of diabetic peripheral neuropathy

image: Alessandro Doria, MD, PhD, MPH, Director of the Molecular Phenotyping and Genotyping at Joslin Diabetes Center and Professor of Medicine at Harvard Medical School

Image: 
John Soares

BOSTON - (June 11, 2019) - Researchers from Joslin Diabetes Center, using a genome-wide association study, have identified a genetic factor linked to the development of diabetic peripheral neuropathy. This finding suggests a new target for preventive therapies. The research has been published online and will appear in the August print issue of Diabetes.

While neuropathy, which causes pain or numbness in the legs and an increase risk of foot ulcers, is a major problem for many people with diabetes, there is significant variability in its onset: some people develop this complication, and others do not, says Alessandro Doria, MD, PhD, MPH, a study senior author and Director of the Molecular Phenotyping and Genotyping at Joslin Diabetes Center and Professor of Medicine at Harvard Medical School in Boston. "Therefore, we wanted to see if we could discover genetic factors that predispose people with diabetes to developing this complication versus being protected from it."

For this study, researchers used an approach called a genome-wide association study, or GWAS. This analysis is used to find disease-associated variants throughout the genome. A GWAS for diabetic peripheral neuropathy was carried out in 5,168 participants from the Action to Control Cardiovascular Risk in Diabetes (ACCORD) clinical trial --- 4,384 with evidence of peripheral neuropathy and 784 who were spared this complication.

After screening millions of small variations of the genome sequence (genetic variants), the study identified a region on chromosome 2q24 as having a powerful impact on the risk of peripheral neuropathy in type 2 diabetes. While the precise mechanisms are not known, there were some hints that the genetic variants in this region may act by affecting a nearby sodium channel regulating the transmission of sensory signals in peripheral nerves.

"People carrying the less frequent variant at that location were protected from neuropathy and people carrying the more common variant at that same location were predisposed to this complication," says Doria.

The implication is that this could be a target for pharmacological therapy to protect people from diabetic peripheral neuropathy. "We found that people with the protective allele have higher amounts of this sodium channel," says Doria. "This suggests that the sodium channel in the peripheral nerves might be used to protect people from neuropathy, by developing a drug that activates this channel."

This finding was replicated in an independent study, the Bypass Angioplasty Revascularization Investigation 2 Diabetes (BARI 2D) trial.

"The study is important because it's the first real effort to have a genome wide search for genes predisposing to this complication of diabetes. Diabetic peripheral neuropathy is often overlooked," says Hetal Shah, MD, MPH - a study senior author and a Research Associate at the Joslin Diabetes Center and Instructor of Medicine at Harvard Medical School. "Yet nearly one-fourth of the annual US expenditure on diabetes is due to diabetic peripheral neuropathy."

One limitation of this study is that it included only white subjects, so it's not known whether these findings also apply to people of other races.

Credit: 
Joslin Diabetes Center

Superweed resists another class of herbicides, study finds

image: Aaron Hager stands in a soybean field infested with multiple-herbicide-resistant waterhemp, a superweed becoming harder and harder to kill. Hager and his co-authors demonstrate that the weed is now resistant to one more class of popular herbicide products.

Image: 
L. Brian Stauffer, University of Illinois

URBANA, Ill. - We've all heard about bacteria that are becoming resistant to multiple types of antibiotics. These are the so-called superbugs perplexing and panicking medical science. The plant analogue may just be waterhemp, a broadleaf weed common to corn and soybean fields across the Midwest. With resistance to multiple common herbicides, waterhemp is getting much harder to kill.

In a new study from the University of Illinois, scientists document waterhemp's resistance to yet another class of herbicides, known as Group 15s. The study provides the first documentation of a non-grass plant to be resistant to Group 15 herbicides.

There are many herbicides on the market, but they all fall into one of 16 classes describing their mode of action (MOA), or specific target in the plant that the chemical attacks. Because of various regulations and biological realities, a smaller number of herbicide MOAs can be used on any given crop and the suite of weeds that goes along with it. Historically, about nine have been useful for waterhemp - and now the weed appears to be resistant to at least seven.

"In some areas, we're one or two MOAs away from completely losing chemical control of waterhemp and other multiple-herbicide-resistant weeds," says Adam Davis, head of the Department of Crop Sciences at Illinois and co-author on the study. "And there are no new herbicide MOAs coming out. There haven't been for 30 years."

Illinois weed scientist and co-author Aaron Hager adds, "We don't want to panic people, but farmers need to be aware this is real. It continues on with the challenges we've warned people about for years."

The research team tested the effectiveness of soil-applied Group 15 herbicides in a Champaign County population already resistant to five MOAs. They applied eight Group 15 formulations in the field at their label rates, and chose three - non-encapsulated acetochlor (Harness), S-metolachlor (Dual Magnum), and pyroxasulfone (Zidua) - for a rate-titration experiment in which the herbicides were applied at one-half, one, two, and four times the label rate.

The eight Group 15 products varied in their effectiveness, with encapsulated acetochlor (Warrant), S-metolachlor, metolachlor (Stalwart), and dimethenamid-P (Outlook) performing the worst. These products provided less than 25% control 28 days after application and less than 6% control 14 days later.

Of the rate-titration experiment, Hager says, "We found we could apply significantly higher than the labeled dose and still see resistance." For example, S-metolachlor provided only 10% control at the standard label rate, 20% at 2x the label rate, and 45% at 4x the label rate.

Hager says farmers might not notice the poor performance of these soil-applied pre-emergence herbicides because waterhemp germinates continuously throughout the season. When a weed pops up mid-season, it's hard to tell exactly when it emerged and whether it was exposed to residual soil-applied herbicides.

"If you think about how you use these products, rarely do they last the entire year. They're very dependent on environmental conditions to work effectively. It could be too wet or too dry. Generally speaking, you have some weed escape. But many farmers would chalk it up to these weather issues. If you're not thinking about it, you could very easily overlook resistance," Hager says.

To confirm results from the field, the team performed a dose response test in the greenhouse. In that test, four waterhemp populations - three with resistance to multiple herbicides and one that is sensitive to all herbicides - were dosed with increasing levels of S-metolachlor, acetochlor, dimethenamid-P, and pyroxasulfone. Populations from Champaign County and McLean County survived higher levels of the Group 15 herbicides than the other populations.

Hager suspects the plants are breaking the chemicals down before they cause damage, a trick known as metabolic resistance. All organisms can turn on cellular defenses against toxins, but it is rather worrisome when weeds and other undesirable pests use their biology against human interventions.

"As we get into the era of metabolic resistance, our predictability is virtually zero. We have no idea what these populations are resistant to until we get them under controlled conditions," Hager says. "It's just another example of how we need a more integrated system, rather than relying on chemistry only. We can still use the chemistry, but have to do something in addition.

"We want farmers to understand that we have to rethink how we manage waterhemp long term."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Brain activation provides individual-level prediction of bipolar disorder risk

Philadelphia, June 11, 2019 - Patterns of brain activation during reward anticipation may help identify people most at risk for developing bipolar spectrum disorders (BPSD), according to a study in Biological Psychiatry: Cognitive Neuroscience and Neuroimaging, published by Elsevier. Mania in people with BPSD is often accompanied by impulsivity, including impulsive responses to potential rewards. In the study, patterns of neural activation during a reward task predicted the severity of the mania symptom in young adults who have not yet developed the disorder.

"Given that emerging manic symptoms predispose to bipolar disorders, these findings can provide neural biomarkers to aid early identification of bipolar disorder risk in young adults," said first author Leticia de Oliveira, PhD, Federal Fluminense University, Brazil.

Having a family member with BPSD puts a person at risk for the disorder, but the relationship doesn't provide enough information to make decisions about potential interventions to help delay or prevent the disorder. The new study shows for the first time that brain activation patterns could be used to predict BPSD risk on an individual level. "These findings could be potentially used to guide the development and choice of early therapeutic interventions, reducing the significant social costs and deleterious outcomes associated with the disorder in these vulnerable individuals," said Dr. Oliveira.

To be sure that the approach would apply to anyone at risk, Dr. Oliveira and colleagues performed the brain imaging in a transdiagnostic group of young adults--the participants had a variety of psychiatric complications, but none had yet developed BPSD.

Of the whole brain, activation in a brain region used during decision making in reward contexts, called the ventrolateral prefrontal cortex (vlPFC), contributed the most to the prediction of symptom severity. This suggests that vlPFC activity in particular may be useful to predict severity of mania symptoms associated with BPSD risk in young adults.

"This study shows how the powerful combination of computational image analysis tools and functionally targeted task fMRI (in this case reward processing) can provide insights into the neural systems underlying symptoms that may indicate liability to mania, in a young, non-bipolar transdiagnostic group of psychiatric patients," said Cameron Carter, MD, Editor of Biological Psychiatry: Cognitive Neuroscience and Neuroimaging.

The researchers replicated the results and the role of the vlPFC in a second independent sample of young adults in the same study, further confirming the potential utility of neural activation in this brain region as a biomarker for BPSD risk.

Credit: 
Elsevier

Rare 'superflares' could one day threaten Earth

Astronomers probing the edges of the Milky Way have in recent years observed some of the most brilliant pyrotechnic displays in the galaxy: superflares.

These events occur when stars, for reasons that scientists still don't understand, eject huge bursts of energy that can be seen from hundreds of light years away. Until recently, researchers assumed that such explosions occurred mostly on stars that, unlike Earth's, were young and active.

Now, new research shows with more confidence than ever before that superflares can occur on older, quieter stars like our own--albeit more rarely, or about once every few thousand years.

The results should be a wake-up call for life on our planet, said Yuta Notsu, the lead author of the study and a visiting researcher at CU Boulder.

If a superflare erupted from the sun, he said, Earth would likely sit in the path of a wave of high-energy radiation. Such a blast could disrupt electronics across the globe, causing widespread black outs and shorting out communication satellites in orbit.

Notsu presented his research at a press briefing at the 234th meeting of the American Astronomical Society in St. Louis.

"Our study shows that superflares are rare events," said Notsu, a researcher in CU Boulder's Laboratory for Atmospheric and Space Physics. "But there is some possibility that we could experience such an event in the next 100 years or so."

Scientists first discovered this phenomenon from an unlikely source: the Kepler Space Telescope. The NASA spacecraft, launched in 2009, seeks out planets circling stars far from Earth. But it also found something odd about those stars themselves. In rare events, the light from distant stars seemed to get suddenly, and momentarily, brighter.

Researchers dubbed those humungous bursts of energy "superflares."

Notsu explained that normal-sized flares are common on the sun. But what the Kepler data was showing seemed to be much bigger, on the order of hundreds to thousands of times more powerful than the largest flare ever recorded with modern instruments on Earth.

And that raised an obvious question: Could a superflare also occur on our own sun?

"When our sun was young, it was very active because it rotated very fast and probably generated more powerful flares," said Notsu, also of the National Solar Observatory in Boulder. "But we didn't know if such large flares occur on the modern sun with very low frequency."

To find out, Notsu and an international team of researchers turned to data from the European Space Agency's Gaia spacecraft and from the Apache Point Observatory in New Mexico. Over a series of studies, the group used those instruments to narrow down a list of superflares that had come from 43 stars that resembled our sun. The researchers then subjected those rare events to a rigorous statistical analysis.

The bottom line: age matters. Based on the team's calculations, younger stars tend to produce the most superflares. But older stars like our sun, now a respectable 4.6 billion years old, aren't off the hook.

"Young stars have superflares once every week or so," Notsu said. "For the sun, it's once every few thousand years on average."

The group published its latest results in May in The Astrophysical Journal.

Notsu can't be sure when the next big solar light show is due to hit Earth. But he said that it's a matter of when, not if. Still, that could give humans time to prepare, protecting electronics on the ground and in orbit from radiation in space.

"If a superflare occurred 1,000 years ago, it was probably no big problem. People may have seen a large aurora," Notsu said. "Now, it's a much bigger problem because of our electronics."

Credit: 
University of Colorado at Boulder

The sun may have a dual personality, simulations suggest

Researchers at CU Boulder have discovered hints that humanity's favorite star may have a dual personality, with intriguing discrepancies in its magnetic fields that could hold clues to the sun's own "internal clock."

Physicists Loren Matilsky and Juri Toomre developed a computer simulation of the sun's interior as a means of capturing the inner roiling turmoil of the star. In the process, the team spotted something unexpected: On rare occasions, the sun's internal dynamics may jolt out of their normal routines and switch to an alternate state--bit like a superhero trading the cape and cowl for civilian clothes.

While the findings are only preliminary, Matilsky said, they may line up with real observations of the sun dating back to the 19th century.

He added that the existence of such a solar alter-ego could provide physicists with new clues to the processes that govern the sun's internal clock--a cycle in which the sun switches from periods of high activity to low activity about once every 11 years.

"We don't know what is setting the cycle period for the sun or why some cycles are more violent than others," said Matilsky, a graduate student at JILA. "Our ultimate goal is to map what we're seeing in the model to the sun's surface so that we can then make predictions."

He will present the team's findings at a press briefing today at the 234th meeting of the American Astronomical Society in St. Louis.

The study takes a deep look at a phenomenon that scientists call the solar "dynamo," essentially a concentration of the star's magnetic energy. This dynamo is formed by the spinning and twisting of the hot gases inside the sun and can have big impacts--an especially active solar dynamo can generate large numbers of sunspots and solar flares, or globs of energy that blast out from the surface.

But that dynamo isn't easy to study, Matilsky said. That's because it mainly forms and evolves within the sun's interior, far out of range of most scientific instruments.

"We can't dive into the interior, which makes the sun's internal magnetism a few steps removed from real observations," he said.

To get around that limitation, many solar physicists use massive supercomputers to try to recreate what's occurring inside the sun.

Matilsky and Toomre's simulation examines activity in the outer third of that interior, which Matilsky likens to "a spherical pot of boiling water."

And, he said, this model delivered some interesting results. When the researchers ran their simulation, they first found that the solar dynamo formed to the north and south of the sun's equator. Following a regular cycle, that dynamo moved toward the equator and stopped, then reset in close agreement with actual observations of the sun.

But that regular churn wasn't the whole picture. Roughly twice every 100 years, the simulated sun did something different.

In those strange cases, the solar dynamo didn't follow that same cycle but, instead, clustered in one hemisphere over the other.

"That additional dynamo cycle would kind of wander," Matilsky said. "It would stay in one hemisphere over a few cycles, then move into the other one. Eventually, the solar dynamo would return to its original state."

That pattern could be a fluke of the model, Matilsky said, but it might also point to real, and previously unknown, behavior of the solar dynamo. He added that astronomers have, on rare occasions, seen sun spots congregating in one hemisphere of the sun more than the other, an observation that matches the CU Boulder team's findings.

Matilsky said that the group will need to develop its model further to see if the dual dynamo pans out. But he said that the team's results could, one day, help to explain the cause of the peaks and dips in the sun's activity--patterns that have huge implications for climate and technological societies on Earth.

"It gives us clues to how the sun might shut off its dynamo and turn itself back on again," he said.

Credit: 
University of Colorado at Boulder

Genetics influence how protective childhood vaccines are for individual infants

A genome-wide search in thousands of children in the UK and Netherlands has revealed genetic variants associated with differing levels of protective antibodies produced after routine childhood immunizations. The findings, appearing June 11 in the journal Cell Reports, may inform the development of new vaccine strategies and could lead to personalized vaccination schedules to maximize vaccine effectiveness.

"This study is the first to use a genome-wide genotyping approach, assessing several million genetic variants, to investigate the genetic determinants of immune responses to three routine childhood vaccines," says Daniel O'Connor of the University of Oxford, who is co-first author on the paper along with Eileen Png of the Genome Institute of Singapore. "While this study is a good start, it also clearly demonstrates that more work is needed to fully describe the complex genetics involved in vaccine responses, and to achieve this aim we will need to study many more individuals."

Vaccines have revolutionized public health, preventing millions of deaths each year, particularly in childhood. The maintenance of antibody levels in the blood is essential for continued vaccine-induced protection against pathogens. Yet there is considerable variability in the magnitude and persistence of vaccine-induced immunity. Moreover, antibody levels rapidly wane following immunization with certain vaccines in early infancy, so boosters are required to sustain protection.

"Evoking robust and sustained vaccine-induced immunity from early life is a crucial component of global health initiatives to combat the burden of infectious disease," O'Connor says. "The mechanisms underlying the persistence of antibody is of major interest, since effectiveness and acceptability of vaccines would be improved if protection were sustained after infant immunization without the need for repeated boosting through childhood."

Vaccine responses and the persistence of immunity are determined by various factors, including age, sex, ethnicity, microbiota, nutritional status, and infectious diseases. Twin studies have also shown vaccine-induced immunity to be highly heritable, and recent studies have started to unpick the genetic components underlying this complex trait.

To explore genetic factors that determine the persistence of immunity, O'Connor and colleagues carried out a genome-wide association study of 3,602 children in the UK and Netherlands. The researchers focused on three routine childhood vaccines that protect against life-threatening bacterial infections: capsular group C meningococcal (MenC), Haemophilus influenzae type b (Hib), and tetanus toxoid (TT) vaccines. They analyzed approximately 6.7 million genetic variants affecting a single DNA building block, known as single nucleotide polymorphisms (SNPs), associated with vaccine-induced antibody levels in the blood.

The researchers identified two genetic loci associated with the persistence of vaccine-induced immunity following childhood immunization. The persistence of MenC immunity is associated with SNPs in a genomic region containing a family of signal-regulatory proteins, which are involved in immunological signaling. Meanwhile, the persistence of TT-specific immunity is associated with SNPs in the human leukocyte antigen (HLA) locus. HLA molecules present peptides to T cells, which in turn induce B cells to produce antibodies.

These variants likely account for only a small portion of the genetic determinants of persistence of vaccine-induced immunity. Moreover, it is unclear whether the findings apply to other ethnic populations besides Caucasians from the UK and Netherlands. But according to the authors, neonatal screening approaches could soon incorporate genetic risk factors that predict the persistence of immunity, paving the way for personalized vaccine regimens.

"We are now carrying out in-depth investigations into the biology of the genetic variants we described in this study," O'Connor says. "We also planned further research, in larger cohorts of children and other populations that benefit from vaccination, to further our understanding of how our genetic makeup shapes vaccine responses."

Credit: 
Cell Press