Culture

Novel approach for the treatment of cannabis use disorder shows promise in phase 2 trial

Experimental drug reduced cannabis use and withdrawal symptoms compared with placebo

Results of a phase 2 randomised trial of 70 men suggest that an experimental drug that boosts the brain's own cannabis-like chemical may help reduce withdrawal symptoms and cannabis use in men with cannabis dependence or cannabis use disorder.

The findings published in The Lancet Psychiatry journal, show for the first time that men with cannabis dependence or cannabis use disorder treated with the fatty acid amide hydrolase (FAAH) inhibitor 'PF-04457845' used less cannabis and experienced fewer withdrawal symptoms--such as sleep disturbance--at 4-week follow-up compared to those given placebo, and there were no safety concerns.

PF-04457845 works by blocking FAAH, an enzyme that breaks down a principal natural endocannabinoid chemical in the brain called anandamide (that acts on brain cannabinoid receptors like cannabis does). Less FAAH means higher anandamide levels, which may potentially improve mood and reduce anxiety.

"A lot of other drugs have been tested for their ability to reduce cannabis use and withdrawal, but until now none have been consistently shown to work against both withdrawal symptoms and relapse. Furthermore, unlike cannabis or its principal active constituent delta-9 tetrahydrocannabinol (THC), FAAH inhibitors do not appear to have psychoactive or rewarding effects, and are therefore not likely to be abused", says Professor Deepak Cyril D'Souza from Yale University School of Medicine, USA, who led the research.

"PF-04457845 was well tolerated. However, more research is needed to demonstrate that PF-04457845 is safe and effective in a larger sample of treatment-seeking individuals, particularly women, and in other outpatient settings over the long-term." [1]

Cannabis use disorder is characterised by a continued problematic pattern of use despite negative consequences such as social and functional impairment, risky use, tolerance, and withdrawal symptoms. Cannabis withdrawal symptoms include craving for cannabis, irritability, anger, depression, sleep disturbances, and decrease in appetite and weight, that make it difficult to quit. Cannabis use disorder affects around 13 million people worldwide [2]. In the USA, around a third of all current cannabis users meet diagnostic criteria for cannabis use disorder, and more than 250,000 people were admitted for cannabis abuse treatment in 2016 [3]. Long-term recovery is achieved by only a few of those who seek treatment with behavioural interventions like cognitive behavioural therapy and motivational enhancement therapy.

Currently, there are no approved pharmacological treatments for problematic cannabis use. Almost every class of psychotropic drug has been tested for cannabis withdrawal or dependence, but none has been consistently effective or well tolerated. Substitution therapy with THC, the psychoactive compound in cannabis, has shown some promise in reducing withdrawal symptoms but does not prevent relapse and is limited by its psychoactive effects and abuse potential. In mice dependent on THC, blocking the FAAH enzyme reduced cannabis withdrawal syndrome.

In the study, 70 men (aged 18-55 years) with cannabis use disorder were randomised to receive the FAAH inhibitor, PF-04457845, (4mg daily; 46 men) or matching placebo (24 men) for 4 weeks. All participants were admitted to hospital for about one week of the treatment phase to achieve abstinence and cannabis withdrawal. Participants were then discharged to continue the remaining 3 weeks of treatment as outpatients.

Adherence to medication was confirmed by video-calling and pill count, and corroborated by weekly blood concentration of the PF-04457845 and anandamide. Cannabis use was assessed by self-report and urine screening for levels of the THC metabolite THC-COOH. Sleep problems, that feature prominently in cannabis withdrawal, were assessed using questionnaires and polysomnography (a test that records brain waves, blood oxygen level, heart rate, breathing, and eye and leg movements overnight).

At the start of the study, participants were smoking on average more than three cannabis joints a day. Admission to hospital reduced cannabis use to zero in both groups. During the inpatient phase (week 1), men treated with PF-04457845 reported fewer symptoms of cannabis withdrawal including depression, irritability, and anxiety compared with those given placebo (table 2).

At the end of treatment (4 weeks), the PF-04457845 group reported less cannabis use compared to the placebo group (average 0.40 vs 1.27 joints per day), and also had lower levels of THC-COOH in their urine (average concentrations of THC-COOH 266ng/mL vs 658 ng/mL).

Additionally, improvements in overall sleep (longer sleep times, deeper sleep, and feeling more rested) were noted compared with placebo. In contrast, reductions in the time spent in deep sleep occurred immediately following abstinence in the placebo group, consistent with the evidence of sleep disturbances in cannabis withdrawal syndrome.

The authors note that withdrawal-induced deep sleep disturbances could play a key role in relapse, and treatment via FAAH inhibition might be useful in correcting it, which in turn could facilitate maintenance of abstinence from cannabis.

Adherence to the study medication was 88%, and urinary THC-COOH concentrations correlated with self-reported cannabis use over time. PF-04457845 was well tolerated, and adverse events were mild and similar in both groups (20 [43%] of 46 participants in the PF-04457845 group vs 11 [46%] of 24 participants in the placebo group had an adverse event during the 4-week treatment phase). No serious adverse events were reported. Drop-out rates were similar between the PF-04457845 (8 [17%] of 46 men) and placebo groups (4 [17%] of 24 men).

The authors note some limitations, including that the study did not include women because of a lack of safety and toxicity data at the time, and did not fully assess motivation to quit cannabis use or the functional consequences of problematic cannabis use. In the future, studies will be needed to compare the advantages and disadvantages of direct agonists like THC with FAAH inhibitors.

Writing in a linked Comment, Dr Tony George from the University of Toronto and Centre for Addiction and Mental Health, Ontario, Canada, says FAAH might prove to be a safe and effective treatment approach but several questions remain to be answered: "No assessments of cannabis related functional impairment...were done, and thus the effect on functional outcomes achieved during this FAAH inhibitor trial is not clear...The population studied seemed not to include adults with psychiatric comorbidity, but it will be important to include these patients in future studies as they seem to be at much higher risk for the initiation and maintenance of cannabis use disorder. Finally, the endurability of FAAH inhibition needs to be rigorously tested with sufficient follow-up assessment periods (eg, 3-6 months after treatment)."

He concludes: "Most pharmacotherapy trials in addiction have sought to develop medications as adjuncts to behavioural interventions. The development of FAAH inhibitors as putative pharmacotherapies for cannabis use disorder should therefore make use of behavioural supports in both abstinence initiation and relapse-prevention designs. In particular, the use of cognitive-behavioural therapy in combination with contingency management could be the optimal approach to testing of putative cannabis pharmacotherapies, because they are most effective in achieving initial abstinence, facilitating the study of relapse-prevention efficacy, which might be the most sensitive test for medications development."

Credit: 
The Lancet

Artificial synapses made from nanowires

image: Image captured by an electron microscope of a single nanowire memristor (highlighted in colour to distinguish it from other nanowires in the background image). Blue: silver electrode, orange: nanowire, yellow: platinum electrode. Blue bubbles are dispersed over the nanowire. They are made up of silver ions and form a bridge between the electrodes which increases the resistance.

Image: 
Forschungszentrum Jülich

Scientists from Jülich together with colleagues from Aachen and Turin have produced a memristive element made from nanowires that functions in much the same way as a biological nerve cell. The component is able to both save and process information, as well as receive numerous signals in parallel. The resistive switching cell made from oxide crystal nanowires is thus proving to be the ideal candidate for use in building bioinspired "neuromorphic" processors, able to take over the diverse functions of biological synapses and neurons.

Computers have learned a lot in recent years. Thanks to rapid progress in artificial intelligence they are now able to drive cars, translate texts, defeat world champions at chess, and much more besides. In doing so, one of the greatest challenges lies in the attempt to artificially reproduce the signal processing in the human brain. In neural networks, data are stored and processed to a high degree in parallel. Traditional computers on the other hand rapidly work through tasks in succession and clearly distinguish between the storing and processing of information. As a rule, neural networks can only be simulated in a very cumbersome and inefficient way using conventional hardware.

Systems with neuromorphic chips that imitate the way the human brain works offer significant advantages. Experts in the field describe this type of bioinspired computer as being able to work in a decentralised way, having at its disposal a multitude of processors, which, like neurons in the brain, are connected to each other by networks. If a processor breaks down, another can take over its function. What is more, just like in the brain, where practice leads to improved signal transfer, a bioinspired processor should have the capacity to learn.

"With today's semiconductor technology, these functions are to some extent already achievable. These systems are however suitable for particular applications and require a lot of space and energy," says Dr. Ilia Valov from Forschungszentrum Jülich. "Our nanowire devices made from zinc oxide crystals can inherently process and even store information, as well as being extremely small and energy efficient," explains the researcher from Jülich's Peter Grünberg Institute.

For years memristive cells have been ascribed the best chances of being capable of taking over the function of neurons and synapses in bioinspired computers. They alter their electrical resistance depending on the intensity and direction of the electric current flowing through them. In contrast to conventional transistors, their last resistance value remains intact even when the electric current is switched off. Memristors are thus fundamentally capable of learning.

In order to create these properties, scientists at Forschungszentrum Jülich and RWTH Aachen University used a single zinc oxide nanowire, produced by their colleagues from the polytechnic university in Turin. Measuring approximately one ten-thousandth of a millimeter in size, this type of nanowire is over a thousand times thinner than a human hair. The resulting memristive component not only takes up a tiny amount of space, but also is able to switch much faster than flash memory.

Nanowires offer promising novel physical properties compared to other solids and are used among other things in the development of new types of solar cells, sensors, batteries and computer chips. Their manufacture is comparatively simple. Nanowires result from the evaporation deposition of specified materials onto a suitable substrate, where they practically grow of their own accord.

In order to create a functioning cell, both ends of the nanowire must be attached to suitable metals, in this case platinum and silver. The metals function as electrodes, and in addition, release ions triggered by an appropriate electric current. The metal ions are able to spread over the surface of the wire and build a bridge to alter its conductivity.

Components made from single nanowires are, however, still too isolated to be of practical use in chips. Consequently, the next step being planned by the Jülich and Turin researchers is to produce and study a memristive element, composed of a larger, relatively easy to generate group of several hundred nanowires offering more exciting functionalities.

Credit: 
Forschungszentrum Juelich

Gender gaps in political perspectives among college students

image: Meredith Worthen is an associate professor in the Department of Sociology, OU College of Arts and Sciences

Image: 
University of Oklahoma

NORMAN--A University of Oklahoma sociologist, Meredith Worthen, has published a new study in the journal, Sexuality Research and Social Policy, on sexuality and gender gaps in political perspectives among lesbian, gay, bisexual, mostly heterosexual and heterosexual college students in the southern United States. Worthen confirms a clear "sexuality gap" between exclusive heterosexuals and all others as well as gender gaps among mostly heterosexual and lesbian, gay and bisexual students, though some gaps are in the opposite direction from the results expected.

"This study fills the gaps in the research, expands our knowledge about sexuality and gender gaps in political attitudes and contributes to new ways of thinking about the perspectives of mostly heterosexual and lesbian, gay and bisexual people," said Worthen, associate professor in the OU College of Arts and Sciences. "This study works toward a deeper understanding of ways college students can promote political change and advocate for social justice."

Overall, Worthen proposes that social justice perspectives may be more common among lesbian, gay and bisexual people as a group, and especially among lesbian and bisexual women due to their oppressed identities. She suggests that these patterns may lead to more liberal lesbian, gay and bisexual political views and contribute to sexuality and gender gaps in political perspectives. In this study, liberal refers to liberal ideology, feminist identity and attitudes toward the death penalty and abortion.

The study found a distinct "lavender liberalism" among mostly heterosexual, lesbian, gay and bisexual college students. Exclusive heterosexuals, on the other hand, are significantly less liberal. Research indicates mostly heterosexual individuals are a growing and visible group on college campuses, so this study's inclusion of mostly heterosexuals as a distinct group that differs from exclusive heterosexuals contributes to the gap in the existing literature.

Overall, these findings support the stereotype that "all gays are liberal." When Worthen explored other sexuality gaps, among mostly heterosexual and LGB respondents, findings were less consistent. However, among the results, there is evidence of a bisexual woman consciousness that relates to liberalism among bisexual college women. In previous literature, a sexuality gap in political perspectives between lesbian and gay and bisexual people indicates lesbian and gay people are more liberal than bisexual people, however, findings do not support this and indicate that bisexual people are more liberal than gay and lesbian people. This finding has important implications for future work that centers bisexual women in conversations about political attitudes and liberal ideology.

Credit: 
University of Oklahoma

Diabetes drug liraglutide linked to lower risk of cardiovascular events

image: This is Björn Pasternak, senior researcher at the Department of Medicine, Solna, Karolinska Institutet, Sweden.

Image: 
Stefan Zimmerman

Real world data from a large Nordic study shows that use of liraglutide, a drug for type 2 diabetes, is associated with a lower risk of myocardial infarction, stroke or cardiovascular death. The study, led by researchers from Karolinska Institutet in Sweden, is published in The Lancet Diabetes & Endocrinology.

The number of patients with type 2 diabetes is increasing rapidly in the world. Cardiovascular disease is a serious complication of diabetes and represents a major cause of mortality in this patient group.

Liraglutide, a diabetes medication, became available for clinical use in 2009. This drug is a glucagon-like peptide 1 receptor agonist that lowers blood sugar and reduces body weight. A large clinical trial published previously showed that liraglutide reduced the risk of major cardiovascular events among patients with diabetes who had established cardiovascular disease or were at high cardiovascular risk. It has been unclear if these findings also translate to cardiovascular benefit in the broad patient population seen in routine clinical practice.

The current study was a collaborative project between researchers at Karolinska Institutet in Sweden, Statens Serum Institut in Denmark, NTNU in Norway and the Swedish National Diabetes Register. The researchers used several nationwide registers with information on prescription drugs, diseases and other data from more than 46,000 patients in Sweden and Denmark, 2010-2016.

Around 23,000 patients initiating treatment with liraglutide were compared with the same number of patients initiating treatment with another diabetes drug, DPP4 inhibitors. The main outcome in the study was major cardiovascular events, defined as myocardial infarction, stroke, or cardiovascular death.

The rate of major cardiovascular events was 14.0 per 1,000 person-years among patients using liraglutide and 15.4 per 1,000 among patients using DPP4 inhibitors, a statistically significant difference. This corresponded to 5 fewer major cardiovascular events per 1,000 patients followed up for 3 years.

Use of liraglutide was also associated with reduced risk of cardiovascular death and any cause of death. In a subgroup analysis, patients with a history of major cardiovascular disease appeared to benefit most from treatment with liraglutide, although this was not a statistically significant difference compared with patients without such history.

"Our study provides support for the cardiovascular effectiveness of liraglutide among a broader unselected group of patients, providing important confirmatory evidence from routine clinical practice. We believe it may be of interest to drug regulators, clinical guidelines, physicians, and patients," says last author Björn Pasternak, senior researcher at the Department of Medicine, Solna, Karolinska Institutet, and affiliated with Statens Serum Institut.

Credit: 
Karolinska Institutet

Infections during childhood increase the risk of mental disorders  

 
 

A new study from iPSYCH shows that the infections children contract during their childhood increase the risk of mental disorders during childhood and adolescence. This knowledge expands our understanding of the role of the immune system in the development of mental disorders. 
 

High temperatures, sore throats and infections during childhood can increase the risk of also suffering from a mental disorder as a child or adolescent. This is shown by the first study of its kind to follow all children born in Denmark between 1 January 1995 and 30 June 2012. The researchers have looked at all infections that have been treated from birth and also at the subsequent risk of childhood and adolescent psychiatric disorders. 
 

"Hospital admissions with infections are particularly associated with an increased risk of mental disorders, but so too are less severe infections that are treated with medicine from the patient's own general practitioner," says Ole Köhler-Forsberg from Aarhus University and Aarhus University Hospital's ?Psychoses Research Unit. He is one of the researchers behind the study. 
 

The study showed that children who had been hospitalised with an infection had an 84 per cent increased risk of suffering a mental disorder and a 42 per cent increased risk of being prescribed medicine to treat mental disorders. Furthermore, the risk for a range of specific mental disorders was also higher, including psychotic disorders, OCD, tics, personality disorders, autism and ADHD. 
 

"This knowledge increases our understanding of the fact that there is a close connection between body and brain and that the immune system can play a role in the development of mental disorders. Once again research indicates that physical and mental health are closely connected," says Ole Köhler-Forsberg. 
 

Highest risk following an infection

 

The study has just been published in JAMA Psychiatry and is a part of the Danish iPSYCH psychiatry project.  
 

"We also found that the risk of mental disorders is highest right after the infection, which supports the infection to some extent playing a role in the development of the mental disorder," says Ole Köhler-Forsberg.
 

It therefore appears that infections and the inflammatory reaction that follows afterwards can affect the brain and be part of the process of developing severe mental disorders. This can, however, also be explained by other causes, such as some people having a genetically higher risk of suffering more infections and mental disorders. 
 

The new knowledge could have importance for further studies of the immune system and the importance of infections for the development of a wide range of childhood and adolescent mental disorders for which the researchers have shown a correlation. This is the assessment of senior researcher on the study, Research Director Michael Eriksen Benrós from the Psychiatric Centre Copenhagen at Copenhagen University hospital.  
 

"The temporal correlations between the infection and the mental diagnoses were particularly notable, as we observed that the risk of a newly occurring mental disorder was increased by 5.66 times in the first three months after contact with a hospital due to an infection and were also increased more than twofold within the first year," he explains. 
 

Michael Eriksen Benrós stresses that the study can in the long term lead to increased focus on the immune system and how infections play a role in childhood and adolescent mental disorders. 
 

"It can have a consequence for treatment and the new knowledge can be used in making the diagnosis when new psychiatric symptoms occur in a young person. But first and foremost it corroborates our increasing understanding of how closely the body and brain are connected," he says. 
 

Credit: 
Aarhus University

Eliminating microglia prevents heightened immune sensitivity after stress

Philadelphia, December 4, 2018 -- Using an animal model of chronic stress, researchers at The Ohio State University have shown that the immune cells of the brain, called microglia, hold unique signatures of chronic stress that leave the animal more sensitive to future stressful experiences, evident by increased anxiety and immune responses. Eliminating microglia so that these “stress memories” could not be maintained did not prevent the increased anxiety in response to later stress but did prevent the hypersensitive immune response.

The study, published in Biological Psychiatry, indicates that eliminating the microglia can reverse some aspects of stress sensitization, which lasts for over 3 weeks after chronic stress ends. The increased anxiety behavior, which was not prevented by elimination of the microglia, may have resulted from stress signatures maintained in neurons, which also persist for weeks after chronic stress.

“It is remarkable that memories of stress are not only stored in nerve cells, but also in the microglia, the immune cells of the brain. It is not the case that these immune cells can generate a representation of the stressful events. However, the microglia appear to be primed to produce a heightened immune response long after the stressful events that sensitized them have passed,” said John Krystal, MD, Editor of Biological Psychiatry.

Co-senior authors of the study, John Sheridan, PhD, and Jonathan Godbout, PhD, study how chronic stress makes a person more vulnerable to events later in life that otherwise might not have caused stress. Using the same mouse model of chronic stress called repeated social defeat (RSD), they had previously shown that over 3 weeks after the stress ended, when the anxiety and the inflammatory response had diminished, they could recall both the behavioral and inflammatory responses with even just a brief exposure to the stressor. “This recall response indicated that the initial exposure to repeated social defeat resulted in sensitization of both neural and microglial populations that responded to less intense exposure to the stressor,” said Dr. Sheridan.

In this study, when the animals were briefly exposed to a stressful event 24 days after RSD, the sensitized microglia recruited large amounts of inflammatory cells called monocytes to the brain, a process that increases the chance that anxiety will return in previously stressed-out mice. This recruitment process depended on the presence of microglia, as it was prevented when the microglia were missing.

“Overall, microglia-specific priming can be reversed, but the effectiveness of this approach depends on the context in which you are testing,” said Dr. Godbout. Stress sensitization involves hyperactive behavioral and immune responses, but only the immune component was prevented by eliminating and repopulating microglia.

Credit: 
Elsevier

Negative views of flexible working prevalent

Flexible working often leads to negative views from other employees, with 1/3 of all UK workers believing those who work flexibly create more work for others, while a similar proportion believe their career will suffer if they use flexible working arrangements, according to new research.

This is the main finding from Dr Heejung Chung from the University of Kent who set out to analyse data from the 2011 Work-Life Balance Survey conducted by the government. Specifically she wanted to examine whether stigma against flexible workers exists, who is most likely to hold such beliefs and who is most likely to suffer from it.

The research also found that the majority of respondents that held negative views against flexible workers were male, while women and especially mothers were the ones who were most likely to suffer from such stereotypes.

Furthermore, one out of five workers (18%) said they had experienced direct negative career consequence as a result of working flexibly. This perhaps accounts for the very low uptake of the right to request flexible working since it was made law in 2003 and expanded to cover all workers as of 2014.

It was women, especially mothers who worked part-time and on reduced hours, rather than full-time workers who work flexibly - i.e. teleworking or on flexitime - that reported that their careers were negatively impacted by working flexibly. On the other hand, men, especially fathers (almost half of respondents), were likely to have reported that their own jobs was negatively impacted due to others working flexibly.

Commentating on the research Dr Chung, from the School of Social Policy, Sociology and Social Research at Kent, said: 'It is clear there are still many people who view flexible working as a negative and for different reasons. This has major implications for how employers introduce and offer flexible working arrangements in their organisation, especially as the government looks to increase the rights of workers to request flexible working.'

'A simple introduction and expansion of the right to request flexible working will not be enough. We need to challenge our prevalent organisational cultures which privileges work above everything else, with long hours considered to be synonymous with productivity and commitment. Such change is crucial especially if flexible working is to help reduce the gender wage gap.'

Credit: 
University of Kent

Understanding the rise of the modern far right using Marx and Lacan

As the end of 2018 approaches, a year that celebrated 200 years of the German philosopher Karl Marx, new research detailing core concepts coined by Karl Marx and French psychiatrist Jacques Lacan offers a fresh perspective on the rise of the far right.

We live in an age of increasing far right groups supported by a variety of media outlets which sympathize with their ideology. Researchers are curious as to how, in this day and age, and in light of recent history, we got to where we are.

New research presented in the article, "Mystified Consciousness: Rethinking the Rise of the Far Right with Marx and Lacan" by Claudia Leeb from Washington State University published in De Gruyter's journal Open Cultural Studies, posits several arguments suggesting that we must turn to thinkers Marx and Lacan and the philosophical concepts they coined to understand the rise of the far right. In the article, Leeb uses the theory of psychoanalysis to explain why white working classes - in the US and throughout the world - seem to have turned to the far right instead of forming an anti-capitalist emancipatory proletariat.

Marx and Lacan's concepts have largely been ignored in literature on the rise of the far right, but Leeb's article draws them together to argue that not only economic, but also psychological factors, brought about its rise.

According to psychoanalytic theorists, our identities are fundamentally incomplete or non-whole, which generates our desire to have whole identities along with fears that we remain incomplete or non-whole. Furthermore, in the ideologies of neo-liberal capitalist societies, we are only considered to be complete and whole if we have achieved economic success. However, since achieving economic success has become so difficult for most people in neo-liberal capitalist societies, such as the United States, the article posits that large groups of citizens may have heightened feeling of being incomplete and thus inadequate. White, male working-class Americans, says Leeb, embrace the ideology of the far right to fulfil the unconscious yearning to be whole again. This ideology provides them with fantasies that compensate for feeling non-whole or inadequate, such as achieveing the American Dream of economic success, finding fulfillment in an afterlife through religion, hatred of ethnic minorities and disdain for women.

The far right fantasy is that of being more whole than "the Other". By branding certain groups of people such as Muslims, immigrants, women as limited or non-whole, the far right displaces the anxieties of the white male working classes on others, allowing them to feel superior and therefore, whole and "great again".

"The article provides an alternative explanation for the far right that the mystification and division in the working classes has to do with the expression of white, masculine supremacy, more so than economic dislocation," says political scientist Laurie Naranch from Siena College in Albany, New York.

The recent resurgence of far right extremism demonstrates just how much more study and research must occur in this field if it is to be curtailed.

Credit: 
De Gruyter

Ibrutinib plus rituximab superior to standard treatment for patients with chronic leukemia

San Diego - An interim analysis of a large phase 3 clinical trial found that the combination of ibrutinib plus rituximab was superior to standard treatment for patients age 70 and younger with previously untreated chronic lymphocytic leukemia (CLL). The trial met its primary endpoint of an improvement in progression-free survival (the length of time patients live before their disease worsens). The combination also improved overall survival, the trial's secondary endpoint. In general, patients in the ibrutinib-rituximab arm were less likely to experience serious side effects than those in the standard treatment arm. Until now, the standard treatment for previously untreated CLL has been a six-month course of FCR, which combines the chemotherapy drugs fludarabine and cyclophosphamide with rituximab.

The data and safety monitoring board overseeing the trial, known as E1912, recommended that these results be released immediately given their significance to public health. The findings were presented as a late-breaking abstract at the American Society of Hematology (ASH) annual meeting on December 4, 2018. The trial was sponsored by the National Cancer Institute (NCI), part of the National Institutes of Health, and designed by researchers with the ECOG-ACRIN Cancer Research Group.

"These results are practice-changing and immediately establish ibrutinib and rituximab as the new standard of care for the initial treatment of CLL in patients age 70 and younger," said lead investigator Tait Shanafelt, M.D., a professor of hematology at the Stanford University School of Medicine in Palo Alto, California. "The E1912 trial showed that the combination of ibrutinib and rituximab not only provided better leukemia control, it also prolonged life and had fewer side effects."

"These definitive results show why large trials like this, that test new therapies in an effort to achieve clinically meaningful benefit for patients, are so important," said Richard F. Little, M.D., of the Cancer Therapy Evaluation Program at NCI.

The study was conducted through NCI's National Clinical Trials Network. Pharmacyclics LLC provided ibrutinib and clinical trial support funding under a cooperative research and development agreement with NCI and a separate agreement with ECOG-ACRIN.

CLL is one of the most common types of leukemia in adults. It typically occurs during or after middle age and rarely occurs in individuals under the age of 40. Ibrutinib and rituximab are targeted treatments. Ibrutinib interferes with the survival of lymphocytic leukemia cells, and rituximab enhances the ability of the body's immune system to destroy the cells. Ibrutinib is approved by the U.S. Food and Drug Administration for the treatment of some blood cancers, including CLL.

The trial enrolled 529 patients between January 2014 and June 2016. Those enrolled in the trial were adults age 70 and younger who had never received treatment for CLL and required treatment. Patients were randomly assigned to receive either the ibrutinib-rituximab combination or FCR.

The first planned interim analysis for progression-free survival was performed in September 2018. With a median follow-up of 33.4 months, the hazard ratio for progression-free survival favored the ibrutinib group over the FCR group (HR=0.352). This means that, at any given time, the risk of disease progression was reduced by about two-thirds (65 percent) for patients in the ibrutinib group compared with the FCR group. This observed improvement in progression-free survival exceeded the trial design target. Overall survival was also superior for patients in the ibrutinib arm.

According to the data and safety monitoring board's recommendation, the outcome has been disclosed to all patients participating in the study and their physicians. Patients who are receiving ibrutinib in the trial can continue therapy, as long as it remains effective. All patients assigned to FCR have completed treatment and are continuing to be monitored per standard of care. Quality of life was rigorously measured in both arms, and the data are awaiting analysis.

Findings from another NCI-supported trial on ibrutinib in patients with CLL were also presented at the ASH meeting and published in The New England Journal of Medicine. The A041202 trial--an international phase 3 clinical trial coordinated by the Alliance for Clinical Trials in Oncology--demonstrated that ibrutinib produces superior progression-free survival compared with standard chemoimmunotherapy (bendamustine plus rituximab) in previously untreated patients with CLL who are age 65 and older. The study found that adding rituximab to ibrutinib did not improve progression-free survival beyond ibrutinib alone.

"These two NCI-funded trials have collectively established ibrutinib-based therapy as the first line treatment for CLL patients of any age," Dr. Little said.

Credit: 
ECOG-ACRIN Cancer Research Group

Consumption of children's antibiotics varies widely globally

4 December 2018]

Researchers analyzing the sales of oral antibiotics for children in 70 high- and middle-income countries found that consumption varies widely from country to country with little correlation between countries' wealth and the types of antibiotics. Of concern is the relatively low-level use of amoxicillin, an antibiotic to treat the most common childhood infections. In addition, the review found the sale of antibiotics which should only be used for specific indications, or 'Watch' antibiotics in a quarter of all countries accounted for 20% of total antibiotic consumption. This is of concern since there is a higher risk of bacteria developing resistance to 'Watch' antibiotics.

In 2017, the World Health Organization (WHO) grouped antibiotics into three categories - Access, Watch, and Reserve - with recommendations on when each category should be used to ensure antibiotics are available when needed, and that the right antibiotics are prescribed for the right infections. This categorization is designed to enhance treatment outcomes, reduce the development of drug-resistant bacteria, and preserve the effectiveness of 'last-resort' antibiotics when all others fail.

While the report finds the consumption of 'Access' antibiotics made up on average 76% of child-appropriate antibiotic formulations across all countries, the use of amoxicillin in community practice is relatively low (median 31%). Categorized by WHO as an 'Access' antibiotic, amoxicillin should be used as first choice for most common antibiotic treatment indications encountered in community practice.

Dr Julia Bielicki, Senior Lecturer at St George's, University of London, and study lead said: "This is the first attempt at developing simple metrics of global child community antibiotic use based on the WHO's grouping. The data can be used by countries to assess their antibiotic use patterns for young children. Countries with low Access percentages can identify opportunities for greater use of these antibiotics. Unnecessary use of Watch antibiotics is more clearly identifiable."

The research was supported by GARDP, the Global Antibiotic Research and Development Partnership. Dr Manica Balasegaram, Executive Director of GARDP, said: "WHO strongly encourages use of 'Access' antibiotics to treat the majority of infections for children and adults as they are affordable, generally less toxic and less likely to drive future antibiotic resistance. Providing country policymakers with evidence on what antibiotics are being prescribed in their country is an important first step to help countries tackle inappropriate prescribing of antibiotics. This in turn will help countries deliver their National Action Plan on antimicrobial resistance and ensure antibiotics remain available and effective for generations to come."

"Consumption of oral antibiotic formulations for young children according to the WHO AWaRe groups; an analysis of sales data from 70 middle and high-income countries" was published in Lancet Infectious Diseases on 3 December 2018.

Credit: 
Drugs for Neglected Diseases Initiative

Personalised ultrasound scan showing atherosclerosis helps reduce cardiovascular risk

A new randomised trial of over 3000 people in The Lancet finds that sharing pictorial representations of personalised scans showing the extent of atherosclerosis (vascular age and plaque in the arteries) to patients and their doctors results in a decreased risk of cardiovascular disease one year later, compared to people receiving usual information about their risk.

Smoking cessation, physical activity, statins, and antihypertensive medication to prevent cardiovascular disease are among the most evidence-based and cost-effective interventions in health care. However, low adherence to medication and lifestyle changes mean that these types of prevention efforts often fail.

"Cardiovascular disease is the leading cause of death in many countries, and despite a wealth of evidence about effective prevention methods from medication to lifestyle changes, adherence is low," says Professor Ulf Näslund, Umea University (Sweden). "Information alone rarely leads to behaviour change and the recall of advice regarding exercise and diet is poorer than advice about medicines. Risk scores are widely used, but they might be too abstract, and therefore fail to stimulate appropriate behaviours. This trial shows the power of using personalised images of atherosclerosis as a tool to potentially prompt behaviour change and reduce the risk of cardiovascular disease." [1]

3532 individuals who were taking part in the Västerbotten County (Sweden) cardiovascular prevention programme were included in the study and underwent vascular ultrasound investigation of the carotid arteries. Half (1749) were randomly selected to receive the pictoral representation of carotid ultrasound, and half (1783) did not receive the pictorial information.

Participants aged 40 to 60 years with one or more cardiovascular risk factors were eligible to participate. All participants underwent blood sampling, a survey of clinical risk factors and ultrasound assessment for carotid intima media wall thickness and plaque formation. Each person in the intervention group received a pictoral representation of plaque formation in their arteries, and a gauge ranging from green to red to illustrate their biological age compared with their chronological age. They then received a follow up call from a nurse after 2-4 weeks to answer any questions. The same pictorial presentation of the ultrasound result was also sent to their primary care doctor. Thus, the study had dual targets.

Both groups received information about their cardiovascular risk factors and a motivational health dialogue to promote healthier life style and, if needed according to clinical guidelines, pharmacological treatment.

At one year follow up, the cardiovascular risk score for all participants (3175 completed the follow up) was calculated showing differences between the two groups (Framingham Risk Score decreased in the intervention group but increased in the control group [-0.58 vs +0.35]; SCORE increased by twice as much in control group compared to the intervention group [0.27 vs 0.13]).

Improvements were also seen for total and LDL cholesterol in both groups, but the reduction was greater in the intervention group than in the control group. A graded effect was also noted, with the strongest effect seen for those with the worst results.

"The differences at a population level were modest, but important, and the effect was largest among those at highest risk of cardiovascular disease, which is encouraging. Imaging technologies such as CT and MRI might allow for a more precise assessment of risk, but these technologies have a higher cost and are not available on an equitable basis for the entire population. Our approach integrated an ultrasound scan, and a follow up call with a nurse, into an already established screening programme, meaning our findings are highly relevant to clinical practice," says Prof Näslund [1].

Importantly, the effect of the intervention did not differ by education level, suggesting that this type of risk communications might contribute to a reduction of the social gap in health. The findings come from a middle-aged population with low to moderate cardiovascular disease risk.

Further research is needed to understand whether the results are sustainable beyond one year, and whether the intervention will lead to a reduction of cardiovascular disease in the long-term. Formal cost-effectiveness analyses will be done after 3-year follow-up.

Writing in a linked Comment, Dr Richard Kones, Umme Rumana and Alberto Morales Salinas, Cardiometabolic Research Institute (USA), says:

"Despite advances in cardiovascular therapies, coronary heart disease remains the leading cause of death in almost all countries. Two of the most remarkable recent treatments, percutaneous coronary intervention and the availability of proprotein convertase subtilisin/ kexin type 9 inhibitor drugs, have revolutionised cardiology practice. Although life-saving and now essential therapies, whether they will be able to reduce the incidence and associated morbidity and mortality of coronary heart disease remains unlikely since the increase in prevalence of obesity and diabetes is raising the background level of cardiovascular risk... Although there are proven methods of lowering cardiovascular risk and these are generally being better used generally in high-income countries, poor adherence and uneven availability and access in low income and middle-income countries still pose serious challenges... About less than half of all patients taking medications are adherent, which substantially increases morbidity and mortality. Non-adherence to medication accounts for 33-69% of all hospital admissions in the USA, and, among patients with coronary heart disease, the extent of low adherence is related to the number of adverse cardiovascular events. Poor adherence is multifactorial and can broadly be grouped into categories related to patients, physicians and therapies, communication, health-care systems, socioeconomic factors, and unpredictable negative effects of the internet. One of the most pertinent factors is patient-related perceived risk and motivation. Despite the many methods that have been proposed, effectiveness in improving adherence and outcomes has been relatively disappointing. It is in this context that the randomised controlled trial by Ulf Näslund and colleagues in The Lancet is relevant."

Credit: 
The Lancet

Solving 21st-century problems requires skills that few are trained in, scientists find

From companies trying to resolve data security risks to coastal communities preparing for rising sea levels, solving modern problems requires teamwork that draws on a broad range of expertise and life experiences. Yet individuals receive little formal training to develop the skills that are vital to these collaborations.

In a new scientific report published in Psychological Science in the Public Interest, an interdisciplinary team of researchers identifies the essential cognitive and social components of collaborative problem solving (CPS) and shows how integrating existing knowledge from a variety of fields can lead to new ways of assessing and training these abilities.

The report, authored by Arthur C. Graesser (University of Memphis), Stephen M. Fiore (University of Central Florida), Samuel Greiff (University of Luxembourg), Jessica Andrews-Todd (Educational Testing Service), Peter W. Foltz (Pearson and University of Colorado), and Friedrich W. Hesse (Leibniz-Institut fur Wissensmedien and University of Tubingen), is accompanied by a commentary from cognitive development expert Mary Gauvain (University of California, Riverside).

"CPS is an essential skill in the workforce and the community because many of the problems faced in the modern world require teams to integrate group achievements with team members' idiosyncratic knowledge," the authors of the report say.

As societies and technologies become increasingly complex, they generate increasingly complex problems. Devising efficient, effective, and innovative solutions to these complex problems requires CPS skills that most students lack. According to a 2015 assessment of more than 500,000 15-year-old students conducted by the Organisation for Economic Cooperation and Development, only 8% of students around the world showed strong CPS skills.

"The experiences of students in and out of the classroom are not preparing them for these skills that are needed as adults," Graesser and colleagues write.

This unique set of cognitive and social skills support core aspects of CPS, including:

Shared understanding: Group members share common goals when solving a new problem.

Accountability: The contributions that each member makes are visible to the rest of the group.

Differentiated roles: Group members draw on their specific expertise to complete different tasks.

Interdependency: Group members depend on the contributions of others to solve the problem.

One reason for the lack of CPS training is a deficit in evidence-based standards and curricula. Secondary school curricula typically focus on educating task- and discipline-specific knowledge, placing little emphasis on educating students' ability to communicate and collaborate effectively.

"Students rarely receive meaningful instruction, modeling, and feedback on collaboration," the researchers note.

When students do receive training relevant to CPS, it is often because they participate in extracurricular activities such as band, sports, student newspapers, and volunteer activities. Even then, the collaborative competencies are not directly relevant to problem solving. The authors argue that it is time to make CPS activities a core part of the curriculum.

Although considerable psychological, educational, and management research has examined factors that contribute to effective learning, teamwork, and decision making, research that directly examines how to improve collaborative problem solving is scarce.

According to the authors, "we are nearly at ground zero in identifying pedagogical approaches to improving CPS skills."

Developing and implementing effective CPS training stands to have significant societal impacts across a wide range of domains, including business, science, education, technology, environment, and public health. In a project funded by the National Science Foundation, for example, Fiore and other research team members are training students to collaborate across a range of disciplines -- including environmental science, ecology, biology, law, and policy -- to identify ways to address social, business, and agricultural effects of rising sea levels in Virginia's Eastern Shore.

"It's exciting to engage in real world testing of methods developed in laboratory studies on teamwork, to see how feedback on collaboration, and reflection on that feedback to improve teamwork strategies, can improve students' problem solving," Fiore explains.

Identifying the necessary components of this kind of training and determining how to translate those components across a variety of real-world settings will, itself, require interdisciplinary cooperation among researchers, educators, and policymakers.

In the commentary, Gauvain emphasizes that achieving a comprehensive understanding of CPS requires taking a developmental perspective and she notes that psychological scientists will be essential in this endeavor. Graesser and colleagues agree:

"When psychological scientists collaborate with educational researchers, computer scientists, psychometricians, and educational experts, we hope to move forward in addressing this global deficit in CPS," they conclude.

Credit: 
Association for Psychological Science

Billions of nanoplastics accumulate in marine organisms within six hours

image: These are some of the scallops used as part of the current research.

Image: 
University of Plymouth

The research, led by the University of Plymouth, examined the uptake of nanoparticles by a commercially important mollusc, the great scallop (Pecten maximus).

After six hours exposure in the laboratory, billions of particles measuring 250nm (around 0.00025mm) had accumulated within the scallop's intestines.

However, considerably more even smaller particles measuring 20nm (0.00002mm) had become dispersed throughout the body including the kidney, gill, muscle and other organs.

The study is the first to quantify the uptake of nanoparticles at predicted environmentally relevant conditions, with previous research having been conducted at far higher concentrations than scientists believe are found in our oceans.

Dr Maya Al Sid Cheikh, Postdoctoral Research Fellow at the University of Plymouth, led the study. She said: "For this experiment, we needed to develop an entirely novel scientific approach. We made nanoparticles of plastic in our laboratories and incorporated a label so that we could trace the particles in the body of the scallop at environmentally relevant concentrations. The results of the study show for the first time that nanoparticles can be rapidly taken up by a marine organism, and that in just a few hours they become distributed across most of the major organs."

Professor Richard Thompson OBE, Head of the University's International Marine Litter Research Unit, added: "This is a ground breaking study, in terms of both the scientific approach and the findings. We only exposed the scallops to nanoparticles for a few hours and, despite them being transferred to clean conditions, traces were still present several weeks later. Understanding the dynamics of nanoparticle uptake and release, as well as their distribution in body tissues, is essential if we are to understand any potential effects on organisms. A key next step will be to use this approach to guide research investigating any potential effects of nanoparticles and in particular to consider the consequences of longer term exposures."

Accepted for publication in the Environmental Science and Technology journal, the study also involved scientists from the Charles River Laboratories in Elphinstone, Scotland; the Institute Maurice la Montagne in Canada; and Heriot-Watt University.

It was conducted as part of RealRiskNano, a £1.1million project funded by the Natural Environment Research Council (NERC). Led by Heriot-Watt and Plymouth, it is exploring the effects which microscopic plastic particles can have on the marine environment.

In this study, the scallops were exposed to quantities of carbon-radiolabeled nanopolystyrene and after six hours, autoradiography was used to show the number of particles present in organs and tissue.

It was also used to demonstrate that the 20nm particles were no longer detectable after 14 days, whereas 250nm particles took 48 days to disappear.

Ted Henry, Professor of Environmental Toxicology at Heriot-Watt University, said: "Understanding whether plastic particles are absorbed across biological membranes and accumulate within internal organs is critical for assessing the risk these particles pose to both organism and human health. The novel use of radiolabelled plastic particles pioneered in Plymouth provides the most compelling evidence to date on the level of absorption of plastic particles in a marine organism."

Credit: 
University of Plymouth

Mayo Clinic researchers identify new strategies that may improve CAR-T cell therapy

SAN DIEGO -- Mayo Clinic researchers have developed two new strategies that may improve the performance of chimeric antigen receptor therapy (CAR-T cell therapy) in treating cancer. They are presenting results of their preclinical research at the 2018 annual meeting of the American Society of Hematology in San Diego.

Reducing toxicity in CAR-T cell therapy

"While CAR-T cell therapy has proven successful in treating certain cancers, severe toxicities have limited its widespread application," says Rosalie Sterner, an M.D.-Ph.D. student working in the T Cell Engineering Laboratory of Saad Kenderian, M.B. Ch.B., a Mayo Clinic hematologist. Sterner says toxicities associated with CAR-T cell therapy include cytokine release syndrome, in which patients can experience fever, nausea, headache, rash, rapid heartbeat, low blood pressure, and difficulty breathing and neurotoxicity.

Sterner says some patients undergoing CAR-T cell therapy get sick during treatment and require a stay in an ICU. She also notes that deaths related to the side effects of CAR-T cell therapy have been reported. Sterner and her colleagues developed a strategy to reduce the severe toxicities associated with CAR-T cell therapy.

The strategy involves blocking the GM-CSF protein, which is produced by CAR-T cells and other cells using a clinical-grade antibody (lenzilumab).

"When we blocked the GM-CSF protein, we found that we could reduce toxicities in preclinical models, says Sterner. "We also were able to demonstrate that CAR-T cells worked better after the GM-CSF protein was blocked."

Next, researchers used a gene editing technology, called CRISPR, to generate CAR-T cells that did not secrete the GM-CSF protein. Sterner says these modified CART cells were more effective than regular CAR-T cells.

Based on their findings, the research team is proceeding with a phase II clinical trial of the GM-CSF blocking antibody during CAR-T cell therapy. If the trial results are consistent with earlier findings, the therapy could become a standard of care during CAR-T cell therapy at Mayo Clinic.

This research also is published in Blood.

mproving response rates for CAR-T cell therapy in B cell lymphoma

"In CAR-T cell therapy, physicians remove and modify a patient's T cells to recognize and fight cancer," says Reona Sakemura, M.D., Ph.D., a hematologist and a postdoctoral fellow in Dr. Kenderian's laboratory. "Once modified T cells are reinfused into the patient where they seek out and ultimately kill cancer cells."

Dr. Sakemura says response rates for CAR-T cell therapy vary by disease. For example, in B cell acute lymphoblastic leukemia, response rates of more 90 percent have been seen, compared to response rates of 10 to 30 percent for treatment with conventional chemotherapy. In other blood cancers, such as lymphoma and chronic lymphocytic leukemia, the response rates for treatment with CAR-T cell therapy remain low.

To improve the effectiveness of CAR-T cell therapy in these cancers, Dr. Sakemura and his colleagues developed a strategy to combine CAR-T cell therapy with a drug that targets a protein called "AXL." This protein is present on the cancer and within the cancer's environment. The drug, called "TP-0903," not only kills cancer cells, but also it enhances the potency of CAR-T cells in attacking cancer cells and potentially lowers the toxicity associated with CAR-T cell treatment.

While more research and clinical trials are needed, Dr. Sakemura says, "We believe the latter effect may eventually be utilized as an innovative approach to augment the efficacy of CAR-T cell therapy and extend its use to other B cell cancers."

Credit: 
Mayo Clinic

Rotavirus outsources cellular protein CK1α to assemble virus factories

image: This is Dr. Mary Estes.

Image: 
Baylor College of Medicine

Rotaviruses, like all viruses, reproduce inside living cells. Making new viruses requires assembling replication factories via a complex, little known process that involves both viral and cellular components. A report in the Proceedings of the National Academy of Sciences by a multidisciplinary team led by researchers at Baylor College of Medicine reveals that the formation of rotavirus factories depends on a cellular protein called CK1α, which chemically modifies viral component NSP2, thus triggering its localization and assembly into the virus factory, an essential step in the formation of new viruses.

"One of the interests of our labs is to better understand the process of assembling rotavirus factories," said first co-author Dr. Jeanette M. Criglar, staff scientist of molecular virology and microbiology at Baylor College of Medicine and a graduate of the program.

In the process of investigating this, Criglar and her colleagues discovered that a cellular protein called CK1α is required to assemble rotavirus factories.
"When we silenced CK1α in cells before infection with rotavirus, we knocked down the replication of the virus by more than 90 percent, suggesting that CK1α largely controls the formation of rotavirus factories," Criglar said.

CK1α is a enzyme with the ability to modify other proteins and their functions chemically by adding phosphate groups to them. The researchers discovered that CK1α mediates its effect on the formation of rotavirus replication factories by adding a phosphate group to a rotavirus protein called NSP2. This phosphate modification triggers the assembly of NSP2 octameric units into a crystal-like structure and appears to be required for the formation of rotavirus factories.

"CK1α normally takes care of housekeeping tasks within the cell. Rotavirus takes advantage of this protein's activity, 'outsourcing' it to assemble the virus factories," said corresponding author Dr. Mary K. Estes, Cullen Foundation Endowed Professor Chair of Human and Molecular Virology at Baylor College of Medicine and emeritus founding director of the Texas Medical Center Digestive Diseases Center.

In addition, the team discovered that rotavirus protein NSP2 can add phosphate groups to itself, thus modifying its activity and affecting other proteins involved in virus assembly. This is a surprising finding, Estes explains, because this function had not been described before for this viral protein.

"Taken together, our findings suggest that a cascade of phosphate chemical modifications, which is mediated in part by CK1α and NSP2, is essential for the formation of rotavirus factories," said co-author Dr. B V Venkataram Prasad, professor and Alvin Romansky Chair in Biochemistry and Molecular Biology, and member of the Dan L Duncan Comprehensive Cancer Center at Baylor. "These findings provide new insights that could lead to previously unsuspected ways to fight the disease in the future."
"It is possible that our findings may also shed light on the assembly of virus factories for other viruses that also require CK1α, such as hepatitis C, or that also form cytoplasmic virus factories like West Nile and dengue virus," Criglar said. "If we can understand how other viruses assemble their factories, perhaps using similar mechanisms to rotavirus, we could advance the understanding of those diseases as well."

Credit: 
Baylor College of Medicine