Body

Drug overcomes chemotherapy resistance in ovarian cancer

image: Professor John Hooper

Image: 
Mater Research

An international research team, led by scientists from Mater Research - The University of Queensland, have discovered they can overcome chemotherapy resistance in an ovarian cancer subtype by using low doses of a drug which slows cell growth.

Principal Investigator, Professor John Hooper and his team based at the Translational Research Institute (TRI) in Brisbane, Australia, collaborated with researchers from the Queensland University of Technology (QUT) and the Mayo Clinic in the United States of America.

In a pre-clinical study, they found that 2-deoxy-D-glucose, could be used at very low levels to significantly improve the effectiveness of the chemotherapy drug, carboplatin, in treating laboratory models of clear cell ovarian cancer.

Their work, which was published in the scientific journal, Cancers, provides the rationale for a clinical trial to evaluate the use of low-dose 2-deoxy-D-glucose in treating patients with this type of cancer, according to Professor Hooper.

"Ovarian clear cell carcinoma is associated with poor prognosis and resistance to chemotherapy," he said.

"The key finding from our study is that low levels of 2-deoxy-D-glucose markedly improved the efficacy of carboplatin against preclinical models of this ovarian cancer.

"Our pre-clinical work used cells taken from patient tumours, so we were very encouraged that we could use such a low dose of 2-deoxy-D-glucose to overcome resistance to chemotherapy in this cancer and stop tumour growth.

"This drug has been trialled previously in other cancers, but we were able to use a 10-fold lower dose than previously reported so that it's safer for patients and is less likely to cause side-effects."

The team is hoping to begin trialling the treatment combination in patients within the next 12 months, following the announcement they had received an award to progress the ovarian cancer research.

The peak, national gynaecological cancer clinical trials organisation for Australia and New Zealand, ANZGOG, awarded the team its Fund for New Research 2019 - Judith Meschke Memorial Grant to study whether "modulation of metabolism can improve the effectiveness of chemotherapy for clear cell ovarian cancer".

Professor Hooper gratefully acknowledged the generosity of the ANZGOG funding from a bequest of Judith Meschke.

"The involvement in the project of so many talented people, scientists and clinicians, is in the spirit of the creative process fostered by the diverse and talented Australian arts community exemplified by the achievements of Ms Meschke," he said.

Credit: 
Translational Research Institute

SCAI issues recommendations on adult congenital cardiac interventional training

WASHINGTON -- The Society for Cardiovascular Angiography and Interventions (SCAI) has released a position statement on adult congenital cardiac interventional training, competencies and organizational recommendations. The paper was published online in SCAI's official journal Catheterization and Cardiovascular Interventions and addresses the rapidly growing field of catheter-based interventions in adults with congenital heart disease.

Congenital heart disease (CHD) is the most common congenital abnormality and occurs in ~0.8% of all live births. The number of adults with CHD (ACHD) in the United States now exceeds the number of pediatric patients.

The statement, also endorsed by the Adult Congenital Heart Association, focuses on three major areas: eligibility for training, training environment and duration of training, and procedural volume. Key recommendations include identification of four main training backgrounds for candidates and individual determination of training period(s). The writing group specifies that during an ACHD interventional training curriculum, trainees should participate as primary operator or first assistant in at least 150 ACHD catheterization cases (100 should be interventional procedures).

"The writing group included representatives from the areas of adult congenital cardiology, pediatric interventional cardiology, and structural interventional cardiology," said Jamil Aboulhosn, MD, FSCAI, chair of the writing group and director, Ahmanson/UCLA Adult Congenital Heart Disease Center. "The recommendations made have wide applicability to trainees and established interventionalists from a variety of clinical and training backgrounds. In addition, the statement affirms the importance of institutional and team staffing, procedures and processes to ensure delivery of high quality invasive cardiac care," he continued.

The document suggests that training of future specialists for this specific population be delivered by multi-disciplinary teams that combine adult and pediatric expertise in a collaborative approach. In conclusion, members of the writing group recommend multi-society collaboration to help establish and maintain competency standards for physicians.

Credit: 
Society for Cardiovascular Angiography and Interventions

Does primary ovarian insufficiency affect your risks for obesity and diabetes?

CLEVELAND, Ohio (April 15, 2020)--Are overweight women less fertile? Does primary ovarian insufficiency increase risks for obesity and diabetes? For years the controversy regarding the connection between reproductive health and body mass index has continued. A new study assessed the effect of ovarian reserve on obesity and glucose metabolism and found no correlation. Study results are published online today in Menopause, the journal of The North American Menopause Society (NAMS).

Ovarian reserve has been defined as the number and quality of a woman's eggs. A low ovarian reserve means that the number and/or quality of eggs a woman has is low for her age, making it more difficult for her to become pregnant. But low ovarian reserve can have other health ramifications beyond fertility. A number of previous studies have suggested that a lower reserve is linked to an increase in the storage of fat and impaired ability to process insulin, putting a woman at greater risk for diabetes.

However, in this latest study involving more than 1,000 participants and follow-up of 16 years, researchers concluded that a woman's level of ovarian reserve was not associated with her risk of becoming obese or diabetic. The study specifically evaluated changes in a woman's level of antimüllerian hormone (AMH), which is found in the blood and helps to estimate the duration of a woman's reproductive lifespan, ultimately determining that this biomarker does not predict cardiometabolic risk.

Study results appear in the article, "Do trends of adiposity and metabolic parameters vary in women with different ovarian reserve status? A population-based cohort study."

"Although previous research has clearly established a link between early menopause and cardiovascular disease risk, the present study showed that lower ovarian reserve, as measured by a single AMH level, was not associated with greater over time trends in adiposity and markers of glucose metabolism. Additional study is needed to determine how best to predict cardiometabolic risk in women with and without primary ovarian insufficiency in order to initiate appropriate risk reduction strategies," says Dr. Stephanie Faubion, NAMS medical director.

Credit: 
The Menopause Society

Aspirin linked to reduction in risk of several cancers of the digestive tract

Aspirin is associated with a reduction in the risk of developing several cancers of the digestive tract, including some that are almost invariably fatal, such as pancreatic and liver cancers.

The largest and most comprehensive analysis to date of the link between aspirin and digestive tract cancers, published in the leading cancer journal Annals of Oncology [1] today (Thursday), found reductions in the risk of these cancers of between 22% and 38%.

Aspirin has been linked to a reduction in the risk of bowel cancer for some time, and other, smaller analyses have found associations with cancers of the oesophagus (the food pipe or gullet) and stomach.

This analysis looked at evidence from 113 observational studies investigating cancers in the general population published up to 2019, of which 45 studies were on bowel cancer and included 156,000 cases. In addition to bowel cancer, the cancers investigated included those of the head and neck, oesophagus, stomach, the part of the stomach that connects to the oesophagus (gastric cardia), liver, gallbladder and bile ducts (hepato-biliary) and pancreas.

The researchers, led by Dr Cristina Bosetti (PhD), head of the Unit of Cancer Epidemiology at the Mario Negri Department of Oncology, Milan (Italy), found that regular use of aspirin, defined as taking at least one or two tablets a week, was associated with a significant reduction in the risk of developing all these cancers, apart from head and neck cancer.

Specifically, aspirin use was linked to 27% reduced risk of bowel cancer (45 studies), 33% reduced risk of oesophageal cancer (13 studies), 39% reduced risk of gastric cardia (ten studies), 36% reduced risk of stomach cancer (14 studies), 38% reduced risk of hepato-biliary cancers (five studies), and 22% reduced risk of pancreatic cancer (15 studies). Ten studies of head and neck cancer did not show a significant reduction in risk.

The senior author of the paper, Carlo La Vecchia (MD), Professor of Epidemiology at the School of Medicine, University of Milan, said: "There are about 175,000 deaths from bowel cancer predicted for 2020 in the EU, of which about 100,000 will be in people aged between 50 and 74. If we assume that regular use of aspirin increases from 25% to 50% in this age group, this would mean that between 5,000 to 7,000 deaths from bowel cancer and between 12,000 and 18,000 new cases could be avoided if further studies show that aspirin does indeed cause the reduction in cancer risk.

"Corresponding figures would be approximately 3,000 deaths each for oesophageal, stomach and pancreatic cancer, and 2,000 deaths from cancer of the liver. Given the unfavourable prognoses for these cancers, the number of new cases would be only slightly greater."

The researchers also analysed the effect of aspirin dose and duration on bowel cancer. They looked at low dose (100mg), regular (325mg) and high dose (500mg), combined with how many times a day, week or month it was taken.

Dr Bosetti said: "We found that the risk of cancer was reduced with increased dose; an aspirin dose between 75 and 100mg a day was associated with a 10% reduction in a person's risk of developing cancer compared to people not taking aspirin; a dose of 325mg a day was associated with a 35% reduction, and a dose of 500mg a day was associated with a 50% reduction in risk. However, the estimate for high dose aspirin was based on just a few studies and should be interpreted cautiously.

"Our findings on bowel cancer support the concept that higher aspirin doses are associated with a larger reduction in risk of the disease. However, the choice of dose should also take into consideration the potential risk of stomach bleeds, which increases with higher aspirin doses.

"Compared to people who did not take aspirin regularly, the risk of bowel cancer declined in regular aspirin users up to ten years. The risk was reduced by 4% after one year, 11% after three years, 19% after five years and 29% after ten years."

Prof Carlo La Vecchia said: "These findings suggest there's a beneficial effect of aspirin in the prevention of bowel and other cancers of the digestive tract. The results for bowel, oesophageal and pancreatic cancers are consistent with evidence from clinical trials on aspirin in the prevention of heart and blood vessel diseases.

"The findings for pancreatic and other digestive tract cancers may have implications for the prevention of these highly lethal diseases. For pancreatic cancer, we found that risk of the disease declined by 25% after five years among people who took aspirin regularly compared to those who did not.

"Taking aspirin for the prevention of bowel cancer, or any other cancers, should only be done in consultation with a doctor, who can take account of the person's individual risk. This includes factors such as sex, age, a family history of a first-degree relative with the disease, and other risk factors. People who are at high risk of the disease are most likely to gain the greatest benefits from aspirin."

In addition to stomach bleeds, the side effects of aspirin include bleeding in other parts of the body and, occasionally, haemorrhages.

As the study is based on observational studies, it can only show that aspirin is associated with a reduced risk, and biases or confounding factors may partly explain its results. Other limitations include the fact that in some studies information may not reflect changes in aspirin use over time; the people in the studies might not remember or report their aspirin use accurately; and most studies did not have data on other medications that might affect the association between aspirin and the risk of cancer.

Credit: 
European Society for Medical Oncology

Obesity is a critical risk factor for type 2 diabetes, regardless of genetics

Obesity increases the risk of developing type 2 diabetes by at least 6 times, regardless of genetic predisposition to the disease, concludes research published in Diabetologia (the journal of the European Association for the Study of Diabetes [EASD]). The study is by Dr Theresia Schnurr and Hermina Jakupovi?, Novo Nordisk Foundation Center for Basic Metabolic Research, Faculty of Health and Medical Sciences, University of Copenhagen, Denmark, and colleagues.

Using data from a case-cohort study nested within the Diet, Cancer and Health cohort in Denmark, the authors examined the joint association of obesity, genetic predisposition, and unfavourable lifestyle with incident type 2 diabetes (T2D). The study sample included 4,729 individuals who developed type 2 diabetes during a median 14.7 years of follow-up, and a randomly selected cohort sample of 5,402 individuals (the control group).

The mean age of all participants was 56.1 years (range 50-65) and 49.6% were women. Overall, 21.8% of all participants were classified as obese, 43.0% as overweight and 35.2% as having normal weight; and 40.0% of the participants had a favourable lifestyle, 34.6% had an intermediate lifestyle and 25.4% had an unfavourable lifestyle.

Genetic predisposition was quantified using a genetic risk score (GRS) comprising 193 known type 2 diabetes-associated genetic variants and divided into 5 risk groups of 20% each (quintiles), from lowest (quintile 1) to highest (quintile 5) genetic risk. Lifestyle was assessed by a lifestyle score composed of smoking, alcohol consumption, physical activity and diet. Statistical modelling was used to calculate the individual and combined associations of the GRS, obesity and lifestyle score with developing T2D.

Compared with people of normal weight, those with obesity were almost six times more likely to develop T2D, while people who were overweight had a 2.4 times increased risk. For genetic risk, those with the highest GRS were twice as likely to develop T2D as those with the lowest, while those with the unhealthiest lifestyle were 18% more likely to develop T2D than those with the healthiest.

Individuals who ranked high for all three risk factors, with obesity, high GRS and unfavourable lifestyle, had a 14.5 times increased risk of developing T2D, compared with individuals who had a normal body weight, low GRS and favourable lifestyle. Notably, even among individuals with a low GRS and favourable lifestyle, obesity was associated with 8.4 times increased risk of T2D compared with normal weight individuals in the same genetic and lifestyle risk group.

The authors conclude: "The results suggest that type 2 diabetes prevention by weight management and healthy lifestyle is critical across all genetic risk groups. Furthermore, we found that the effect of obesity on type 2 diabetes risk is dominant over other risk factors, highlighting the importance of weight management in type 2 diabetes prevention."

Credit: 
Diabetologia

Discovering the secrets of the enigmatic caspase-6

image: Senior author Thirumala-Devi Kanneganti, Ph.D., and first author Min Zheng, Ph.D., both of the Department of Immunology, discovered a key component in cell death processes that allow for possible, new ways to fight viruses.

Image: 
St. Jude Children's Research Hospital

St. Jude Children's Research Hospital scientists have identified previously unknown functions of the enigmatic enzyme caspase-6. The findings show that caspase-6 is a key regulator of innate immunity, inflammasome activation and host defense. Modulation of caspase-6 could be beneficial for treating viral diseases like influenza and other inflammatory diseases including cancer. The work appears as an advance online publication today in Cell.

Caspases are a family of enzymes that regulate programmed cell death (how a cell self-destructs), inflammation and other biological functions. Caspase-6 has previously been characterized as an executioner caspase in a non-inflammatory form of cell death called apoptosis. Caspase-6 has also been linked to neurological disorders like Alzheimer's disease and Huntington disease. However, the full range of the enzyme's function was not well understood. Now, researchers have discovered for the first time how caspase-6 regulates the ZBP1-NLRP3 inflammasome.

"We contributed to the fundamental understanding of caspase-6, which has remained a mystery in the field for decades," said senior author Thirumala-Devi Kanneganti, Ph.D., of the St. Jude Department of Immunology. "Caspase-6 has essential functions in innate immunity, inflammation and in driving PANoptosis."

Cell death and innate immune function

The Kanneganti laboratory previously was first to identify ZNA-binding protein 1 (ZBP1) as an innate immune sensor of influenza, an RNA virus. Their work also revealed that ZBP1 triggers inflammatory cell death in the form of pyroptosis, apoptosis and necroptosis, which together are known as PANoptosis.

PANoptosis is an inflammatory death pathway regulated by components of a structure termed the PANoptosome, which mediates cell death that cannot be assigned to any of the single cell death pathways described previously. In this study, the scientists found that caspase-6 played a critical role in this process.

The researchers found that caspase-6 interacts with RIPK3 to facilitate the recruitment of RIPK3 to the ZBP1-PANoptosome. This makes caspase-6 crucial for assembly of this ZBP1-mediated inflammatory cell death-inducing complex. In line with these findings, the researchers demonstrated that caspase-6 is required for ZBP1-mediated PANoptosis during viral infection.

"Caspase-6 deficiency in mice leads to increased susceptibility to influenza virus infection and higher levels of viral replication in the lungs," said first author Min Zheng, Ph.D., of the St. Jude Department of Immunology. "It is likely that the caspase-6-mediated inflammatory cell death pathway is essential to fighting other viruses that activate similar innate immune pathways, potentially including other respiratory viruses."

The discovery that caspase-6 is a key component in cell death processes has diverse implications for human health, suggesting that modulation of caspase-6 could be a beneficial approach to infectious and inflammatory disease treatment.

The other authors are Rajendra Karki and Peter Vogel, both of St. Jude.

The research at St. Jude was funded in part by grants from the National Institutes of Health (AI101935, AI124346, AR056296 and CA163507) and ALSAC, the fundraising and awareness organization of St. Jude.

Credit: 
St. Jude Children's Research Hospital

Low-cost imaging system poised to provide automatic mosquito tracking

image: A new low-cost imaging system for monitoring mosquitoes could be used to transmit images of mosquitoes inside traps like the one seen here. This could make it easier to track mosquito species that carry disease.

Image: 
Adam Goodwin, Johns Hopkins University

WASHINGTON -- Mosquito-transmitted diseases such as malaria, dengue and yellow fever are responsible for hundreds of thousands of deaths every year, according to the World Health Organization (WHO). A new low-cost imaging system could make it easier to track mosquito species that carry disease, enabling a more timely and targeted response.

"A remote system like ours can dramatically reduce the labor needed to monitor mosquitos in a given area, thus greatly increasing the capability to do more monitoring," said research team leader Adam Goodwin from Johns Hopkins University, USA. "If you can provide more mosquito data, then you will more quickly catch outbreaks and save more lives."

In The Optical Society (OSA) journal Biomedical Optics Express, Goodwin and colleagues' paper is part of a feature issue on Optical Technologies for Improving Healthcare in Low-Resource Settings. In the paper, they describe the new system, which is designed to transmit images from inside a mosquito trap that are detailed enough for entomologists to distinguish mosquito wing patterns and the color of scales, features that indicate whether a mosquito is a species that carries disease. This information can be used to plan interventions that work best against that species.

"The new system is a classic application of an internet of things (IoT) device," said Goodwin. "It could eventually be paired with computer vision algorithms to automatically determine species and provide that information to public health systems."

Developing a remote imaging trap

In the many areas of the world where mosquito-transmitted disease is problematic, understanding which mosquito species are present in what numbers requires continually trapping mosquitoes at multiple locations. A worker must then drive around a county or region to drop off and pick up hundreds of traps per week and bring the specimens back to the lab to be identified under a microscope.

"Our new optical system can be placed inside a traditional mosquito trap to provide remote surveillance of the abundance, diversity and distribution of mosquito species," said Goodwin. "Using imaging is particularly appealing because as long as image quality is high, several mosquitos could be identified from an image at once."

When designing the system, the researchers focused on the ability to accurately identify Aedes aegypti mosquitos, which can spread Zika, dengue, chikungunya and yellow fever. This invasive species is native to Africa but has established itself in many parts of the world, including North America, Europe and Asia. They say that same approach could be applied to other insects as long as there is a way to capture and reliably image it.

Using optics and camera sensors that are readily commercially available, the researchers optimized their optical setup to achieve a resolution that balanced the need to image many mosquitos at once with the ability to see enough details to identify the mosquito species.

"Our new system would be particularly useful in monitoring Aedes aegypti in hard to reach areas and at commercial ports of entry where invasive species can be brought from other countries," said Goodwin. "It could also expand current surveillance operations for regions already monitoring local populations of Aedes aegypti."

In most cases, public health systems only need to determine if there are changes in the number or type of mosquitos from day to day or hour to hour, not minute to minute. This means a camera sensor would only need to be turned on a few times a day at most. This would keep the power consumption within the range feasible for an internet-connected device.

Testing the system

To test the new system, the researchers compared entomologists' ability to classify specimens from a digital microscopy image and images from the remote imaging system. There was not a significant difference in their capabilities between the image types. Although the entomologists didn't perform well on species classification for either the microscopy images or the remote system images, they did very well on genus classification.

"Entomologists are not used to identifying specimens from an image because they normally have the specimen in person and manipulate it with tweezers under a microscope," said Goodwin. "However, recent work using convolutional neural networks to classify mosquitos from an image does show promise."

The researchers plan to continue optimizing the remote trap and plan to integrate computer vision algorithms as well as internet-connectivity into the system. "This would enable species information to be sent directly to the public health system for decision-making," said Goodwin. "This is where we think the system will really shine."

Credit: 
Optica

Australia's Centre for Digestive Diseases cures Crohn's disease in new study

image: Professor Thomas Borody

Image: 
CDD

The Centre for Digestive Disease (CDD) headed by Professor Thomas Borody has cured Crohn's Disease as reported today by Dr Gaurav Agrawal in Gut Pathogens.

Professor Borody is internationally recognised for curing stomach ulcers caused by H. Pylori, and is currently researching the infection connection associated with heart disease. He is also a leader in Faecal Microbiota Transfer (FMT) and pioneered the innovative treatment process in Australia.

Crohn's Disease was until today an incurable and debilitating gut disease that affects 75,000 people in Australian almost 3 million globally.

Curing Crohn's Disease has been a global priority with 1,455 Crohn's Disease clinical research studies currently listed on ClinicalTrials.Gov.

According to Gut Pathogens:

"Prolonged remission has been achieved for 3-23 years with individualised treatments," patients being off all Crohn's therapies.

Professor Borody and his team devised a treatment of specific antibiotics combinations and doses, and/or FMT.

FMT is where the gut microbiome bacteria from a healthy donor is transferred to the gut of a patient with a damaged gut ecosystem, to repopulate the gut with healthy and balanced microbiome.

Each year, Crohn's Disease results in frequent hosptialisations and surgical procedures and is life threatening.

The research study was funded by the CDD and involved 10 Australian patients. The team was led by Professor Borody and included Dr Gaurav Agrawal, Dr Annabel Clancy and Dr Roy Huynh.

Professor Borody has overseen more than 37,000 FMT processes at the CDD, making him the most experienced FMT specialist in the world. He and his world-class team use FMT to treat and manage a range of gut health conditions.

According to the report in Gut Pathogens:

"Crohn's disease (CD) is a chronic inflammatory process of the digestive tract characterized by deep ulcerations, skip lesions, transmural inflammation, fistulae and granulomas, with no known cure. It has a negative impact on many aspects of quality of life, including physical, social, psychological, and sexual functioning."

"Crohn's disease (CD) is rising in incidence and has a high morbidity and increased mortality. Current treatment use immunosuppressives but e cacy is suboptimal, and relapse is common. It has been shown that there is an imbalance present in the gut microbiome (dysbiosis) in CD with a possible infective aetiology--Mycobacterium avium subsp. paratuberculosis (MAP) being the most proposed. Antibacterial therapy and Faecal Microbiota Transplan- tation (FMT) are emerging treatments which can result in clinical and endoscopic remission, if employed correctly. The objective of this study was to report on the treatment and clinical outcomes of patients with CD in prolonged remission. "

Professor Borody said this breakthrough opens the way for Crohn's treatments using the antibiotic combination and a "crapsule" - an oral capsule of freeze dried donor faecal microbiota for FMT.

Credit: 
Digital Mantra Group

Antiviral drug baloxavir reduces transmission of flu virus among ferrets

Baloxavir treatment reduced transmission of the flu virus from infected ferrets to healthy ferrets, suggesting that the antiviral drug could contribute to the early control of influenza outbreaks by limiting community-based viral spread, according to a study published April 15 in the open-access journal PLOS Pathogens by Aeron Hurt of F. Hoffmann-La Roche Ltd and Wendy Barclay of Imperial College London, and colleagues. As noted by the authors, this is the first evidence that the rapid reduction in infectious viral particles associated with baloxavir treatment translates into a reduced risk of transmitting influenza to exposed contacts.

Influenza viruses cause seasonal outbreaks and pose a continuous pandemic threat. Although vaccines are available for influenza control, their efficacy varies each season and a vaccine for a novel pandemic virus manufactured using current technology will not be available fast enough to mitigate the effect of the first pandemic wave. Antivirals can be effective against many different influenza viruses, but have not been used extensively for outbreak control. A recently licensed antiviral drug called baloxavir has been shown to reduce the amount of virus particles produced by infected people more effectively than the widely used drug oseltamivir. In the new study, the researchers tested whether baloxavir treatment might also interrupt onward virus transmission.

They found that baloxavir treatment reduced infectious viral shedding in the upper respiratory tract of ferrets infected with A(H1N1)pdm09 influenza viruses compared to placebo, and reduced the frequency of transmission, even when treatment was delayed until two days after infection. By contrast, oseltamivir treatment did not substantially affect viral shedding or transmission compared to placebo. Importantly, the researchers did not detect the emergence of baloxavir-resistant variants in the animals. The results support the idea that antivirals which decrease viral shedding could also reduce influenza transmission in the community. According to the authors, such an effect has the potential to dramatically change how we manage influenza outbreaks, including pandemic influenza.

The authors add, "Our study shows that baloxavir can have a dual effect in influenza: a single dose reduces the symptoms and reduces the risk of passing it on to others as well".

Credit: 
PLOS

High blood glucose levels may explain why some flu patients experience severe symptoms

Influenza A (a highly contagious virus that causes annual flu epidemics worldwide) may trigger an inflammatory "cytokine storm" - an excessive immune response that can lead to hospitalization or even death - by increasing glucose metabolism, according to a new study. As the novel coronavirus pandemic grips the globe, Qiming Wang and colleagues have separately, and fully apart from this study, begun investigating how glucose metabolism may affect patients with COVID-19. "We believe that glucose metabolism contributes to various COVID-19 outcomes since both influenza and COVID-19 can induce a cytokine storm, and since COVID-19 patients with diabetes have shown higher mortality," says Shi Liu, a researcher on the study. In general, the mechanisms that promote cytokine storms, causing some individuals to suffer more from influenza A (and, perhaps more from COVID-19) than others remain mysterious. Although glucose metabolism and inflammatory cytokine signal networks are known to have evolved together, it has not been clear whether they interact during flu infection. To learn whether glucose metabolism is related to the off-the-wall immune response brought on by influenza A, Wang et al. examined blood glucose levels and cytokine production in mice with the flu, finding that those treated with glucosamine produced significantly higher levels of inflammatory cytokines and chemokines than mice that did not receive glucosamine. Additionally, the researchers analyzed glucose levels in blood samples from patients diagnosed with influenza A and healthy patients, which were collected from volunteers during physical examinations at two Wuhan University hospitals between 2017 and 2019. They determined that the hexosamine biosynthesis pathway, which metabolizes a small portion of glucose, plays an essential role in cytokine storms triggered by the flu virus.

Credit: 
American Association for the Advancement of Science (AAAS)

An antibody treatment combats life-threatening sepsis in rodents

video: This video describes the results of a recent study by Weiqiang Chen and colleagues that links the protein tetranectin to the potentially lethal outcomes of sepsis - with potential relevance for secondary bacterial infections in COVID-19 patients. This material relates to a paper that appeared in the Apr. 15, 2019, issue of Science Translational Medicine, published by AAAS. The paper, by W. Chen at Northwell Health in Manhasset, NY; and colleagues was titled, "Identification of tetranectin-targeting monoclonal antibodies to treat potentially lethal sepsis."

Image: 
[Dr. Haichao Wang with technical assistance from Zoe Wang and Echo Wang]

Sepsis - the body's extreme and organ-damaging response to severe infections - is a major contributor to death in patients battling infectious disease. By studying sepsis patients in intensive care, scientists have created a new antibody treatment that improved survival by combatting sepsis when tested in a mouse model. The new approach "has marked translational potential and overcomes a limitation in many murine sepsis studies," say Cameron Paterson and colleagues in a related Focus. Sepsis poses a massive medical and economic burden; it results in at least $62 billion in costs in the U.S. every year and is estimated to cause 20% of all deaths worldwide. Advances in treatments for sepsis have remained elusive, and most clinical trials of new therapies have failed to show a durable survival benefit. While studying a small group of septic patients, Weiqiang Chen and colleagues discovered that the patients showed much lower levels of a protein called tetranectin in their blood compared with healthy controls. The authors saw that knocking out tetranectin in mice exacerbated severe inflammation, lung damage and other features of lethal sepsis, but supplementing the mice with tetranectin reduced organ damage and extended survival. Through screening experiments, the research team designed an antibody that reacts with a peptide in tetranectin named P5 and blocks the protein's degradation. The antibody boosted survival in a mouse model of sepsis, even if treatment was started a full 24 hours after sepsis appeared. Although more work is needed, the authors speculate that their therapy might find use with severe bacterial infections, and may even help some COVID-19 patients - especially those suffering from secondary bacterial infections that can cause life-threatening sepsis.

Credit: 
American Association for the Advancement of Science (AAAS)

'Frailty' from age 40 -- what to look out for

image: Professor Anthony Maeder and Professor Sue Gordon at Flinders University, Tonsley Innovation District, Adelaide, South Australia.

Image: 
Flinders University

With all eyes on avoiding major illness this year, health researchers are urging people as young as 40 to build physical and mental health to reduce or even avoid 'frailty' and higher mortality risk.

A new study published online in BMC Geriatrics found 'pre-frailty occurs in 45% of people aged 40-49 - which is about the same percentage of people aged 70-75.

From the age of 40, or even younger, people in 'pre-frailty' stages now have the opportunity to avoid poor health outcomes and frailty, Flinders University Caring Futures Institute and international colleagues found.

"You don't have to be in your 70s or 80s to be heading down the path to frailty. Age doesn't matter," says Flinders University Strategic Professor Sue Gordon, Chair of Restorative Care in Ageing.

"Successful healthy ageing interventions and self-management should commence in at least the fourth decade of life focusing on these factors which contribute to pre-frailty and frailty."

People can take matters in their own hands by improving a range of things, including:

Pre-frailty indicators: Poor dynamic trunk stability and lower limb strength, poor balance, poor foot sensation, being underweight, pelvic floor problems and poor nutrition.

Pre-frailty to frailty factors: Poor mental state i.e. living alone, high psychological distress, poor lung function and poor sleep quality.

Many options for improving health outcomes are available online, adds Professor Anthony Maeder, from the Digital Health Research Centre at Flinders University.

"People working from home during the self-isolation period can take the opportunity to reassess their health, habits and routines to seek ways to make their daily routines and homes better places to live, and live longer in the process," he says.

Credit: 
Flinders University

Satellite galaxies of the Milky Way help test dark matter theory

video: The cyan dots, collectively, represent the satellite. The Milky Way galaxy is at the intersection of the pink dashed lines (center of the animation). The evolved time in giga years is shown on the left top corner of the animation. In this video we see the satellite, under the gravitational influence of the host (the Milky Way), revolves around the center of mass of the host. The satellite loses most of its mass after few passages. This is called tidal stripping. If the satellite gets completely destroyed through this process, it is called tidal disruption. The whole simulation is run for 10 giga years; this animation is composed of 100 snapshots.

Image: 
Omid Sameie

RIVERSIDE, Calif. -- A research team led by physicists at the University of California, Riverside, reports tiny satellite galaxies of the Milky Way can be used to test fundamental properties of "dark matter" -- nonluminous material thought to constitute 85% of matter in the universe.

Using sophisticated simulations, the researchers show a theory called self-interacting dark matter, or SIDM, can compellingly explain diverse dark matter distributions in Draco and Fornax, two of the Milky Way's more than 50 discovered satellite galaxies.

The prevailing dark matter theory, called Cold Dark Matter, or CDM, explains much of the universe, including how structures emerge in it. But a long-standing challenge for CDM has been to explain the diverse dark matter distributions in galaxies.

The researchers, led by UC Riverside's Hai-Bo Yu and Laura V. Sales, studied the evolution of SIDM "subhalos" in the Milky Way "tidal field" -- the gradient in the gravitational field of the Milky Way that a satellite galaxy feels in the form of a tidal force. Subhalos are dark matter clumps that host the satellite galaxies.

"We found SIDM can produce diverse dark matter distributions in the halos of Draco and Fornax, in agreement with observations," said Yu, an associate professor of physics and astronomy and a theoretical physicist with expertise in particle properties of dark matter. "In SIDM, the interaction between the subhalos and the Milky Way's tides leads to more diverse dark matter distributions in the inner regions of subhalos, compared to their CDM counterparts."

Draco and Fornax have opposite extremes in their inner dark matter contents. Draco has the highest dark matter density among the nine bright Milky Way satellite galaxies; Fornax has the lowest. Using advanced astronomical measurements, astrophysicists recently reconstructed their orbital trajectories in the Milky Way's tidal field.

"Our challenge was to understand the origin of Draco and Fornax's diverse dark matter distributions in light of these newly measured orbital trajectories," Yu said. "We found SIDM can provide an explanation after taking into both tidal effects and dark matter self-interactions."

Study results appear in Physical Review Letters.

Dark matter's nature remains largely unknown. Unlike normal matter, it does not absorb, reflect, or emit light, making it difficult to detect. Identifying the nature of dark matter is a central task in particle physics and astrophysics.

In CDM, dark matter particles are assumed to be collisionless, and every galaxy sits within a dark matter halo that forms the gravitational scaffolding holding it together. In SIDM, dark matter is proposed to self-interact through a new dark force. Dark matter particles are assumed to strongly collide with one another in the inner halo, close to the galaxy's center -- a process called dark matter self-interaction.

"Our work shows satellite galaxies of the Milky Way may provide important tests of different dark matter theories," said Sales, an assistant professor of physics and astronomy and an astrophysicist with expertise in numerical simulations of galaxy formation. "We show the interplay between dark matter self-interactions and tidal interactions can produce novel signatures in SIDM that are not expected in the prevailing CDM theory."

In their work, the researchers mainly used numerical simulations, called "N-body simulations," and obtained valuable intuition through analytical modeling before running their simulations.

"Our simulations reveal novel dynamics when an SIDM subhalo evolves in the tidal field," said Omid Sameie, a former UCR graduate student who worked with Yu and Sales and is now a postdoctoral researcher at the University of Texas at Austin working on numerical simulations of galaxy formation. "It was thought observations of Draco were inconsistent with SIDM predictions. But we found a subhalo in SIDM can produce a high dark matter density to explain Draco."

Sales explained SIDM predicts a unique phenomenon named "core collapse." In certain circumstances, the inner part of the halo collapses under the influence of gravity and produces a high density. This is contrary to the usual expectation that dark matter self-interactions lead to a low-density halo. Sales said the team's simulations identify conditions for the core collapse to occur in subhalos.

"To explain Draco's high dark matter density, its initial halo concentration needs to be high," she said. "More dark matter mass needs to be distributed in the inner halo. While this is true for both CDM and SIDM, for SIDM the core-collapse phenomenon can only occur if the concentration is high so that the collapse timescale is less than the age of the universe. On the other hand, Fornax has a low-concentrated subhalo, and hence its density remains low."

The researchers stressed their current work mainly focuses on SIDM and does not make a critical assessment on how well CDM can explain both Draco and Fornax.

After the team used numerical simulations to properly take into account the dynamical interplay between dark matter self-interactions and tidal interactions, the researchers observed a striking result.

"The central dark matter of an SIDM subhalo could be increasing, contrary to usual expectations," Sameie said. "Importantly, our simulations identify conditions for this phenomenon to occur in SIDM, and we show it can explain observations of Draco."

The research team plans to extend the study to other satellite galaxies, including ultrafaint galaxies.

Credit: 
University of California - Riverside

Impact of donor lymphocyte infusion and intensified conditioning for relapsed/refractory leukemia

image: LFS (A and B) and OS (C and D) according to treatment in RRAML (A and C) and ALL (B and D).

Image: 
©Science China Press

Patients with acute leukemia who become refractory to initial or re-induction chemotherapy (refractory/relapsed acute leukemia (RRAL)) have poor prognosis, with limited therapeutic options with hematopoietic cell transplant (HCT) being the only opportunity to cure. Current strategies to decrease post-transplant relapse for patients with RRAL include more effective pre-transplant conditioning, safer donor lymphocyte infusion (DLI), and improved donor selection (including haploidentical donor), though relevant real world data is scarce. In the two largest series coming from the International registries, most patients underwent HCT from matched sibling donors (MSDs) or unrelated donors (URDs) while haploidentical donors (HIDs) only account for less than 10 percent of the population. Furthermore, only the impact of donor source and clinical factors on transplant outcomes were evaluated while treatment-related factors such as conditioning intensity or DLI were not analyzed. It is imperative to understand the influence of treatment relevant variables in patients with active disease in the therapeutic decision making process.

"To understand the influence of treatment relevant variables in patients with RRAL In practice, we analyzed the outcomes of 932 consecutive patients who underwent transplantation during relapse or primary induction failure to evaluate the impact of conditioning or DLI in RRAL.", said Professor Xiao-Jun, Huang at Peking University Institute of Hematology, the corresponding author of this study. The results indicated that patients with RRAL can tolerate both interventions (prophylactic/preemptive donor lymphocyte infusion (p/pDLI) and intensified myeloablative conditioning (intenseMAC) and achieve a really reasonable outcome. The 3-year leukemia free survival (LFS) rates were 56% for patients receiving both interventions and 30% for those with neither therapy by a landmark analysis.

"Furthermore, to evaluate whether conditioning, DLI, or the combined treatment have varying extents of impact among acute myeloid leukemia (AML) and acute lymphoblastic leukemia (ALL) patients as well as among MSD and HID recipients, we performed multivariable analyses separately."said Prof. Huang. p/pDLI was associated with significantly higher LFS than non-DLI for both AML and ALL patients without increasing nonrelapse mortality (NRM). IntenseMAC was linked to significantly lower relapse and higher LFS than nonintensified MAC despite higher NRM rates in ALL; there was no impact of intenseMAC in AML. Moreover, p/pDLI had superior outcomes in both MSD and HID transplant while intenseMAC only had influence on MSD outcomes. "So, RRAL patients receiving "total therapy" by way of p/pDLI and intenseMAC have an improved chance for LFS, with p/pDLI being safer with a more extensive impact relative to intenseMAC."said Prof. Huang.

These important results not only provide novel data on transplant outcomes in RRAL from a large, unselected cohort, but also highlight the need for the incorporation of the intensive approaches into the "total therapy" strategy to improve the prognosis for patients with RRAL.

Credit: 
Science China Press

Study: Frequent cannabis users are way too high ... in their estimates of cannabinoids

BUFFALO, N.Y. -- One would think that cannabis enthusiasts attending a marijuana advocacy event would be knowledgeable about cannabinoids.

Not necessarily, according to the findings of a study by researchers from the University at Buffalo and the University of Michigan, who surveyed frequent cannabis users at an annual marijuana advocacy event held on the University of Michigan campus.

The surprisingly low level of knowledge about tetrahydrocannabinol (THC) and cannabidiol (CBD) content, and effective dosages, demonstrated by Hash Bash participants highlights the need for additional public health education and research, according to Daniel Kruger, PhD, the lead author of the study, published online ahead of print today in the journal Drugs: Education, Prevention and Policy.

"Even the people who are most enthusiastic have very poor knowledge of cannabinoid content. They greatly overestimated how much THC and how much CBD was in various strains, and what the effective dosages were," said Kruger, a research associate professor of community health and health behavior in the UB's School of Public Health and Health Professions. He is also a research investigator with the Population Studies Center at the University of Michigan.

Researchers surveyed nearly 500 Hash Bash attendees, asking them to fill out a 24-item questionnaire. Two-thirds of participants reported using cannabis every day, and most said it was for health or medical purposes. More than three-quarters of survey-takers said their knowledge of cannabis came from their own experiences.

The study survey asked participants to fill in, in milligrams, the amounts they considered to be effective doses of THC and CBD. (THC is the principal psychoactive compound and the one largely responsible for the high experienced by users. CBD does not have the same psychoactivity, but has other effects, such as reduction of anxiety.) Participants could also check the box for "I don't know."

The majority reported they didn't know. Other participants gave average estimates of 91 milligrams for THC and 177 milligrams for CBD. In other words, they were way off.

"The average estimate for an effective dose of THC would actually be fatal in humans," Kruger said.

One participant even said 1 million milligrams was the effective dose for THC. "That's a kilogram of THC. That's enough to fill an entire football stadium full of people and get them all high," Kruger said.

Participants also were asked to fill in what they thought were the percentages for high and low THC strains, and high and low CBD strains. The majority (58%) believed that a low-THC strain of cannabis was 20% THC or higher -- a level that would actually be considered a high-THC strain. In addition, 22% believed that a low-THC strain of cannabis was 40% THC or higher, which exceeds the levels of anything available now.

For CBD, 86% felt that a low strain of cannabis was 10% CBD or higher, a level considered representative of a high-CBD strain of cannabis. Nearly half believed that a low strain was 30% CBD or higher, which exceeds the CBD level of any existing strain.

"Our results suggest the need for broad-based cannabis education programs to help advocates and the general public to better understand and manage their use of the drug," said study co-author, R. Lorraine Collins, PhD, associate dean for research in UB's School of Public Health and Health Professions.

The current paper is the latest in a series of studies Kruger and his UB colleagues have published in recent years, based on data collected at Hash Bash. Their findings have shown how little many cannabis users know about the drug. The researchers also have highlighted the lackluster public health efforts to promote an effective harm reduction approach to marijuana use, especially during an era when cannabis is being deregulated in many states.

The stakes are higher with an increasing percentage of Americans using cannabis for a variety of recreational and medical reasons, as well as increasing cannabis potency, researchers say.

"Cannabis strains are 20 times as potent today as they were during the Summer of Love," said study co-author Jessica Kruger, PhD, clinical assistant professor of community health and health behavior in UB's School of Public Health and Health Professions.

The main message: "We really have to educate people. This has very real consequences, because these compounds have differential effects," Daniel Kruger said.

"Most Americans now live in a state where cannabis is legal, at least for medical purposes, but the information channels aren't there regarding safe and effective cannabis use."

Credit: 
University at Buffalo