Body

When macrophages are deprived of oxygen

Infected tissue has a low concentration of oxygen. The body's standard immune mechanisms, which rely on oxygen, can then only function to a limited extent. How does the immune system nevertheless manage to control bacteria under such conditions? The working groups led by PD Dr. Anja Lührmann at the Institute of Microbiology - Clinical Microbiology, Immunology and Hygiene (Director: Prof. Dr. Christian Bogdan) at Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) and Prof. Dr. Jonathan Jantsch at the Institute for Medical Microbiology and Hygiene (Director: Prof. Dr. Dr. André Gessner) at University Hospital Regensburg have investigated this question in collaboration with other groups from Erlangen, Regensburg and Jena. The researchers discovered that fewer metabolites are produced in the citric acid cycle under hypoxic conditions, leading to a reduced rate of reproduction among bacteria in macrophages.

Macrophages are a type of phagocyte and belong to the congenital immune system, where they have a key role to play in defending against infection by intracellular pathogens such as those which cause tuberculosis, Legionnaires' disease or Q fever. The team of researchers observed changes in the mitochondrial metabolism of the macrophages caused by signalling pathways initiated by the lack of oxygen (hypoxia). This leads to a reduction in various metabolites in the citric acid cycle, especially citrate. This in turn prevents bacteria reproduction, as citrate is an essential growth factor for certain bacteria. 'Our results describe a method of pathogen control which does not depend on oxygen and which we were not aware of until now,' explains Prof. Jantsch from Universität Regensburg. FAU scientist PD Dr. Lührmann adds: 'The pharmacological influence of these signalling pathways opens up new opportunities for fighting infectious diseases.'

Credit: 
Friedrich-Alexander-Universität Erlangen-Nürnberg

Technology better than tape measure for identifying lymphedema risk

Bioimpedance spectroscopy (BIS) is better than a tape measure for assessing a woman's risk for developing lymphedema after breast cancer surgery, according to interim results of a study led by Sheila Ridner, PhD, RN, Martha Ingram Professor and director of the PhD in Nursing Science Program at Vanderbilt University School of Nursing.

The multisite international study compares the two methods for identifying women who should be prescribed compression sleeves and gauntlets to reduce lymphatic fluid in the arm and prevent progression to lymphedema.

BIS surveillance reduced rates of progression by approximately 10%, a clinically meaningful improvement. Interim findings from the study were published May 3 in Annals of Surgical Oncology and Ridner presented the analysis during the annual meeting of The American Society of Breast Surgeons in Dallas.

"The bioimpedance device measures lymphatic fluid, and the tape measures everything," said Ridner, a researcher with Vanderbilt-Ingram Cancer Center. "It takes more lymphatic fluid to make your whole arm volume change than it does to make the device pick up changes. The device is just more sensitive to changes in lymphatic fluid."

Breast cancer related lymphedema affects between 20% and 30% percent of women due to damage to the lymph glands from surgery, radiation and some medicines, Ridner said. Lymphedema causes swelling in the arm, can be physically debilitating and puts women at greater risk for infections as well as psychological stress.

The results are an interim analysis of an ongoing controlled trial called PREVENT, launched in 2014 and led by Ridner. The analysis involved 508 participants who had been monitored for a year or longer. Participants identified at risk for lymphedema received compression sleeves and gauntlets and were instructed to wear them 12 hours daily for 28 days to prevent progression to lymphedema. Patients who developed lymphedema reached their endpoint with the trial and were referred to clinicians for complex decongestive physiotherapy (CDP).

"CDP is resource intensive and costly," Ridner said. "Lymphedema therapists are not accessible everywhere and mostly are in metropolitan areas. You go an hour-and-a-half in any direction outside of Nashville, for example, and we can't find people to treat these patients."

Clinicians have traditionally used tape measures to monitor breast cancer patients for lymphedema, but that method can vary greatly depending upon how a clinician does this.

"Tape measure is the most commonly used method around the world even though it is fraught with error," Ridner said. "To get accurate measurements for a research study, there is an incredible amount of training to teach all the sites in this international study how to measure the same way. I do annual fidelity oversight visits to every single site to make sure there has not been any slippage in the protocol."

BIS is a painless and noninvasive procedure that entails running an electronic signal through the body. The technology is similar to electronic monitors for body mass index, but much more refined.

Although the study showed that participants in the BIS experienced reduced rates of progression to lymphedema requiring CDP, the tape measure group triggered an intervention more often and earlier. The median time that triggered an intervention in the tape measure group was 2.8 months versus 9.5 months for the BIS group.

"It is possible that at three months post-surgery in some patients there remains a generalized, whole-arm inflammatory response that is identified by tape measure," the analysis states. "Increased extracellular fluid may not be a major factor in that volume change."

Ridner and the research team will evaluate the factors associated with triggering for both groups going forward.

"We had statistically significant more people trigger an intervention that were in the tape group than in the BIS group, which was contrary to what many people thought would have happened in the study. One of the concerns about BIS in general was that it might generate false positives and we might psychologically distress people," Ridner said. "That was never my experience in the 15 to 16 years I've been working with the technology."

The PREVENT trial has enrolled a total of 1,201 participants with 325 of them being patients of the Vanderbilt Breast Center. The findings released at the annual meeting of the American Society of Breast Surgeons involved the first 500 to have been monitored for 12 months or longer. The trial is anticipated to continue through December 2020. Other sites involved in the trial include Alleghany General Hospital, Columbia University Medical Center, Mayor Clinic (Jacksonville, Fla.), University of Louisville, Macquarie University (New South Wales, Australia), MD Anderson Cancer Center, University of Kansas Medical Center, Cleveland Clinic and Southeast Health Southeast Cancer Center (Cape Girardeau, Mo.).

Co-investigators of the study from Vanderbilt include Mary Dietrich, PhD, MS, VUSN, and Vandana Abramson, MD, Vanderbilt-Ingram Cancer Center. Other co-investigators are Chirag Shah, MD, Cleveland Clinic, and Frank Vicini, MD, 21stCentury Oncology.

Credit: 
Vanderbilt University

'Neural Lander' uses AI to land drones smoothly

image: The Neural Lander system is tested in the Aerodrome, a three-story drone arena at Caltech's Center for Autonomous Systems and Technologies.

Image: 
Caltech

Landing multi-rotor drones smoothly is difficult. Complex turbulence is created by the airflow from each rotor bouncing off the ground as the ground grows ever closer during a descent. This turbulence is not well understood nor is it easy to compensate for, particularly for autonomous drones. That is why takeoff and landing are often the two trickiest parts of a drone flight. Drones typically wobble and inch slowly toward a landing until power is finally cut, and they drop the remaining distance to the ground.

At Caltech's Center for Autonomous Systems and Technologies (CAST), artificial intelligence experts have teamed up with control experts to develop a system that uses a deep neural network to help autonomous drones "learn" how to land more safely and quickly, while gobbling up less power. The system they have created, dubbed the "Neural Lander," is a learning-based controller that tracks the position and speed of the drone, and modifies its landing trajectory and rotor speed accordingly to achieve the smoothest possible landing.

"This project has the potential to help drones fly more smoothly and safely, especially in the presence of unpredictable wind gusts, and eat up less battery power as drones can land more quickly," says Soon-Jo Chung, Bren Professor of Aerospace in the Division of Engineering and Applied Science (EAS) and research scientist at JPL, which Caltech manages for NASA. The project is a collaboration between Chung and Caltech artificial intelligence (AI) experts Anima Anandkumar, Bren Professor of Computing and Mathematical Sciences, and Yisong Yue, assistant professor of computing and mathematical sciences.

A paper describing the Neural Lander will be presented at the Institute of Electrical and Electronics Engineers (IEEE) International Conference on Robotics and Automation on May 22. Co-lead authors of the paper are Caltech graduate students Guanya Shi, whose PhD research is jointly supervised by Chung and Yue, as well as Xichen Shi and Michael O'Connell, who are the PhD students in Chung's Aerospace Robotics and Control Group.

Deep neural networks (DNNs) are AI systems that are inspired by biological systems like the brain. The "deep" part of the name refers to the fact that data inputs are churned through multiple layers, each of which processes incoming information in a different way to tease out increasingly complex details. DNNs are capable of automatic learning, which makes them ideally suited for repetitive tasks.

To make sure that the drone flies smoothly under the guidance of the DNN, the team employed a technique known as spectral normalization, which smooths out the neural net's outputs so that it doesn't make wildly varying predictions as inputs/conditions shift. Improvements in landing were measured by examining deviation from an idealized trajectory in 3D space. Three types of tests were conducted: a straight vertical landing; a descending arc landing; and flight in which the drone skims across a broken surface--such as over the edge of a table--where the effect of turbulence from the ground would vary sharply.

The new system decreases vertical error by 100 percent, allowing for controlled landings, and reduces lateral drift by up to 90 percent. In their experiments, the new system achieves actual landing rather than getting stuck about 10 to 15 centimeters above the ground, as unmodified conventional flight controllers often do. Further, during the skimming test, the Neural Lander produced a much a smoother transition as the drone transitioned from skimming across the table to flying in the free space beyond the edge.

"With less error, the Neural Lander is capable of a speedier, smoother landing and of gliding smoothly over the ground surface," Yue says. The new system was tested at CAST's three-story-tall aerodrome, which can simulate a nearly limitless variety of outdoor wind conditions. Opened in 2018, CAST is a 10,000-square-foot facility where researchers from EAS, JPL, and Caltech's Division of Geological and Planetary Sciences are uniting to create the next generation of autonomous systems, while advancing the fields of drone research, autonomous exploration, and bioinspired systems.

"This interdisciplinary effort brings experts from machine learning and control systems. We have barely started to explore the rich connections between the two areas," Anandkumar says.

Besides its obvious commercial applications--Chung and his colleagues have filed a patent on the new system--the new system could prove crucial to projects currently under development at CAST, including an autonomous medical transport that could land in difficult-to-reach locations (such as a gridlocked traffic). "The importance of being able to land swiftly and smoothly when transporting an injured individual cannot be overstated," says Morteza Gharib, Hans W. Liepmann Professor of Aeronautics and Bioinspired Engineering; director of CAST; and one of the lead researchers of the air ambulance project.

Credit: 
California Institute of Technology

Virtual reality can spot navigation problems in early Alzheimer's disease

image: Footage from virtual reality headset.

Image: 
University of Cambridge

Virtual reality (VR) can identify early Alzheimer's disease more accurately than 'gold standard' cognitive tests currently in use, suggests new research from the University of Cambridge.

The study highlights the potential of new technologies to help diagnose and monitor conditions such as Alzheimer's disease, which affects more than 525,000 people in the UK.

In 2014, Professor John O'Keefe of UCL was jointly awarded the Nobel Prize in Physiology or Medicine for 'discoveries of cells that constitute a positioning system in the brain'. Essentially, this means that the brain contains a mental 'satnav' of where we are, where we have been, and how to find our way around.

A key component of this internal satnav is a region of the brain known as the entorhinal cortex. This is one of the first regions to be damaged in Alzheimer's disease, which may explain why 'getting lost' is one of the first symptoms of the disease. However, the pen-and-paper cognitive tests used in clinic to diagnose the condition are unable to test for navigation difficulties.

In collaboration with Professor Neil Burgess at UCL, a team of scientists at the Department of Clinical Neurosciences at the University of Cambridge led by Dr Dennis Chan, previously Professor O'Keefe's PhD student, developed and trialled a VR navigation test in patients at risk of developing dementia. The results of their study are published today in the journal Brain.

In the test, a patient dons a VR headset and undertakes a test of navigation while walking within a simulated environment. Successful completion of the task requires intact functioning of the entorhinal cortex, so Dr Chan's team hypothesised that patients with early Alzheimer's disease would be disproportionately affected on the test.

The team recruited 45 patients with mild cognitive impairment (MCI) from the Cambridge University Hospitals NHS Trust Mild Cognitive Impairment and Memory Clinics. Patients with MCI typically exhibit memory impairment, but while MCI can indicate early Alzheimer's, it can also be caused by other conditions such as anxiety and even normal aging. As such, establishing the cause of MCI is crucial for determining whether affected individuals are at risk of developing dementia in the future.

The researchers took samples of cerebrospinal fluid (CSF) to look for biomarkers of underlying Alzheimer's disease in their MCI patients, with 12 testing positive. The researchers also recruited 41 age-matched healthy controls for comparison.

All of the patients with MCI performed worse on the navigation task than the healthy controls. However, the study yielded two crucial additional observations. First, MCI patients with positive CSF markers - indicating the presence of Alzheimer's disease, thus placing them at risk of developing dementia - performed worse than those with negative CSF markers at low risk of future dementia.

Secondly, the VR navigation task was better at differentiating between these low and high risk MCI patients than a battery of currently-used tests considered to be gold standard for the diagnosis of early Alzheimer's.

"These results suggest a VR test of navigation may be better at identifying early Alzheimer's disease than tests we use at present in clinic and in research studies," says Dr Chan.

VR could also help clinical trials of future drugs aimed at slowing down, or even halting, progression of Alzheimer's disease. Currently, the first stage of drug trials involves testing in animals, typically mouse models of the disease. To determine whether treatments are effective, scientists study their effect on navigation using tests such as a water maze, where mice have to learn the location of hidden platforms beneath the surface of opaque pools of water. If new drugs are found to improve memory on this task, they proceed to trials in human subjects, but using word and picture memory tests. This lack of comparability of memory tests between animal models and human participants represents a major problem for current clinical trials.

"The brain cells underpinning navigation are similar in rodents and humans, so testing navigation may allow us to overcome this roadblock in Alzheimer's drug trials and help translate basic science discoveries into clinical use," says Dr Chan. "We've wanted to do this for years, but it's only now that VR technology has evolved to the point that we can readily undertake this research in patients."

In fact, Dr Chan believes technology could play a crucial role in diagnosing and monitoring Alzheimer's disease. He is working with Professor Cecilia Mascolo at Cambridge's Centre for Mobile, Wearable Systems and Augmented Intelligence to develop apps for detecting the disease and monitoring its progression. These apps would run on smartphones and smartwatches. As well as looking for changes in how we navigate, the apps will track changes in other everyday activities such as sleep and communication.

"We know that Alzheimer's affects the brain long before symptoms become apparent," says Dr Chan. "We're getting to the point where everyday tech can be used to spot the warning signs of the disease well before we become aware of them.

"We live in a world where mobile devices are almost ubiquitous, and so app-based approaches have the potential to diagnose Alzheimer's disease at minimal extra cost and at a scale way beyond that of brain scanning and other current diagnostic approaches."

Credit: 
University of Cambridge

Lifestyle explains part of the protective effect of education on heart disease

Lifestyle factors, such as weight, blood pressure and smoking, explain around 40% of the protective effect of education on heart disease risk in later life, finds a study published by The BMJ today.

The results suggest that intervening on these "modifiable" risk factors would lead to reductions in cases of heart disease as a result of lower educational achievement.

But the researchers point out that more than half of this protective effect still remains unexplained and requires further investigation.

We already know that lower levels of education are directly related to higher cardiovascular risk in later life. But educational opportunities aren't the same for everyone, so the key to improving heart health in later life may lie in tackling the risk factors that drive these poorer outcomes.

To test this theory, an international team of researchers set out to investigate the role of body mass index (BMI), systolic blood pressure and smoking in explaining the protective effect of education on cardiovascular risk.

They carried out observational and genetic analysis of data from over 200,000 adults in the UK Biobank - a large population based study of more than half a million British men and women, in addition to a two-sample Mendelian randomisation approach from predominantly European studies.

This technique uses genetic information to avoid some of the problems that afflict observational studies, making the results less prone to unmeasured (confounding) factors, and therefore more likely to be reliable in understanding cause and effect.

In both observational and Mendelian randomisation analyses, the researchers found consistent evidence that BMI, blood pressure and smoking mediated the effect of education, explaining up to 18%, 27% and 34% respectively.

When all three risk factors were combined, they explained around 40% of the relationship between education and cardiovascular disease. And similar results were found for risk of stroke, heart attack, and all other types of cardiovascular disease.

As such, the researchers suggest that intervening on these risk factors "would lead to reductions in cardiovascular disease attributable to lower levels of education." But they say it is important to note that over half of the overall effect of education remain unexplained.

They point to some study limitations, for example the main analysis did not consider factors such as exercise, diet, cholesterol and blood sugar levels, and as participants were mostly white Europeans, findings may not be applicable to other populations.

Nevertheless, they stress that results were consistent across the two approaches and in additional sensitivity analyses,suggesting that the findings are robust.

These findings have "notable implications for policymakers as they identify potential strategies for reducing education inequalities in health," write the authors.

Further research identifying other related factors and the interplay between them - and in more diverse populations - will be key to reducing inequalities in cardiovascular disease, they conclude.

Credit: 
BMJ Group

Leaving school earlier could increase the risk of heart disease

Although it has been known for a long time, that education, and socioeconomic position affect health, particularly in later life, there was limited knowledge as to why. New research has found that increased levels of BMI, blood pressure and smoking partly explain why people who left school at an earlier age could be at an increased risk of heart disease.

The study led by the University of Bristol and Imperial College London, and published in the BMJ today [Wednesday 22 May], investigated the role of body mass index (BMI), systolic blood pressure (SBP) and smoking in European populations to explain the effect of education on the risk of cardiovascular disease, which affects the heart or blood vessels and includes heart disease, heart attack and stroke.

Building on previous heart disease studies, the researchers looked at the effect of education on all combined coronary disease subtypes, heart attack and stroke. Using a method called mediation analysis, which aims to identify the mechanism between an exposure and outcome, they investigated how much of the association between education and heart disease could be explained by BMI, blood pressure, smoking and all three factors together.

The research team found consistent evidence that BMI, blood pressure and smoking relates up to 18 per cent, 27 per cent and 34 per cent of the effect of education on heart disease, respectively. Considering all of these together explains around 40 per cent of the effect of education on heart disease. Future interventions on these risk factors could lead to reductions in cardiovascular disease that has been caused by lower levels of education.

Alice Carter, PhD student in Population Health Sciences in the MRC Integrative Epidemiology Unit at the University of Bristol and co-first author, said: "Past policies that increase the duration of compulsory education have improved health and such endeavours must continue. However, intervening on education is difficult to achieve and requires large amounts of both societal and political change.

"Our work shows that there may be opportunities to intervene, after education is completed, to reduce the potential risk of heart disease. By lowering BMI, blood pressure or rates of smoking in individuals who left school at an earlier age, we could reduce their overall risk of heart disease. However, it is important to note this work looks at the effect of education on a population level risk of heart disease - and leaving school earlier does not necessarily mean an individual will go on to develop heart disease."

Dr Dipender Gill, co-first author of the work from Imperial's School of Public Health, explained: "Although we know from previous research that someone who spends more time in education has a lower risk of heart disease and stroke, we didn't know why. Surprisingly, our research suggests only half of this protective effect seems to come from lower weight, blood pressure and less time smoking.

"We now need to investigate what other reasons may link education and lower cardiovascular risk. One possibility is that people who spend more time in education tend to engage more with healthcare services, and see their doctor sooner with any health complaints."

Analyses were carried out using self-reported or measured observational data in 217,013 participants in the UK Biobank. These analyses were then replicated using Mendelian randomisation, a method which uses genetic variants (single point changes in an individual's DNA sequence) associated with the risk factors, including education, BMI, blood pressure and smoking. Additionally, genetic data from large consortia were utilised to replicate these Mendelian randomisation analyses.

Although the study, funded by the Medical Research Council (MRC) and the Wellcome Trust, has been able to explain around 40 per cent of the effect of education on heart disease with BMI, blood pressure and smoking, over half of the effect remains unexplained.

The team carried out some sensitivity analyses, looking at broad measures of diet and exercise, and these factors did not explain any additional amount of the association.

Understanding what other factors are driving the association will be important. This may be related to medication use, for example being prescribed cholesterol lowering drugs - statins - and then taking them as prescribed. Additionally, this work predominantly looked at white, European individuals. We know there are ethnic differences both in education and heart disease, so repeating these analyses in diverse populations will be necessary to extrapolate these findings.

Finally, researching and identifying what interventions may be successful in lowering these levels of intermediate factors, and then implementing them, will be key to reducing heart disease.

Credit: 
University of Bristol

The top 25 medical lab tests around the world

A recent study can help governments understand which diagnostic laboratory tests are most important when developing universal health coverage systems.

Researchers from five countries found that diagnostic laboratory tests are used similarly around the world, even though the institutions they studied differed in terms of poverty levels, health systems and prevalence of disease.

"Even though poorer countries have many more infectious diseases, while richer ones suffer more from non-communicable diseases like stroke, diabetes and heart conditions, the patterns of tests were surprisingly similar across these countries," said Susan Horton, a public health professor at the University of Waterloo and the study's principal investigator. "The interpretation is that human biology is similar across the globe, which affects the type of tests needed."

The researchers obtained data on the 25 most common tests at five hospitals, ranging from lower-middle to high income, located around the world: Kenya, India, Nigeria, Malaysia and the U.S. Two were private hospitals, while three were public.

They compared the volume of tests, as well as their prices, and found that the same tests were common everywhere. In terms of prices for different types of tests at each hospital, the researchers compared the most common biochemical test (blood glucose), the most common hematologic test (CBC), the most common microbiology test (urine culture), and histopathologic tests overall (surgicals).

They found that even though the relative prices varied, the biochemical test price was lowest in each hospital, the hematology and microbiology test prices were about three times as high, and the histopathology test price about 15 times as high as the price of the biochemical test.

In terms of volume, four of the top five tests were the same across four of the hospitals, and the fifth hospital had three that were the same. Surprisingly, a test for diagnosing tuberculosis only appeared in the top 25 in one hospital, despite the prevalence of the disease in some countries. The researchers hypothesize that the tests are taking place outside the hospital system, such as TB-dedicated facilities.

"As countries around the world attempt to implement universal health coverage, their health care systems will require better, affordable and effective laboratory tests to guide diagnosis and treatment," said Horton. "One important implication is that the new Essential Diagnostics List issued by the World Health Organization (WHO) in 2018 is likely to be a powerful tool -- and given that most of the tests on our top 25 list were also included on the WHO list, these findings have the potential to inform plans for universal health coverage."

Horton added that more data needs to be collected from a more representative set of health facilities in each country, especially as they develop their own essential diagnostics lists.

Credit: 
University of Waterloo

More years spent in education associated with lower weight and blood pressure

Scientists have helped unravel the link between higher levels of education and reduced risk of heart attack and stroke.

Previous research showed every 3.6 years spent in education can reduce a person's risk of heart disease by a third.

However, scientists did not know exactly why spending more time in education reduced a person's risk of cardiovascular disease (a general term for any condition affecting the heart or blood vessels, including heart disease, heart attack and strokes).

In the latest study, led by Imperial College London, University of Bristol, University of Cambridge and University of Oxford, scientists used statistical and genetic analyses to show only 40 per cent of the effect of education on cardiovascular disease risk is explained through body mass index (a measure of body fat based on height and weight), blood pressure or how much a person has smoked.

Analysis in the study also suggested each 3.6 additional years in education was linked to a reduction in BMI of 1kg/m2, and a reduction in systolic blood pressure of 3mm/Hg. A BMI between 18.5 and 24.9 is generally considered healthy, while systolic blood pressure should be between 90 and 120.

Dr Dipender Gill, co-first author of the work from Imperial's School of Public Health, said: "Although we know from previous research that someone who spends more time in education has a lower risk of heart disease and stroke, we didn't know why. Surprisingly, our research suggests only half of this protective effect comes from lower weight, blood pressure and less time smoking. We now need to investigate what other reasons may link education and lower cardiovascular disease risk. One possibility is that people who spend more time in education tend to engage more with healthcare services, and see their doctor sooner with any health complaints."

Alice Carter, co-first author from the University of Bristol explained: "Past policies that increase the duration of compulsory education have improved health and such endeavours must continue. However, intervening on education is difficult to achieve and requires large amounts of both societal and political change. Our work shows that there are opportunities to intervene, after education is completed, to reduce the potential risk of heart disease. By lowering BMI, blood pressure or rates of smoking in individuals who left school at an earlier age, we could reduce their overall risk of heart disease. However, it is important to note this work looks at the effect of education on a population level risk of heart disease - and leaving school earlier does not necessarily mean an individual will go on to develop heart disease."

In the research, published in The BMJ, the scientists used two types of analysis to investigate the link between education and cardiovascular risk.

In the first approach, they analysed data from over 200,000 people in the UK, and compared the number of years individuals spent in education with their body mass index (BMI), blood pressure, the lifetime amount they have smoked, and consequent cardiovascular disease events such as heart attack or stroke.

In the second approach, the research team used a type of analysis called Mendelian randomisation. Using genetic data from public databases, the team searched through data from more than one million people to investigate the link between education and cardiovascular disease risk. They focused on points in the genome where a single 'letter' difference in the DNA - called a single nucleotide polymorphism (SNP) - has been linked to years in schooling.

The research team, who were funded by the Medical Research Council and Wellcome Trust, then assessed the link between these genetic markers for years in schooling with genetic markers for BMI, blood pressure and lifetime smoking (the researchers only assessed years in education and did not analyse intelligence in any way).

Using these two methods, they found that body mass index, blood pressure and smoking contribute to the effect of education, explaining up to 18 per cent, 27 per cent and 34 per cent respectively. Combined, these factors accounted for 40 per cent of the effect of education on cardiovascular risk.

Dr Gill said this total is less than would be expected by simply adding the individual percentages for BMI, blood pressure and smoking. This suggests the effect of the three factors have some overlap.

He added that most of the data analysed was from individuals of European heritage, and more work is now needed to investigate the link between education and cardiovascular risk in other ethnic groups.

Credit: 
Imperial College London

Children with cancer wait an average of 6.5 years longer than adults to access new drugs

image: The median time lag between a cancer drug's first clinical trial in adults and its first clinical trial in children was 6.5 years (range, 0 to 27.7 years). Each line on this chart represents a different drug.

Image: 
Patrick Bibbins / Boston Children's Hospital

Cancer drugs approved by the U.S. Food and Drug Administration (FDA) took a median of 6.5 years to go from the first clinical trial in adults to the first trial in children, according to a study at the Dana-Farber/Boston Children's Cancer and Blood Disorders Center. The study was published in the May issue of the European Journal of Cancer.

"Despite knowing that these agents are effective anticancer drugs, it's taking too long to even start studying these therapies in children," says Steven G. DuBois, MD, Dana-Farber/Boston Children's Cancer and Blood Disorders Center, corresponding author on the study. "As a doctor taking care of young cancer patients, this is tremendously frustrating. If I were a parent of a child with cancer, I wouldn't stand for this."

The team at Dana-Farber/Boston Children's conducted a systematic analysis of the time from first-in-human trials to first-in-child trials of agents first approved by the FDA for any oncology indication from 1997 to 2017. The investigators utilized clinical trials registry data, published literature and oncology abstracts to identify relevant trials and start dates.

Delayed pediatric trials

In that timeframe, 126 drugs received initial FDA approval for an oncology indication. After excluding hormonal modulators (not relevant to children's cancers), 117 agents remained for analysis. Fifteen of 117 drugs (12.8%) had not yet had a pediatric trial, while 6 of 117 drugs (5.1%) included children in the initial FDA approval.

The data showed a median 6.5-year lag between first-in-human and first-in-child clinical trials, with a range of 0 to 27.7 years.

"Some may argue that this lag is appropriate to ensure safety of a vulnerable pediatric population and to only study agents in children that are on a path to FDA approval, based upon activity in adults with cancer," says DuBois. "Others may argue that this lag is too long for children with life-threatening diseases and that some agents that fail in adult indications may nevertheless prove to be important drugs for pediatric indications."

In the U.S., the recent RACE for Children Act strengthens the requirement that new cancer therapies with potential biological relevance to pediatric cancers be evaluated in children. This study can serve as a benchmark as this new policy is enacted, says DuBois.

Credit: 
Boston Children's Hospital

Soy foods linked to fewer fractures in younger breast cancer survivors

A new paper in JNCI Cancer Spectrum, published by Oxford University Press, is the first study to find that diets high in soy foods are associated with a decreased risk of osteoporotic bone fractures in pre-menopausal breast cancer survivors.

Breast cancer is the second most common cancer among women in the United States, with 1 in 8 women diagnosed with it during their lifetime. Many treatments for breast cancer can cause premature menopause and decrease bone mineral density. This leads to a higher incidence of osteoporosis-related fractures among survivors compared to healthy women in the same age range, and yet many factors connected to this increase in fracture risks are understudied.

Researchers here studied the impact that BMI, exercise, and soy food consumption had on bone fracture rates among breast cancer survivors. The study used data from the Shanghai Breast Cancer Survival Study of 5,042 newly diagnosed breast cancer survivors between the ages of 20 and 75. Researchers collected detailed information at enrollment, including cancer diagnosis and treatment history, medication use, dietary habits, exercise and other lifestyle factors. About 52% of women in the study were postmenopausal. Patients then had follow-up visits at 18 months, and 3, 5, and 10 years after their diagnosis to update exposure and outcome information.

Throughout the 10-year study period, 3.6% of survivors reported an osteoporotic bone fracture. Higher soy intake was associated with a 77% reduced risk of osteoporotic fractures in younger women, and exercise showed a significantly reduced risk of fractures among older women.

Consistent with prior studies, the extended use of tamoxifen, a drug that is prescribed for breast cancer patients showed a 37% reduced risk of fractures in the overall study population. Tamoxifen is a selective estrogen receptor modulator, or SERM, that causes an increase in bone mineral density. Soy based foods, which are rich in isoflavones, provide a natural SERM.

"The menopausal transition is known to be a period of high risk for bone loss, and given the relative scarcity of data related to fracture risk among younger women with breast cancer, this study marks an important contribution to this body of literature," said the paper's lead author, Evelyn Hsieh. "Our findings, in particular regarding the protective effects of soy food consumption provide novel insight into how future interventions can be best tailored to different risk groups."

Credit: 
Oxford University Press USA

CBD reduces craving and anxiety in people with heroin use disorder

Cannabidiol (CBD) reduced cue-induced craving and anxiety in individuals with a history of heroin abuse, suggesting a potential role for it in helping to break the cycle of addiction, according to research conducted at the Icahn School of Medicine at Mount Sinai and published May 21 in the American Journal of Psychiatry. The study also revealed that CBD tended to reduce physiological measures of stress reactivity, such as increased heart rate and cortisol levels, that are induced by drug cues.

The wide availability and use of heroin and prescription opioid medications in the United States during the past decade has resulted in an unprecedented epidemic involving more than 300,000 deaths. Despite this staggering toll, limited non-opioid medication options have been developed. Two of the current options, methadone and buprenorphine, are opioid substitution therapies which work on the same opioid receptors (mu receptors) as heroin and other potent opioid agonists. These medications, however, carry a stigma as well as their own addiction risk, are mired in tight governmental regulation, and therefore are underutilized by the millions of people diagnosed with opioid use disorder. Such a treatment gap highlights the urgent need to develop novel therapeutic strategies that do not target the mu opioid receptor.

"To address the critical need for new treatment options for the millions of people and families who are being devastated by this epidemic, we initiated a study to assess the potential of a non-intoxicating cannabinoid on craving and anxiety in heroin-addicted individuals," says Yasmin Hurd, PhD, the Ward-Coleman Chair of Translational Neuroscience at the Icahn School of Medicine at Mount Sinai, Director of the Addiction Institute at Mount Sinai and first author of the study. "The specific effects of CBD on cue-induced drug craving and anxiety are particularly important in the development of addiction therapeutics because environmental cues are one of the strongest triggers for relapse and continued drug use."

Previous preclinical work conducted by Dr. Hurd and her lab team at Mount Sinai, in animals with a history of heroin self-administration, demonstrated that CBD reduced the animals' tendency to use heroin in response to a drug-associated cue. To determine whether the preclinical work could be translated to humans, her lab then conducted a series of clinical studies that demonstrated CBD was safe and tolerable in humans.

The current study used a double-blind, randomized, placebo-controlled design to explore the acute (one hour, two hours, and 24 hours), short-term (three consecutive days), and protracted (seven days after the last of three consecutive daily administrations) effects of CBD administration on drug cue-induced craving and anxiety in drug-abstinent individuals with heroin use disorder. Secondary measures assessed participants' positive and negative affect, cognition, and physiological status.

Through the study, 42 drug-abstinent men and women were randomly assigned to receive either 400 mg or 800 mg of an oral CBD solution or a matching placebo. Participants were then exposed to neutral and drug-related cues during the course of three sessions: immediately following administration, 24 hours after CBD or placebo administration, and seven days after the third and final daily CBD or placebo administration. Neutral cues consisted of a three-minute video showing relaxing scenarios, such as scenes of nature, while drug-related cues included a three-minute video showing intravenous or intranasal drug use and exposure to heroin-related paraphernalia like syringes, rubber ties, and packets of powder resembling heroin. Measures of opioid craving, anxiety, positive and negative affect, and vital signs (skin temperature, blood pressure, heart rate, respiratory rate, and oxygen saturation) were obtained at different times during the sessions.

The study team found that CBD, in contrast to placebo, significantly reduced both the craving and anxiety induced by drug cues compared with neutral cues in the acute term. CBD also showed significant protracted effects on these measures seven days after the final short-term exposure. In addition, CBD reduced the drug cue-induced physiological measures of heart rate and salivary cortisol levels. There were no significant effects on cognition, and there were no serious adverse events. The capacity of CBD to reduce craving and anxiety one week after the final administration mirrors the results of the original preclinical animal study, suggesting that the effects of CBD are long-lasting, even when the cannabinoid would not be expected to be present in the body.

"Our findings indicate that CBD holds significant promise for treating individuals with heroin use disorder," says Dr. Hurd. "A successful non-opioid medication would add significantly to the existing addiction medication toolbox to help reduce the growing death toll, enormous health care costs, and treatment limitations imposed by stringent government regulations amid this persistent opioid epidemic."

Dr. Hurd's research team is working on two follow-up studies: one delves into understanding the mechanisms of CBD's effects on the brain; the second paves the way for the development of unique CBD medicinal formulations that are likely to become a significant part of the medical arsenal available to address the opioid epidemic.

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

New recommendations for stroke systems of care to improve patient outcomes

DALLAS, May 20, 2019 - Improvements in stroke systems of care are necessary to ensure scientific advances in the treatment and care of stroke patients improve patient outcomes, according to a policy statement published today by the American Stroke Association, a division of the American Heart Association, in the journal Stroke.

The policy statement, released during National Emergency Medical Services (EMS) Week, comes as stroke systems of care have seen vast improvements in availability of endovascular therapy, neurocritical care and stroke center certification over the past decade. In addition, innovations such as telestroke and mobile stroke units have increased access for stroke patients to alteplase, a lifesaving, clot-busting drug.

"We have seen monumental advancements in acute stroke care over the past 14 years, and our concept of a comprehensive stroke system of care has evolved as a result," said Opeolu Adeoye, M.D., the chair of the writing group for the statement and associate professor of emergency medicine and neurosurgery at the University of Cincinnati. "These recommendations reflect how far we have progressed and what still needs to be accomplished to maximize patient outcomes in acute stroke care."

The statement recommends that when more than one intravenous alteplase-capable hospital is within reach, Emergency Medical Services (EMS) should consider additional travel time of up to 15 minutes to reach a hospital capable of performing endovascular thrombectomy (also called stent retrievers) for patients suspected of having a severe stroke. Both intravenous alteplase, a clot-dissolving therapy, and endovascular thrombectomy, a procedure to remove a clot mechanically, must be administered quickly to be effective, but not every hospital is able to deliver these services.

"While it is vitally important for patients suspected of having a large vessel blockage to get to the hospital quickly, getting to the right hospital is equally important," Adeoye said.

The statement also addresses disparities in care among racial and ethnic minorities, who are less likely to use EMS and have the lowest awareness of the causes and symptoms of stroke. Among Hispanic and black populations in particular, lack of knowledge of the risk factors and symptoms of stroke can hamper timely stroke care.

In response, the statement recommends that public health leaders and medical professionals implement public education programs focused on stroke systems and the need to seek emergency care by calling 9-1-1 in response to stroke symptoms.

The statement also includes the following recommendations:

Education: Stroke systems of care should support local and regional public education initiatives to increase awareness of stroke symptoms with an emphasis on at-risk populations.

Triage: EMS leaders, governmental agencies, medical authorities and local experts should work together to adopt consistent, standardized triage protocols to rapidly identify patients with a known or suspected stroke.

Secondary Prevention: Certified stroke centers should help stroke survivors reduce the of risk of subsequent strokes, consistent with the national guidelines for secondary prevention.

Rehabilitation and Support: A stroke system should provide comprehensive post-stroke care including ongoing primary care and specialized stroke services such as physical, occupational, speech or other therapies on discharge.

Federal and State Policies: Policies should be enacted to standardize the organization of stroke care, lower barriers to seeking emergency care for stroke, ensure stroke patients receive care at appropriate hospitals in a timely manner, and facilitate access to secondary prevention and rehabilitation and recovery resources after stroke.

A stroke occurs every 40 seconds in the U.S., and someone dies of a stroke every four minutes. An estimated 7.2 million Americans aged 20 years or older have had a stroke, and approximately 800,000 people in the U.S. have a new or recurrent stroke each year.

Optimized stroke systems of care that span health care delivery from primordial prevention to rehabilitation and recovery help to ensure patients, caregivers and providers have the tools needed for prevention, treatment and recovery.

Implementation of the American Heart Association's Get With The Guidelines - Stroke at U.S. hospitals has been associated with an 8 percent reduction in mortality at one year and improved functional outcome at hospital discharge.

Credit: 
American Heart Association

Duchenne muscular dystrophy prevalence increases, while incidence remains steady

New York, NY, May 20, 2019--In the first study of its kind involving Duchenne Muscular Dystrophy (DMD) in the U.S., researchers from the Deerfield Institute found that while the number of new cases has remained stable, there has been an uptick in prevalence--largely attributed to enhanced treatments and longevity. The study, which is titled "Duchenne Muscular Dystrophy Prevalence in the U.S.: A Novel Incidence-Based Modeling Approach Using System Dynamics", is scheduled for Poster Session ll on Monday, May 20 at the ISPOR 2019 annual meeting in New Orleans.

DMD, a genetic disorder characterized by progressive muscle degeneration and weakness, is caused by an absence of Dystrophin, a protein that helps keep muscle cells intact. Symptom onset is in early childhood, typically between ages 3 and 5. The disease primarily affects boys, but in rare cases it can affect girls.

Using a triangular distribution of incidence rates identified in the literature2,3, a sensitivity analysis was run to estimate the diagnosed incidence of DMD in the U.S. at 17.24 per 100,000 live male births, corresponding to approximately 362 incident cases in 2019; diagnosed prevalence was found to be 6.09 per 100,000 male population across all age groups, corresponding to about 10,015 prevalent cases in 2019.

The Deerfield Institute researchers found that while the majority (64.5%) of DMD patients are under the age of 20, there is a significant number of older DMD patients up to 45 years of age that were excluded from previous prevalence estimates. 4 The prevalence of DMD among males, aged 45 or younger, was found to be 10.0 per 100,000 vs previously estimated prevalence estimates of 1.38 per 10,000 among males 5 to 24 years of age.

"We hypothesized that the prevalence of DMD has increased over the past few decades, due predominantly to improvements in treatment and care" said Emma Giegerich, MPH, an epidemiologist with the Deerfield Institute and co-author of the study. "Our incidence-to-prevalence model was built using system dynamics principles and birth-cohort-specific survival curves to get the most accurate picture of the disease landscape and its current burden. The results indicate that there is a larger than expected patient population that may benefit from novel treatment interventions, such as targeted gene therapies, potentially improving the viability of current or future drug development programs."

Credit: 
Deerfield Management (Deerfield Institute Division)

Walking and strength training may decrease the risk of dying from liver disease

San Diego, Calif. (May 19, 2019) -- Physical activity, including walking and muscle-strengthening activities, were associated with significantly reduced risk of cirrhosis-related death, according to research presented at Digestive Disease Week® (DDW) 2019. Chronic liver disease is increasing, partly due to the obesity epidemic, and currently there are no guidelines for the optimal type of exercise for the prevention of cirrhosis-related mortality. Researchers hope these findings will help provide specific exercise recommendations for patients at risk for cirrhosis and its complications.

"The benefit of exercise is not a new concept, but the impact of exercise on mortality from cirrhosis and from liver cancer has not yet been explored on this scale," said Tracey Simon, MD, lead researcher on the study and instructor of medicine at Harvard Medical School and Massachusetts General Hospital, Boston. "Our findings show that both walking and strength training contribute to substantial reductions in risk of cirrhosis-related death, which is significant because we know very little about modifiable risk factors."

Dr. Simon and her team prospectively followed 68,449 women from the Nurses' Health Study and 48,748 men from the Health Professionals Follow-up Study, without known liver disease at baseline. Participants provided highly accurate data on physical activity, including type and intensity, every two years from 1986 through 2012, which allowed researchers to prospectively examine the association between physical activity and cirrhosis-related death.

Researchers observed that adults in the highest quintile of weekly walking activity had 73 percent lower risk for cirrhosis-related death than those in the lowest quintile. Further risk reduction was observed with combined walking and muscle-strengthening exercises.

Previous research has been limited to studies that assessed physical activity at just one point in time, or studies with very short-term follow-up. This was the first prospective study in a large U.S. population to include detailed and updated measurements of physical activity over such a prolonged period, which allowed researchers to more precisely estimate the relationship between physical activity and liver-related outcomes.

"In the U.S., mortality due to cirrhosis is increasing dramatically, with rates expected to triple by the year 2030. In the face of this alarming trend, information on modifiable risk factors that might prevent liver disease is needed," said Dr. Simon. "Our findings support further research to define the optimal type and intensity of physical activity to prevent adverse outcomes in patients at risk for cirrhosis."

Credit: 
Digestive Disease Week

Big data reveals hidden subtypes of sepsis

video: Dr. Christopher Seymour, of UPMC and the University of Pittsburgh School of Medicine, explains why sepsis is not just one disease and how we should treat it differently.

Image: 
UPMC

PITTSBURGH, May 19, 2019 - Much like cancer, sepsis isn't simply one condition but rather many conditions that could benefit from different treatments, according to the results of a University of Pittsburgh School of Medicine study involving more than 60,000 patients.

These findings, announced today in JAMA and presented at the American Thoracic Society's Annual Meeting, could explain why several recent clinical trials of treatments for sepsis, the No. 1 killer of hospitalized patients, have failed. Sepsis is a life-threatening condition that arises when the body's response to an infection injures its own tissues and organs.

"For over a decade, there have been no major breakthroughs in the treatment of sepsis; the largest improvements we've seen involve the enforcing of 'one-size fits all' protocols for prompt treatment," said lead author Christopher Seymour, M.D., M.Sc., associate professor in Pitt's Department of Critical Care Medicine and member of Pitt's Clinical Research Investigation and Systems Modeling of Acute Illness Center. "But these protocols ignore that sepsis patients are not all the same. For a condition that kills more than 6 million people annually, that's unacceptable. Hopefully, by seeing sepsis as several distinct conditions with varying clinical characteristics, we can discover and test therapies precisely tailored to the type of sepsis each patient has."

In the "Sepsis ENdotyping in Emergency Care" (SENECA) project, funded by the National Institutes of Health (NIH), Seymour and his team used computer algorithms to analyze 29 clinical variables found in the electronic health records of more than 20,000 UPMC patients recognized to have sepsis within six hours of hospital arrival from 2010 to 2012.

The algorithm clustered the patients into four distinct sepsis types, described as:

Alpha: most common type (33%), patients with the fewest abnormal laboratory test results, least organ dysfunction and lowest in-hospital death rate at 2%;

Beta: older patients, comprising 27%, with the most chronic illnesses and kidney dysfunction;

Gamma: similar frequency as beta, but with elevated measures of inflammation and primarily pulmonary dysfunction;

Delta: least common (13%), but most deadly type, often with liver dysfunction and shock, and the highest in-hospital death rate at 32%.

The team then studied the electronic health records of another 43,000 UPMC sepsis patients from 2013 to 2014. The findings held. And they held again when the team studied rich clinical data and immune response biomarkers from nearly 500 pneumonia patients enrolled at 28 hospitals in the U.S.

In the next part of the study, Seymour and his team applied their findings to several recently completed international clinical trials that tested different promising therapies for sepsis--all of which had ended with unremarkable results.

When trial participants were classified by the four sepsis types, some trials might not have been failures. For example, early goal-directed therapy (EGDT), an aggressive resuscitation protocol that includes placing a catheter to monitor blood pressure and oxygen levels, delivery of drugs, fluids and blood transfusions was found in 2014 to have no benefit following a five-year, $8.4 million study. But when Seymour's team re-examined the results, they found that EGDT was beneficial for the Alpha type of sepsis patients. Conversely, it resulted in worse outcomes for the Delta subtype.

"Intuitively, this makes sense--you wouldn't give all breast cancer patients the same treatment. Some breast cancers are more invasive and must be treated aggressively. Some are positive or negative for different biomarkers and respond to different medications," said senior author Derek Angus, M.D., M.P.H., professor and chair of Pitt's Department of Critical Care Medicine. "The next step is to do the same for sepsis that we have for cancer--find therapies that apply to the specific types of sepsis and then design new clinical trials to test them."

Credit: 
University of Pittsburgh