Body

'Good cholesterol' may not always be good

video: Postmenopausal factors may have an impact on the heart-protective qualities of high-density lipoproteins (HDL) -- also known as 'good cholesterol,' according to epidemiologists at the University of Pittsburgh Graduate School of Public Health.

Image: 
Tim Betler/UPMC

PITTSBURGH, July 19, 2018 - Postmenopausal factors may have an impact on the heart-protective qualities of high-density lipoproteins (HDL) - also known as 'good cholesterol' - according to a study led by researchers in the University of Pittsburgh Graduate School of Public Health.

The findings, published today in Arteriosclerosis, Thrombosis, and Vascular Biology, a journal of the American Heart Association (AHA), indicate that this specific type of blood cholesterol may not translate into a lowered risk of cardiovascular disease in older women--bringing into question the current use of HDL cholesterol in a common equation designed to predict heart disease risk, particularly for women.

HDL is a family of particles found in the blood that vary in sizes and cholesterol contents. HDL has traditionally been measured as the total cholesterol carried by the HDL particles, known as HDL cholesterol. HDL cholesterol, however, does not necessarily reflect the overall concentration, the uneven distribution, or the content and function of HDL particles. Previous research has demonstrated the heart-protective features of HDL. This good cholesterol carries fats away from the heart, reducing the build-up of plaque and lowering the potential for cardiovascular disease.

"The results of our study are particularly interesting to both the public and clinicians because total HDL cholesterol is still used to predict cardiovascular disease risk," said lead author Samar R. El Khoudary, Ph.D., M.P.H., F.A.H.A., associate professor in Pitt Public Health's Department of Epidemiology. "This study confirms our previous work on a different group of women and suggests that clinicians need to take a closer look at the type of HDL in middle-aged and older women, because higher HDL cholesterol may not always be as protective in postmenopausal women as we once thought. High total HDL cholesterol in postmenopausal women could mask a significant heart disease risk that we still need to understand."

El Khoudary's team looked at 1,138 women aged 45 through 84 enrolled across the U.S. in the Multi-Ethnic Study of Atherosclerosis (MESA), a medical research study sponsored by the National Heart, Lung and Blood Institute of the National Institutes of Health (NIH). MESA began in 1999 and is still following participants today.

The study points out that the traditional measure of the good cholesterol, HDL cholesterol, fails to portray an accurate depiction of heart disease risk for postmenopausal women.

Women are subject to a variety of physiological changes in their sex hormones, lipids, body fat deposition and vascular health as they transition through menopause. The authors are hypothesizing that the decrease of estrogen, a cardio-protective sex hormone, along with other metabolic changes, can trigger chronic inflammation over time, which may alter the quality of HDL particles.

"We have been seeing an unexpected relationship between HDL cholesterol and postmenopausal women in previous studies, but have never deeply explored it," said El Khoudary. Her study looked at two specific measurements of HDL to draw the conclusion that HDL cholesterol is not always cardio-protective for postmenopausal women, or not as 'good' as expected.

The number and size of the HDL particles and total cholesterol carried by HDL particles was observed. The study also looked at how age when women transitioned into postmenopause, and the amount of time since transitioning, may impact the expected cardio-protective associations of HDL measures.

The harmful association of higher HDL cholesterol with atherosclerosis risk was most evident in women with older age at menopause and who were greater than, or equal to, 10 years into postmenopause.

In contrast to HDL cholesterol, a higher concentration of total HDL particles was associated with lower risk of atherosclerosis. Additionally, having a high number of small HDL particles was found beneficial for postmenopausal women. These findings persist irrespective of age and how long it has been since women became postmenopausal.

On the other hand, large HDL particles are linked to an increased risk of cardiovascular disease close to menopause. During this time, the quality of HDL may be reduced, increasing the chance for women to develop atherosclerosis or cardiovascular disease. As women move further away from their transition, the quality of the HDL may restore--making the good cholesterol cardio-protective once again.

"Identifying the proper method to measure active 'good' HDL is critical to understanding the true cardiovascular health of these women," said senior author Matthew Budoff, M.D., of Los Angeles Biomedical Research Institute.

El Khoudary recently was awarded funding from the National Institute on Aging to expand upon this research work. Her goal is to continue understanding the link between quality of good cholesterol over the menopause transition and women's risk of cardiovascular disease later in life. She also seeks to examine the biological mechanisms that contribute to quality change of good cholesterol, so that the cardio-protective contribution of good cholesterol to postmenopausal women's health can be clarified, which would impact guidelines for screening and treatment.

Credit: 
University of Pittsburgh Schools of the Health Sciences

US opioid prescribing rates by congressional district

Congressional districts with the highest opioid prescribing rates are predominantly concentrated in the southeastern U.S., with other hotspots in Appalachia and the rural west, according to a new study led by Harvard T.H. Chan School of Public Health. The study, the first to focus on opioid prescribing rates at the congressional district level, could help policy makers at the federal and state level better target intervention and prevention strategies.

The study will be published online July 19, 2018 in American Journal of Public Health.

"It is important for public health research to focus on geographical units such as congressional districts as it allows for elected representatives to be more informed about important issues such as the opioid epidemic. Because a congressional district has a named elected representative, unlike say a county, it brings a certain degree of political accountability when it comes to discussing the opioid epidemic," said S V Subramanian, professor of population health and geography.

The study found that Alabama's Fourth Congressional District had 166 opioid prescriptions per 100 people, the highest rate of any district in the nation. Districts in Kentucky, Tennessee, Mississippi, Arkansas, Virginia, and Oklahoma rounded out the top ten areas with the highest prescribing rates. Other high prescribing rates were found in districts located in eastern Arizona, Nevada, northern California, rural Oregon, and rural Washington.

The lowest opioid prescribing rates tended to be concentrated in congressional districts near urban centers, including Washington, D.C., New York, Boston, Atlanta, Los Angeles, and San Francisco. Virginia was the only state that had congressional districts with top- and bottom-ten opioid prescribing rates.

The findings come amid a national opioid epidemic that has claimed tens of thousands of lives. Between 1999 and 2010, prescription opioid-related overdose deaths quadrupled, according to the U.S. Centers for Disease Control and Prevention, and the epidemic was estimated to cost $78.5 billion in 2013--one-third of which was spent on health care and treatment costs. In 2016 overdoses resulted in more than 42,000 deaths, and the following year President Trump officially declared a public health emergency.

"A great deal of variation may exist between state-level opioid prescribing rates and prescribing rates in specific congressional districts within the state," said Lyndsey Rolheiser, a postdoctoral research fellow and lead author of the paper. "Having these data could help representatives advocate more strongly for federal policies aimed at curbing the opioid epidemic and helping their constituents."

Credit: 
Harvard T.H. Chan School of Public Health

Caffeine affects food intake at breakfast, but its effect is limited and transient

video: Findings of a new study published in the Journal of the Academy of Nutrition and Dietetics do not support the use of caffeine as an appetite suppressant or weight-loss aid.

Image: 
<i>Journal of the Academy of Nutrition and Dietetics</i>

Philadelphia, July 19, 2018 - A new study featured in the Journal of the Academy of Nutrition and Dietetics found that after drinking a small amount of caffeine, participants consumed 10 percent less at a breakfast buffet provided by researchers, but this effect did not persist throughout the day and had no impact on participants' perceptions of their appetites. Based on these findings, the investigators have concluded that caffeine is not effective as an appetite suppressant and weight-loss aid.

"Caffeine is frequently added to dietary supplements with claims that it suppresses appetite and facilitates weight loss. Previous research has speculated that caffeine speeds metabolism or affects brain chemicals that suppress appetite. In addition, epidemiological evidence suggests that regular caffeine consumers have a lower body mass index (BMI) than non-consumers. The purpose of our study was to determine whether caffeine can in fact be linked to reduced food intake or suppressed appetite, and if the results vary by BMI," explained lead investigator Leah M. Panek-Shirley, PhD, SUNY University at Buffalo, Department Exercise and Nutrition Sciences, Buffalo, NY, USA.

On average, Americans drink eight ounces of coffee per day. Fifty healthy adults (aged 18-50 years) visited the investigators' laboratory weekly over a month to participate in the study. Each time, they were asked to drink juice with added caffeine that was either equivalent to consumption of four ounces (1 mg/kg) or eight ounces (3 mg/kg) of coffee, or no coffee as a placebo dose. Thirty minutes later, participants were instructed to eat as much or as little as they wanted of a hearty breakfast buffet. The investigators asked participants to record everything they ate throughout each entire study day and sent them hourly reminder emails, linked to an online survey, to document their intake and appetite at each interval.

The study determined that after drinking the juice with 1 kg/mg of caffeine, participants consumed about 70 fewer calories than they did after drinking juice with 3 mg/kg or no added caffeine. After reviewing what the participants ate for the rest of each study day, they found the small decrease in intake did not persist. Participants compensated for the reduced intake at breakfast later in the day. In addition, there were no differences in reported appetite associated with the caffeine doses. Finally, their individual BMIs had no effect on their food intake or appetite at all three caffeine levels.

"This study, by nature of its rigorous design, reinforces the importance of good eating habits and not relying on unsupported weight loss aids or unhealthy practices," commented Carol DeNysschen, PhD, RD, MPH, CDN, FAND, one of the investigators, Professor and Chair of the Department of Health, Nutrition, and Dietetics, SUNY Buffalo State College, Buffalo, NY, USA. She elaborated on the rigor of the double-blind, randomized, crossover design of the study: the order of the doses was randomized for the 50 participants, both participants and researchers did not know the dose of samples as they were being presented, and all participants received all dose treatments, thereby acting as their own controls to enable comparisons of their individual responses.

Credit: 
Elsevier

TGen-led study shows DNA methylation related to liver disease among obese patients

image: Dr. Johanna DiStefano, a TGen Professor and head of the institute's Diabetes and Fibrotic Disease Unit.

Image: 
TGen

PHOENIX, Ariz. -- July 18, 2018 -- DNA methylation is a molecular process that helps enable our bodies to repair themselves, fight infection, get rid of environmental toxins, and even to think. But sometimes this process goes awry.

A team of scientists led by the Translational Genomics Research Institute (TGen), an affiliate of City of Hope, has identified how DNA methylation is associated with a condition known as non-alcoholic fatty liver disease (NAFLD), which can lead to liver cirrhosis and death, and is one of the leading indicators for liver transplants.

In one of the most exacting studies of its kind, TGen scientists found evidence that DNA methylation has a role in the initiation of NAFLD-related fibrosis, according to a study published in the journal Clinical Epigenetics.

Obesity and insulin resistance are associated with fat accumulation in the liver, and obesity is a significant risk factor for NAFLD. Using a City of Hope computer algorithm specifically designed for the task, researchers analyzed the biopsied liver tissues of 14 obese patients with advanced fibrosis or cirrhosis of the liver, and 15 obese patients with normal livers.

"Our findings showed statistically significant evidence for differential DNA methylation between fibrotic and normal tissue samples from obese individuals," said Dr. Johanna DiStefano, a TGen Professor and head of the institute's Diabetes and Fibrotic Disease Unit.

Importantly, the study zeroed in on four genes -- AQP1, FGFR2, RBP5 and MGMT -- that not only were methylated in this study, but also in three previous studies that were similar, but not specifically focused on advanced fibrosis in obese NAFLD patients.

"These genes could represent targets for new therapeutics," said Dr. DiStefano, the study's senior author, who plans to pursue a larger study that would validate the initial findings in this pilot inquiry. "These approaches are yielding new insights into the pathological mechanisms underlying the development of fibrosis and cirrhosis in NAFLD."

Future studies will be needed to determine the extent to which DNA methylation patterns in the liver are represented in other metabolically relevant tissues. Such findings would be critical for the development of non-invasive biomarkers in the creation of an early-warning system for NAFLD, the study concludes.

Credit: 
The Translational Genomics Research Institute

Physicians who visit patients post-hospitalization give more comprehensive discharge plans

(Boston)-- When resident physicians visit the homes of their former hospital patients they are better able to assess patient needs and understand the important role that community services and agencies play in keeping them at home and out of the hospital, according to a new study by Boston University School of Medicine (BUSM).

Resident physicians, commonly known as residents, often develop the discharge plans as part of their training programs. Typically they don't make a home visit after discharge to assess if the plan worked and many never see their patients again.

Thirty-nine internal medicine residents from Boston Medical Center (BMC) participated in a post-hospital discharge home visit to older patients. They were able to review their discharge plan and determine the effectiveness of the plan, specifically identifying parts that did and did not work. "After visiting the home, the residents were better able to understand what makes for a good hospital discharge of an older patient," explained corresponding author Megan Young, MD, assistant professor of Medicine at BUSM.

After completing the exercise, residents were asked what they learned. These residents were able to better assess patient needs, which highlighted the need for more individualized discharge plans with regard to in-home functioning, communication with caregivers and medication reconciliation.

"By being able to go into the patient's home and see what services patients need (home delivered meals, grab bars in the shower, medication delivery systems), we as doctors are able to provide more comprehensive care plans that allow community-dwelling older adults to stay in their home and out of the hospital," said Young, a geriatrician at BMC.

Adverse events in older adult patients following discharge from the hospital is as high as 25 percent. Since the affordable care act and hospital readmissions reduction program, many hospitals get lower payments if they have too many readmissions. "Although this study did not look at re-admissions, the goal was to teach residents how to develop comprehensive discharge plans that involved community agencies and resources in the hopes that future patients will have fewer adverse events and readmissions."

Credit: 
Boston University School of Medicine

Using adrenaline in cardiac arrests results in less than 1 percent more people leaving hospital alive

image: Infographic detailing result findings,

Image: 
University of Warwick

A clinical trial of the use of adrenaline in cardiac arrests has found that its use results in less than 1% more people leaving hospital alive - but almost doubles the risk of severe brain damage for survivors of cardiac arrest. The research raises important questions about the future use of adrenaline in such cases and will necessitate debate amongst healthcare professionals, patients and the public.

Each year 30,000 people sustain a cardiac arrest in the UK and less than one in ten survive. The best chance of survival comes if the cardiac arrest is recognised quickly, someone starts cardiopulmonary resuscitation (CPR) and defibrillation (electric shock treatment) is applied without delay.

The application of adrenaline is one of the last things tried in attempts to treat cardiac arrest. It increases blood flow to the heart and increases the chance of restoring a heartbeat. However it also reduces blood flow in very small blood vessels in the brain, which may worsen brain damage. Observational studies, involving over 500,000 patients, have reported worse long-term survival and more brain damage among survivors who were treated with adrenaline.

Despite these issues, until now, there have been no definitive studies of the effectiveness of adrenaline as a treatment for cardiac arrest. This led the International Liaison Committee on Resuscitation to call for a placebo-controlled trial to establish if adrenaline was beneficial or harmful in the treatment of cardiac arrest. This "Pre-hospital Assessment of the Role of Adrenaline: Measuring the Effectiveness of Drug administration In Cardiac arrest (PARAMEDIC2)" trial was undertaken to determine if adrenaline is beneficial or harmful as a treatment for out of hospital cardiac arrest.

The trial was funded by the National Institute for Health Research, sponsored by the University of Warwick and led by researchers in the University's Clinical Trials Units - part of Warwick Medical School. The trial ran from December 2014 through October 2017. It was conducted in 5 National Health Service Ambulance Trusts in the United Kingdom, and included 8000 patients who were in cardiac arrest. Patients were allocated randomly to be given either adrenaline or a salt-water placebo and all those involved in the trial including the ambulance crews and paramedics were unaware which of these two treatments the patient received.

The results of the trial have now been published in the New England Journal of Medicine (NEJM) on Thursday 19th July 2018 in an article entitled "A Randomized Trial of Epinephrine in Out-of-Hospital Cardiac Arrest".

Of 4012 patients given adrenaline, 130 (3.2%) were alive at 30 days compared with 94 (2.4%) of the 3995 patients who were given placebo. However, of the 128 patients who had been given adrenaline and who survived to hospital discharge 39 (30.1%) had severe brain damage, compared with 16 (18.7%) among the 91 survivors who had been given a placebo. In this study a poor neurological outcome (severe brain damage) was defined as someone who was in a vegetative state requiring constant nursing care and attention, or unable to walk and look after their own bodily needs without assistance.

The reasons why more patients survived with adrenaline and yet had an increased chance of severe brain damage are not completely understood. One explanation is that although adrenaline increases blood flow in large blood vessels, it paradoxically impairs blood flow in very small blood vessels, and may worsen brain injury after the heart has been restarted. An alternative explanation is that the brain is more sensitive than the heart to periods without blood and oxygen and although the heart can recover from such an insult, the brain is irreversibly damaged.

Professor Gavin Perkins Professor of Critical Care Medicine in Warwick Medical School at the University of Warwick (and the lead author on the paper) said:

"We have found that the benefits of adrenaline are small - one extra survivor for every 125 patients treated - but the use of adrenaline almost doubles the risk of a severe brain damage amongst survivors."

"Patients may be less willing to accept burdensome treatments if the chances of recovery are small or the risk of survival with severe brain damage is high. Our own work with patients and the public before starting the trial identified survival without brain damage is more important to patients than survival alone. The findings of this trial will require careful consideration by the wider community and those responsible for clinical practice guidelines for cardiac arrest."

Professor Jerry Nolan, from the Royal United Hospital Bath (and a co-author on the paper) said:

"This trial has answered one of the longest standing questions in resuscitation medicine. Taking the results in context of other studies, it highlights the critical importance of the community response to cardiac arrest. Unlike adrenaline, members of the public can make a much bigger difference to survival through learning how to recognise cardiac arrest, perform CPR and deliver an electric shock with a defibrillator. "

Credit: 
University of Warwick

Concentrated wealth in agricultural populations may account for the decline of polygyny

Across small-scale societies, the practice of some males taking multiple wives is thought to be associated with extreme disparities of wealth. But in fact, polygyny has been more common among relatively egalitarian low-tech horticulturalists than in highly unequal, capital-intensive agricultural societies. This surprising fact is known as the polygyny paradox, and a new study from the Santa Fe Institute's Dynamics of Wealth Inequality Project provides a possible resolution of the puzzle.

The team used a new model developed along with Seung-Yun Oh (Korea Insurance Research Institute), in which both men and women make marital choices to maximize the number of surviving children they raise. The model was tested on data gathered from 11,813 individuals' wealth, marriage, and reproductive success in 29 diverse societies.

Their main finding is that where wealth is highly concentrated, few men are wealthy enough to afford more than a single wife, and the very wealthy, while often polygynous, do not take on wives in proportion to their wealth.

Cody Ross (Max Planck Institute for Evolutionary Anthropology), a former Santa Fe Institute postdoctoral fellow along with Santa Fe Institute Professor Samuel Bowles and University of California anthropologist Monique Borgerhoff Mulder led the team of 34 anthropologists and economists conducting the study.

"In many capital-intensive farming societies," Bowles explained, "there are so few sufficiently wealthy men that unless they were to have a truly extraordinary number of wives, only a small fraction of all women will be married polygynously."

Like previous studies, including those by Laura Fortunato (Santa Fe Institute, University of Oxford) and others, the model developed by the team takes account of the fact that taking on additional wives reduces the amount of a male's material wealth -- land, cattle, equipment -- available to each wife.

But the new study showed that the impediments to polygyny even among the very rich went way beyond the need to share the male's wealth among rival wives. "This is what really surprised us," said Borgerhoff Mulder. "Our estimates show that a very rich man with four wives will have far fewer kids than two men with two wives and with the same total wealth divided between them."

Credit: 
Santa Fe Institute

Algorithm identifies patients best suited for antidepressants

image: Results of this study bring us closer to identifying individuals likely to benefit from antidepressants.

Image: 
McLean Hospital

Belmont, MA - McLean Hospital researchers have completed a study that sought to determine which individuals with depression are best suited for antidepressant medications. Their findings, published in Psychological Medicine on July 2, 2018, have led to the development of a statistical algorithm that identifies patients who may best respond to antidepressants--before they begin treatment.

Christian A. Webb, PhD, director of the Treatment and Etiology of Depression in Youth Laboratory at McLean Hospital, is one of the study's coauthors, along with Diego A. Pizzagalli, PhD, director of McLean's Center for Depression, Anxiety and Stress Research. Webb explained how their paper, "Personalized Prediction of Antidepressant v. Placebo Response: Evidence from the EMBARC Study," grew from data derived from a large and recently completed multi-site clinical trial of antidepressant medications called Establishing Moderators and Biosignatures of Antidepressant Response in Clinical Care (EMBARC). Demographic and clinical characteristics of individuals who took part in the EMBARC study were collected prior to the start of treatment by the study team across four sites (Columbia University, Massachusetts General Hospital, the University of Michigan, and UT Southwestern Medical Center). Participants were also administered computer-based tasks.

Using this information, Webb and his colleagues developed an algorithm predicting that approximately one-third of individuals would derive a meaningful therapeutic benefit from antidepressant medications relative to placebo. In the study, participants were randomly assigned to a common antidepressant medication or a placebo pill.

The results, Webb said, were like many previous clinical trials in that "we found relatively little difference in average symptom improvement between those individuals randomly assigned to the medication vs. placebo." However, he explained, "for the one-third of individuals predicted to be better suited to antidepressants, they had significantly better outcomes if they happened to be assigned to the medication rather than the placebo." The latter group of patients were characterized by higher depression severity and negative emotionality, were older, more likely to be employed, and exhibited better cognitive control on a computerized task.

"These results bring us closer to identifying groups of patients very likely to benefit preferentially from an SSRI and could realize the goal of personalizing antidepressant treatment selection," added UT Southwestern Medical Center's Madhukar Trivedi, MD, coordinating principal investigator for the EMBARC study.

Building on these findings, Webb said, his team is now looking to adapt the algorithm for use in "real-world" clinics. Specifically, he reported, the researchers are looking to collaborate with the University of Pennsylvania on a study that would test the algorithm in psychiatric clinics treating individuals suffering from depression by comparing two or more viable treatments--for example, two different classes of antidepressants, or antidepressants vs. psychotherapy.

"Our mission is to use these data-driven algorithms to provide clinicians and patients with useful information about which treatment is expected to yield the best outcome for this specific individual," Webb said. He explained that research like this may further the goal of creating "personalized medicine" in health care. "Rather than using a one-size-fits-all approach, we'd like to optimize our treatment recommendations for individual patients," he said.

Credit: 
McLean Hospital

Early supper associated with lower risk of breast and prostate cancer

image: Early supper associated with breast and prostate cancer risk.

Image: 
ISGlobal

Having an early supper or leaving an interval of at least two hours before going to bed are both associated with a lower risk of breast and prostate cancer. Specifically, people who take their evening meal before 9 pm or wait at least two hours before going to sleep have an approximate 20% lower risk of those types of cancer compared to people who have supper after 10pm or those who eat and go to bed very close afterwards, respectively. These were the main conclusions of a new study by the Barcelona Institute for Global Health (ISGlobal), a centre supported by the "la Caixa" Banking Foundation. The study is the first to analyse the association between cancer risk and the timing of meals and sleep.

Previous studies of the link between food and cancer have focused on dietary patterns--for example, the effects of eating red meat, fruit and vegetables and the associations between food intake and obesity. However, little attention has been paid to other factors surrounding the everyday act of eating: the timing of food intake and the activities people do before and after meals. Recent experimental studies have shown the importance of meal timing and demonstrated the health effects of eating late at night.

The aim of the new study, published in the International Journal of Cancer, was to assess whether meal timing could be associated with risk of breast and prostate cancer, two of the most common cancers worldwide. Breast and prostate cancers are also among those most strongly associated with night-shift work, circadian disruption and alteration of biological rhythms. The study assessed each participant's lifestyle and chronotype (an individual attribute correlating with preference for morning or evening activity).

The study, which formed part of the MCC-Spain project, co-financed by the CIBER of Epidemiology and Public Health (CIBERESP), included data from 621 cases of prostate cancer and 1,205 cases of breast cancer, as well as 872 male and 1,321 female controls selected randomly from primary health centres. The participants, who represented various parts of Spain, were interviewed about their meal timing, sleep habits and chronotype and completed a questionnaire on their eating habits and adherence to cancer prevention recommendations.

"Our study concludes that adherence to diurnal eating patterns is associated with a lower risk of cancer," explained ISGlobal researcher Manolis Kogevinas, lead author of the study. The findings "highlight the importance of assessing circadian rhythms in studies on diet and cancer", he added.

If the findings are confirmed, Kogevinas noted, "they will have implications for cancer prevention recommendations, which currently do not take meal timing into account". He added: "The impact could be especially important in cultures such as those of southern Europe, where people have supper late."

ISGlobal researcher Dora Romaguera, the last author of the study, commented: "Further research in humans is needed in order to understand the reasons behind these findings, but everything seems to indicate that the timing of sleep affects our capacity to metabolise food."

Animal experimental evidence has shown that the timing of food intake has "profound implications for food metabolism and health", commented Romaguera.

Credit: 
Barcelona Institute for Global Health (ISGlobal)

The pain circuit breaker

image: Visual illusions could help reduce knee pain for sufferers by up to 40 per cent.

Image: 
University of South Australia

In a new study published in the journal Peer J this week, researchers at UniSA's Body in Mind Research Group have found people suffering osteoarthritis in the knees reported reduced pain when exposed to visual illusions that altered the size of their knees.

UniSA researcher and NHMRC Career Development Fellow, Dr Tasha Stanton says the research combined visual illusions and touch, with participants reporting up to a 40 per cent decrease in pain when presented with an illusion of the knee and lower leg elongated.

"We also found that the pain reduction was optimal when the illusion was repeated numerous times - that is, its analgesic effect was cumulative," Dr Stanton says.

The small study - 12 participants - focused on people over 50 years with knee pain, and a clinical diagnosis of osteoarthritis.

Dr Stanton says the research provides "proof of concept" support that visual illusions can play a powerful role in reducing pain.

"We have shown that pain is reduced significantly when a visual stimulus, in this case a smaller or an elongated joint, is provided, but not only that, when exposed to that illusion repeatedly, pain decreases even further," she says.

"It seems that seeing is believing, and by understanding the neurological processes at work we may be able to ease pain more effectively for people with chronic conditions, reduce their reliance on medications and find alternative physical therapies to help manage conditions like osteoarthritis.

"This research adds to a growing body of evidence that the pain experienced in osteoarthritis is not just about damage to the joint.

"There are other factors at play and the more we understand about these natural mechanisms for reducing pain and how they are triggered, the more opportunity we have to develop a range of treatments to manage chronic conditions."

Credit: 
University of South Australia

The scent of coffee appears to boost performance in math

Drinking coffee seems to have its perks. In addition to the physical boost it delivers, coffee may lessen our risk of heart disease, diabetes and dementia. Coffee may even help us live longer. Now, there's more good news: research at Stevens Institute of Technology reveals that the scent of coffee alone may help people perform better on the analytical portion of the Graduate Management Aptitude Test, or GMAT, a computer adaptive test required by many business schools.

The work, led by Stevens School of Business professor Adriana Madzharov, not only highlights the hidden force of scent and the cognitive boost it may provide on analytical tasks, but also the expectation that students will perform better on those tasks. Madzharov, with colleagues at Temple University and Baruch College, recently published their findings in the Journal of Environmental Psychology.

"It's not just that the coffee-like scent helped people perform better on analytical tasks, which was already interesting," says Madzharov. "But they also thought they would do better, and we demonstrated that this expectation was at least partly responsible for their improved performance." In short, smelling a coffee-like scent, which has no caffeine in it, has an effect similar to that of drinking coffee, suggesting a placebo effect of coffee scent.

In their work, Madzharov and her team administered a 10-question GMAT algebra test in a computer lab to about 100 undergraduate business students, divided into two groups. One group took the test in the presence of an ambient coffee-like scent, while a control group took the same test - but in an unscented room. They found that the group in the coffee-smelling room scored significantly higher on the test.

Madzharov and colleagues wanted to know more. Could the first group's boost in quick thinking be explained, in part, by an expectation that a coffee scent would increase alertness and subsequently improve performance?

The team designed a follow-up survey, conducted among more than 200 new participants, quizzing them on beliefs about various scents and their perceived effects on human performance. Participants believed they would feel more alert and energetic in the presence of a coffee scent, versus a flower scent or no scent; and that exposure to coffee scent would increase their performance on mental tasks. The results suggest that expectations about performance can be explained by beliefs that coffee scent alone makes people more alert and energetic.

Madzharov, whose research focuses on sensory marketing and aesthetics, is looking to explore whether coffee-like scents can have a similar placebo effect on other types of performance, such as verbal reasoning. She also says that the finding - that coffee-like scent acts as a placebo for analytical reasoning performance - has many practical applications, including several for business.

"Olfaction is one of our most powerful senses," says Madzharov. "Employers, architects, building developers, retail space managers and others, can use subtle scents to help shape employees' or occupants' experience with their environment. It's an area of great interest and potential."

Credit: 
Stevens Institute of Technology

Diabetes drug with better side-effect tolerance could improve treatment

Improved medications for Type 2 diabetes are one step closer thanks to a new discovery reported this week by researchers at the University of Pennsylvania and Syracuse University. By modifying the key ingredient in current diabetes drugs, the researchers produced a compound that was effective for hyperglycemia in animal trials, yet without the most problematic side effects of current drugs.

"Drug regimens often have long lists of side effects which negatively impact treatment," said Dr. Bart De Jonghe of University of Pennsylvania School of Nursing, one of the study leaders. "In Type 2 diabetes, nausea and vomiting top that list. It's the main reason people stop taking their diabetes medications, and diminishes quality of life for millions who do take them." De Jonghe and his collaborators presented their findings this week at the annual meeting of the Society for the Study of Ingestive Behavior, a leading international research conference for experts on eating behavior.

The rate of Type 2 diabetes, which is linked to obesity, has increased dramatically in recent decades. One widely prescribed class of drugs mimics the hormone glucagon-like peptide-1 (GLP-1) and is effective for controlling hyperglycemia. Yet all FDA-approved GLP-1 based drugs cause nausea and vomiting in between 20-50% of patients. The Penn-Syracuse team modified the active ingredient in current drugs, a compound called exendin-4. By attaching each molecule of exendin-4 to vitamin B-12, they produced a compound that is less absorbed into regions of the brain that trigger nausea and vomiting.

Having recently published that their exendin-4/B-12 conjugate improves blood sugar levels, the team's newest research addressed the issue of side effects. Measuring that in animal trials proved challenging, since lab rats and mice are unable to vomit. The researchers turned their attention to the musk shrew (Suncus murinus), a mouse-sized mammal with a vomiting reflex similar to humans. Their modifications made a striking difference in the shrews' response. Both versions of the drug showed equal benefits for controlling blood sugar, yet vomiting occurred in almost 90% of shrews dosed with ordinary exendin-4 and only 12% of shrews treated with the modified version.

"The vomiting results are striking and very encouraging," says De Jonghe. "It's rare to see such positive results with a new drug compared to the standard. It's hard to not be optimistic when you observe a complete flip in the side effect prevalence in favor of vastly improved tolerance."

Dr. Tito Borner a postdoctoral fellow in De Jonghe's group, performed additional experiments that explain the improvements in vomiting. The modified drug shows decreased activation of a brain area called the dorsal vagal complex. This primitive brain region is located in the hindbrain and is thought to coordinate many ingestive behaviors, including responses like vomiting to ensure survival.

Type 2 diabetes is the most common form of diabetes, affecting more than 25 million Americans. It occurs when the body stops responding properly to insulin, resulting in chronically high blood sugar. It is most common in people who are overweight or obese, and prevalence is highest in people of Native American, African-American, and Hispanic heritage. Left untreated, consequences of diabetes include circulatory damage, kidney disease, high blood pressure, and stroke. In addition to diet and exercise, physicians often recommend medication for its ability to quickly stabilize blood sugar. The Penn-Syracuse team's discovery could improve diabetes management by leading to a next generation of the FDA-approved medications that are better tolerated, reducing the number of people who cease medication due to adverse side effects.

Credit: 
Society for the Study of Ingestive Behavior

Anti-obesity drug derived from chili peppers shows promise in animal trials

A novel drug based on capsaicin, the compound that gives chili peppers their spicy burn, caused long term weight loss and improved metabolic health in mice eating a high fat diet, in new studies from the University of Wyoming School of Pharmacy. The drug, Metabocin, was designed to slowly release capsaicin throughout the day so it can exert its anti-obesity effect without producing inflammation or adverse side effects.

"We observed marked improvements in blood sugar and cholesterol levels, insulin response, and symptoms of fatty liver disease," reported Dr. Baskaran Thyagarajan, lead investigator, describing how Metabocin reversed many damaging effects of the high fat diet. He presented the results this week at the annual meeting of the Society for the Study of Ingestive Behavior, the leading international conference of experts on food and fluid intake.

The research team developed Metabocin, which can be taken orally, to target receptors called TRPV1 (transient receptor potential vanilloid subfamily 1) that are found in high numbers in fat cells. Stimulating the TRPV1 receptors causes white fat cells to start burning energy instead of storing it, which, in theory, should cause weight loss. An important question for the researchers was whether the drug remains effective when used long term, and whether adverse effects would outweigh its benefits. The mice in this experiment remained on the drug for 8 months, maintaining the weight loss with no evidence of safety problems. Additional ongoing experiments will see how long that can be maintained.

"It proved safe and was well tolerated by the mice," Thyagarajan concluded. "Developing Metabocin as a potent anti-obesity treatment shows promise as part of a robust strategy for helping people struggling with obesity."

Although these results may give some people the idea to eat more spicy food to lose weight, that would not work as intended. Most of the capsaicin in spicy food is not well absorbed into the body so it would not produce these effects. The researchers specifically modified the capsaicin in Metabocin for proper absorption and sustained release.

Obesity is a growing public health concern, resulting in metabolic diseases including type 2 diabetes, hypertension, atherosclerosis and heart diseases. Currently one in three individuals world-wide is either overweight or obese. Exercise and diet are the standard recommendation, but those are difficult for most people to maintain in the long term, and rebound weight gain usually occurs. The Wyoming researchers advocated for continuing to pursue medical options that stay effective in the long term to counter obesity and its metabolic impacts, to assist people seeking to maintain a healthier weight.

Credit: 
Society for the Study of Ingestive Behavior

Nursing notes can help indicate whether ICU patients will survive

Researchers at the University of Waterloo have found that sentiments in the nursing notes of health care providers are good indicators of whether intensive care unit (ICU) patients will survive.

Hospitals typically use severity of illness scores to predict the 30-day survival of ICU patients. These scores include lab results, vital signs, and physiological and demographic characteristics gathered within 24 hours of admission.

"The physiological information collected in those first 24 hours of a patient's ICU stay is really good at predicting 30-day mortality," said Joel Dubin, an associate professor in the Department of Statistics and Actuarial Science and the School of Public Health and Health Systems. "But maybe we shouldn't just focus on the objective components of a patient's health status. It turns out that there is some added predictive value to including nursing notes as opposed to excluding them."

The researchers used the large publicly available intensive care unit (ICU) database, Medical Information Mart for Intensive Care III, containing patient data between 2001 and 2012. After some inclusion and exclusion criteria were considered, such as the need for at least one nursing note for a given patient, the dataset used in the analysis included details about more than 27,000 patients, as well as the nursing notes. The researchers applied an open-source sentiment analysis algorithm to extract adjectives in the text to establish whether it is a positive, neutral or negative statement. A multiple logistic regression model was then fit to the data to show a relationship between the measured sentiment and 30-day mortality while controlling for gender, type of ICU, and simplified acute physiology score.

The sentiment analysis provided a noticeable improvement for predicting 30-day mortality in the multiple logistic regression model for this group of patients. There was also a clear difference between the patients with the most positive messages who experienced the highest survival rates and the patients with the most negative messages who experienced the lowest survival rates.

"Mortality is not the only outcome that nursing notes could potentially predict," said Dubin. "They might also be used to predict readmission, or recovery from infection while in the ICU."

The study, Sentiment in nursing notes as an indicator of out-of-hospital mortality in intensive care patients, co-authored by Dubin and his collaborators, Ian Waudby-Smith, Nam Tran, and Joon Lee, all of the University of Waterloo, was published recently in the journal PLOS ONE.

Credit: 
University of Waterloo

Traumatic brain injury biomarker shows promise to support rapid damage evaluation and predict outcomes

image: Images of sagittal brain sections of lipid species obtained by matrix-assisted laser desorption ionization imaging mass spectrometry from an uninjured control and a three hours postinjury (HPI) rat pseudocolored by intensity to demonstrate the brain-wide change in lipid levels after injury. Images of hemotoxylin and eosin (H&E) sections from adjacent sections highlight the brain regions used for mean signal intensity analysis.

Image: 
<em>The American Journal of Pathology</em>

Philadelphia, July 16, 2018 - A new study in The American Journal of Pathology found that a brain lipid molecule, lysophosphatidic acid (LPA), was significantly increased after traumatic brain injury (TBI) in a preclinical animal model. They also found that it was elevated in areas associated with cell death and axonal injury, both major hallmarks of moderate and severe TBI. This strengthens the evidence that LPA could be used as a biomarker of TBI through blood testing, potentially providing a prognostic indicator of injury and outcome.

TBI is characterized by impairments in cognition, emotion, or physical function caused by a violent blow to the head or direct brain penetration by an object. Upon injury, it is often difficult to evaluate the extent of damage or predict how long the impairment will last or whether it will worsen.

"TBI affects nearly 1.7 million individuals each year. There is a need for non-invasive biomarkers to indicate the degree of injury, predict functional outcomes, and advise how long an injured patient must remain away from sports or work before resuming any activity," explained lead investigator Neil G. Harris, PhD, of the Department of Neurosurgery, David Geffen School of Medicine at UCLA, Los Angeles, CA, USA. "LPA may well be a potential marker for that since we found it to be associated with major regions of brain pathology. It is also present in blood in high concentrations after injury."

Although LPA has been previously suggested as a marker for TBI, this study showed for the first time that levels of LPA change within the area of the injury as well as in regions distant to the injury site and linked these changes to pathological findings in brain cells. Investigators used matrix-assisted laser desorption ionization imaging mass spectrometry (MALDI IMS). This sophisticated technology allowed them to measure fine changes in lipid distribution within brain slices. They linked MALDI IMS findings to cellular pathology, such as axonal injury or cell death, using immunohistochemical (IHC) techniques.

Strong enhancement in LPA and some of its metabolites was seen throughout the brain beginning one hour after compression brain injury. By three hours after injury, high levels of LPA were noted in the cerebellum, corpus callosum, hippocampus, and other areas. "These observations demonstrate that acute injury profoundly alters LPA and LPA metabolite expression throughout the brain, and that this occurs especially in white matter regions at both near and far sites from the injury epicenter," commented lead author Whitney S. McDonald, PhD, of the Brain Injury Research Center, Department of Neurosurgery and Brain Research Center, UCLA, Los Angeles, CA, USA.

Atrophy of the connections between the cortex and thalamus is a common finding in TBI. When the investigators analyzed the thalamus in this animal model, they found intracellular levels of LPA and its precursor phosphatidic acid increased one hour after injury but returned to normal levels three hours after injury. A special stain revealed that cell death was evident after three hours, leading the researchers to suggest that the observed changes in lipid levels are part of the early response of the brain to trauma and act to initiate the later sequence of neurodegenerative changes associated with TBI. The researchers also saw that although LPA levels were elevated in hemorrhagic areas, increases were also seen in brain regions not contaminated with blood.

"These data show that LPA may be a useful biomarker of cellular pathology after TBI," said Dr. Harris. He noted that other investigators have reported that LPA is significantly increased within 24 to 36 hours in some patients with severe TBI, and elevated levels of LPA metabolites were associated with poor patient outcomes. "If LPA can be shown to be a good predictor of outcome, then measuring LPA blood levels has potential as a prognostic indicator of injury and outcome."

LPA is a simple phospholipid involved in many biological functions such as the regulation of cellular proliferation, migration, differentiation, and suppression of cell death. LPA and its receptors are found throughout the nervous system. The rapid onset of pathology and the complexity of the cellular response to TBI suggest that these act as early signaling messengers involved in initiating the cascade of cellular events that promote functional impairment after trauma. LPA has been shown to be significantly involved in the pathology of central nervous system injury.

Credit: 
Elsevier