Body

College students claim effects from opioids even if they don't take them

PHILADELPHIA -- About one in five college students reported in a survey that they knew someone who was addicted to pain medications, and nearly a third said they knew somebody who overdosed on painkillers or heroin, according to a team of undergraduate Penn State Lehigh Valley researchers.

This secondary exposure to opioid abuse may shine a light on the collateral damage that is often left out of the current debate about the epidemic, said Jennifer Parker, associate professor of sociology, Penn State Lehigh Valley.

"Since the beginning of the opioid epidemic, public debate and prevention strategies have focused on the primary victims, misusers themselves, while surprisingly little attention has been paid to the burdens felt and experienced by those who are intimately or socially tied to them," said Parker, who advised the group of researchers presenting at the American Sociological Association's annual meeting today (Aug. 11) in Philadelphia.

According to the researchers, most of the 118 students who completed a survey admitted that they had been in some way exposed to people who misuse drugs and alcohol. Of those, 20.5 percent said they currently know someone who is addicted to pain medication. About 32.5 percent said they knew somebody who overdosed on either painkillers or heroin.

Erica Hughes, an undergraduate student in Health Policy Administration, added about 15 percent of the students reported worrying that someone they knew may be misusing pain medication.

"I was surprised by how many students report close ties to people who are addicted to or have overdosed on opioids," Hughes said. "It makes me sad to think that so many are carrying around this worry because being a student in today's world is already hard enough."

Hughes added that dealing with issues connected to their exposure to the effects of opioid abuse may be particularly difficult for college students. Many college students already face increased pressure from rising tuition costs and student debt, along with fears about the job market, she added.

Amanda Borges, a 2018 graduate in Health Policy Administration, said that the findings might raise awareness about the extent of the opioid crisis and offer insight into better ways to address it.

"The general public should know how devastating this crisis has been and how it impacts all communities and social classes including college students," said Borges.

Gathering information on all aspects of the opioid crisis may help better allocate resources to help communities, added Kirsten Mears, also a 2018 graduate in Health Policy Administration.

"The more we know, the better we are able to help and identify how particular communities, especially our poorest, may have certain disadvantages in this epidemic because of lesser resources and lack of health insurance," said Mears.

According to the researchers, gender may also play a role in how college students report their exposures to the opioid problem. For example, women were twice as likely to report having intimate ties to those who misuse or overdose on opioids, the researchers said.

Shanice Clark and a team of 15 undergraduate students in Health Policy Administration also contributed to the study.

The researchers collected data from surveys filled out by students at a university in a region particularly hard hit by the opioid crisis.

Of the approximate 130 surveys were passed out, participants completed 122. Of those, the researchers determined that 118 surveys were both completed and valid.

The researchers said that future research should look at whether secondary exposure to opioids impacts the students' mental and physical health, as well as their academic performance.

Credit: 
Penn State

Rotavirus vaccine cuts infant diarrhea deaths by a third in Malawi

image: These are Malawian children.

Image: 
Dr Carina King

A major new study has shown that rotavirus vaccination reduced infant diarrhoea deaths by 34% in rural Malawi, a region with high levels of child deaths.

The study led by scientists at the University of Liverpool, UCL, Johns Hopkins Bloomberg School of Public Health and partners in Malawi provides the first population-level evidence from a low-income country that rotavirus vaccination saves lives.

The findings, published in The Lancet Global Health, add considerable weight to the World Health Organisation's (WHO) recommendation for rotavirus vaccine to be included in all national immunisation programmes.

Professor Nigel Cunliffe from the University of Liverpool's Centre for Global Vaccine Research, one of the study leads, said: "Rotavirus remains a leading cause of severe diarrhoea and death among infants and young children in many countries in Africa and Asia. Our findings strongly advocate for the incorporation of rotavirus vaccine into the childhood immunisation programmes of countries with high rates of diarrhoea deaths, and support continued use in such countries where a vaccine has been introduced."

Rotavirus is the most common cause of diarrhoeal disease among infants and young children. Despite improvements in sanitation and case management, rotavirus still caused 215,000 child deaths in 2013, with 121,000 of these in Africa. With support from Gavi, the Vaccine Alliance, many countries in Africa with high death rates have added rotavirus vaccine to their routine immunisation programme over the past five years.

To determine the vaccine's impact on infant diarrhoea deaths, researchers carried out a large population-based birth cohort study of 48,672 infants in Malawi, which introduced a monovalent rotavirus vaccine in October 2012.

As low-income African countries often lack birth and death registries - a resource used for similar impact studies in middle-income countries - the investigators and their study team of more than 1,100 people visited the homes of infants in 1,832 villages over the course of four years to collect data, including the infants' vaccination status and whether they survived to age one.

The study findings reveal that children who received the rotavirus vaccine had a 34% lower risk of dying from diarrhoea, which is a similar impact to that observed in middle-income countries.

"This is encouraging because children from the sub-Saharan African region account for more than half of global diarrhoea deaths, and with over 30 African countries thus far introducing rotavirus vaccine, the absolute impact on mortality is likely to be substantial," said one of the report's lead authors Dr Carina King, a senior research associate at UCL's Institute for Global Health.

Co-lead author Dr Naor Bar-Zeev, Associate Professor of International Health at the International Vaccine Access Center of the Johns Hopkins Bloomberg School of Public Health, added: "We already knew that rotavirus vaccine reduces hospital admissions and is highly cost-effective in low-income countries with a high burden of diarrhoeal disease, and now we've been able to demonstrate that it saves lives.

"However not all countries are vaccinating against rotavirus yet, including some very populous countries. The key message of this paper is that to do the best by all our children and to help them survive, all countries should introduce rotavirus vaccination."

The researchers also found a direct link between the proportion of the population vaccinated and the reduction in mortality that achieved. Malawi had a strong immunisation programme and was very proactive in planning to introduce rotavirus vaccine, which made it possible to scale up coverage rapidly.

"Within about a year from vaccine introduction, we were able to reach up to 90% of the population. It is vitally important that rotavirus vaccines reach all children, especially the most vulnerable living in poorer settings where the impact of vaccination is greatest," said one of the authors Dr Charles Mwansambo, Chief of Health Services for Malawi.

Credit: 
University of Liverpool

Novel blood test predicts kidney cancer risk and survival five years prior to diagnosis

A critical biomarker of kidney disease may help predict clear cell kidney cancer - the most common form of kidney cancer - years before clinical diagnosis. Kidney-injury-molecule-1 (KIM-1) can be detected in the urine and blood and is generally present at low levels in healthy individuals. Prior research by leaders at Brigham and Women's Hospital has shown that KIM-1 is an important and highly predictive marker for kidney injury. In a new study published in Clinical Cancer Research, BWH investigators, along with colleagues from Beth-Israel Deaconess Medical Center, Imperial College, London, and the International Agency for Research on Cancer, explore whether a blood test can detect higher concentrations of KIM-1 in patients who will go on to develop kidney cancer up to five years later. Their results show that KIM-1 substantially helped distinguish between those who went on to develop kidney cancer from those who did not.

"Early detection of kidney cancer can be lifesaving. We can cure kidney cancer when we detect it at an early stage, but patients with advanced kidney cancer have a very high death rate," said Venkata Sabbisetti, PhD, a research faculty member in the BWH Renal Division and a co-author of the study. However, kidney cancer is asymptomatic and many patients present with advanced kidney cancer at the time of diagnosis. Our results suggest that with further refinement, KIM-1 has the potential to identify patients with early, curable kidney cancer."

Sabbisetti and colleagues measured KIM-1 concentrations in samples from patients enrolled in the European Prospective Investigation into Cancer and nutrition (EPIC). The team compared KIM-1 levels from 190 participants who went on to develop RCC within the next five years to 190 matched participants (same age, body mass index, smoking status, etc.) who didn’t develop kidney cancer. Higher levels of KIM-1 are associated with higher risk of development kidney cancer and poorer survival.

The team reported that adding KIM-1 to a model for predicting kidney cancer risk approximately doubled the accuracy of that model. KIM-1 was substantially more sensitive for kidney cancer detection than prostate specific antigen is for prostate cancer. However, given the rarity of kidney cancer, plasma KIM-1 alone may not to be suited for early detection in the general population.

"We envision that KIM-1 will be useful in settings where the risk of kidney cancer is higher, such as patients undergoing abdominal CT scanning, where KIM-1 could be used to stratify risk of RCC," the authors wrote. "This will be particularly important given the rise of routine CT scans and the strong association between number of CT scans and number of nephrectomies performed at the regional level in the U.S., indicating a substantial burden of overdiagnosis."

Credit: 
Brigham and Women's Hospital

Dealing with digital distraction

SAN FRANCISCO -- Our digital lives may be making us more distracted, distant and drained, according to research presented at the annual convention of the American Psychological Association.

For instance, even minor phone use during a meal with friends was enough to make the diners feel distracted and reduced their enjoyment of the experience, one study found.

"People who were allowed to use their phones during dinner had more trouble staying present in the moment," said Ryan Dwyer, MA, of the University of British Columbia, lead author of a study that was presented during a symposium on how digital technology is affecting relationships. "Decades of research on happiness tell us that engaging positively with others is critical for our well-being. Modern technology may be wonderful, but it can easily sidetrack us and take away from the special moments we have with friends and family in person."

Dwyer and his colleagues conducted two studies - a field experiment in a restaurant and a survey. The restaurant experiment included more than 300 adults and university students in Vancouver, British Columbia. Participants were either asked to keep their phones on the table with the ringer or vibration on or to put their phones on silent and place them in a container on the table during the meal.

After eating, the participants filled out a questionnaire detailing their feelings of social connectedness, enjoyment, distraction and boredom, as well as the amount of phone use and what they did on their phones during the meal.

The researchers found that people who had their phones easily accessible during the experiment not only used them more than those with their phones put away, but they also reported feeling more distracted and enjoyed the experience less.

The survey portion included more than 120 participants from the University of Virginia. Participants were surveyed five times a day for one week and were asked to report on how they were feeling and what they had been doing in the 15 minutes before completing the survey.

Results showed that people reported feeling more distracted during face-to-face interactions if they had used their smartphone compared with face-to-face interactions where they had not used their smartphone. The students also said they felt less enjoyment and interest in their interaction if they had been on their phone.

"The survey findings were especially notable because of the negative effects of phone use among university students, who are commonly known as digital natives," said Elizabeth Dunn, PhD, of the University of British Columbia and co-author of the study and symposium presenter. "We assumed that this generation would be more adept at multi-tasking between using their phones and interacting with others, but we found out even moderate levels of phone use undermined the benefits of engaging with others."

Another study presented in the session found that compassionate people spend less time on social media than people who are more self-centered and narcissistic.

In addition, people with lower emotional intelligence, or those who have difficulty identifying, describing and processing their emotions, used social media more often than those who are more in touch with their feelings, according to the study.

"People who are uncomfortable with their own and others' emotions may be more comfortable online," said Sara Konrath, PhD, of Indiana University. "We think that they may prefer text-based interactions that allow them more time to process social and emotional information."

This study built upon previous research that has shown that more narcissistic people use social more often than less narcissistic people. Virtually no research has been done on how emotional intelligence relates to social media use, according to Konrath.

She and her colleagues analyzed data from four studies of more than 1,200 adult participants and used existing scales that assessed narcissism, empathy, emotional intelligence and emotion recognition. The studies also asked questions about how frequently participants checked and posted on Facebook, Twitter and Instagram.

More empathic people used Twitter less frequently than those who were not as caring and compassionate toward others, the researchers found. Also, people who were more likely to be able to see the world from another's perspective did not spend as much time on Facebook and Instagram. Another interesting finding was that people who scored high on a test of reading others' emotions used Twitter and Facebook less often.

Conversely, more narcissistic people and those who feel overwhelmed by the emotional experiences of others spent more time on all three social media sites.

"Does being more emotionally intelligent and empathic cause people to avoid social media, or are lower empathy people more drawn to it? It could also be the opposite: Perhaps frequently using social media can impair empathy and emotional intelligence," said Konrath. "We cannot determine causality with this study. We need more research to better understand how online digital technology affects people, for better or for worse."

Other research presented found that pre-teens became better at reading non-verbal cues from their peers after five days with no screen time and college-age participants bonded better with their friends during in-person interactions versus video chat, audio chat or instant messaging.

Session 2164: "How Smartphone Use Spreads and Undermines Enjoyment in Face-to-Face Social Interactions," and "Emotional Intelligence and Social Media Use," Symposium, Friday, Aug. 10, 11 a.m. PDT, Room 308, Level Three-South Building, Moscone Center, 747 Howard St., San Francisco, Calif.

Presentations are available from the APA Public Affairs Office.

Credit: 
American Psychological Association

More than 40 percent of women with asthma may develop COPD, but risk may be reduced

image: More than 4 in 10 women with asthma may go on to develop chronic obstructive pulmonary disease (COPD).

Image: 
ATS

Aug.10, 2018--More than 4 in 10 women with asthma may go on to develop chronic obstructive pulmonary disease (COPD), according to a study conducted in Ontario, Canada, and published online in the Annals of the American Thoracic Society.

In "Asthma and COPD Overlap in Women: Incidence and Risk Factors," Teresa To, PhD, and coauthors report that of the 4,051 women with asthma included in their study, 1,701, or 42 percent, developed COPD. On average, the women were followed for about 14 years after being diagnosed with asthma.

The researchers examined risk factors for developing asthma and COPD overlap syndrome, known as ACOS. Those who develop ACOS experience increased exacerbations and hospitalizations and have a lower quality of life, compared to those who have asthma or COPD alone.

"Previous studies have found an alarming rise in ACOS in women in recent years and that the mortality rate from ACOS was higher in women than men," said Dr. To, a professor in the Graduate School of Public Health at the University of Toronto in Canada. "We urgently need to identify and quantify risk factors associated with ACOS in women to improve their health and save lives."

The authors report that individual risk factors played a more significant role in the development of ACOS than exposure to fine particulate matter, a major air pollutant that because of its microscopic size penetrates deep into the lungs.

Women who had a more than five-pack-year smoking history, meaning they had smoked more than the equivalent of a pack of cigarettes a day for five years, were much more likely to develop ACOS than those who smoked fewer cigarettes or never smoked.

However, ACOS did not affect only those who smoke: 38 percent of the women who developed ACOS in the study had never smoked.

In addition to smoking, the study identified obesity, rural residence, lower education levels and unemployment as significant risk factors for ACOS. The authors speculate that these factors indicative of low socioeconomic status may result in suboptimal access to care, under-treatment of asthma and poor compliance to medications, all of which lead to more frequent asthma attacks. These attacks in turn may lead to airway remodeling that increases the chances of developing ACOS.

The researchers noted that they lacked the data to investigate this association directly. Study limitations also include not having information about exposure to second-hand smoke and exposure to air pollution over the entire time the women were followed.

The authors wrote that they were encouraged by the fact that most of the risk factors identified in their study were modifiable.

"The adverse impact of smoking and obesity on health may be even worse in those who are already living with asthma or COPD," said Dr. To, who is also senior scientist, Child Health Evaluative Sciences, at The Hospital for Sick Children (SickKids). "Identifying modifiable risk factors in the progression from asthma to COPD is an essential first step in developing prevention strategies that lead to a healthy, active lifestyle."

Credit: 
American Thoracic Society

Elderly patients on dialysis have a high risk of dementia

Older kidney disease patients who are sick enough to require the blood-filtering treatment known as dialysis are at high risk of dementia, including Alzheimer's disease, according to a study led by scientists at Johns Hopkins Bloomberg School of Public Health.

The study, published Aug. 9 in the Clinical Journal of the American Society for Nephrology, found evidence that older kidney disease patients had a substantially higher risk of being diagnosed with dementia than community-dwelling older adults.

"The dementia risk in this population seems to be much higher than what we see among healthy community-dwelling older adults," says study lead author Mara McAdams-DeMarco, assistant professor of epidemiology at the Bloomberg School.

To McAdams-DeMarco and her colleagues, the findings suggest that doctors should be doing more to monitor, and if possible to slow or prevent, cognitive decline among older dialysis patients. "The high incidence of dementia seems to be overlooked in this population," she says.

Cognitive decline and dementia, including Alzheimer's disease, are principally age-related and relatively common in the elderly. However, research suggests that kidney disease appears to worsen the problem. Studies over the past two decades have found evidence that as kidney function declines, cognitive functions are apt to decline as well. One recent study in dialysis patients found that this kidney-related cognitive decline was particularly noticeable for executive functions such as attention, impulse control and working memory.

The precise biological mechanism linking kidney disease to brain problems isn't yet clear, but kidney disease has itself been linked to poor blood flow in the brain, so researchers suspect that as a key factor.

To get a better understanding of the dementia problem among elderly patients with advanced kidney disease, McAdams-DeMarco and colleagues examined a large national kidney disease registry, focusing on 356,668 Medicare patients older than 66 years who had initiated dialysis due to end-stage kidney disease during 2001-2013.

Their analysis was aimed mainly at estimating the risk of a dementia diagnosis within a given period after initiating dialysis. For the female patients in this group, the estimated risk was 4.6 percent for a dementia diagnosis within a year, 16 percent within 5 years, and 22 percent--a nearly one in four chance--within 10 years. For males, the corresponding figures were slightly lower at 3.7, 13 and 19 percent.

Alzheimer's disease represented a significant proportion of dementia diagnoses: The one-year risk of this form of dementia was 0.6 percent for women and 0.4 percent for men.

The study was not designed to compare dialysis patients directly to healthy people of the same age; even so, the dementia risk among these patients was considerably higher than what would be expected in this age group. For example, a well-known prior study following residents of a Massachusetts town found that community-dwelling 65-year-olds had only a 1.0 to 1.5 percent incidence of dementia within 10 years, while for 75-year-olds the incidence was only about 7.5 percent. By contrast, in this study the researchers determined that the 10-year risk of dementia after starting dialysis was 19 percent for patients in the sample aged 66 to 70, and 28 percent among 76- to 80-year-olds.

Even the Alzheimer's disease risk among the dialysis patients seemed higher than normal--for example, 4.3 percent of the 66-70-year-olds were diagnosed with the disease within 10 years of starting dialysis, compared to a 10-year incidence of less than 1 percent among 65-year-olds in the Massachusetts study. That suggests that older patients with end-stage kidney disease may even be vulnerable to Alzheimer's disease.

McAdams-DeMarco and colleagues also found that older dialysis patients with a dementia diagnosis were about twice as likely to die at any time in the study period, compared to older dialysis patients without a dementia diagnosis.

As stark as these findings are, they may understate the problem. "We know from other studies that only about half of patients with dementia receive a diagnosis, so the figures in this study could be seen as a lower limit," McAdams-DeMarco says.

She and her colleagues suggest that more in-depth studies need to be done to gauge the true extent of the dementia problem among older end-stage kidney disease patients. "Patients starting dialysis generally meet with health care providers a few times per week, so in principle there is ample opportunity to do at least brief cognitive screening," she says.

She also recommends more studies of potential measures to prevent dementia among these vulnerable patients. "We're currently setting up a large clinical trial to identify appropriate interventions to preserve cognitive function in these patients," McAdams-DeMarco says.

Credit: 
Johns Hopkins Bloomberg School of Public Health

Kaiser Permanente Northern California's colorectal cancer screening program saves lives

Kaiser Permanente members in Northern California are 52 percent less likely to die from colorectal cancer since the health care system launched a comprehensive, organized screening program, according to a new study in the specialty's top journal, Gastroenterology.

"Since we launched our screening program we have seen a remarkable decline in the number of cases of colorectal cancer and related deaths across a large, diverse population," said gastroenterologist and co-lead author Theodore R. Levin, MD, clinical lead for Kaiser Permanente's colorectal cancer screening in Northern California.

The study, "Effects of Organized Colorectal Cancer Screening on Cancer Incidence and Mortality in a Large, Community-based Population," confirms that since Kaiser Permanente Norther California's screening program for colorectal cancer was rolled out between 2006 and 2008, screening completion as recommended by the U.S. Preventive Services Task Force increased to 83 percent among those eligible (adults 50 to 75 years old) by 2015, compared to 66 percent nationally. In that same timeframe, new cases of colorectal cancer in the United States dropped 26 percent.

Researchers compared the periods before and after the organized Kaiser Permanente screening program was rolled out between 2006 and 2008. The study found that mortality from colorectal cancer decreased 52.4 percent from approximately 31 deaths to 15 deaths per 100,000 people; and the incidence fell 25.5 percent from approximately 96 cases to 71 cases per 100,000 people.

"This most recent study is the culmination of more than two decades of groundbreaking Kaiser Permanente research and clinical care initiatives. It provides evidence for dramatic improvements to colorectal cancer screening by having an organized approach to making sure people get screened," said The Permanente Medical Group gastroenterologist and co-lead author Douglas A. Corley, MD, PhD.

Screening saves lives

Colorectal cancer is the second-leading cause of cancer deaths in the United States. The American Cancer Society estimates more than 140,000 new cases will be diagnosed in the U.S. this year, and it's expected to cause more than 50,000 deaths during 2018.

Fortunately, regular screening allows for the early detection of colorectal cancers and polyps that can become colorectal cancer. Cancer detected early is more likely to be cured and removing polyps early can prevent the development of colorectal cancer.

The U.S. Preventive Task Force recommends three screening methods for colorectal cancer for adults beginning at age 50:

Fecal testing (such as with the fecal immunochemical test or "FIT"), every year

Flexible sigmoidoscopy every 5 years, and/or

Colonoscopy every 10 years

After a positive fecal test or sigmoidoscopy, physicians order a colonoscopy, the procedure that examines the full colon and can remove polyps.

Drs. Levin and Corley, both research scientists with the Kaiser Permanente Northern California Division of Research, noted that Kaiser Permanente has conducted, and continues to conduct, critical research on all three of these methods, with important studies dating back to the 1980s.

"Colonoscopy has long been an effective screening tool for colorectal cancer, but it can be expensive and time-consuming to deliver in large populations, and many people are unwilling to undergo the test," Dr. Levin said. "We have found that Kaiser Permanente members are more than willing to be screened with the FIT kit, which has greatly contributed to our high screening rates."

An initial Kaiser Permanente research goal -- led by James Allison, MD, FACP, an emeritus research scientist in the Division of Research -- was to generate evidence that the guaiac-based fecal occult blood test and/or sigmoidoscopy conducted in large, average-risk populations could save lives and decrease the incidence of colorectal cancer.

The Division of Research conducted landmark studies on the use of flexible sigmoidoscopy, which began at Kaiser Permanente facilities in the mid-1990s; these studies formed the basis for U.S. Preventive Services Task Force guideline recommendations.

In 1996, Kaiser Permanente conducted the first U.S. study showing that FIT kits had superior performance characteristics to the fecal occult blood test. A 2007 study provided evidence used by the U.S. Preventive Services Task Force to recommend FITs as a screening option in their guidelines. And a recent multicenter study within Kaiser Permanente was the first large U.S. study to estimate the effectiveness of colonoscopy for reducing deaths from colorectal cancer. (Although colonoscopy is commonly used, it has not been studied in large, randomized trials.)

Model screening program

Within a few years of initiating its population-based screening program, Kaiser Permanente was able to dramatically increase its screening rates by mailing FIT kits to the homes of its Northern California members of recommended screening age, 50 to 75 years old; systematically reminding members when they are due for screening; and quickly processing a large volume of FITs -- upwards of 3,000 per day -- that are mailed directly to a Kaiser Permanente laboratory.

"Kaiser Permanente's screening program for colorectal cancer in Northern California is a perfect example of how a large health care system can expertly use technology and data to create a program that promotes health, prevents illness and saves the lives of its members on an unprecedented scale," said Yi-Fen Chen, MD, associate executive director for quality and research of The Permanente Medical Group.

In 2014, the National Colorectal Cancer Roundtable set a national goal of screening at least 80 percent of those eligible by 2018. Kaiser Permanente Northern California achieved the 80 percent screening rate by 2011. The Kaiser Permanente FIT-based outreach program, combined with colonoscopy, has become a model for similar programs to maximize the number of people screened in the United States, the Veteran's Administration and internationally.

Dr. Corley now also leads the National Cancer Institute's Population-based Research Optimizing Screening through Personalized Regimens (or PROSPR) consortium, a multisite effort to evaluate and improve cancer screening processes, including colorectal cancer. The consortium's research has included important studies on the quality and effectiveness of colonoscopies and adenoma detection rates.

"Screening markedly decreases deaths from colorectal cancer and enables people to live healthier lives," Dr. Corley said. "The future of colorectal cancer research and care is furthering proven ways to increase screening rates; better understand the best ages to start, repeat and stop screening; and continue to improve the ease and effectiveness of the tests themselves."

Credit: 
Kaiser Permanente

Community health centers can help boost rates of colorectal cancer screening

An innovative program in community health centers to mail free colorectal cancer screening tests to patients' homes led to a nearly 4 percentage point increase in CRC screening, compared to clinics without the program, according to a Kaiser Permanente study published today in JAMA Internal Medicine.

According to the National Association of Community Health Centers, approximately 24 million people in the United States receive care at federally qualified health centers, often called community health or safety net clinics. These underserved patients historically have low rates of CRC screening compared to the general population.

"With such a large number of individuals receiving care in the safety-net setting, an improvement in CRC screening rates of even a few percentage points can have a major impact in terms of cancers detected and lives saved," said lead author Gloria Coronado, PhD, an investigator at the Kaiser Permanente Center for Health Research in Portland, Oregon.

The study, "Strategies and Opportunities to Stop Colon Cancer in Priority Populations (STOP CRC)," took place in 26 clinics representing eight health centers in Oregon and California. More than 41,000 adults aged 50-64 met the study criteria of being due for CRC screening between February 2014 and February 2015.

Half of the clinics were randomized to implement the program after receiving training and support, and the other half continued to deliver usual care without the program. For clinics implementing the program, the process began with customization of their electronic health record systems to identify patients who were due for CRC screening. The clinics then mailed an introductory letter to these patients, explaining they would soon be receiving a screening test in the mail.

Next, clinics mailed the screening tests to eligible patients' homes. The clinics used the fecal immunochemical test (or FIT), a simple test that detects small amounts of blood in the stool and can be done easily at home. Individuals with a positive FIT result were encouraged to get a follow-up colonoscopy to look for cancer or pre-cancerous polyps. Finally, as the last step of the program, clinics mailed a reminder letter to patients' homes, encouraging them to complete and return their FIT kits.

Compared to the control group clinics, clinics that delivered the intervention had a significantly higher proportion of patients who were screened for CRC. The percentage of patients who completed a FIT kit was 3.4 points higher, and the percentage of patients who received any type of CRC screening was 3.8 points higher in intervention clinics compared to control clinics.

The clinical effectiveness of mailing FIT kits to patients' homes had already been established in previous research, including a pilot study by Coronado and her team. But this new, much larger study showed the program can also work well when clinic staff -- not researchers -- are responsible for implementing it.

"This was a real-world, pragmatic trial, which is quite a bit different from a carefully controlled research environment," explained Coronado. "Our team provided clinics with the EHR tools needed to identify and contact patients who were due for screening; we trained clinic staff to use the tools; we provided letter templates, pictographic instructions and other materials; and we used a collaborative learning model to offer ongoing support. But ultimately, clinic staff were responsible for integrating the intervention into their care processes."

The study team observed significant variation across health centers in successful implementation of the program. The proportion of eligible patients who were mailed a FIT kit ranged from 6.5 percent to 68.2 percent. Of eligible patients who did receive a FIT kit in the mail, reminder letters had a major impact on return rates. Clinics that consistently sent out reminder letters after sending FIT kits had a return rate of 25 percent, compared to clinics that did so inconsistently (14 percent) or not at all (6 percent).

"Community health centers are very busy places with many competing priorities," said senior author Beverly Green, MD, MPH, of the Kaiser Permanente Washington Health Research Institute. "Our study showed that while FIT outreach programs can be a great way to increase colorectal cancer screening rates in this underserved population, we need to identify additional strategies to support program implementation in health centers with limited resources."

Coronado and her team are building on this research with a new study, "Participatory Research to Advance Colon Cancer Prevention." The researchers are working with community advisors to adapt and spread a direct-mail FIT kit and reminder program, with the ultimate goal of increasing the effectiveness of reminders by ensuring they meet the specific needs of diverse population subgroups.

Credit: 
Kaiser Permanente

Juveniles with conduct problems face high risk of premature death

We already know that adolescents with conduct and/or substance use problems are at increased risk for premature death, mainly from substance-related deaths, traffic accidents, and violent deaths (related to suicide, assault, or legal intervention). This prospective study of more than 3700 US juveniles discovered that there is an independent association between conduct disorder and mortality hazard. In other words, the connection between conduct disorder and risk of early death appears to exist even when other contributing factors such as sex, ethnicity, familial factors, and substance use are removed.

This is important because conduct disorder can be treated and improved, potentially reducing the risk of early death.

And the risk is substantial. This study followed 1463 US youths with combined conduct disorder (CD) and substance use disorder (SUD), 1399 of their siblings, and 904 community controls. It found that the relative risk of mortality among the first two groups was 4.99 times higher than the controls. Over an average follow-up time of 16 years, 96 deaths were observed among adolescents with CD/SUD and their siblings versus 8 deaths among controls.

The study used the definition of conduct disorder provided by the APA's Diagnostic and Statistical Manual of Mental Disorders (DSM-IV): "A repetitive and persistent pattern of behavior in which the basic rights of others or major age-appropriate societal norms or rules are violated," which includes aggression toward people and animals, property destruction, deceitfulness or theft, and serious violation of rules.

Lead author Richard Border (University of Colorado Boulder) adds this caveat: "Because this study focused on juveniles with severe CD and SUD, the extent to which the results might generalize to individuals with moderate CD and SUD is unclear. But our results indicate that adolescents with severe CD are at marked risk for premature death beyond that which can be explained by substance use problems and sociodemographic factors, which should make them a prime target for future treatment research."

Credit: 
Society for the Study of Addiction

Pass the salt: Study finds average consumption safe for heart health

image: Andrew Mente, first author of the study and a researcher with The Population Health Research Institute at McMaster University and Hamilton Health Sciences in Hamilton, Ontario, CA.

Image: 
Hamilton Health Sciences

HAMILTON, ON (Aug. 9, 2018) - New research shows that for the vast majority of individuals, sodium consumption does not increase health risks except for those who eat more than five grams a day, the equivalent of 2.5 teaspoons of salt.

Fewer than five per cent of individuals in developed countries exceed that level.

The large, international study also shows that even for those individuals there is good news. Any health risk of sodium intake is virtually eliminated if people improve their diet quality by adding fruits, vegetables, dairy foods, potatoes, and other potassium rich foods.

The research, published today in The Lancet, is by scientists of the Population Health Research Institute (PHRI) of McMaster University and Hamilton Health Sciences, along with their research colleagues from 21 countries.

The study followed 94,000 people, aged 35 to 70, for an average of eight years in communities from18 countries around the world and found there an associated risk of cardiovascular disease and strokes only where the average intake is greater than five grams of sodium a day.

China is the only country in their study where 80 per cent of communities have a sodium intake of more than five grams a day. In the other countries, the majority of the communities had an average sodium consumption of 3 to 5 grams a day (equivalent to 1.5 to 2.5 teaspoons of salt).

"The World Health Organization recommends consumption of less than two grams of sodium -- that's one teaspoon of salt -- a day as a preventative measure against cardiovascular disease, but there is little evidence in terms of improved health outcomes that individuals ever achieve at such a low level," said Andrew Mente, first author of the study and a PHRI researcher.

He added that the American Heart Association recommends even less -- 1.5 grams of sodium a day for individuals at risk of heart disease.

"Only in the communities with the most sodium intake -- those over five grams a day of sodium - which is mainly in China, did we find a direct link between sodium intake and major cardiovascular events like heart attack and stroke.

"In communities that consumed less than five grams of sodium a day, the opposite was the case. Sodium consumption was inversely associated with myocardial infarction or heart attacks and total mortality, and no increase in stroke."

Mente added: "We found all major cardiovascular problems, including death, decreased in communities and countries where there is an increased consumption of potassium which is found in foods such as fruits, vegetables, dairy foods, potatoes and nuts and beans."

The information for the research article came from the ongoing, international Prospective Urban Rural Epidemiology (PURE) study run by the PHRI. Mente is also an associate professor of the Department of Health Research Methods, Evidence and Impact at McMaster University.

Most previous studies relating sodium intake to heart disease and stroke were based on individual-level information, said Martin O'Donnell, co-author of the report, a PHRI researcher and an associate clinical professor of medicine at McMaster.

"Public health strategies should be based on best evidence. Our findings demonstrate that community-level interventions to reduce sodium intake should target communities with high sodium consumption, and should be embedded within approaches to improve overall dietary quality.

"There is no convincing evidence that people with moderate or average sodium intake need to reduce their sodium intake for prevention of heart disease and stroke," said O'Donnell.

Besides Canada, this research paper involved individual and community information from the countries of Argentina, Bangladesh, Brazil, Chile, China, Columbia, India, Iran, Malaysia, occupied Palestinian territory, Pakistan, Philippines, Poland, Saudi Arabia, South Africa, Sweden, Tanzania, Turkey, United Arab Emirates, and Zimbabwe.

Credit: 
McMaster University

The Lancet: Early age of type 1 diabetes diagnosis linked to greater heart risks and shorter life expectancy, compared to later diagnosis

Life-expectancy for individuals with younger-onset disease is on average 16 years shorter compared to people without diabetes, and 10 years shorter for those diagnosed at an older age

Being diagnosed with type 1 diabetes at a young age is associated with more cardiovascular complications and higher risk of premature death than being diagnosed later in life, independent of disease duration. The findings, published in The Lancet, come from a large observational study in Sweden that followed over 27,000 individuals with type 1 diabetes and more than 135,000 matched controls for an average of 10 years. With around half of individuals with type 1 diabetes diagnosed before the age of 14, the authors stress the need to consider wider and earlier use of cardioprotective measures such as statins and blood pressure lowering drugs in this high-risk population.

"Although the relative risk of cardiovascular disease is increased after an early diabetes diagnosis, the absolute risk is low", says Dr Araz Rawshani from the University of Gothenburg in Sweden who co-led the research. "However, age at disease onset appears to be an important determinant of survival as well as cardiovascular outcomes in early adulthood, warranting consideration of earlier treatment with cardioprotective drugs."[1]

The new estimates suggest that individuals diagnosed before the age of 10 have a 30-times greater risk of serious cardiovascular outcomes like heart attack (0.31 cases per 100,000 person years for participants with diabetes vs 0.02 cases in every 100,000 person-years for controls) and heart disease (0.5 vs 0.03) than those in the general population, whilst risk levels are around six times higher for people diagnosed between ages 26 and 30 (0.87 vs 0.25 and 1.80 vs 0.46 respectively).

People with younger-onset type 1 diabetes are four times as likely to die from any cause (0.61 vs 0.17), and have more than seven times the risk of dying from cardiovascular disease (0.09 vs 0.02) than their diabetes-free counterparts. In contrast, people first diagnosed between ages 26 and 30 face a lower (three-fold) risk of dying from any cause (1.9 vs 0.6) and cardiovascular disease (0.56 vs 0.15) compared to their peers without diabetes.

Professor Naveed Sattar, co-author, University of Glasgow (UK), explains: "While the absolute risk levels are higher in individuals who develop diabetes when older, simply due to age being a strong risk factor, the excess risk compared to healthy controls is much higher in those who developed diabetes when younger. If this higher excess risk persists over time in such individuals, they would be expected to have highest absolute risks at any given subsequent age. Indeed, those who develop type 1 diabetes when under 10 years of age experience the greatest losses in life expectancy, compared to healthy controls. This is something we did not fully appreciate before." [1]

The impact of type 1 diabetes on younger people should not be underestimated, and there is a need to consider adding recommendations about age of onset in future guidelines, say the authors.

Type 1 diabetes mellitus is the second most common chronic disease in children, accounting for 85% of diabetes in the under 20s. But it's not unusual to develop the disease as an adult. Worldwide, the incidence of type 1 diabetes in children aged 14 years and younger has risen by 3% a year since the 1980s.

It's well known that people with type 1 diabetes are at increased risk of health problems and have shorter life expectancies, partly due to premature cardiovascular disease. But, until now, the impact of age of diagnosis on this excess mortality and cardiovascular risk was unclear.

To provide more evidence, the researchers calculated the excess risk of all-cause mortality, cardiovascular mortality, acute heart attack, stroke, cardiovascular disease, coronary heart disease, heart failure, and atrial fibrillation in 27,195 individuals from the Swedish National Diabetes Register compared to 135,178 controls matched for age, sex, and county from the general population (average age 29 years).

Individuals with diabetes, who registered between Jan 1988 and December 2012, were divided into groups by age at diagnosis--0-10 years, 11-15 years, 16-20 years, 21-25 years, and 26-30 years--and followed up for an average of 10 years.

The researchers adjusted for a range of factors that may have influenced the results including age, sex, marital status, income, educational level, region of birth, diabetes duration, and previous history of cardiovascular complications.

Cardiovascular risks and survival were strongly related to age at disease onset, with people diagnosed before the age of 10 having five-times greater risk of heart attack and coronary heart disease than those diagnosed at age 26-30 years. The younger group also had much higher risk of heart failure and stroke than peers without diabetes and those diagnosed at an older age (figure 3).

Excess risks were particularly pronounced in women, with those diagnosed before age 10 facing a 60-fold higher risk of heart disease (0.48 cases per 100,000 person years for participants with diabetes vs 0.02 cases in every 100,000 person-years for controls) and 90-times increased risk of heart attack than matched controls (0.25 vs 0.01). In comparison, men with young-onset diabetes have a 17 times greater risk of developing heart disease and 15 times higher risk of having a heart attack in early adulthood compared to those in the general population (0.53 vs 0.05 and 0.36 vs 0.03).

These estimates for early-onset disease are substantially higher than recent estimates by the American Heart Association and American Diabetes Association, which do not consider age of onset as a risk stratifier, and report that women with type 1 diabetes are at seven times increased risk and men at three times the risk of developing heart disease.

Life expectancy was also markedly shorter for women with type 1 diabetes. Women who develop the condition before 10 years of age die on average around 18 years earlier than their diabetes-free counterparts (average life expectancy 70.9 years vs 88.6 years), whilst men with early onset type 1 diabetes die around 14 years earlier (69.1 years vs 83.3 years). Individuals diagnosed at 26-30 years old lose, on average, about 10 years of life.

The authors speculate that loss of beta cells which contribute to glycaemic load, and is more severe and rapid among those with younger age onset, could be a contributing factor to the increased risk of cardiovascular-related death. Whilst they adjust for duration, longer exposure to higher glucose levels in individuals who develop diabetes when children might also contribute to greater heart disease risks. However, the authors acknowledge that their findings show observational associations rather than cause and effect.

Professor Sattar adds: "People with early onset diabetes should more often be considered for cardioprotective drugs such as statins and blood pressure lowering medication when they reach 30-40 years of age. Currently, only around 10-20% of individuals with type 1 diabetes are taking statins by the age of 40. Also, improving glycaemic control and smoking cessation programmes could meaningfully prolong the lives of these individuals. The good news, however, is that recent technological advances are helping younger patients manage their glucose levels better." [1]

The authors note some limitations including that they did not have information about patients' glycaemic control before enrolment in the register. What's more, they only included patients who had the condition for 20 years or less to provide contemporary comparisons of cardiovascular risk that reflect current diabetes management. Key strengths include the large cohort, individual controls, adjustment for diabetes duration, the range of age subgroups, and variety of cardiovascular outcomes. Life expectancy analyses used the entire cohort, regardless of duration of diabetes.

Writing in a linked Comment, Marina Basina and David M Maahs, Stanford University (USA) say: "These data will increase attention towards cardioprotection at younger ages and specifically for those with an earlier age of type 1 diabetes onset. Practitioners need a stronger evidence base, including confirmatory reports from other registries and clinical trials, to clarify proper therapy and translate research findings to care guidelines and clinical practice to improve mortality and cardiovascular disease outcomes for individuals with type 1 diabetes."

Credit: 
The Lancet

Why do some microbes live in your gut while others don't?

image: Katherine Pollard (left) and Patrick Bradley (right) identified genes that may help microbes live successfully in the human gut.

Image: 
Elisabeth Fall/Gladstone Institutes

SAN FRANCISCO, CA--August 9, 2018--Trillions of tiny microbes and bacteria live in your gut, each with their own set of genes. These gut microbes can have both beneficial and harmful effects on your health, from protecting you against inflammation to causing life-threatening infections. To keep out pathogens yet encourage the growth of beneficial microbes, scientists have been trying to find ways to target specific microbial genes.

Katherine Pollard, PhD, is one of these scientists. Her team at the Gladstone Institutes is interested in better understanding how microbes colonize the gut. They want to identify the genes that help microbes pass through the stomach's harsh environment and survive in the lower gastrointestinal tract.

"Until now, this has not been an easy feat," explains Pollard, senior investigator and director of the Gladstone Institute of Data Science and Biotechnology. "Most microbes in the gut have evolved from related species, so they share many common genes. It's difficult to single out the genes that actually influence a microbe's ability to survive in the gut environment."

A new study published in the scientific journal PLOS Computational Biology led by Patrick Bradley, a postdoctoral scholar in the Pollard lab, found a new approach to identify the genes that may be important to help microbes live successfully in the human gut.

An Old Method Reveals New Insights

The microbes that colonize the human body can be difficult to study using traditional experimental methods, and not merely due to their sheer number. Given that scientists lack the tools to grow and study many microbe species in the lab, identifying their genes is labor- and time-intensive.

New computational methods and DNA sequencing provide a solution to determine which microbes are usually present in a person's gut, and what genes are in the microbes' genomes. However, the researchers at Gladstone showed that just looking at the genes shared by gut microbes, without accounting for the microbes' common ancestry, can lead to many false discoveries.

To avoid similar pitfalls, they found a novel approach to address the issue. Bradley applied a technique called phylogenetic linear modeling, which has been often used in ecology, but rarely in genomics.

"With this method, we use information from an evolutionary tree that maps out the historical relationship between different species," explained Bradley. "We were the first to directly apply this method to metagenomics data, which comes from the collective genetic material from the microbes present in the human body."

The team used this approach to analyze public data from hundreds of individuals in industrialized countries. As a result, Bradley and Pollard found thousands of genes across different species that are prevalent in the gut. They also looked for--and found--genes in microbes that are actually more prevalent in the gut than in other parts of the body, suggesting that these genes may be specific to this environment. The researchers believe the genes they identified may help microbes colonize the gut, for example, by allowing microbes to survive in acidic environments, such as the stomach.

In addition, the scientists used the same technique to compare gut microbes in health versus disease. They found genes that are associated with bacteria that are more prevalent in patients with Crohn's disease than in healthy patients.

"In Crohn's disease, some bacteria with anti-inflammatory properties seem to be depleted," said Bradley. "If we can identify genes that improve gut colonization specifically in people with Crohn's, then down the road, we could potentially help treat patients by engineering new versions of these anti-inflammatory bacteria that would survive better in that environment."

A First Step to Improving Gut Health

The new computational approach developed by Pollard and Bradley could lead to the development of new therapies to maintain or improve gut health.

"If we want to target individual microbial genes, we first need to understand the role they play in colonizing the gut," said Pollard, who is also a professor at UCSF and a Chan Zuckerberg Biohub investigator. "This could yield opportunities to design better probiotics or prevent invasion of the gut by harmful pathogens like C. difficile."

This method leverages ideas from many fields, including bioinformatics, microbial ecology and evolution, biochemistry, biostatistics, and the study of the human microbiome. It can also be applied to identify genes associated with microbes in other environments besides the human gut.

"Our study shows that by using methods that account for the evolutionary relationship between microbes, we can predict genes that might be important in a particular environment with much greater accuracy than standard models allow," said Bradley. "Our hope is that other scientists will realize they can get so much more out of the data they already have by using our approach."

Pollard and Bradley are now working to develop a web tool that will allow other academics--particularly wet lab biologists who might not have the required in-house expertise--to upload their own data and use this new computational technique to obtain helpful results.

Credit: 
Gladstone Institutes

American College of Rheumatology: CMS decision an affront to America's sickest Medicare patients

ATLANTA - The American College of Rheumatology (ACR) today expressed its extreme disappointment with a new Centers for Medicare and Medicaid Services (CMS) decision to allow Medicare Advantage (MA) plans to implement step therapy for Part B drugs and cross-manage Part B and D drug utilization. The policy change threatens patient access to drugs covered under Medicare Part B for the 54 million Americans living with rheumatic diseases. This policy puts insurance companies in control of patient treatment plans. Compromising medical decision making between doctors and patients prevents timely access to medications that effectively control disease.

"Put simply, this policy change is a gross affront to America's sickest Medicare patients - individuals living with diseases like inflammatory arthritis and cancer - who depend on timely access to safe, affordable, and high-quality treatments," said David Daikh, PhD, MD, President of the ACR. "Utilization management techniques like step therapy prevent and delay important treatments for rheumatic disease patients, which can result in irreversible joint or organ damage. At the same time that medical research is showing that early institution of effective treatment prevents such damage, CMS is instituting a policy that will makes it much more difficult for patients to get this treatment in time. We urge CMS to reconsider this policy and ensure that all Americans continue to have access to the most appropriate and effective therapy as determined by their health care team."

Step therapy - also known as "fail first" - is a troubling practice employed by a majority of insurers that forces patients to try therapies preferred by the insurance company before being approved for the therapy their doctor prescribed - even when doctors doubt the "insurer preferred" option will be effective. Utilized by both public and private insurers, step therapy undermines the clinical judgment of healthcare providers, leads to delays in effective therapy, and puts patients' health at unnecessary risk.

The ACR has long opposed utilization management techniques such as step therapy - in addition to others such as prior authorization, specialty tiering, and high cost-sharing - because they can prevent and delay important treatments for patients. In comments submitted to CMS last month, the ACR urged policymakers to protect patient access to Part B therapies and to instead address the issue of high treatment costs by facilitating the development of alternative payment models, expanding patient access to cost and coverage information at the time of treatment and improving FDA's capacity and manufacturer ability to bring safe, effective biosimilars to market, which will increase competition and lower costs. The ACR also supports practices continuing to negotiate better overall drug spending through Part B than what currently occurs in Part D, as suggested by HHS's own dashboard. Yet rather than addressing underlying causes of the high drug costs, this CMS policy seeks to reduce costs for insurers by limiting the ability of patients to receive the appropriate medications to treat their disease.

Furthermore, the ACR expressed concern over how these changes are being implemented and urged CMS to put any proposed changes through the formal rulemaking process so that patients and healthcare providers may be able to weigh in on the details of such a proposal.

"A change this seismic - one that has significant consequences for patient access to live-saving drugs - should go through the formal comment and rule-making process," Dr. Daikh concluded.

Credit: 
American College of Rheumatology

Epigenetic reprogramming of human hearts found in congestive heart failure

image: Adam Wende and Mark Pepin

Image: 
UAB

BIRMINGHAM, Ala. - Congestive heart failure is a terminal disease that affects nearly 6 million Americans. Yet its management is limited to symptomatic treatments because the causal mechanisms of congestive heart failure -- including its most common form, ischemic cardiomyopathy -- are not known. Ischemic cardiomyopathy is the result of restricted blood flow in coronary arteries, as occurs during a heart attack, which starves the heart muscle of oxygen.

Researchers at the University of Alabama at Birmingham have now described an underlying mechanism that reprograms the hearts of patients with ischemic cardiomyopathy, a process that differs from patients with other forms of heart failure, collectively known as dilated (non-ischemic) cardiomyopathies. This points the way toward future personalized care for ischemic cardiomyopathy.

The study used heart tissue samples collected at UAB during surgeries to implant small mechanical pumps alongside the hearts of patients with end-stage heart failure that assist in the pumping of blood. As a routine part of this procedure, a small piece of heart tissue is excised and ultimately discarded as medical waste. The current study acquired these samples from the left ventricles of five ischemic cardiomyopathy patients and six non-ischemic cardiomyopathy patients, all men between ages 49 and 70.

The research team, led by Adam Wende, Ph.D., assistant professor in the UAB Department of Pathology, found that epigenetic changes in ischemic cardiomyopathy hearts likely reprogram the heart's metabolism and alter cellular remodeling in the heart. Epigenetics is a field that describes molecular modifications known to alter the activity of genes without changing their DNA sequence.

One well-established epigenetic change is the addition or removal of methyl groups to the cytosine bases of DNA. Generally, hyper-methylation is associated with reduction of gene expression, and conversely, hypo-methylation correlates with increased gene expression.

Wende and colleagues found an epigenetic signature in the heart of patients with ischemic cardiomyopathy that differed from the non-ischemic hearts. Furthermore, this signature was found to reflect a long-known metabolic change in ischemic cardiomyopathy, where the heart's preference of metabolic fuel switches from using oxygen to produce energy in cells, as healthy hearts do, to an anaerobic metabolism that does not need oxygen. This anaerobic metabolic preference is seen in fetal hearts; however, after birth, the baby's heart quickly changes to oxidative metabolism.

"Altogether, we believe that epigenetic changes encode a so-called 'metabolic plasticity' in failing hearts, the reversal of which may repair the ischemic and failing heart," Wende said.

The researchers found that increased DNA methylation correlated with reduced expression of genes involved in oxidative metabolism. The transcription factor KLF15 is an upstream regulator of metabolic gene expression, which the researchers found is suppressed by the epigenetic regulator EZH2. Conversely, the researchers also found hypo-methylation of anaerobic glycolytic metabolic genes.

This contribution by EZH2 offers a new molecular target for further mechanistic studies that may aid precision-based heart disease therapies. Of note, co-author Sooryanarayana Varambally, who has spent over 15 years studying this protein, has already made progress using small-molecular inhibitors to regulate EZH2 to treat various cancers.

The Wende-led study, now published in Nature - Laboratory Investigation, employed a wide array of bioinformatics tools. First author Mark Pepin used publicly available programs to create a fully automated computational pipeline, which is provided as an online supplement to the paper. This protocol, written in the R programming language, allowed the investigators to both analyze their multi-Omics datasets and compare their findings to those of animal-based studies and public data repositories. "Supplying the coding scripts," Wende said, "is our way of demonstrating the rigor and reproducibility that should be expected of any bioinformatics study."

Pepin is a sixth-year M.D.-Ph.D. student at UAB and is currently completing the Ph.D. portion of his training in the Medical Scientist Training Program.

The UAB team also performed cell culture experiments showing repression of KLF15 after EZH2 over-expression in rat cardiomyoblasts, and they demonstrated that EZH2 over-expression depended on EZH2's having an intact SET catalytic domain.

Credit: 
University of Alabama at Birmingham

Inducing labor at 39 weeks decreases need for cesarean section

image: Inducing labor in healthy women at 39 weeks into their pregnancy reduces the need for cesarean section and is at least as safe for mother and baby as waiting for spontaneous labor, according to a new study.

Image: 
University of Utah Health

Inducing labor in healthy women at 39 weeks into their pregnancy reduces the need for cesarean section and is at least as safe for mother and baby as waiting for spontaneous labor. Choosing to induce could also reduce the risk that mothers will develop preeclampsia and that newborns will need respiratory support after delivery, according to a study publishing online in the New England Journal of Medicine on August 8.

"This doesn't mean that everyone should be induced at 39 weeks," says the study's co-author Robert Silver, M.D., chair of Obstetrics & Gynecology at University of Utah Health and a Maternal-Fetal Medicine physician at Intermountain Healthcare in Salt Lake City. Kim Hall, R.N., B.S.N., a research nurse coordinator at U of U Health and Intermountain Healthcare is also co-author on the study.

"Electing to induce labor is a reasonable option that may give the best chance for vaginal delivery and improve outcomes," says Silver.

Results were from 6,106 first-time mothers enrolled into the randomized ARRIVE clinical trial carried out at 41 hospitals participating in the National Institutes of Health-supported Maternal Fetal Medicine Units Network. More than 1,200 women were at the Utah MFMUN, consisting of University Hospital and Intermountain Medical Center, the largest enrolling site in the trial.

A Rising C-Section Rate

Driving the study is a steadily increasing rate of babies being delivered by C-section in the U.S., a number that has been holding at 32 percent since 2016. Medically unnecessary cesarean deliveries in healthy first-time mothers account for 80 percent of those deliveries, a point of concern.

Although the procedure is generally safe, the major surgery increases risk for complications to both mother and baby, and to future pregnancies. Women who deliver by C-section once are more likely to continue delivering that way, increasing the likelihood of high-risk complications such as placenta accreta.

For years, health care providers had been taught to avoid inducing labor in healthy, first-time mothers based on the belief that inducing increases the chance for C-section births. However, recent results from small, observational studies indicated that this may not necessarily be the case.

ARRIVE was a prospective trial designed to test this premise by examining outcomes from two groups of healthy, first-time mothers. One group elected to induce labor at 39 weeks, when the baby is full term and it is considered safe for mothers to give birth. The other group took part in expectant management or "watchful waiting," the routine practice of waiting for spontaneous labor but undergoing active intervention should a medical need arise.

Inducing Labor vs. Waiting

On average, women who chose to induce at 39 weeks delivered nearly one week earlier than women who waited for spontaneous labor. C-section delivery was significantly less likely after elective induction than after expectant management (18.6 vs. 22.2 percent).

Based on these data, the researchers estimate that inducing labor at 39 weeks could eliminate the need for 1 C-section for every 28 deliveries.

"We're always trying to find the safest way to deliver babies and take care of our patients," says M. Sean Esplin, M.D., an associate professor of Obstetrics and Gynecology at U of U Health and chief of Maternal-Fetal Medicine at Intermountain Healthcare. "If the primary goal is to keep rates of C-sections down, then elective induction is an option."

Choosing to induce labor at 39 weeks is at least as safe as spontaneous labor, according to results from the study. A composite score measuring several health indicators in newborns -- including death, seizures, hemorrhage and trauma -- was not significantly different between the two groups.

Inducing labor was linked to significant improvement in two specific outcomes: women were less likely to develop preeclampsia (9 vs. 14 percent), and rates of respiratory distress decreased in newborns. Silver says that the placenta tends not to function as well later in pregnancy, possibly explaining why mothers and babies who deliver earlier may fare better.

The study's findings held true regardless of the woman's age, ethnicity and BMI. Currently, researchers are evaluating whether inducing delivery at 39 weeks is cost effective.

"These results open the door for pregnant women and their health care providers to talk about what the woman wants to do," says Michael Varner, M.D., vice chair for research in Obstetrics and Gynecology at U of U Health and primary investigator of the Utah MFMUN.

"The opinions that matter most comes from the women we serve," says Varner.

Credit: 
University of Utah Health