Body

Infertility treatment linked with slightly higher risk of pregnancy complications

Women who have undergone infertility treatment, such as in vitro fertilization, are more likely to experience severe pregnancy complications, according to new research published in CMAJ (Canadian Medical Association Journal) http://www.cmaj.ca/lookup/doi/10.1503/cmaj.181124

These include severe postpartum hemorrhage, admission to the intensive care unit and sepsis.

The background rate in Canada of any severe complication is approximately 10 to 15 for every 1000 births. Maternal deaths are even rarer, occurring in 10 or fewer per 100,000 births in Canada. During pregnancy, such complications are often sudden and difficult to predict. It is important to identify women who may be at risk for these "near miss" events so that worse outcomes, including death, may be averted.

"We found that the women who received infertility treatment, especially in vitro fertilization, were about 40% more likely to experience a severe pregnancy complication compared with women who gave birth without any treatment," says lead author Dr. Natalie Dayan, Research Institute of the McGill University Health Centre, Montréal, Quebec. "However, it is important to remember that the absolute number of women who develop these complications remains quite small, meaning that for most women who cannot conceive naturally, this treatment is a very safe and effective method of becoming pregnant and having a child."

In Canada, 1 in 6 couples is affected by infertility and many turn to infertility treatment, with about 18,000 pregnancies occurring after treatments with assisted reproductive technology each year. Fertility experts in Ontario have generated data to not only evaluate the rate of success of these treatments, but also to conduct appropriate surveillance of the mother's health after treatment.

Canadian researchers looked at data on 813 719 live births and stillbirths in Ontario hospitals between 2006 and 2012. They identified 11 546 women who conceived through infertility treatment and matched them with 47 553 women with similar characteristics who conceived without assistance. The women who conceive with infertility treatment are typically older, report higher incomes, are more often first-time mothers and carry multiple fetuses.

A severe maternal morbidity event occurred in 30.8 per 1000 infertility-treated pregnancies and in 22.2 per 1000 untreated pregnancies. This higher risk was seen among recipients of in vitro fertilization, but not among recipients of other forms of infertility treatment, such as intrauterine insemination or ovulation induction with medication.

The current study, like others before it, shows that maternal age greater than 40 years and being pregnant with twins or triplets are each linked with a higher rate of these complications.

Infertility treatment is often given to older women, and multiple pregnancy is also more likely after infertility treatment. The authors note that "[w]hether specific components of treatment using in vitro fertilization, such as the dose of ovarian hyperstimulation or fresh versus frozen embryo transfer, worsen maternal health, or whether the increased risk is a reflection of those who require or choose in vitro fertilization, remains to be determined."

However, the present study does suggest a small added risk from the treatment itself.

Over the last number of years, there have been worthy efforts by the medical and scientific community to promote optimal maternal health before infertility treatment. In addition, fertility specialists now often choose to implant only one embryo per mother to avoid risks associated with multiple pregnancies.

This study will promote further in-depth research to understand how such infertility treatment protocols may be further modified to minimize these rare but important health risks and increase the chances of a successful and safe pregnancy for the mother and her child.

Credit: 
Canadian Medical Association Journal

Shared genetic marker offers new promise in targeting specific ovarian and lung cancers

Two new papers, published simultaneously in Nature Communications and led by researchers at McGill University, offer promise that a drug currently used to treat estrogen positive breast cancer may be effective in treating two different types of cancer, one rare and one common form.

The breakthrough discovery launching this research came in 2014 when Dr. William Foulkes, James McGill Professor in the Departments of Medicine, Oncology and Human Genetics at McGill's Faculty of Medicine, showed that small cell carcinoma of the ovary, hypercalcemic type (SCCOHT), a rare but highly fatal cancer which primarily strikes younger women, is caused by mutations in the gene SMARCA4.

The challenge became how to find a way to exploit this genetic deficiency to better treat these patients. Stepping up to the challenge was Dr. Sidong Huang, Assistant Professor in the Department of Biochemistry at McGill's Faculty of Medicine, and senior author on both papers. Having trained for many years in Functional Genomics, Dr. Huang joined McGill with the goal of working on important problems in oncology. "Working on something like SCCOHT seemed an obvious choice as it is a unique genetic disease driven by loss of a single gene, SMARCA4," explains Dr. Huang, who is also a member of McGill's Goodman Cancer Research Centre.

A collaboration between Dr. Huang and Dr. Foulkes ensued, along with fellow McGill Professor Dr. Janusz Rak at the Research Institute of the McGill University Health Centre (RI-MUHC), Dr. Barbara Vanderhyden at the Ottawa Hospital Research Institute and Dr. Sriram Venneti at the University of Michigan. Through their work, Dr. Huang and his PhD student Yibo Xue, the papers' first author, were able to identify that targeting the cyclin-dependent kinases 4/6 (CDK4/6) exposed a vulnerability in SMARCA4-deficient cancers.

"What's clinically exciting about this work is that CDK4/6 inhibitors have been used for years, so they are very well known and their safety profile is established," notes Dr. Foulkes who is also Head of the Cancer Genetics Laboratory at the Lady Davis Institute of the Jewish General Hospital and a researcher at the RI-MUHC.

"In the case of SCCOHT, in particular, it is encouraging to find existing drugs that may prove effective because this is such a rare cancer that it is unlikely to be the subject of dedicated drug development," adds Dr. Huang. "Furthermore, patients may also benefit from the anti-tumour immunity triggered by CDK4/6 inhibitors as recently shown in other cancers, in addition to the direct tumour inhibition by these drugs".

Although much more common than SCCOHT, non-small cell lung cancer (NSCLC) can be very difficult to cure. "We extended the SCCOHT work to NSCLC as we realised about 10% of these common tumours also lack SMARCA4," explains Xue.

Evidence has been obtained both in vitro on human cancer cells, and in vivo on animal models that CDK inhibitors are effective at quelling SMARCA4-deficient tumours.

"The fact these drugs worked so well was a bit surprising," says Dr. Foulkes. "Perhaps it works because the protein which is targeted is at critically low levels in the tumour - just enough to keep the tumour alive, but still susceptible to blocking."

"This is in contrast to the initial application of this class of drugs to treat breast cancers that often express elevated levels of the same protein," adds Dr. Huang. "Thus, our findings potentially broaden the applications of these drugs."

The precise mechanism by which these particular inhibitors work in the different cancers is yet to be definitively determined. But, says Dr. Foulkes, this is an academic question; so long as the drugs proves effective, the clinical impact is undeniable.

"The next step is seeing if these drugs work in patients with SCCOHT or NSCLC with SMARCA4 deficiency and identifying other additional drug targets in these cancers to be inhibited in combination with CDK4/6 inhibitors to overcome potential resistance," says Dr. Huang.

"The dream would be to cure these cancers, but any hint of a response would be a positive step forward, in particular because current treatments for women stricken with SCCOHT have limited effectiveness," concludes Dr. Foulkes.

Credit: 
McGill University

Let's talk about sex ... after childbirth

The feedback from doctors seemed almost appalling.

One woman remembers her gynecologist stating, "Well, girl, you better, because if you don't, somebody else will."

Another said, "My doctor was really excited to tell [my partner] at six weeks that I was ready to go."

Yes, we're talking sex. After childbirth.

Resuming sexual activity after pregnancy isn't always like riding a bike, especially for mothers experiencing postpartum pain, fatigue and stress. Yet, many couples are led to believe there is a hard-and-fast point at which they can restart sexual intercourse, according to 70 in-depth interviews with women in South Carolina.

The findings were recently published online in the journal Culture, Health & Sexuality.

"Among participants, the most frequent recommendation from health providers was to resume sex after the six-week postpartum visit," said Andrea DeMaria, an assistant professor in Purdue University's College of Health and Human Sciences who led the study. "But we found some women were ready before six weeks due to personal and partner desire, while other women expressed difficulties resuming sex, including pain and exhaustion from caring for a new baby."

As part of the study, research subjects retold their conversations with their doctors about postpartum sex. A couple of examples:

The American College of Obstetrics and Gynecology recently revised its recommendations on postpartum care, stating it "should be an ongoing process, rather than a single encounter, and that all women have contact with their ob-gyns or other obstetric care providers within the first three weeks postpartum," according to a news release issued by the professional organization.

Although the recommendation was designed to reduce maternal morbidity and mortality, the move represents a departure from the current "one-size-fits-all" approach to postpartum care, DeMaria said.

"Providers should communicate to their patients pre- and postpartum that women have varied experiences with resuming sexual activity after birth, and there is not one strict recommendation or guideline that applies to everyone," she said.

The in-depth interviews reinforced previous findings that individual women significantly differ in how they experience postpartum sexual desires and pleasures, which are often influenced physically by delivery mode and psychologically by self-confidence and body image. The study also highlighted the need for candid conversations about the subject among mothers, partners and doctors, even at the prenatal stage.

"If health care providers can bring this up and normalize these different experiences, then women and partners will be more aware of what they should be on the lookout for, that these feelings they're experiencing are normal," said Stephanie Meier, a doctoral student at Purdue and co-author of the paper. "Those conversations should continue throughout prenatal and postpartum."

The study is part of a larger oral histories project recording women's reproductive health experiences across generations, including menstruation, contraception, childbirth and sexual violence. The recordings will be archived for future reference, Meier said.

"Researchers and people who are interested in the history of South Carolina and the community can go back and listen to these recordings so those stories aren't lost," she said.

New mothers seeking advice on postpartum health should contact their personal health care provider, DeMaria said. Women also can reference postpartum toolkits provided by the American College of Obstetrics and Gynecology or contact Indiana's Office of Women's Health.

The work aligns with Purdue's giant leaps (Giant Leaps should be capitalized)celebration, acknowledging the university's global advancements made in health, longevity and quality of life as part of Purdue's 150th anniversary. This is one of the four themes of the yearlong celebration's Ideas Festival, designed to showcase Purdue as an intellectual center solving real-world issues.

Credit: 
Purdue University

SFU researchers find new clues to controlling HIV

image: Simon Fraser University professor Mark Brockman (l) is part of an international research team that is investigating a connection between infection control and how well antiviral T cells respond to diverse HIV sequences.

Image: 
SFU

The immune system is the body's best defense in fighting diseases like HIV and cancer. Now, an international team of researchers is harnessing the immune system to reveal new clues that may help in efforts to produce an HIV vaccine.

SFU professor Mark Brockman and co-authors from the University of KwaZulu-Natal in South Africa have identified a connection between infection control and how well antiviral T cells respond to diverse HIV sequences.

Brockman explains that HIV adapts to the human immune system by altering its sequence to evade helpful antiviral T cells.

"So to develop an effective HIV vaccine, we need to generate host immune responses that the virus cannot easily evade," he says.

Brockman's team has developed new laboratory-based methods for identifying antiviral T cells and assessing their ability to recognize diverse HIV sequences.

"T cells are white blood cells that can recognize foreign particles called peptide antigens," says Brockman. "There are two major types of T cells--those that 'help' other cells of the immune system, and those that kill infected cells and tumours."

Identifying the T cells that attack HIV antigens sounds simple, but Brockman says three biological factors are critical to a T cell-mediated immune response. And in HIV infection, all three are highly genetically diverse.

He explains that for a T cell to recognize a peptide antigen, the antigen must first be presented on the cell surface by human leukocyte antigen proteins (HLA), which are are inherited.

And since many thousands of possible HLA variants exist in the human population, every person responds differently to infection. In addition, since HIV is highly diverse and evolves constantly during untreated infection, the peptide antigen sequence also changes.

Matching T cells against the HLA variants and HIV peptide antigens expressed in an individual is a critical step in the routine research process. But, says Brockman, "our understanding of T cell responses will be incomplete until we know more about the antiviral activity of individual T cells that contribute to this response."

It is estimated that a person's T cell "repertoire" is made up of a possible 20-100 million unique lineages of cells that can be distinguished by their T cell receptors (TCR), of which only a few will be important in responding to a specific antigen.

So to reduce the study's complexity, the team examined two highly related HLA variants (B81 and B42) that recognize the same HIV peptide antigen (TL9) but are associated with different clinical outcomes following infection.

By looking at how well individual T cells recognized TL9 and diverse TL9 sequence variants that occur in circulating HIV strains, the researchers found that T cells from people who expressed HLA B81 recognized more TL9 variants compared to T cells from people who expressed HLA B42.

Notably, a group of T cells in some B42-expressing individuals displayed a greater ability to recognize TL9 sequence variants. The presence of these T cells was associated with better control of HIV infection.

This study demonstrates that individual T cells differ widely in their ability to recognize peptide variants and suggests that these differences may be clinically significant in the context of a diverse or rapidly evolving pathogen such as HIV.

Much work needs to be done to create an effective vaccine. However, says Brockman, "Comprehensive methods to assess the ability of T cells to recognize diverse HIV sequences, such as those reported in this study, provide critical information to help design and test new vaccine strategies."

Credit: 
Simon Fraser University

New UC study may help guide treatment of pediatric anxiety

Researchers from the University of Cincinnati (UC) examined common medications prescribed for children and adolescents with anxiety disorders, to determine which are the most effective and best-tolerated. This study revealed that the selective serotonin reuptake inhibitors (SSRIs) performed best overall compared to other types of medications.

The results, available online in the Journal of Clinical Psychiatry, include the largest amount of data to date for analyses of pediatric anxiety disorder treatments. The study examined more than a dozen medications from 22 randomized controlled trials.

"Clinicians have limited data to help them select among evidence-based medication treatments for their patients with anxiety. This meta-analysis provides guidance in terms of medication-specific differences in efficacy and tolerability among medications that are commonly used to treat pediatric patients with anxiety disorders," says Jeffrey Strawn, MD, associate professor in the Department of Psychiatry and Behavioral Neuroscience at the UC College of Medicine and lead author on the study.

According to the American Academy of Pediatrics (AAP), anxiety disorders are the most common type of mental health disorder in children. Anxiety affects approximately 8 percent of all children and adolescents. Symptoms of anxiety can include having recurring fears, aversions to social situations or being unable to control worries and can manifest as serious medical conditions: trouble sleeping, difficulty concentrating, even heart and digestive problems.

"Our study synthesizes evidence from multiple individual trials to guide clinicians and patients in deciding which medication to use when treating children and adolescents with anxiety disorders," said Eric Dobson, MD, a psychiatry resident at the Medical University of South Carolina in Charleston, who conducted the study while a medical student at UC.

The authors identified trials published between 1971 and 2018, comparing 13 commonly used medications with placebo or with other medications--including antidepressants--for the acute treatment of anxiety disorders in children and adolescents. A total of 2,623 patients (average age: 11½ years) had been randomly assigned to receive a medication or receive placebo, and the patients had generalized, separation or social anxiety disorders that was of at least moderate severity.

The researchers looked at the number of patients who responded to treatment as well as the proportion of patients who discontinued the study as a result of adverse events, i.e. side effects. In anxious youth, treatment response was more effective with SSRIs than with serotonin-norepineprhine reuptake inhibitors (SNRIs). SNRIs prolong the activity of the neurotransmitters serotonin and norepinephrine, while SSRIs act predominantly to prolong the effects of serotonin. In terms of discontinuation and tolerability, SSRIs were the most tolerable class of medication, while tricyclic antidepressants were the least tolerable. Tricyclic antidepressants increase levels of norepinephrine and serotonin, and block the action of the neurotransmitter acetylcholine, which may give rise to some of their side effects.

"This comprehensive evaluation comparing efficacy and tolerability of treatments in pediatric anxiety disorders suggests that SSRIs are superior to SNRIs and all other classes of medications," says Dobson.

"These findings confirm the recommendations from the American Academy of Child and Adolescent Psychiatry that SSRIs be considered as the first-line medication treatment for anxiety in youth," adds Strawn.

Credit: 
University of Cincinnati

In prenatal testing, 'genomics' sometimes sees what genetic tests can't

NEW YORK, NY (Jan. 31, 2019) -- A new kind of prenatal genetic testing can improve obstetricians' ability to diagnose the underlying causes of fetal anomalies found during prenatal ultrasounds. But the results require expert interpretation, according to a study by researchers at Columbia University Vagelos College of Physicians and Surgeons. The study was published in The Lancet.

The work, led by Ronald Wapner, MD, director of reproductive genetics at Columbia University's Institute for Genomic Medicine (IGM) and vice chair of research in obstetrics and gynecology, and David Goldstein, PhD, director of the IGM, clarifies the utility and limitations of such tests.

Why it Matters

In about 3 percent of pregnancies, ultrasound imaging will reveal a significant fetal physical anomaly. Knowing the cause of the anomaly can help doctors and parents be better prepared, both during the pregnancy and after delivery.

However, doctors are often hard-pressed to identify the cause. Standard genetic tests are able to identify the cause of fewer than half of such anomalies.

When a cause cannot be identified, families often embark on a diagnostic odyssey that can last for years until the exact cause can be determined. Parents are also left not knowing whether future pregnancies could be similarly affected.

Background

"If there is an anomaly detected at ultrasound, the current standard of care...is to obtain a sample of amniotic fluid and perform karyotyping to determine if the fetus has the right number of chromosomes and if small regions are missing," says Vimla Aggarwal, MBBS, director of Columbia's precision genomics laboratory and co-lead author of the study. But this test can only pinpoint the underlying cause for about 40 percent of anomalies found on ultrasound, leaving the majority of families in the dark.

To address this, some clinicians have begun offering whole exome sequencing--a technique that reads the smallest details of all protein-coding genes in the genome--to obtain a genetic diagnosis with undiagnosed abnormalities. However, only a handful of small, selective studies have looked at the utility of the technique as a prenatal diagnostic tool, and much of the science connecting gene variants to fetal anomalies remains unsettled.

What the researchers did

In this study, the Columbia team enrolled 234 pregnant women at NewYork-Presbyterian/Columbia University Irving Medical Center with abnormal ultrasound findings but whose standard genetic tests were negative. By sequencing the genomes of the parents and fetuses, the researchers were able to diagnose an additional 10 percent (24) of the fetuses with a known genetic disorder, providing a diagnosis for almost half of affected pregnancies. The diagnostic rate increased with the number of ultrasound anomalies present: 6 percent with single anomalies and 19 percent with multiple anomalies.

Another 20 percent (46) of fetuses had gene sequence signatures that were suggestive, though not definitive, of a genetic disorder.

What it Means

"Based on our findings, whole exome sequencing could serve as a valuable addition to standard prenatal genetic tests, with the potential to improve perinatal care for infants with genetic conditions and ease parents' fears by offering a clear diagnosis," says Wapner, who is also a maternal-fetal medicine expert at NewYork-Presbyterian/Columbia University Irving Medical Center.

Since the science surrounding genomic analysis is still developing, some of the gene sequence patterns had been associated, but not definitively linked, to the specific developmental abnormality. Clinicians need to balance their desire to give patients definitive answers against the sometimes murky state of genomic science. A team of multidisciplinary experts such as clinical and molecular geneticists, genetic counselors, developmental biologists, and maternal fetal medicine specialists, are needed to ensure an accurate interpretation of the new test results.

What's Next

"Future studies are needed to determine whether performing whole exome sequencing on fetuses during pregnancy will lead to improved care and reproductive counseling," Wapner adds. Such studies are ongoing at Columbia.

As more information about the genetics of fetal anomalies comes to light, many of the suggestive gene signatures discovered in this study may ultimately be determined to be the cause of an anomaly. This could increase the diagnostic yield of whole exome sequencing to 7 in 10 cases, say the authors. Sequencing data may also be used to develop better tools to treat the fetus before and after delivery.

Credit: 
Columbia University Irving Medical Center

Study: Collaborative video games could increase office productivity

image: Team video gaming photo illustration

Image: 
BYU Photo

Move over trust falls and ropes courses, turns out playing video games with coworkers is the real path to better performance at the office.

A new study by four BYU information systems professors found newly-formed work teams experienced a 20 percent increase in productivity on subsequent tasks after playing video games together for just 45 minutes. The study, published in AIS Transactions on Human-Computer Interaction, adds to a growing body of literature finding positive outcomes of team video gaming.

"To see that big of a jump -- especially for the amount of time they played -- was a little shocking," said co-author and BYU associate professor Greg Anderson. "Companies are spending thousands and thousands of dollars on team-building activities, and I'm thinking, go buy an Xbox."

For the study, researchers recruited 352 individuals and randomly organized them into 80 teams, making sure no participants with pre-existing relationships were on the same team. For their initial experimental task, each team played in a geocaching competition called Findamine, an exercise created by previous IS researchers which gives players short, text-based clues to find landmarks. Participants were incentivized with cash rewards for winning the competition.

Following their first round of Findamine, teams were randomly assigned to one of three conditions before being sent out to geocache again: 1) team video gaming, 2) quiet homework time or 3) a "goal training" discussion on improving their geocaching results. Each of these conditions lasted 45 minutes and those in the video gaming treatment chose to play either Rock Band or Halo 4 -- games selected because they are both familiar and require coordinated efforts among players.

The researchers found that while the goal-training teams reported a higher increase in team cohesion than the video-gaming teams, the video gamers increased actual performance on their second round of Findamine significantly, raising average scores from 435 to 520.

"Team video gaming may truly be a viable -- and perhaps even optimal -- alternative for team building," said lead researcher Mark Keith, associate professor of information systems at BYU.

Researchers also said it doesn't matter if people are avid video gamers to see the positive effects of gaming together; they observed video game novices established communications norms -- and built working relationships -- even quicker with new teammates so as to learn the nuances of the game.

There is one caveat to the finding, however: the study was done with teams of individuals who don't know each other. Researchers admit If team members are already familiar with each other, then competitive video gaming may possibly reinforce biases and negative relationships developed from previous experiences.

Credit: 
Brigham Young University

Study examines barriers to exercise experienced by dialysis patients

Washington, DC (January 29, 2019) -- A new study has identified several barriers that make it difficult for dialysis patients to exercise. The study, which appears in an upcoming issue of the Clinical Journal of the American Society of Nephrology (CJASN), also explored the benefits that these patients would like to gain from exercising, if they were able to do so.

People with end stage kidney disease tend to be sedentary and are at high risk of worsening functional impairment after starting dialysis. Exercise may mitigate this risk and help to improve patients' quality of life, but there is often a disconnect between what researchers and patients think is important. To address this, a team led by Deborah Zimmerman, MD, MSc and Danielle Moorman, MD, MSc (The Ottawa Hospital and the University of Ottawa) designed a study to better understand patient perspectives about the benefits and barriers to exercise, the types of exercise that patients are interested in, and the types of outcomes that are most important to them if they were to exercise. The researchers also examined whether these differ depending on patients' age and the type of dialysis that they use.

For the study, the investigators surveyed 423 patients with end stage kidney disease who were undergoing dialysis. Among the major findings:

The most desired benefits of exercise were improved energy (18%) and strength (14%).

The third priority differed such that improved sleep, maintenance of independence, and longevity were selected by peritoneal dialysis, in-center hemodialysis, and home dialysis patients, respectively.

Older patients were most interested in improving energy and strength, as well as maintaining independence, while young patients were interested in improving energy, longevity, and transplant candidacy.

Only 25% of patients could exercise without difficulty; the major barriers for the reaming patients were feeling too tired (55%), too weak (49%), or short of breath (50%).

If patients were to exercise, they wanted to exercise at home (73%) using a combination of aerobic and resistance training (41%), regardless of dialysis type or age.

"The majority of dialysis patients included in this study, regardless of modality, believe exercise would be beneficial, but report several barriers to participating in an exercise program that will need to be addressed in any proposed exercise program and/or clinical study," said Dr. Zimmerman. "The fatigue and weakness experienced by patients may mandate an exercise program that can be incorporated into their activities of daily living at home or in their neighborhood."

In an accompanying Patient Voice editorial, Nichole Jefferson noted that, as a chronic kidney disease patient of over 15 years, the article resonated with her on different levels. She also explained that at various times over the course of her disease, she has viewed exercise in different terms. "Reading this article and the surveys utilized made me realize that maybe we need to re-define exercise," she wrote.

Credit: 
American Society of Nephrology

Do women with breast cancer have a higher risk of atrial fibrillation?

Philadelphia, January 29, 2019 - Patients with breast cancer may have an increased incidence of atrial fibrillation (AF), say researchers. A retrospective study in Denmark has found that women with breast cancer have an increased risk of developing AF within three years following their cancer diagnosis compared with other women of the same age. The results are published in HeartRhythm, the official journal of the Heart Rhythm Society and the Cardiac Electrophysiology Society.

According to the World Cancer Research Fund, breast cancer is the most common cancer in women worldwide and the second most common cancer overall. There were over two million new cases in 2018. Investigators hypothesized that patients with breast cancer may have a lower threshold for developing AF as breast cancer induces inflammation, a known risk factor for AF.

"Modern treatment regimens ensure that approximately 80 percent of breast cancer patients become long-term survivors," explained lead investigator Maria D'Souza, MD, of the Herlev and Gentofte Hospital, Cardiology Department, Hellerup, Denmark. "Healthy survivorship can be threatened, however, by long-term complications resulting from both the cancer and related treatments. Notably, increased frequencies of cardiovascular disease, especially heart failure and ischemic heart disease, have been observed in survivors. We hypothesized that women with breast cancer could also be more prone to developing AF because breast cancer induces inflammation."

Using nationwide registries in Denmark, investigators analyzed the long-term incidence of AF in patients with breast cancer compared with the general population. They identified patients diagnosed with breast cancer between 1998 and 2015 and then matched 74,155 female breast cancer patients according to age and sex with 222,465 individuals from the general population (ratio 1:3). Long-term incidence of AF was estimated by cumulative incidence curves and multivariable Cox regression models.

The investigators found that female patients with breast cancer had an increased risk of AF, and that the risk was dependent on age and time since diagnosis. Patients less than sixty years old had a more than doubled risk in the first six months after diagnosis and an eighty percent higher risk from six months to three years after their diagnosis. Patients over sixty years old had a similar risk to the general population during the first six months but had a fourteen percent increased risk from six months to three years after diagnosis.

"This study was the first to show that women with recent breast cancer had an increased risk of developing AF. Our findings should encourage doctors to focus on the risk of AF in patients with recent breast cancer in order to diagnose and treat as early as possible, and researchers to search for increased risk of AF looking at the cancer itself, treatment, genetic predisposition, and shared life style risk factors," said Dr. D'Souza. "Ultimately, earlier treatment may result in better stroke prevention."

In an accompanying editorial, Ankur Karnik, MD, of the Evans Department of Medicine, Cardiovascular Medicine Section, Boston University School of Medicine, Boston, MA, USA, and colleagues commented that this study provides valuable insights from a large nationwide cohort with results generalizable to women of European ancestry. However, they caution that there are several factors to be considered when interpreting the results. Follow-up was only three years, which may be too short a time for the cardiotoxic effects of breast cancer treatment to fully manifest. The multivariable Cox regression model used in the study did not account for competing risk of death - the three-year mortality risk was larger than the risk of AF in both groups.

The study does however suggest several substantive research questions. Can the risk of AF in patients with breast cancer be explained by shared risk factors? Is it a multiple-hit phenomenon in which a pro-inflammatory state and breast cancer treatments add insult to injury? Are there certain chemotherapeutic regimens or cumulative radiation doses that raise AF risk? Are there subsets of women with breast cancer at sufficient risk for AF who may merit more intensive monitoring?

"While we do not consider broad-based monitoring for AF in women with breast cancer is warranted at this time, the work of D'Souza et al. is a contribution to the burgeoning field of cardio-oncology and provides support for further research into the potentially bidirectional relationship between cancer and AF," noted Dr. Karnik.

Credit: 
Elsevier

Exploring the connection between hearing loss and cognitive decline

Hearing loss affects tens of millions of Americans and its global prevalence is expected to grow as the world's population ages. A new study led by investigators at Brigham and Women's Hospital adds to a growing body of evidence that hearing loss is associated with higher risk of cognitive decline. These findings suggest that hearing loss may help identify individuals at greater risk of cognitive decline and could provide insights for earlier intervention and prevention.

"Dementia is a substantial public health challenge that continues to grow. There is no cure, and effective treatments to prevent progression or reverse the course of dementia are lacking," said lead author Sharon Curhan, MD, MSc, a physician and epidemiologist in the Channing Division for Network Medicine at the Brigham. "Our findings show that hearing loss is associated with new onset of subjective cognitive concerns which may be indicative of early stage changes in cognition. These findings may help identify individuals at greater risk of cognitive decline."

Curhan and colleagues conducted an eight-year longitudinal study among 10,107 men aged ≥62 years in the Health Professionals Follow-up Study (HFPS). They assessed subjective cognitive function (SCF) scores based on responses to a six-item questionnaire administered in 2008, 2012 and 2016. SCF decline was defined as a new report of at least one SCF concern during follow-up.

The team found that hearing loss was associated with higher risk of subjective cognitive decline. Compared with men with no hearing loss, the relative risk of cognitive decline was 30 percent higher among men with mild hearing loss, 42 percent higher among men with moderate hearing loss, and 54 percent higher among men with severe hearing loss but who did not use hearing aids.

Researchers were interested to see if hearing aids might modify risk. Although they found that among men with severe hearing loss who used hearing aids, the risk of cognitive decline was somewhat less (37 percent higher), it was not statistically significantly different from the risk among those who did not use hearing aids. The authors note that this may have been due to limited power or could suggest that if a difference truly exists, the magnitude of the effect may be modest.

The authors also note that the study was limited to predominantly older white male health professionals. This allowed for greater control of variability but further studies in additional populations would be helpful. In addition, the study relies on self-reported hearing loss and subjective measures of cognitive function. In the future, the team plans to investigate the relationships between self-reported hearing loss, change in audiometric hearing thresholds, and changes in cognition in women using several different assessment measures.

"Whether there is a temporal association between hearing loss and cognitive decline and whether this relation is causal remains unclear," said Curhan. "We plan to conduct further longitudinal studies of the relation of hearing loss and cognition in women and in younger populations, which will be informative."

Credit: 
Brigham and Women's Hospital

Waist-stature ratio can indicate the risk of cardiovascular disease even in healthy men

image: Physically active men who are not overweight but who have a relatively high waist-stature ratio are more likely to develop heart disorders, according to a study by Brazilian researchers.

Image: 
Vitor Engrácia Valenti

Health experts have warned for years that men and women with excess abdominal fat run a greater risk of developing cardiovascular problems. However, individuals with abdominal or central obesity are not the only ones in danger, according to a new study.

The study found that physically active men who were not overweight but whose waist-stature ratio (WSR) was close to the risk threshold were also more likely to develop heart disorders than individuals with lower WSRs.

The study was conducted by Brazilian researchers affiliated with São Paulo State University (UNESP) in Presidente Prudente and Marília in collaboration with colleagues at Oxford Brookes University in the UK. The study resulted from a research project supported by São Paulo Research Foundation - FAPESP and is published in the journal Scientific Reports.

"We found that non-overweight, physically active, healthy individuals without a history of metabolic or cardiovascular disease but with WSRs close to the risk factor limit were more likely to develop heart disorders than individuals with less accumulated fat in the waist area," Vitor Engrácia Valenti, a professor at UNESP Marília and principal investigator for the study, told.

According to Valenti, recent research suggests that the WSR (waist circumference divided by height) is a more accurate predictor of cardiovascular risk than the body mass index (BMI), a widely used measure of body fat.

The researchers further investigated this hypothesis by analyzing the autonomic recovery of heart rate after aerobic exercise in healthy men with different WSRs. To this end, 52 physically active healthy men aged 18-30 were divided into the following three groups according to WSR: between 0.40 and 0.449, which is below the risk threshold for cardiovascular disease; between 0.45 and 0.50, which is close to the threshold; and between 0.50 and 0.56, which is above the threshold.

The participants were tested on two separate days with a 48-hour interval between the two tests. On the first day, they remained seated and at rest for 15 minutes, and then performed a maximum effort test on a treadmill. After this bout of aerobic exercise, they remained standing and at rest for three minutes and then seated and at rest for the next 57 minutes, totalling one hour of recovery from the exertion.

"This test proved they were all physically active. They weren't athletes, but they were in the habit of playing soccer on weekends, for example," Valenti said.

On the second day, they warmed up for five minutes and then ran at 60% of their maximum effort for 25 minutes.

Their heart rate and heart rate variability were measured while at rest and six times during the recovery hour to assess their speed of autonomic recovery after physical activity.

"Autonomic heart rate recovery time is a good indicator of the risk of cardiovascular complications immediately after aerobic exercise and of developing heart disease," Valenti said. "If the heart rate takes a long time to return to normal, this indicates that the individual runs a significant risk of developing a heart disorder."

Interaction with nervous system

Analysis of the measurements showed that the autonomic recovery was slower in the groups with WSRs close to and above the risk threshold for heart disease after both the maximum effort test and moderate aerobic exercise.

"We found that volunteers in the group with WSRs close to the risk limit were also more likely to develop cardiovascular disorders," Valenti said.

The researchers at UNESP performed statistical analyses involving correlation coefficient tests and linear regression models to look for significant links between WSRs and heart rate variability after physical activity.

The results of the statistical analyses suggested that two factors were most significantly correlated during the first ten minutes of the postexercise recovery period, when the parasympathetic nervous system (PNS) was being reactivated.

Among other functions, the PNS, one of the three divisions of the autonomic nervous system, slows heart rate and reduces blood pressure via the release of hormones.

"We found that PNS activity diminished as WSR increased. This heightens the risk of cardiovascular disturbance," Valenti said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Heroin injections linked to substantial rise in bacterial heart infections

A study of people who inject drugs found a significant increase in the risk of infective endocarditis, a serious infection of the lining of the heart, possibly linked to increasing use of the opioid hydromorphone. The study is published in CMAJ (Canadian Medical Association Journal).

Infective endocarditis can be life-threatening.

"We observed a substantial increase in the risk of infective endocarditis among people who inject drugs, which is associated with hydromorphone's increasing share of the prescription opioid market," write the authors, including first author Dr. Matthew Weir, associate scientist at Lawson Health Research Institute and assistant professor at Schulich School of Medicine & Dentistry, Western University, London, Ontario.

Researchers looked at Ontario data on drug users from linked health administrative databases at ICES between April 2006 and September 2015. There were 60 529 admissions to hospital of people who inject drugs and, of these, 733 had infective endocarditis linked to injecting drugs. Although admission rates for people who inject drugs were stable over the study period, the risk of infective endocarditis increased from 13.4 admissions every three months (fourth quarter 2011) to 35.1 admissions every three months in the period afterwards.

Whereas the percentage of opioid prescriptions attributed to controlled-release oxycodone declined rapidly when it was removed from the market by its manufacturer in the fourth quarter of 2011, hydromorphone prescriptions increased from 16% at the start of the study to 53% by the end.

The researchers expected that an increase in risk of infective endocarditis would occur when controlled-release oxycodone was removed from the Canadian market; however, they found that the rise began before removal.

"Although our observations do not support our hypothesis that the loss of controlled-release oxycodone increased the use of hydromorphone, they do support our suspicion that hydromorphone may be playing a role in the increasing risk of infective endocarditis," says coauthor Dr. Michael Silverman, associate scientist at Lawson and associate professor at Schulich School of Medicine & Dentistry.

The increase in the risk of infective endocarditis is consistent with the findings of other studies, but the observed timing of the increase was novel.

"Both the rise in this severe complication of injection drug use and the possible association with hydromorphone require further study," suggest the authors.

"The risk of infective endocarditis among people who inject drugs: a retrospective, population-based time series analysis" is published January 28, 2019

Credit: 
Canadian Medical Association Journal

Study examines long-term opioid use in patients with severe osteoarthritis

New research published in Arthritis & Rheumatology, an official journal of the American College of Rheumatology, reveals that prescription opioids are commonly used long-term to treat pain in older patients with severe osteoarthritis. The study also found substantial statewide variation in rates of treatment with long-term opioid therapy for osteoarthritis, which was not fully explained by differences in patient characteristics or access to healthcare providers.

Long-term use of prescription opioids for the treatment of chronic pain carries the risk of dependence and other serious harms. Osteoarthritis in the hip or knee is a common source of chronic pain in the United States, as it affects nearly 30 million US adults and has a prevalence that is expected to rise with the aging of the population.

To evaluate long-term opioid use in patients with severe osteoarthritis and to examine differences based on geography and healthcare access, Rishi J. Desai, MS, PhD, of Brigham and Women's Hospital and Harvard Medical School, and his colleagues analyzed 2010-2014 Medicare data on osteoarthritis patients undergoing total joint replacement.

The analysis included 358,121 patients with an average age of 74 years. One in six patients used long-term prescription opioids (?90 days) for pain management in the year leading up to total joint replacement, with an average duration of approximately seven months. More strikingly, nearly 20 percent of the long-term users consumed an average daily dose of ?50 morphine milligram equivalents, an amount that was identified by recent guidelines as potentially imparting a high risk of opioid-related harms.

The average percent of long-term opioid users among advanced osteoarthritis patients varied widely across states, ranging from 8.9 percent in Minnesota to 26.4 percent in Alabama. Access to primary care providers was only modestly associated with rates of long-term opioid use (an average adjusted difference of 1.4 percent between areas with highest versus lowest concentration of primary care providers), while access to rheumatologists was not associated with long-term opioid use.

"These findings suggest that regional prescribing practices are key determinants of prescription opioid use in chronic pain patients, and geographically targeted dissemination strategies for safe opioid prescribing guidelines may be required to address the high use observed in certain states," said Dr. Desai.

Credit: 
Wiley

Chickens genetically modified to lay human proteins in eggs offer future therapy hope

image: Scientists at the University of Edinburgh's Roslin Institute have produced GM chickens that make human proteins in their eggs, offering a more cost-effective method of producing certain types of drugs.

Image: 
Norrie Russell, The Roslin Institute

Chickens that are genetically modified to produce human proteins in their eggs can offer a cost-effective method of producing certain types of drugs, research suggests.

The study - which has initially focused on producing high quality proteins for use in scientific research - found the drugs work at least as well as the same proteins produced using existing methods.

High quantities of the proteins can be recovered from each egg using a simple purification system and there are no adverse effects on the chickens themselves, which lay eggs as normal.

Researchers say the findings provide sound evidence for using chickens as a cheap method of producing high quality drugs for use in research studies and, potentially one day, in patients.

Eggs are already used for growing viruses that are used as vaccines, such as the flu jab. This new approach is different because the therapeutic proteins are encoded in the chicken's DNA and produced as part of the egg white.

The team have initially focused on two proteins that are essential to the immune system and have therapeutic potential - a human protein called IFNalpha2a, which has powerful antiviral and anti-cancer effects, and the human and pig versions of a protein called macrophage-CSF, which is being developed as a therapy that stimulates damaged tissues to repair themselves.

Just three eggs were enough to produce a clinically relevant dose of the drug. As chickens can lay up to 300 eggs per year, researchers say their approach could be more cost-effective than other production methods for some important drugs.

Researchers say they haven't produced medicines for use in patients yet but the study offers proof-of-principle that the system is feasible and could easily be adapted to produce other therapeutic proteins.

Protein-based drugs, which include antibody therapies such as Avastin and Herceptin, are widely used for treating cancer and other diseases.

For some of these proteins, the only way to produce them with sufficient quality involves mammalian cell culture techniques, which are expensive and have low yields. Other methods require complex purification systems and additional processing techniques, which raise costs.

Scientists have previously shown that genetically modified goats, rabbits and chickens can be used to produce protein therapies in their milk or eggs. The researchers say their new approach is more efficient, produces better yields and is more cost-effective than these previous attempts.

The study was carried out at the University of Edinburgh's Roslin Institute and Roslin Technologies, a company set up to commercialise research at The Roslin Institute.

The research is published in BMC Biotechnology. The Roslin Institute receives strategic funding from the Biotechnology and Biological Sciences Research Council.

Professor Helen Sang, of the University of Edinburgh's Roslin Institute, said: "We are not yet producing medicines for people, but this study shows that chickens are commercially viable for producing proteins suitable for drug discovery studies and other applications in biotechnology."

Dr Lissa Herron, Head of the Avian Biopharming Business Unit at Roslin Technologies, said: "We are excited to develop this technology to its full potential, not just for human therapeutics in the future but also in the fields of research and animal health."

Dr Ceri Lyn-Adams, Head of Science Strategy, Bioscience for Health with BBSRC, said: "These recent findings provide a promising proof of concept for future drug discovery and potential for developing more economical protein-based drugs."

Credit: 
University of Edinburgh

Asthma Controller Step Down Yardstick -- treatment guidance for when asthma improves

ARLINGTON HEIGHTS, IL (January 25, 2019) - When asthma symptoms improve, there's reason for celebration by both allergist and patient. But once symptoms are better, how do health care practitioners go about stepping down asthma medication to make sure a patient's needs are still met? The Asthma Controller Step Down Yardstick, a new guideline from the American College of Allergy, Asthma and Immunology (ACAAI), offers an "operation manual". It helps health care professionals understand how to identify when a patient is ready to step down their treatment, and what the process might involve.

"There is a gap in information when it comes to guiding allergists and other health practitioners through the process of stepping down controller therapy," says allergist Bradley Chipps, MD, immediate ACAAI past president and lead author of the guideline. "We have yardsticks that address stepping up asthma controller medication, but this document addresses how to reverse the process for patients whose asthma has been well controlled for at least three months - or longer for the highest risk patients."

The guideline outlines both reasons for, and reasons not to consider stepping down treatment.

Consider stepping down treatment to:

Re-assess a current diagnosis of asthma.

Decrease the potential adverse effects of asthma medications.

Address patient and family preferences about taking medications.

Reduce the burden of treatment (e.g., time to take medications, remembering to take medications, having to take medications at work or school).

Reduce the costs of treatment.

Simplify therapy and enhance adherence with treatment.

Consider not stepping down treatment when:

Reducing asthma medication may lead to an increased risk of having an asthma
exacerbation or loss of control.

It is unclear whether the patient is using his/her asthma medications as indicated (e.g., whether the patient has already self-reduced treatment).

A seasonal maintenance of therapy is needed (e.g., during the patient's allergy season or viral season).

"Stepping down controller therapy serves several purposes," says allergist Leonard Bacharier, MD, co-author of the guideline. "It identifies the minimum effective treatment that will maintain well-controlled asthma based on both impairment and risk domains. And it minimizes the risk of adverse effects from higher doses of medication(s) than may be needed to maintain control. It can also simplify the patient's treatment regimen and may enhance adherence, because reducing exposure to higher doses of medication(s) is generally consistent with patient values and preferences."

The guideline offers specific recommendations regarding appropriate medications and dosing for patients on step 2 through step 5 asthma therapy.

"Stepping down asthma therapy is an important component of managing patients with asthma, from the mildest end of the severity spectrum to the most difficult-to-treat patient," says Dr. Chipps. "Although current guidelines recommend stepping down therapy in a patient with stable asthma, the operational focus has been on how to step up treatment when asthma is inadequately controlled. This Yardstick provides recommendations for how to step-down therapy using guideline-based severity levels."

Credit: 
American College of Allergy, Asthma, and Immunology