Culture

Looking in the depths of the great red spot to find water on Jupiter

image: The Great Red Spot is the dark patch in the middle of this infrared image. It is dark due to the thick clouds that block thermal radiation. The yellow strip denotes the portion of the Great Red Spot used in astrophysicist Gordon L. Bjoraker's analysis.

Image: 
NASA's Goddard Space Flight Center/Gordon Bjoraker

For centuries, scientists have worked to understand the makeup of Jupiter. It's no wonder: this mysterious planet is the biggest one in our solar system by far, and chemically, the closest relative to the Sun. Understanding Jupiter is a key to learning more about how our solar system formed, and even about how other solar systems develop.

But one critical question has bedeviled astronomers for generations: Is there water deep in Jupiter's atmosphere, and if so, how much?

Gordon L. Bjoraker, an astrophysicist at NASA's Goddard Space Flight Center in Greenbelt, Maryland, reported in a recent paper in the Astronomical Journal that he and his team have brought the Jovian research community closer to the answer.

By looking from ground-based telescopes at wavelengths sensitive to thermal radiation leaking from the depths of Jupiter's persistent storm, the Great Red Spot, they detected the chemical signatures of water above the planet's deepest clouds. The pressure of the water, the researchers concluded, combined with their measurements of another oxygen-bearing gas, carbon monoxide, imply that Jupiter has 2 to 9 times more oxygen than the sun. This finding supports theoretical and computer-simulation models that have predicted abundant water (H2O) on Jupiter made of oxygen (O) tied up with molecular hydrogen (H2).

The revelation was stirring given that the team's experiment could have easily failed. The Great Red Spot is full of dense clouds, which makes it hard for electromagnetic energy to escape and teach astronomers anything about the chemistry within.

"It turns out they're not so thick that they block our ability to see deeply," said Bjoraker. "That's been a pleasant surprise."

New spectroscopic technology and sheer curiosity gave the team a boost in peering deep inside Jupiter, which has an atmosphere thousands of miles deep, Bjoraker said: "We thought, well, let's just see what's out there."

The data Bjoraker and his team collected will supplement the information NASA's Juno spacecraft is gathering as it circles the planet from north to south once every 53 days.

Among other things, Juno is looking for water with its own infrared spectrometer and with a microwave radiometer that can probe deeper than anyone has seen -- to 100 bars, or 100 times the atmospheric pressure at Earth's surface. (Altitude on Jupiter is measured in bars, which represent atmospheric pressure, since the planet does not have a surface, like Earth, from which to measure elevation.)

If Juno returns similar water findings, thereby backing Bjoraker's ground-based technique, it could open a new window into solving the water problem, said Goddard's Amy Simon, a planetary atmospheres expert.

"If it works, then maybe we can apply it elsewhere, like Saturn, Uranus or Neptune, where we don't have a Juno," she said.

Juno is the latest spacecraft tasked with finding water, likely in gas form, on this giant gaseous planet.

Water is a significant and abundant molecule in our solar system. It spawned life on Earth and now lubricates many of its most essential processes, including weather. It's a critical factor in Jupiter's turbulent weather, too, and in determining whether the planet has a core made of rock and ice.

Jupiter is thought to be the first planet to have formed by siphoning the elements left over from the formation of the Sun as our star coalesced from an amorphous nebula into the fiery ball of gases we see today. A widely accepted theory until several decades ago was that Jupiter was identical in composition to the Sun; a ball of hydrogen with a hint of helium -- all gas, no core.

But evidence is mounting that Jupiter has a core, possibly 10 times Earth's mass. Spacecraft that previously visited the planet found chemical evidence that it formed a core of rock and water ice before it mixed with gases from the solar nebula to make its atmosphere. The way Jupiter's gravity tugs on Juno also supports this theory. There's even lightning and thunder on the planet, phenomena fueled by moisture.

"The moons that orbit Jupiter are mostly water ice, so the whole neighborhood has plenty of water," said Bjoraker. "Why wouldn't the planet -- which is this huge gravity well, where everything falls into it -- be water rich, too?"

The water question has stumped planetary scientists; virtually every time evidence of H2O materializes, something happens to put them off the scent. A favorite example among Jupiter experts is NASA's Galileo spacecraft, which dropped a probe into the atmosphere in 1995 that wound up in an unusually dry region. "It's like sending a probe to Earth, landing in the Mojave Desert, and concluding the Earth is dry," pointed out Bjoraker.

In their search for water, Bjoraker and his team used radiation data collected from the summit of Maunakea in Hawaii in 2017. They relied on the most sensitive infrared telescope on Earth at the W.M. Keck Observatory, and also on a new instrument that can detect a wider range of gases at the NASA Infrared Telescope Facility.

The idea was to analyze the light energy emitted through Jupiter's clouds in order to identify the altitudes of its cloud layers. This would help the scientists determine temperature and other conditions that influence the types of gases that can survive in those regions.

Planetary atmosphere experts expect that there are three cloud layers on Jupiter: a lower layer made of water ice and liquid water, a middle one made of ammonia and sulfur, and an upper layer made of ammonia.

To confirm this through ground-based observations, Bjoraker's team looked at wavelengths in the infrared range of light where most gases don't absorb heat, allowing chemical signatures to leak out. Specifically, they analyzed the absorption patterns of a form of methane gas. Because Jupiter is too warm for methane to freeze, its abundance should not change from one place to another on the planet.

"If you see that the strength of methane lines vary from inside to outside of the Great Red Spot, it's not because there's more methane here than there," said Bjoraker, "it's because there are thicker, deep clouds that are blocking the radiation in the Great Red Spot."

Bjoraker's team found evidence for the three cloud layers in the Great Red Spot, supporting earlier models. The deepest cloud layer is at 5 bars, the team concluded, right where the temperature reaches the freezing point for water, said Bjoraker, "so I say that we very likely found a water cloud." The location of the water cloud, plus the amount of carbon monoxide that the researchers identified on Jupiter, confirms that Jupiter is rich in oxygen and, thus, water.

Bjoraker's technique now needs to be tested on other parts of Jupiter to get a full picture of global water abundance, and his data squared with Juno's findings.

"Jupiter's water abundance will tell us a lot about how the giant planet formed, but only if we can figure out how much water there is in the entire planet," said Steven M. Levin, a Juno project scientist at NASA's Jet Propulsion Laboratory in Pasadena, Calif.

Credit: 
NASA/Goddard Space Flight Center

Gum disease treatment may improve symptoms in cirrhosis patients

Rockville, Md. (August 29, 2018)--Routine oral care to treat gum disease (periodontitis) may play a role in reducing inflammation and toxins in the blood (endotoxemia) and improving cognitive function in people with liver cirrhosis. The study is published ahead of print in the American Journal of Physiology--Gastrointestinal and Liver Physiology.

Cirrhosis, which is a growing epidemic in the U.S., is the presence of scar tissue on the liver. When severe, it can lead to liver failure. Complications of cirrhosis can include infections throughout the body and hepatic encephalopathy, a buildup of toxins in the brain caused by advanced liver disease. Symptoms of hepatic encephalopathy include confusion, mood changes and impaired cognitive function.

Previous research shows that people with cirrhosis have changes in gut and salivary microbiota-- bacteria that populate the gastrointestinal tract and mouth--which can lead to gum disease and a higher risk of cirrhosis-related complications. In addition, studies have found that people with cirrhosis have increased levels of inflammation throughout the body, which is associated with hepatic encephalopathy.

Researchers studied two groups of volunteers that had cirrhosis and mild-to-moderate periodontitis. One group received periodontal care ("treated"), including teeth cleaning and removal of bacteria toxins from the teeth and gums. The other group was not treated for gum disease ("untreated"). The research team collected blood, saliva and stool samples before and 30 days after treatment. Each volunteer took standardized tests to measure cognitive function before and after treatment.

The treated group, especially those with hepatic encephalopathy, had increased levels of beneficial gut bacteria that could reduce inflammation, as well as lower levels of endotoxin-producing bacteria in the saliva when compared to the untreated group. The untreated group, on the other hand, demonstrated an increase in endotoxin levels in the blood over the same time period. The improvement in the treated group "could be related to a reduction in oral inflammation leading to lower systemic inflammation, or due to [less harmful bacteria] being swallowed and affecting the gut microbiota," the research team wrote.

Cognitive function also improved in the treated group, suggesting that the reduced inflammation levels in the body may minimize some of the symptoms of hepatic encephalopathy in people who are already receiving standard-of-care therapies for the condition. This finding is relevant because there are no further therapies approved by the U.S. Food and Drug Administration to alleviate cognition problems in this population, the researchers said. "The oral cavity could represent a treatment target to reduce inflammation and endotoxemia in patients with cirrhosis to improve clinical outcomes."

Credit: 
American Physiological Society

Removable balloon is as good as permanent stent implant for opening small blocked arteries

Munich, Germany - 28 Aug 2018: A removable balloon is as good as a permanent stent implant for opening small blocked arteries, according to late breaking results from the BASKET-SMALL 2 trial presented in a Hot Line Session today at ESC Congress 20181 and simultaneously published in The Lancet.

Principal investigator Professor Raban Jeger, of the University Hospital Basel, Switzerland, said: "The results of this trial move us a step closer towards treating small blocked arteries without having to insert a permanent implant."

One of the standard treatments for opening blocked arteries is to insert an expandable metal tube (stent) covered with drugs via a catheter. The stent remains in the body permanently. In smaller arteries there is a risk that tissue will grow inside the stent and narrow it, causing the artery to become blocked a second time (in-stent restenosis), or that a blood clot will develop on the stent (stent thrombosis) and cause a heart attack or stroke.

Balloons covered with drugs, also inserted using a catheter, are approved in Europe to reopen stented arteries that have become blocked a second time. The balloon is removed after the procedure.

BASKET-SMALL 2 is the largest randomised trial to examine whether drug coated balloons are as good as drug-eluting stents for opening small arteries that have become blocked for the first time. The effectiveness of the two treatments was evaluated by comparing the rate of major adverse cardiac events (MACE) at 12 months.

Between 2012 and 2017 the trial enrolled 758 patients with a first-time lesion in an artery smaller than 3 mm in diameter. The average age of study participants was 68 years, 72% had stable coronary artery disease and 28% had an acute coronary syndrome (heart attack or unstable angina).

Patients were randomised to receive drug coated balloon angioplasty (382 patients) or second-generation drug-eluting stent implantation (376 patients). The balloon was coated with iopromide and paclitaxel, and the stents were covered with everolimus or paclitaxel.

After the procedure, patients were followed-up for 12 months for the occurrence of MACE, which included death from cardiac causes, non-fatal heart attack, and the need to reopen the artery due to it becoming blocked again (called target vessel revascularisation). Secondary endpoints included the single components of MACE at 12 months, and major bleeding at 12 months.

At 12 months, there was no difference in the rates of MACE between patients who received a stent (7.5%) and patients who underwent the balloon procedure (7.6%) (p=0.918). Professor Jeger said: "The BASKET-SMALL 2 trial met its primary endpoint of non-inferiority for major adverse cardiac events at 12 months. This is a long-awaited milestone in clinical evidence for the drug coated balloon technique, which so far has primarily been used for the treatment of in-stent restenosis."

There were no statistical differences between groups in the rates of the individual components of the primary endpoint at 12 months: rates of cardiac death were 3.1% versus 1.3% (p=0.113), rates of nonfatal heart attack were 1.6% versus 3.5% (p=0.112), and rates of target vessel revascularisation were 3.4% versus 4.5% (p=0.438) in the balloon versus stent groups, respectively. The rate of major bleeding at 12 months was similar in the balloon (1.1%) and stent (2.4%) groups (p=0.183).

"The potential benefits of a stent-free option to treat small blocked arteries are numerous," said Professor Jeger. "With no permanent implant left after the procedure, the problem of tissue growth and clot formation within the stent is eliminated. In addition, there may be no need for prolonged treatment with anticlotting medicines, which has been controversial since it increases the risk of bleeding."

He concluded: "Drug coated balloon angioplasty has the possibility to become the standard treatment for small blocked arteries. We will continue to monitor patients in the trial for a further two years for major adverse cardiac events, stent thrombosis, and bleeding."

Credit: 
European Society of Cardiology

126 patient and provider groups to CMS: Proposed E/M service cuts will hurt sickest patients

WASHINGTON, DC - A broad coalition of 126 patient and provider groups - led by leading national organizations including the American College of Rheumatology - today sent a letter to the Centers for Medicare and Medicaid Services (CMS) urging the agency not to move forward with a proposal that would significantly reduce Medicare reimbursements for evaluation and management (E/M) services provided by specialists, citing concerns that these time-intensive services - which include examinations, disease diagnosis and risk assessments, and care coordination - are already grossly under-compensated and that additional payment cuts would worsen workforce shortages in already strained specialties like rheumatology.

The proposal, which was included in the 2019 Physician Fee Schedule proposed rule, would consolidate billing codes for E/M office visits, resulting in a flat payment for all E/M visits regardless of the complexity of the visit. Though the proposal was intended to reduce Medicare provider documentation and reporting burdens, it would also result in significant payment cuts for specialty care involving face-to-face visits with patients who have complex care needs, penalizing doctors who treat sicker patients or patients with multiple, chronic conditions.

"We applaud CMS for recognizing the problems with the current evaluation and management documentation guidelines and codes and for including a significant proposal to address them in the CY 2019 physician fee schedule proposed rule," the letter reads. "However, we urge CMS to reconsider this proposal to cut and consolidate evaluation and management services, which would severely reduce Medicare patients' access to care by cutting payments for office visits, adversely affecting the care and treatment of patients with complex conditions, and potentially exacerbate physician workforce shortages."

The groups warn that payment cuts of this magnitude will not only compromise patient access to care by forcing physicians to spend less time with their patents but could create a disastrous ripple effect throughout the U.S. health care system, discouraging medical students from pursuing specialties that provide complex care and disincentivizing doctors from taking new Medicare patients altogether.

"Not only will this will result in an additional burden on patients with more copayments and costs associated with time and travel, it will also reduce the quality of care, particularly for patients with complex medical conditions," the letter continues.

The proposed cuts go against the recommendations of the Medicare Payment Advisory Commission (MedPAC), an independent advisory commission to the Medicare program, which earlier this year proposed increasing reimbursement for E/M services given the time and intensity they require, and that E/M services are already undervalued relative to other physician services.

"We therefore urge CMS not to move forward with the proposal as it currently stands, and instead convene stakeholders to identify other strategies to reduce paperwork and administrative burden that do not threaten patient access to care," the letter concludes.

To view the letter, click here.

Credit: 
American College of Rheumatology

CU researchers identify potential target for treating pain during surgery

AURORA, Colo. (Aug. 28, 2018) - A research team lead by faculty of the University of Colorado School of Medicine have published a study that improves the understanding of the pain-sensing neurons that respond to tissue injury during surgery.

The team, led by Slobodan Todorovic, MD, PhD, Professor of Anesthesiology at the School of Medicine and the Neuroscience Graduate Program on the CU Anschutz Medical Campus, reports its findings today in the journal Science Signaling.

"We investigated the potential role and molecular mechanisms of nociceptive ion channel dysregulation in acute pain conditions such as those resulting from skin and soft tissue incision," Todorovic said.

Nociceptors represent a type of a receptor that exist to feel pain when the body is harmed. When activated, nociceptors notify the brain about the injury. In their study, the CU-led team looked at a specific channel for transmitting that information, aimed at developing a better understanding of potential ways to address pain after surgery.

By gaining a better understanding of how these nociceptors work, the researchers aim to identify potential new therapies for pain during surgery and to decrease the need for narcotics.

"Although opioids are very effective in treating the acute pain associated with surgical procedures, their use is associated with serious side effects, which include constipation, urinary retention, impaired cognitive function, respiratory depression, tolerance, and addiction," Todorovic and his co-authors write. "More than 12 million people in the United States abused prescription opioids in 2010 alone, resulting in more overdose deaths than heroin and cocaine combined. The necessity to treat this acute type of pain is of paramount importance since its duration and intensity influence the recovery process after surgery, as well as the onset of chronic post-surgical pain."

Credit: 
University of Colorado Anschutz Medical Campus

Getting to the roots of our ancient cousin's diet

image: Paranthropus robustus fossil from South Africa SK 46 (discovered 1936, estimated age 1.9-1.5 million years) and the virtually reconstructed first upper molar used in the analyses.

Image: 
Kornelius Kupczik, Max Planck Institute for Evolutionary Anthropology

Food needs to be broken down in the mouth before it can be swallowed and digested further. How this is being done depends on many factors, such as the mechanical properties of the foods and the morphology of the masticatory apparatus. Palaeoanthropologists spend a great deal of their time reconstructing the diets of our ancestors, as diet holds the key to understanding our evolutionary history. For example, a high-quality diet (and meat-eating) likely facilitated the evolution of our large brains, whilst the lack of a nutrient-rich diet probably underlies the extinction of some other species (e.g., P. boisei). The diet of South African hominins has remained particularly controversial however.

Using non-invasive high-resolution computed tomography technology and shape analysis the authors deduced the main direction of loading during mastication (chewing) from the way the tooth roots are oriented within the jaw. By comparing the virtual reconstructions of almost 30 hominin first molars from South and East Africa they found that Australopithecus africanus had much wider splayed roots than both Paranthropus robustus and the East African Paranthropus boisei. "This is indicative of increased laterally-directed chewing loads in Australopithecus africanus, while the two Paranthropus species experienced rather vertical loads", says Kornelius Kupczik of the Max Planck Institute for Evolutionary Anthropology.

Paranthropus robustus, unlike any of the other species analysed in this study, exhibits an unusual orientation, i.e. "twist", of the tooth roots, which suggests a slight rotational and back-and-forth movement of the mandible during chewing. Other morphological traits of the P. robustus skull support this interpretation. For example, the structure of the enamel also points towards a complex, multidirectional loading, whilst their unusual microwear pattern can conceivably also be reconciled with a different jaw movement rather than by mastication of novel food sources. Evidently, it is not only what hominins ate and how hard they bit that determines its skull morphology, but also the way in which the jaws are being brought together during chewing.

The new study demonstrates that the orientation of tooth roots within the jaw has much to offer for an understanding of the dietary ecology of our ancestors and extinct cousins. "Perhaps palaeoanthropologists have not always been asking the right questions of the fossil record: rather than focusing on what our extinct cousins ate, we should equally pay attention to how they masticated their foods", concludes Gabriele Macho of the University of Oxford.

Molar root variation in hominins is therefore telling us more than previously thought. "For me as an anatomist and a dentist, understanding how the jaws of our fossil ancestors worked is very revealing as we can eventually apply such findings to the modern human dentition to better understand pathologies such as malocclusions", adds Viviana Toro-Ibacache from the University of Chile and one of the co-authors of the study.

Credit: 
Max Planck Institute for Evolutionary Anthropology

Take a vacation -- it could prolong your life

image: Figure of the intervention and control groups.

Image: 
European Society of Cardiology

Munich, Germany - 28 Aug 2018: Taking vacations could prolong life. That's the finding of a 40-year study presented today at ESC Congress and accepted for publication in The Journal of Nutrition, Health & Aging.1,2

"Don't think having an otherwise healthy lifestyle will compensate for working too hard and not taking holidays," said Professor Timo Strandberg, of the University of Helsinki, Finland. "Vacations can be a good way to relieve stress."

The study included 1,222 middle-aged male executives born in 1919 to 1934 and recruited into the Helsinki Businessmen Study in 1974 and 1975. Participants had at least one risk factor for cardiovascular disease (smoking, high blood pressure, high cholesterol, elevated triglycerides, glucose intolerance, overweight).

Participants were randomised into a control group (610 men) or an intervention group (612 men) for five years. The intervention group received oral and written advice every four months to do aerobic physical activity, eat a healthy diet, achieve a healthy weight, and stop smoking. When health advice alone was not effective, men in the intervention group also received drugs recommended at that time to lower blood pressure (beta-blockers and diuretics) and lipids (clofibrate and probucol). Men in the control group received usual healthcare and were not seen by the investigators.

As previously reported, the risk of cardiovascular disease was reduced by 46% in the intervention group compared to the control group by the end of the trial.3 However, at the 15-year follow-up in 1989 there had been more deaths in the intervention group than in the control group.4,5

The analysis presented today extended the mortality follow-up to 40 years (2014) using national death registers and examined previously unreported baseline data on amounts of work, sleep, and vacation. The researchers found that the death rate was consistently higher in the intervention group compared to the control group until 2004. Death rates were the same in both groups between 2004 and 2014.

Shorter vacations were associated with excess deaths in the intervention group. In the intervention group, men who took three weeks or less annual vacation had a 37% greater chance of dying in 1974 to 2004 than those who took more than three weeks. Vacation time had no impact on risk of death in the control group. (see figures)

Professor Strandberg said: "The harm caused by the intensive lifestyle regime was concentrated in a subgroup of men with shorter yearly vacation time. In our study, men with shorter vacations worked more and slept less than those who took longer vacations. This stressful lifestyle may have overruled any benefit of the intervention. We think the intervention itself may also have had an adverse psychological effect on these men by adding stress to their lives."

Professor Strandberg noted that stress management was not part of preventive medicine in the 1970s but is now recommended for individuals with, or at risk of, cardiovascular disease.6 In addition, more effective drugs are now available to lower lipids (statins) and blood pressure (angiotensin-converting enzyme inhibitors, angiotensin receptor blockers, calcium channel blockers).

He concluded: "Our results do not indicate that health education is harmful. Rather, they suggest that stress reduction is an essential part of programmes aimed at reducing the risk of cardiovascular disease. Lifestyle advice should be wisely combined with modern drug treatment to prevent cardiovascular events in high-risk individuals."

Credit: 
European Society of Cardiology

Anxiety, depression, other mental distress may increase heart attack, stroke risk in adults over 45

DALLAS, Aug. 28, 2018 - Adults ages 45 or older who experience psychological distress such as depression and anxiety may have an increased risk of developing cardiovascular disease, according to new research in Circulation: Cardiovascular Quality and Outcomes, an American Heart Association journal.

In a study of 221,677 participants from Australia, researchers found that:

among women, high/very high psychological distress was associated with a 44 percent increased risk of stroke; and

in men ages 45 to 79, high/very high versus low psychological distress was associated with a 30 percent increased risk of heart attack, with weaker estimates in those 80 years old or older.

The association between psychological distress and increased cardiovascular disease risk was present even after accounting for lifestyle behaviors (smoking, alcohol intake, dietary habits, etc.) and disease history.

"While these factors might explain some of the observed increased risk, they do not appear to account for all of it, indicating that other mechanisms are likely to be important," said Caroline Jackson, Ph.D., the study's senior author and a Chancellor's Fellow at the University of Edinburgh in Edinburgh, Scotland.

The research involved participants who had not experienced a heart attack or stroke at the start of the study and who were part of the New South Wales 45 and Up Study that recruited adults ages 45 or older between 2006 and 2009.

Researchers categorized psychological distress as low, medium and high/very high using a standard psychological distress scale which asks people to self-assess the level.

The 10-question survey asks questions such as: "How often do you feel tired out for no good reason?" How often do you feel so sad that nothing could cheer you up?" How often do you feel restless or fidgety?"

Of the participants - 102,039 men (average age 62) and 119,638 women (average age 60) - 16.2 percent reported having moderate psychological distress and 7.3 percent had high/very high psychological distress.

During follow-up of more than four years, 4,573 heart attacks and 2,421 strokes occurred. The absolute risk - overall risk of developing a disease in a certain time period - of heart attack and stroke rose with each level of psychological distress.

The findings add to the existing evidence that there may be an association between psychological distress and increased risk of heart attack and stroke, she said. But they also support the need for future studies focused on the underlying mechanisms connecting psychological distress and cardiovascular disease and stroke risk and look to replicate the differences between men and women.

Mental disorders and their symptoms are thought to be associated with increased risk of heart disease and stroke, but previous studies have produced inconsistent findings and the interplay between mental and physical health is poorly understood.

People with symptoms of psychological distress should be encouraged to seek medical help because, aside from the impact on their mental health, symptoms of psychological distress appear to also impact physical health, Jackson said. "We encourage more proactive screening for symptoms of psychological distress. Clinicians should actively screen for cardiovascular risk factors in people with these mental health symptoms."

All factors analyzed in this research, apart from the outcomes of heart attack and stroke, were identified at the same point in time, which made it difficult for researchers to understand the relationship between psychological distress and variables such as unhealthy behaviors like smoking and poor diet. With that analysis approach, they may have underestimated the effect psychological distress has on the risk of heart attack and stroke.

Credit: 
American Heart Association

Current advice to limit dairy intake should be reconsidered

Munich, Germany - UNDER EMBARGO UNTIL 28 Aug 2018: The consumption of dairy products has long been thought to increase the risk of death, particularly from coronary heart disease (CHD), cerebrovascular disease, and cancer, because of dairy's relatively high levels of saturated fat. Yet evidence for any such link, especially among US adults, is inconsistent. With the exception of milk, which appears to increase the risk of CHD, dairy products have been found to protect against both total mortality and mortality from cerebrovascular causes, according to research presented today at ESC Congress 2018, the annual congress of the European Society of Cardiology.1 Therefore, current guidelines to limit consumption of dairy products, especially cheese and yogurt, should be relaxed; at the same time, the drinking of non-fat or low-fat milk should be recommended, especially for those who consume large quantities of milk.
"A meta-analysis of 29 cohort studies2 published in 2017 found no association between the consumption of dairy products and either cardiovascular disease (CVD) or all-cause mortality," said Professor Maciej Banach, from the Department of Hypertension at Medical University of Lodz, Poland. "Yet a large 20-year prospective study of Swedish adults3, also published in 2017, found that higher consumption of milk was associated with a doubling of mortality risk, including from CVD, in the cohort of women."

Professor Banach and his co-researchers examined data from a 1999-2010 National Health and Nutrition Examination Surveys (NHANES) study of 24,474 adults with a mean age of 47.6 years, 51.4% of whom were female. (NHANES is conducted by the US's Centers for Disease Control and Prevention.) During the follow-up period of 76.4 months, 3,520 total deaths were recorded, including 827 cancer deaths, 709 cardiac deaths, and 228 cerebrovascular disease deaths. The researchers found consumption of all dairy products to be associated with a 2% lower total mortality risk and consumption of cheese to be associated with an 8% lower total mortality risk (hazard ratio [HR]: 0.98, 95% confidence interval [CI]: 0.95-0.99; HR: 0.92, 95% CI: 0.87-0.97, respectively). For cerebrovascular mortality, they found a 4% lower risk with total dairy consumption and 7% lower risk with milk consumption (HR: 0.96, 95% CI: 0.94-0.98; HR: 0.93, 95% CI: 0.91-0.96, respectively).

A meta-analysis by Professor Banach and his co-researchers of 12 prospective cohort studies with 636,726 participants who were followed for approximately 15 years confirmed these results. But milk consumption was also associated with a 4% higher CHD mortality, while consumption of fermented dairy products such as yogurt was associated with a 3% lower rate of total mortality. The yogurt finding, however, was determined to be not significant after further adjustment (Q4: HR: 0.98, p=0.125).

The researchers concluded that among US adults, higher total dairy consumption protected against both total mortality and mortality from cerebrovascular causes. At the same time, higher milk consumption was associated with an increased risk of CHD, an association that needs further study. Causality, however, could be difficult to determine, as most people who consume milk also consume other dairy products.

"In light of the protective effects of dairy products," said Professor Banach, "public health officials should revise the guidelines on dairy consumption. And given the evidence that milk increases the risk of CHD, it is advisable to drink fat-free or low-fat milk."

Credit: 
European Society of Cardiology

Physicians deserve answers as public service loan forgiveness program hangs in the balance

PHILADELPHIA --With medical school loan debt averaging $200,000, many physicians pursue the Public Service Loan Forgiveness Program that eliminates federal student loans after 10 years of service in the public sector. But the fate of the program hangs in the balance, as government officials signal a desire to end it, leaving physicians in a lingering uncertainty that's unnecessary and unfair, health policy experts from the Perelman School of Medicine at the University of Pennsylvania and three other medical institutions argue in a new commentary published in the Annals of Internal Medicine.

"There are justifiable reasons to support the program and also justifiable reasons to change it," senior author David A. Asch, MD, a professor of Medicine and Medical Ethics and Health Policy and executive director of the Penn Center for Health Care Innovation, and his colleagues wrote. "But there are no justifiable reasons to keep recent graduates in suspense."

The program, which began in 2007 under President George W. Bush, forgives remaining federal debt for borrowers who have made 120 payments or 10 years' worth of loan repayments while employed at nonprofit or public institutions. Those conditions are particularly favorable for physicians, as most begin their careers in residency programs - which counts towards years of repayment - at nonprofit medical centers or hospitals, and often continue in that setting afterwards. According to the authors, a third of 2017 graduates who borrowed money for medical school report planning to use the program.

Recently, the program has been threatened with elimination by Congress and the Trump administration, with little guidance about the fate of current borrowers.

A U.S. House of Representatives proposal makes borrowers after July 1, 2019 ineligible for the program - and leaves unclear whether borrowers pursuing federal loans prior to that time will be grandfathered in. Further muddying the waters, the Department of Education retroactively reversed certifications for some lawyers working in nonprofit institutions, and indicated that certifications are now temporary and subject to final approval by them.

It's what the authors call an "unnecessary uncertainty" that deserves clarity and swift decisions. "Physicians are trained to handle uncertainty but that does not excuse leaving new physicians facing uncertainties that can be easily resolved," they wrote. "Even as we consider new approaches toward financing training for public service, we should insist on clarity for those who have already pursued it."

The program is not without its problems, including issues such as insufficient cost estimates.

While it could help the mounting problem of educational debt in America, the designers of the program may have been thinking more about lower paid public school teachers and not have anticipated how many higher paid physicians would be eligible and how the long years of training would increase the proportion of physician debt relieved from the program. "It's structured to encourage borrowers to minimize current loan repayment to maximize eventual loan forgiveness," says Asch. "That's a strategy that could work out well for physicians and other borrowers," he added, "but it's also a risky one--one that could lead young physicians to accumulate even more debt as they delay paying off loans."

Physicians pursuing higher earning specialties with typically longer training, such as neurosurgery, also end up responsible for less of their debt repayment than physicians pursuing lower earning specialties with typically shorter training, such as family medicine. "That probably was not an intended consequence of the program's structure, but it's nevertheless a perverse outcome, given recognized shortages and relatively low pay in primary care fields compared to subspecialties," says Justin Grischkan, MD, the lead author of the study and a resident physician in Internal Medicine at the Massachusetts General Hospital in Boston.

Still, expunging the program, rather than fixing it, would not only worsen educational debt, it may also prevent people from entering the medical workforce. "Physicians who are planning to use the program are more likely to graduate with higher debt, receive less scholarship support and come from backgrounds with lower parental income" Grischkan says. As more medical students come from high-income backgrounds that can support their education without debt (nearly one third of graduates), overall debt is further concentrated among those with the most need, the authors said.

"A medical degree is increasingly out of reach of many who might contribute to a workforce more responsive to diverse national needs," the authors wrote. Ending that program "may remove a financial support critical to national interests."

The authors estimate annualized costs for the program at around $1 billion for physician medical education. Ending it could redirect that $1 billion more efficiently toward health care workforce goals, but it's more plausible that those funds would go elsewhere, they said. "A flawed program may be better than none at all."

Credit: 
University of Pennsylvania School of Medicine

Many Arctic pollutants decrease after market removal and regulation

image: Persistent Organic Pollutants, also known as POPs, can having lasting impacts on both people and wild animals in the Arctic. Research shows some POPs are decreasing in the region after being pulled from market or regulated around the globe.

Image: 
Arturo de Frias Marques (https://commons.wikimedia.org/wiki/File:Polar_Bear_AdF.jpg)

Levels of some persistent organic pollutants (POPs) regulated by the Stockholm Convention are decreasing in the Arctic, according to an international team of researchers who have been actively monitoring the northern regions of the globe.

POPs are a diverse group of long-lived chemicals that can travel long distances from their source of manufacture or use. Many POPs were used extensively in industry, consumer products or as pesticides in agriculture. Well-known POPs include chemicals such as DDT and PCBs (polychlorinated biphenyls), and some of the products they were used in included flame retardants and fabric coatings.

Because POPs were found to cause health problems for people and wildlife, they were largely banned or phased out of production in many countries. Many have been linked to reproductive, developmental, neurological and immunological problems in mammals. The accumulation of DDT, a well-known and heavily used POP, was also linked to eggshell-thinning in fish-eating birds, such as eagles and pelicans, in the late 20th century, and caused catastrophic population declines for those animals.

In 2001, 152 countries signed a United Nations treaty in Stockholm, Sweden intended to eliminate, restrict or minimize unintentional production of 12 of the most widely used POPs. Later amendments added more chemicals to the initial list. Today, more than 33 POP chemicals or groups are covered by what is commonly called the "Stockholm Convention," which has been recognized by182 countries.

"This paper shows that following the treaty and earlier phase-outs have largely resulted in a decline of these contaminants in the Arctic," says John Kucklick, a biologist from the National Institute of Standards and Technology (NIST) and the senior U.S. author on the paper, published August 23 in Science of the Total Environment. "When POP use was curtailed, the change was reflected by declining concentrations in the environment."

"In general, the contaminants that are being regulated are decreasing," says Frank Rigét from the Department of Bioscience, Aarhus University, Denmark, and lead author.

POPs are particularly problematic in the Arctic because the ecosystem there is especially fragile, and pollution can come from both local sources and from thousands of miles away due to air and water currents. POPs also bioaccumulate. This means that they build up faster in animals and humans than they can be excreted, and that exposure can increase up the food chain. Plankton exposed to POPs in water are eaten by schools of fish, which are in turn eaten by seals or whales, and with each jump up the food chain the amount of POPs increases. The same is true for terrestrial animals. A large mammal's exposure, therefore, can be large and long-lasting.

Indigenous people living in northern coastal areas such as Alaska often consume more fish and other animals that come from higher on the food chain than the average American. Such communities, therefore, are potentially exposed to larger amounts of these pollutants.

For almost two decades beginning in 2000, Kucklick and Rigét worked in conjunction with scientists from Denmark, Sweden, Canada, Iceland and Norway to track POPs in the fat of several marine mammals and in the tissue of shellfish and seabirds. They also monitored air in the Arctic circle for pollution.

To gain a fuller picture of how the deposition of POPs might have changed over time, the study included specimens archived since the 1980s and '90s in special storage facilities around the globe. The U.S. specimens were provided by the NIST Biorepository, located in Charleston, South Carolina. Samples archived in that facility are part of the Alaska Marine Mammal Tissue Archival Project (AMMTAP) or the Seabird Tissue Archival and Monitoring Project (STAMP). Both collections are conducted in collaboration with other federal agencies.

The study pooled more than 1,000 samples taken over the course of several decades from many different locations throughout the Arctic Circle. In general, the so-called legacy POPs--those that have been eliminated or restricted from production--were shown to be decreasing over the past two to three decades, although some had decreased more than others.

The biggest decreases were in a byproduct of the pesticide lindane, a-HCH, with a mean annual decline of 9 percent in Arctic wildlife.

The research team found PCBs had decreased as well. Most industrial countries banned PCBs in the 1970s and '80s, and their production was reduced under the Stockholm Convention in 2004. Previously, the compounds had been widely used in electrical systems. In this study, it was found that their presence had decreased by almost 4 percent per year across the Arctic region since being pulled from the market.

Two of the legacy POPs listed under Stockholm, β-HCH and HCB, showed only small declines of less than 3 percent per year. β-HCH was part of a heavily-used pesticide mixture with the active ingredient lindane and HCB was used both in agriculture and industry.

A small number of the legacy POPs had increased in a few locations, although some of those were at sites suspected to be influenced by strong, still-existing local pollution sources.

Notably, the flame retardant hexabromocyclododecane (HBCDD) showed an annual increase of 7.6 percent. HBCDD was one of 16 additional POPs added to the Stockholm Convention as of 2017 and is recommended for elimination from use, with certain exemptions.

Most of the research conducted for this paper was a direct result of the 2001 treaty stipulations, which included a requirement that sponsors participate in ongoing, long-term biological monitoring. Although the U.S. participated in the research, it has not ratified the treaty. It is expected that work on the treaty will continue as new POPs are identified.

This recent research work highlights the usefulness of long-term data and international scientific collaboration, says Rigét. "You really need to gather more than 10 years of data before you can see the trend because in the short term there can be some small fluctuations," he notes. "Looking at this data also showed us how to be more economical and avoid over-sampling in the future."

Credit: 
National Institute of Standards and Technology (NIST)

Environmentally friendly farming practices used by nearly 1/3 of world's farms

image: Washington State University soil scientist John Reganold is part of an international team that found nearly one-third of the world's farms have adopted more environmentally friendly practices while continuing to be productive.

Image: 
Washington State University

PULLMAN, Wash. - Nearly one-third of the world's farms have adopted more environmentally friendly practices while continuing to be productive, according to a global assessment by 17 scientists in five countries.

The researchers analyzed farms that use some form of "sustainable intensification," a term for various practices, including organic farming, that use land, water, biodiversity, labor, knowledge and technology to both grow crops and reduce environmental impacts like pesticide pollution, soil erosion, and greenhouse gas emissions.

Writing in the journal Nature Sustainability, the researchers estimate that nearly one-tenth of the world's farmland is under some form of sustainable intensification, often with dramatic results. They have seen that the new practices can improve productivity, biodiversity and ecosystem services while lowering farmer costs. For example, they document how West Africa farmers have increased yields of maize and cassava; some 100,000 farmers in Cuba increased their productivity 150 percent while cutting their pesticide use by 85 percent.

Sustainable intensification "can result in beneficial outcomes for both agricultural output and natural capital," the researchers write.

"Although we have a long way to go, I'm impressed by how far farmers across the world and especially in less developed countries have come in moving our food-production systems in a healthy direction," said John Reganold, Washington State University Regents Professor of Soil Science and Agroecology and a co-author of the paper. Reganold helped identify farming systems that meet sustainable intensification guidelines and analyze the data.

Less developed countries tend to see the largest improvements in productivity, while industrialized countries "have tended to see increases in efficiency (lower costs), minimizing harm to ecosystem services, and often some reductions in crop and livestock yields," the authors write.

Jules Pretty, the study's lead author and a professor of environment and society at the University of Essex in England, first used the term "sustainable intensification" in a 1997 study of African agriculture. While the word "intensification" typically applies to environmentally harmful agriculture, Pretty used the term "to indicate that desirable outcomes, such as more food and better ecosystem services, need not be mutually exclusive."

The term now appears in more than 100 scholarly papers a year and is central to the United Nations Sustainable Development Goals.

For the Nature Sustainability paper, the researchers used scientific publications and datasets to screen some 400 sustainable intensification projects, programs and initiatives around the world. They chose only those that were implemented on more than 10,000 farms or 10,000 hectares, or nearly 25,000 acres. They estimate that 163 million farms covering more than a billion acres are affected.

The researchers focused on seven different farming changes in which "increases in overall system performance incur no net environmental cost." The changes include an advanced form of Integrated Pest Management that involves Farmer Field Schools teaching farmers agroecological practices, such as building the soil, in more than 90 countries. Other changes include pasture and forage redesign, trees in agricultural systems, irrigation water management, and conservation agriculture, including the soil-saving no-till technique used in eastern Washington.

Sustainable intensification "has been shown to increase productivity, raise system diversity, reduce farmer costs, reduce negative externalities and improve ecosystem services," the researchers write. They say it has now reached a "tipping point" in which it can be more widely adopted through governmental incentives and policies.

"Stronger government policies across the globe are now needed to support the greater adoption of sustainable intensification farming systems so that the United Nations Sustainable Development Goals endorsed by all members of the UN are met by 2030," said Reganold. "This will help provide sufficient and nutritious food for all, while minimizing environmental impact and enabling producers to earn a decent living."

Credit: 
Washington State University

New target could prevent progression of liver damage to cancer

image: Dr. Anatolij Horuzsko, reproductive immunologist in the Georgia Cancer Center and Department of Medicine at the Medical College of Georgia at Augusta University

Image: 
Phil Jones, Senior Photographer, Augusta University

AUGUSTA, Ga. (Aug. 27, 2018) - Problems like obesity and alcoholism appear to chronically trigger in the liver a receptor known to amplify inflammation in response to invaders like bacteria, scientists report.

The relentless, increased activity of TREM-1 in turn accelerates injury and scarring of the liver, a first step toward cirrhosis and liver cancer, says Dr. Anatolij Horuzsko, reproductive immunologist in the Georgia Cancer Center and Department of Medicine at the Medical College of Georgia at Augusta University.

TREM-1, or triggering receptor expressed on myeloid cells-1, is known to help turn up inflammation short-term to help us deal with external invaders. It has increased activity immediately after an injury as well, when increased inflammation, damage cleanup and collagen production aid healing.

But Georgia Cancer Center scientists report in the Journal of Clinical Investigation the first evidence that when activated by chronic offending agents, like obesity and hepatitis, TREM-1 instead contributes to a destructive level of inflammation that results in liver damage and possibly cancer.

The unhealthy transformation can occur in five to 50 years, depending on factors like the level of insult, and may be largely reversible up to the point of cirrhosis, if the offending agent is stopped, and the liver's natural ability to regenerate takes over.

Horuzsko and his colleagues think TREM-1 could one day be another point of intervention, possibly with a drug that could return TREM-1 activation to normal levels on resident, garbage-eating, watchdog immune cells called Kupffer cells.

"Right now we have treatment for hepatitis C, for example, which is very efficient, if we treat it before too much damage is done. But we don't have treatment for alcohol- or obesity-related damage," Horuzsko says.

They already are doing experiments with a drug that, because of its structure, should enable tamping down of TREM-1, but long-term goals include a drug that would target this receptor on Kupffer cells.

It's known that inflammation is a key process in the thickening and scarring of the liver called fibrosis, and that tamping down inflammation can help prevent fibrosis progression. But just how inflammation and fibrosis happen at the cellular and molecular level is largely unknown, say Horuzsko, the study's corresponding author.

Their work in both animal models and human tissue indicate TREM-1 is essential to both.

In the liver, TREM-1 is found primarily on Kupffer cells, the liver's resident macrophages, as well as monocytes, a type of white blood cell that can also become garbage-eating macrophages.

TREM-1's expression is limited in a healthy human liver but its activation goes up short-term following an insult, like a laceration.

To look at what happens in the face of a chronic problem, the scientists created a model of chronic liver disease - like obesity or high alcohol consumption might - using carbon tetrachloride, a poisonous solvent found in oils, varnishes and resin. They found TREM-1 activation went up and stayed up on a larger number of Kupffer cells in the liver as well as other immune cells circulating in the body.

When they deleted TREM-1 from the model, it reduced inflammation, injury and subsequent fibrosis. When they gave TREM-1 back to the mice, inflammation and related damage came back with a vengeance, leading them to dub TREM-1 the main target that drives fibrogenesis.

They found TREM-1 even recruits other pro-inflammatory cells from the bone marrow to the liver, many of which could become macrophages as well, which further multiplies the inflammation, liver cell damage and death.

"This creates a loop," says Horuzsko, of increased activity on many fronts. "This creates chronic inflammation - with no bacterium or virus involved - which is important to the development of liver disease."

As liver cells die in the face of chronic inflammation, they release their innards, called damage associated molecular patterns, or DAMPs, when they get outside the cells. DAMPs further activate TREM-1 on the macrophages and the damaging momentum builds, he says.

That's where collagen and fibrosis set in. Stellate cells in the liver are normally quiescent and mainly store vitamin A. When TREM-1 gets activated on the macrophages, it also activates the macrophages themselves which, in turn, activate stellate cells.

Stellate cells literally change their shape, release vitamin A and start to make collagen. Collagen is a component of connective tissue that typically helps hold tissues and blood vessels together and aids wound healing. The liver already has some collagen, but in this scenario too much gets deposited and liver function suffers.

"Efficiency goes down and it causes additional damage to liver cells that already have been damaged by something like hepatitis or obesity," Horuzsko says. The liver of a patient with cirrhosis, for example, is overrun with collagen, he notes.

Blood levels of the enzymes alanine aminotransferase, or ALT, and aspartate aminotransferase, or AST, are indicators of liver injury and both went up and remained high in their models. However, in mice where TREM-1 was knocked out, rates went up only short- term before returning to pre-injury levels, another indicator of TREM-1's role in persistent inflammation and resulting damage, Horuzsko says.

They also found that while mice with and without TREM-1 both recruited additional immune cells, such as more macrophages and monocytes, from their bone marrow immediately after the injury, 72 hours later the levels were much higher both in the blood and livers of the mice that also still had TREM-1.

To look further at the role of Kupffer cells when TREM-1 is out of the picture, they first removed the cells from both mice models, then gave Kupffer cells that contained TREM-1 back to both, and both were able to cause localized damage and recruit immune cells from the marrow to further bolster inflammation. But when they put TREM-1-deficient Kupffer cells back in normal mice, the exaggerated inflammation and liver damage did not happen.

Likewise, they found markedly increased infiltration of TREM-1 expressing cells in patients with liver fibrosis.

"TREM-1 is a molecule that can be very dangerous and is normally very controlled in the body," Horuzsko says. In fact, one of the diagnostic criteria for body-wide infection, or sepsis, is the level of TREM-1 protein in the fluid portion of a patient's blood. And, in hepatitis B related liver cancer in humans, high levels of TREM-1 expression on stellate cells is considered an indicator of poor prognosis.

"The balance in our body is very, very tightly regulated and important. Alcohol, obesity, hepatitis viruses all change the balance," Horuzsko says.

The scientists suspect their findings of TREM-1 gone wild will hold true in other organs including the lungs, heart and kidneys, which also have TREM-1 on their macrophages.

Liver cancer rates have risen dramatically in the United States, 43 percent in men and 40 percent in women, from 2000-16, according to a report released this summer by the Centers for Disease Control and Prevention. In May 2017, the CDC reported that newly reported cases of hepatitis C tripled between 2010-15 and the American Cancer Society says liver cancer rose from ninth to the sixth leading cause of cancer death from 2000-16.

The most common causes of liver cancer include infection with the hepatitis B or C virus, heavy alcohol use, obesity and diabetes, according to the CDC.

The liver is part of the gastrointestinal tract and filters blood coming from the GI tract before the blood circulates to the rest of the body. Its myriad of functions include secreting bile, which helps us absorb fats and eliminate waste; producing cholesterol, triglycerides and blood clotting factors; and detoxifying chemicals. The liver is the heaviest solid organ in the body and sits on the right sight of the body behind the lower ribs.

Credit: 
Medical College of Georgia at Augusta University

Beluga whales and narwhals go through menopause

Scientists have discovered that beluga whales and narwhals go through the menopause - taking the total number of species known to experience this to five.

Aside from humans, the species now known to experience menopause are all toothed whales - belugas, narwhals, killer whales and short-finned pilot whales.

Almost all animals continue reproducing throughout their lives, and scientists have long been puzzled about why some have evolved to stop.

The new study, by the universities of Exeter and York and the Center for Whale Research, suggests menopause has evolved independently in three whale species (it may have evolved in a common ancestor of belugas and narwhals).

"For menopause to make sense in evolutionary terms, a species needs both a reason to stop reproducing and a reason to live on afterwards," said first author Dr Sam Ellis, of the University of Exeter.

"In killer whales, the reason to stop comes because both male and female offspring stay with their mothers for life - so as a female ages, her group contains more and more of her children and grandchildren.

"This increasing relatedness means that, if she keeps having young, they compete with her own direct descendants for resources such as food.

"The reason to continue living is that older females are of great benefit to their offspring and grand-offspring. For example, their knowledge of where to find food helps groups survive."

The existence of menopause in killer whales is well documented due to more than four decades of detailed study.

Such information on the lives of belugas and narwhals is not available, but the study used data on dead whales from 16 species and found dormant ovaries in older beluga and narwhal females.

Based on the findings, the researchers predict that these species have social structures which - as with killer whales - mean females find themselves living among more and more close relatives as they age.

Research on ancestral humans suggests this was also the case for our ancestors. This, combined with the benefits of "late-life helping" - where older females benefit the social group but do not reproduce - may explain why menopause has evolved.

Senior author Professor Darren Croft said: "It's hard to study human behaviour in the modern world because it's so far removed from the conditions our ancestors lived in.

"Looking at other species like these toothed whales can help us establish how this unusual reproductive strategy has evolved."

Although individuals of many species may fail to reproduce late in life, the researchers looked for evidence of an "evolved strategy" where females had a significant post-reproductive lifespan.

Credit: 
University of Exeter

Percutaneously reducing secondary mitral regurgitation in heart failure appears futile

Munich, Germany - 27 Aug 2018: Percutaneously reducing secondary mitral regurgitation appears futile when tested in all heart failure patients, according to late breaking research presented today in a Hot Line Session at ESC Congress 2018 and published in the New England Journal of Medicine.1

European guidelines state that percutaneous edge-to-edge repair may be considered in patients with secondary mitral regurgitation and heart failure considered high risk for open heart surgery.2 The intervention involves inserting one or more clips through the femoral vein then attaching the two mitral valve leaflets together so that they close more effectively.

Professor Jean-Francois Obadia, principal investigator, Civil Hospices of Lyon, France, said: "We show for the first time that despite reducing secondary mitral regurgitation, percutaneous repair of the mitral valve does not improve survival or symptoms, or reduce heart failure hospitalisations compared to standard medical treatment alone. This strongly suggests that this procedure is futile in patients with heart failure and secondary mitral regurgitation."

Around 2% of adults in developed countries have heart failure, rising to more than 10% of people over 70 years of age.3Typical symptoms include breathlessness, ankle swelling, and fatigue, caused by the heart's inability to pump enough blood out of the left ventricle to meets the body's needs.

In the advanced stages of heart failure the left ventricle dilates, causing the mitral valve to close insufficiently and incorrectly allow blood to leak back into the left atrium (called mitral regurgitation). It is labelled "secondary" because the mitral valve is structurally normal, but does not work properly due to a dilated left ventricle. Although associated with a worse prognosis, there is no evidence that reducing secondary mitral regurgitation improves survival.3

The MITRA.fr study examined whether this procedure could reduce the rate of all-cause death or unscheduled hospitalisation for heart failure over 12 months compared to optimal medical treatment alone.

A total of 304 patients with symptomatic heart failure, poor left ventricular function (left ventricular ejection fraction 15-40%), and severe secondary mitral regurgitation were enrolled from 37 hospitals in France. Patients received optimal medical treatment with an angiotensin-converting enzyme inhibitor (if not tolerated, an angiotensin receptor blocker was substituted), a beta-blocker, a mineralocorticoid receptor antagonist, and a diuretic.

Patients were then randomly assigned to undergo percutaneous mitral valve repair or no intervention (control group). Patients were followed-up for 12 months.

More than 90% of procedures were performed successfully and there were no significant safety concerns. Mitral regurgitation was substantially reduced in patients who underwent the procedure compared to those who received medical therapy alone. Nevertheless, there was no statistically significant difference in the primary endpoint of death and unscheduled heart failure rehospitalisation at 12 months, which occurred in 55% of patients receiving the intervention and 52% of patients treated with medical therapy alone (p=0.53).

Professor Obadia said: "The reduction in mitral regurgitation that was achieved with the intervention did not translate into a clinical benefit in patients with heart failure. There is therefore no reason to perform this procedure in all patients with heart failure and secondary mitral regurgitation. Other randomised trials could examine whether there are subgroups that might be more suitable candidates."

Credit: 
European Society of Cardiology