Body

Chronic kidney disease patients face continual, significant gaps in care

Patients with chronic kidney disease (CKD) have a high prevalence of uncontrolled hypertension and diabetes, as well as statin use below the recommended guidelines for cholesterol control, according to a study by researchers at UC San Francisco.

While effective treatments exist for the more than 30 million Americans with CKD, nearly 50 percent of such patients analyzed from 2006 to 2014 continued to suffer from uncontrolled hypertension and 40 percent from uncontrolled diabetes, the researchers said. The study appears online July 11, 2019, in the Clinical Journal of the American Society of Nephrology.

"Even when physicians are aware of a patient's CKD diagnosis, there are substantial gaps in quality of care," said lead author Sri Lekha Tummalapalli, MD, MBA, a nephrology fellow at UCSF. "The lack of improvement over a decade highlights a more urgent need for CKD-specific quality measures and the implementation of quality-improvement interventions."

CKD, a condition of reduced kidney function or kidney damage, affects about 13.6 percent of the U.S. adult population and is expected to grow to 14.4 percent by 2020 and 16.7 percent by 2030. Combined with end-stage renal disease (ESRD), it results in high morbidity, mortality and health care costs. According to the U.S. Renal Data System, among fee-for-service Medicare patients, total medical costs in 2016 exceeded $79 billion for CKD and another $35 billion for ESRD patients.

However, CKD management is complex, involving multiple interventions to protect patient health and prevent kidney failure, such as lifestyle changes and/or medications that control hypertension, high cholesterol and diabetes.

In the study, Tummalapalli and her colleagues used the National Ambulatory Medical Care Survey to review visits by CKD patients to office-based outpatient practices over a nine-year period. They reviewed blood pressure measurement, uncontrolled hypertension and uncontrolled diabetes, as well as the use of certain medications in patients with hypertension, statins if aged 50 years and older, and nonsteroidal anti-inflammatory drugs (NSAIDs).

Overall, they assessed 7,099 visits for CKD patients. No statistically significant difference was found in the prevalence of uncontrolled hypertension over time: 46 percent in 2006-2008 to 48 percent in 2012-2014. They also found that 40 percent of the patients had uncontrolled diabetes in 2012-2014. NSAID use recorded in the medical record was low, averaging 3 percent from 2006 to 2014. Statin use in CKD patients aged 50 years and older with high cholesterol was low and statistically unchanged during the study period, from 29 percent in 2006-2008 to 31 percent in 2012-2014, despite guidelines for their use by the American College of Cardiology and the American Heart Association.

The researchers cited a lack of dedicated, specific quality metrics and insufficient knowledge of specific guidelines for the overall poor quality of CKD care. Low rates of nephrology referral may further drive decreased adherence to quality indicators, they said, along with payment models and care delivery systems that do not support population health-based interventions.

Most CKD is treated in primary care settings, so efforts towards improved CKD management must involve primary care physicians as a central component of multispecialty care teams, Tummalapalli said, while addressing their limited time and competing demands.

"Chronic disease management in all patients, and particularly those with CKD, is essential to slow disease progression and reduce the risk of kidney failure and cardiovascular events," Tummalapalli said. "Improving the control of hypertension and diabetes is extremely challenging and requires multifaceted efforts to deliver care more effectively and support lifestyle modification and medication adherence. Building these systems that are efficient and scalable will be the task of health care over the next decade."

Credit: 
University of California - San Francisco

AGS commends bipartisan leaders on bringing training legislation closer to law

image: Founded in 1942, the American Geriatrics Society (AGS) is a nationwide, not-for-profit society of geriatrics healthcare professionals that has -- for more than 75 years -- worked to improve the health, independence, and quality of life of older people. Our nearly 6,000 members include geriatricians, geriatric nurses, social workers, family practitioners, physician assistants, pharmacists, and internists. The Society provides leadership to healthcare professionals, policymakers, and the public by implementing and advocating for programs in patient care, research, professional and public education, and public policy. For more information, visit AmericanGeriatrics.org.

Image: 
(C) 2019, American Geriatrics Society

New York (July 11, 2019)--As members of the House Committee on Energy & Commerce move to debate, amend, and revise a host of important health proposals, the American Geriatrics Society (AGS) again pledged enthusiastic support for one of the Committee's most important bills under consideration: The Educating Medical Professionals and Optimizing Workforce Efficiency and Readiness (EMPOWER) for Health Act of 2019 (H.R. 2781).

Introduced by Congresswoman Jan Schakowsky (D-Ill.) and House Energy & Commerce Health Subcommittee Ranking Member Michael Burgess (R-TX) earlier this year, the bill reauthorizes workforce training programs under Title VII of the Public Health Service Act. Among these initiatives are the Geriatrics Workforce Enhancement Program (GWEP) and the Geriatrics Academic Career Awards (GACAs), both critical to the care all Americans need as our country continues to age.

The GWEPs educate and engage the broader frontline workforce and family caregivers, and focus on opportunities to improve the quality of care delivered to older adults--particularly in underserved and rural areas. The GACAs represent an essential complement to the GWEP. Grounded in health professions education, GACAs ensure we can equip early career clinician-educators to become leaders in geriatrics training and research.

The EMPOWER for Health Act would authorize funding of $51 million annually through 2024, allowing current and future awardees to:

Educate and engage with family caregivers by training providers who can assess and address their care needs and preferences.

Promote interprofessional team-based care by transforming clinical training environments to integrate geriatrics and primary care delivery systems.

Improve the quality of care delivered to older adults by providing education to families and caregivers on critical care challenges such as Alzheimer's disease and related dementias.

Support clinician-educators engaged in geriatrics education and research to develop the next generation of innovators to improve care outcomes and care delivery.

Through a legislative process known as "markup," today the Health Subcommittee of the House Committee on Energy & Commerce reviewed and unanimously approved, by voice vote, the bill's language and proposals, moving it one step closer to becoming law.

"Representatives Schakowsky and Burgess introduced an incredibly strong draft with H.R. 2781, in large part because they worked so collaboratively with expert partners like the AGS," noted AGS Chief Executive Officer Nancy Lundebjerg, MPA. "We commend them for their support of the geriatrics workforce training programs and for their ongoing efforts to improve care of older Americans."

Introduced in the U.S. House of Representatives on May 16, the EMPOWER for Health Act draws considerable insights from the Eldercare Workforce Alliance (EWA), a collaborative comprised of more than 30 member organizations co-convened by the AGS. Like EWA itself, the EMPOWER for Health Act under consideration by the Health Subcommittee now reflects the diverse expertise of millions of health professionals who support older Americans.

"The future we're working for at the AGS--a future when all older Americans have access to high-quality, person-centered care--begins by building the workforce to make that possible, and by ensuring that workforce can connect us to the tools and supports we need as we age," concluded Lundebjerg, MPA. "We look forward to a very-near future when the EMPOWER for Health Act can make that possible when it becomes law."

Credit: 
American Geriatrics Society

Finger-prick blood test could safely reduce antibiotic use in patients with COPD

A simple finger-prick blood test could help prevent unnecessary prescribing of antibiotics for people with the lung condition chronic obstructive pulmonary disease (COPD), according to a new study by researchers from Cardiff University, University of Oxford and King's College London.

With funding from the National Institute for Health Research, the team demonstrated that using a CRP finger-prick blood test resulted in 20% fewer people using antibiotics for COPD flare-ups.

Importantly, this reduction in antibiotic use did not have a negative effect on patients' recovery over the first two weeks after their consultation at their GP surgery, or on their well-being or use of health care services over the following six months.

Safely reducing the use of antibiotics in this way may help in the battle against antibiotic resistance.

More than a million people in the UK have COPD, which is a lung condition associated with smoking and other environmental pollutants. People living with the condition often experience exacerbations, or flare-ups, and when this happens, three out of four are prescribed antibiotics. However, two -thirds of these flare-ups are not caused by bacterial infections and antibiotics often do not benefit patients.

Professor Nick Francis, from Cardiff University's School of Medicine, said: "Governments, commissioners, clinicians, and patients living with COPD around the world are urgently seeking tools to help them know when it is safe to withhold antibiotics and focus on treating flare-ups with other treatments.

"This is a patient population that are often considered to be at high risk from not receiving antibiotics, but we were able to achieve a reduction in antibiotic use that is about twice the magnitude of that achieved by most other antimicrobial stewardship interventions, and demonstrate that this approach was safe."

The finger-prick test measures the amount of C- reactive protein (CRP) - a marker of inflammation that rises rapidly in the blood in response to serious infections. People with a COPD flare-up who have a low CRP level in the blood appear to receive little benefit from antibiotic treatment.

Professor Chris Butler, from the University of Oxford, said: "This rigorous clinical trial speaks directly to the pressing issues of; preserving the usefulness of our existing antibiotics; the potential of stratified, personalised care; the importance of contextually-appropriate evidence about point of care testing in reducing unnecessary antibiotic use, and; enhancing the quality of care for people with the common condition of chronic obstructive pulmonary disease.

"Most antibiotics are prescribed in primary medical care, and many of these prescriptions do not benefit patients: point of care testing is being vigorously promoted as a critical solution for better targeted antibiotic prescribing. However, there have been virtually no trials of point of care tests that measure impact on clinician behaviour, patient behaviour and patient outcomes. Acute exacerbations of chronic pulmonary disease account for considerable proportion of unnecessary antibiotic use, but a good solution to the problem in ambulatory care (where most of the antibiotics are prescribed) has not been identified until now. Ours is the first trial of biomarker guided management of AECOPD in ambulatory care, and has found an effect that should be practice-changing."

Jonathan Bidmead and Margaret Barnard were the patient and public representatives on the PACE study, providing a voice for patients with COPD: Jonathan Bidmead commented: "We need to highlight not only how many people are saved by antibiotics but also that many are harmed though unnecessary antibiotic use. As a COPD sufferer, I know that antibiotics are routinely used at the first sign of an exacerbation: this study has shown that doctors can use a simple finger-prick test in a consultation to better identify those instances where antibiotics will probably do no good and may even do some harm. This can help us focus on other treatments that may be more helpful for some exacerbations."

Professor Hywel Williams, Director of the NIHR's Health Technology Assessment (HTA) Programme, said: "This is a really important study which provides clear evidence that a simple biomarker blood test carried out in GP surgeries on people with chronic obstructive pulmonary disease experiencing flare-ups, has the potential to reduce unnecessary prescribing of antibiotics, without adversely affecting recovery from these flare-ups. This in turn helps tackle the wider global health hazards of antimicrobial resistance (AMR).

"The NIHR is committed to research in areas of greatest health need, such as AMR. This study is one of a number which we have funded over the last few years in this crucial area, in our sustained effort to tackle this worldwide threat."

Credit: 
Cardiff University

Study suggests arthroscopy more effective than MRI for chondral defects of the knee

Using arthroscopy to stage a lesion in the chondral area of the knee is more accurate than magnetic resonance imaging, according to researchers from the Rothman Institute, La Jolla, Calif. The findings were presented today at the American Orthopedic Society of Sports Medicine Annual Meeting.

Chondral injuries of the knee are a common source of pain in athletes but one of the main methods of diagnosing and staging these injuries, MRI, has a specificity of 73 percent and sensitivity of 42 percent. Using arthroscopy to stage the degree of the injury is a more accurate way to evaluate the knee prior to surgery.

The doctors reviewed 98 patients who had autologous chondrocyte implantation, osteochondral allograft transplantation and meniscus allograft transplantation.

"Based on our review, a change in treatment plan was made in 47 percent of cases in which staging arthroscopy was used to evaluate articular cartilage surfaces," said lead researcher Dr. Hytham S. Salem of Rothman Institute.

Arthroscopy is performed after a standard sterile skin preparation and involves injecting local anesthetic subcutaneously at the portal sites and within the knee joint. It is often performed in office while patients are awake and alert.

"The results of our study indicate that staging arthroscopy is an important step in determining the most appropriate treatment plan for chondral defects prior to OCA, ACI and MAT," Salem said. "Addressing all knee's pathology can be important for the success of cartilage restoration surgery, and treatment plans may change based on the extent and location of cartilage damage."

Credit: 
American Orthopaedic Society for Sports Medicine

Mosquito surveillance uncovers new information about malaria transmission in madagascar

Riley Tedrow, PhD, a medical entomologist at Case Western Reserve University School of Medicine, has uncovered new findings about malaria transmission in Madagascar. In a recent study published in PLoS Neglected Tropical Diseases, he also describes real-world application of an effective mosquito surveillance strategy using low cost traps and a recently reported tool that simultaneously tests each mosquito for its species, what it fed on, and the presence of malaria parasites.

Conducting research in remote villages in Madagascar, Tedrow discovered that female Anopheles mosquitoes-the only mosquitoes that can transmit malaria-bite more often and have more varied diets than typically assumed. These findings could result in better understanding of how the disease is transmitted as well as enhance malaria-prevention strategies.

Specifically, Tedrow found that feeding behavior in the mosquitoes that he collected frequently showed evidence of multiple blood-meal hosts (single host = 53.6%, two hosts = 42.1%, three hosts = 4.3%). The predominant mosquito host was cow, followed by pig, then human.

Additionally, he discovered that the propensity for mosquitoes to feed on humans increased from 27% to 44% between December 2017 and April 2018, when he conducted the study. This suggests that host preferences could vary from season to season, again raising implications for surveillance and eradication campaigns.

Tedrow also found that certain species of Anopheles mosquitoes that are typically considered less important for malaria transmission, and therefore more likely to be overlooked in surveillance and eradication campaigns, were often infected with the Plasmodium parasite that causes malaria. "This hidden reservoir of malaria parasites could hinder malaria eradication," said Tedrow. "The strategy used in this study could easily be adapted to other countries at risk for malaria, possibly uncovering equally complex transmission dynamics that may impact our approach to disease control."

In the same study Tedrow reports that QUEST, his modified, outdoor-based, tennis-net-sized trap, can supplement current mosquito-control interventions, which focus on indoor sources of malaria. "Outdoor trapping can pick up species that other sampling methods might miss out on," he said.

In addition, Tedrow describes his application in Madagascar of BLOODART, a tool he developed that combines an existing malaria test with new host and mosquito-species analysis techniques. BLOODART enables efficient evaluation of hundreds of mosquitoes by simultaneously identifying the species of each mosquito, determining what it has fed on, and diagnosing the presence of malaria parasites, all from a single mosquito abdomen.

Despite intensive international efforts to combat the malady, there were 219 million cases of malaria worldwide and 435,000 subsequent deaths in 2017, with most (92% and 93%, respectively) occurring in Africa.

Credit: 
Case Western Reserve University

Adding immunotherapy after initial treatment can benefit metastatic lung cancer patients

image: Treating metastatic non-small cell lung cancer (NSCLC) patients with the immunotherapy drug pembrolizumab after they've completed locally ablative therapy almost tripled the median progression-free survival (PFS) compared to the historical average.

Image: 
Penn Medicine

PHILADELPHIA - Treating metastatic non-small cell lung cancer (NSCLC) patients with the immunotherapy drug pembrolizumab after they've completed locally ablative therapy - meaning all known sites of their cancer were either treated with surgery, radiation, or other definitive treatments - almost tripled the median progression-free survival (PFS) compared to the historical average. Research from the Abramson Cancer Center at the University of Pennsylvania found the average PFS of study participants was 19.1 months, compared to 6.6 months. JAMA Oncology published the findings today.

Lung cancer is, by far, the leading cause of cancer death in America, and NSCLC is the most common type. Chemotherapy is the standard treatment, but recent data have shown that patients with limited metastatic disease may have more options.

"Multiple trials have shown that if we use these definitive treatment techniques on all visible tumor sites, patients can end up with better outcomes than what they would get with chemotherapy alone, so our trial went one step further and added immunotherapy," said lead author Joshua M. Bauml, MD, an assistant professor of Hematology-Oncology in Penn's Perelman School of Medicine. The study's senior author was Corey J. Langer, MD, a professor of Hematology-Oncology at Penn.

For this study, 45 NSCLC patients with four or fewer metastatic sites underwent LAT, then received pembrolizumab. In addition to the significant increase in overall average PFS, the average PFS from the start of pembrolizumab was 18.7 months. Importantly, the treatment did not lead to any new safety issues or decreases in patient quality of life.

"Our understanding of which metastatic patients may benefit from curative therapies as opposed to palliative therapies is still evolving, but our data show promise that the addition of immunotherapy can bring make a difference," Bauml said.

Researchers say the approach needs further study and that they are still evaluating the impact of this combination on overall survival. However, they note that this study accrued 45 patients from February 2015 through September 2017, a significant number for a single site and evidence that a larger, multicenter, randomized controlled trial to test this approach is feasible.

Credit: 
University of Pennsylvania School of Medicine

Study questions if tongue-tie surgery for breastfeeding is always needed

New research raises questions as to whether too many infants are getting tongue-tie and lip tether surgery (also called frenulectomy) to help improve breastfeeding, despite limited medical evidence supporting the procedure. In a new study, published July 11 in JAMA Otolaryngology--Head & Neck Surgery, nearly 63 percent of children who were referred to a pediatric ear nose and throat surgeon for tongue tie and/or upper lip tether surgery ended up not needing the procedure, and were able to successfully breastfeed following a thorough feeding evaluation from a multidisciplinary team of clinicians, including a speech-language pathologist. A feeding evaluation program implemented on a wider scale may prevent infants from getting a surgery that might not be beneficial to improve breastfeeding, according to the study's authors.

"We have seen the number of tongue-tie and upper lip tether release surgeries increase dramatically nationwide without any real strong data to show these are effective for breastfeeding," says Christopher J. Hartnick, MD, MS, Director of the Division of Pediatric Otolaryngology and the Pediatric Airway, Voice and Swallowing Center at Massachusetts Eye and Ear. "We don't have a crystal ball that can tell us which infants might benefit most from the tongue-tie or upper lip release, but this preliminary study provides concrete evidence that this pathway of a multidisciplinary feeding evaluation is helping prevent babies from getting this procedure."

Tongue-tie, or ankyloglossia, is a condition an infant is born with wherein a piece of tissue, called the lingual frenulum, connects too tightly from the tongue to the floor of the mouth. Infants can also experience upper lip ties when a different tissue, the frenulum of the upper lip, is connected to the gum. In some cases, this restriction in movement can result in difficulty with breastfeeding or in rarer cases, may affect dental health or speech later in childhood.

Breastfeeding is recommended by numerous health organizations worldwide as a preferred method of infant feeding for the newborn's growth and development. When there are difficulties breastfeeding, including the baby not latching on, or gaining weight, or when the mother is in pain, many new parents seek a consultation, which may result in surgery to clip the tongue tie, sometimes called a frenotomy, frenectomy, or frenulectomy.

Despite a lack of medical literature linking the surgery to improved breastfeeding, the number of these procedures has been rapidly rising in recent years, the authors point out, noting that the Kids' Inpatient Database in the United States estimated a 10-fold increase in tongue-tie surgeries from 1,279 in 1997 to 12,406 in 2012.

Prompted by these rising rates and an influx of parents seeking second opinions, Dr. Hartnick and colleagues formed a multidisciplinary feeding evaluation program at the Pediatric Airway, Voice, and Swallowing Center at Mass. Eye and Ear, consisting of clinicians from different medical specialties such as pediatric otolaryngology, pulmonology, gastroenterology, and speech language pathology, including staff from Massachusetts General Hospital.

The researchers examined 115 newborns who were referred to the clinic for tongue tie surgery with a pediatric ENT. There, each mother-newborn pair met with a pediatric speech-language pathologist, who performed a comprehensive feeding evaluation including clinical history, oral exam and observation of breastfeeding. They then offered real-time feedback and strategies to address the hypothesized cause of their breastfeeding challenges.

Following the multidisciplinary feeding evaluation, 62.6 percent of the newborns did not undergo the surgeries. Although all of the referrals were for tongue tie surgery specifically, 10 (8.7 percent) underwent a lip tie surgery alone and 32 (27.8 percent) underwent both lip and tongue tie surgery.

Future multicenter trials are planned, and the researchers also plan follow-up outcomes studies comparing infants who did and did not undergo tongue-tie surgery longer term.

They study's authors call for best practice guidelines to be developed to help with this decision-making throughout the medical community.

"We've learned that an interdisciplinary collaboration is key to a thorough feeding evaluation " says study co-author Cheryl Hersh, MS, CCC-SLP, a pediatric speech-language pathologist at MassGeneral Hospital for Children, who sees patients at the Mass. Eye and Ear Pediatric Airway, Voice, and Swallowing Center. "This is still a work in progress, but we have learned a great deal about what we can do differently to help our patients and their families. In doing so, we have been able to identify many babies who are having breastfeeding problems that are not related to their lip and tongue anatomy.

Tongue-tie and upper lip tether release surgery are relatively safe outpatient procedures performed with local anesthetic, with risks being similar to any surgical procedure, including pain and infection. Parents have also reported experiencing psychological pain, or guilt from the feeding difficulties and resulting surgery. There can be significant out of pocket costs for the procedure given there is much variation related to providers performing a frenotomy, with dental professionals, pediatric otolaryngologists, and neonatologists providing this service with often unpredictable cost and coverage by medical insurance.

Credit: 
Mass Eye and Ear

Cincinnati researchers say early puberty in girls may be 'big bang theory' for migraine

image: Vincent Martin, MD, professor at the University of Cincinnati College of Medicine.

Image: 
Joe Fuqua/University of Cincinnati

CINCINNATI--Adolescent girls who reach puberty at an earlier age may also have a greater chance of developing migraine headaches, according to new research from investigators at the University of Cincinnati (UC) College of Medicine.

"We know that the%age of girls and boys who have migraine is pretty much the same until menstruation begins," says Vincent Martin, MD, professor in the Division of General Internal Medicine and director of the Headache and Facial Pain Center at the UC Gardner Neuroscience Institute. "When the menstrual period starts in girls, the prevalence goes way up, but what our data suggests is that it occurs even before that."

The findings will be presented by Martin at the American Headache Society 61st Annual Scientific Meeting Saturday, July 13, in Philadelphia.

Nationally, about 10% of school age children suffer from migraine, according to the Migraine Research Foundation (MRF). As adolescence approaches, the incidence of migraine increases rapidly in girls, and by age 17, about 8% of boys and 23% of girls have experienced migraine, the MRF reports.

Martin and a team of researchers were part of a longitudinal study looking at 761 adolescent girls from sites in Cincinnati, New York and the San Francisco Bay area. The girls ranged in age from 8 to 20 and study took place over a 10-year period beginning in 2004. Girls enrolled in the study at age 8-10 were examined during study visit every six to 12 months. Researchers determined when they showed initial signs of thelarche (breast development), pubarche (pubic hair growth) and menarche (start of menstrual periods).

Girls answered a headache questionnaire to find out if they suffered from migraine headache, no migraine or probable migraine--the latter is defined as meeting all the diagnostic criteria for migraine except one. The average age at which they completed the survey was 16.

Of those surveyed, 85 girls (11%) were diagnosed with migraine headache while 53 (7%) had probable migraine and 623 (82%) had no migraine, according to Martin, also a UC Health physician specializing in migraine.

Researchers found that girls with migraine had an earlier age of thelarche (breast development) and the onset of menarche (menstrual periods) than those with no migraine. On average breast development occurred four months earlier in those with migraine while menstruation started five months earlier. There was no difference in the age of pubarche (pubic hair development) between those with migraine and no migraine.

"There was a 25% increase in the chance of having migraine for each year earlier that a girl experienced either thelarche or menarche," says Susan Pinney, PhD, professor in the UC Department of Environmental Health and lead investigator on the study. "This suggests a strong relationship between early puberty and the development of migraine in adolescent girls."

The age of onset of thelarche, pubarche or menarche did not differ between those with probable migraine and no migraine, says Pinney.

Previous research suggests that migraine often starts with the onset of menstrual cycles during menarche in adolescent girls. But this study looks at earlier stages of puberty such as thelarche and pubarche, explains Martin.

"To suggest the origins of migraine may occur actually before menstrual periods begin is pretty novel," says Martin. "At each of these stages, different hormones are starting to appear in girls. During pubarche, testosterone and androgens are present, and during thelarche, there is the very first exposure to estrogen. Menarche is when a more mature hormonal pattern emerges. Our study implies that the very first exposure to estrogen could be the starting point for migraine in some adolescent girls. It may be the Big Bang Theory of migraine."

So is there anything that one can do to prevent an early puberty?

"Studies suggest that childhood obesity is associated with early puberty," says Martin, who is also president of the National Headache Foundation. "Keeping your weight down might prevent the early onset of puberty. Future studies will need to be done to determine if strategy will decrease also the likelihood of developing migraine."

Credit: 
University of Cincinnati

Scientific statement on predicting survival for cardiac arrest survivors

To better facilitate research on appropriately determining prognosis after cardiac arrest and to establish better treatments for recovering from brain injury, a working group composed of a Johns Hopkins Medicine physician and American Heart Association (AHA) experts have released a scientific statement that provides best practices on how to predict recovery in comatose survivors. The statement was released in the July 11 issue of Circulation.

At this time, there aren't any rules or set criteria for how to carry out a study to predict recovery. Because of low quality, flawed research, decisions related to current policies may result in prediction errors that may forecast a poor outcome for patients who may have a good outcome, or vice versa. Moreover, the lack of standards for predicting outcomes has made it all but impossible to properly study therapies that could potentially heal the brain and the rest of the body after being resuscitated from cardiac arrest.

To develop this scientific statement, the AHA Emergency Cardiovascular Care Science Subcommittee formed an international panel of experts in the adult and pediatric specialties of neurology, cardiology, emergency medicine, intensive care medicine and nursing. The group's goal is for the clinical research community to develop an accurate, precise clinical test for most patients after resuscitation from a cardiac arrest to determine likely prognosis.

"We owe it to patients and families to ensure we are doing the best to both not prolong unnecessary suffering while balancing that with not withdrawing care too soon if the person has the potential to recover with a reasonably good quality of life," says Romergryko Geocadin, M.D., the chair of the expert panel and professor of neurology, neurosurgery, and anesthesiology and critical care medicine at the Johns Hopkins University School of Medicine. "At the current state of affairs, we have to acknowledge the limitations in our practices in this area because we don't have high-quality science to back our decision-making."

According to the statement, about 8% of the more than 320,000 people who have cardiac arrest outside of a health care setting in the U.S. are released from the hospital with a good outcome, whereas the vast majority of resuscitated patients end up in a coma or another state of consciousness due to brain injury. Most of the deaths are reported as brain injury, yet only 10% of these patients show clinical signs of brain death. Most die from being removed from life support because it's predicted that they will have little brain function and will most likely not recover.

Currently, many physicians wait 48 hours after a cardiac arrest for a patient to awaken from a coma, and some even opt to wait 72 hours. But due to testing limitations and other confounding factors, such as therapeutic hypothermia, predicting an outcome may be biased and premature.

During a cardiac arrest, there are two stages of brain injury: One is due to lack of oxygen and the other happens, ironically, after blood returns. Healing may not begin until after the patient has cleared this hurdle, which may take at least a week after the cardiac arrest. This further muddies the decision for how long to wait for a patient to awaken. Sedatives may also influence some of the diagnostics that determine brain function, so the authors generally recommend waiting seven days or until after the patient comes off sedatives, whichever happens later.

"One possible reason that every single drug that has been tested in clinical trials to heal brain injury after cardiac arrest may have failed is because the studies are designed to look for these drug effects at 30 or 90 days after successful resuscitation from cardiac arrest, but we don't allow most of the patients time to recover for that period. Instead, early predictions on recovery (within 72 hours) are made based on low quality studies," says Geocadin. "By providing this statement, health care providers can use this as a guide to develop better, more rigorous studies that can inform how to undertake better clinical studies that will lead to better practice medicine and develop helpful treatments for our patients."

The authors reviewed the current diagnostics available and their limitations to test brain function, such as assessing reflexes, stimulating sensory nerves in the arm, measuring pupil dilation after shining a penlight in the eye, using electroencephalogram to evaluate for seizures, applying MRI and computerized tomography brain imaging, and more. By using existing or yet to be developed tools properly in better designed studies, they hope researchers can adopt these procedures or enhance them to create better diagnostics for predicting long-term brain function.

The statement offers clinician researchers parameters for setting up their studies, such as how many people they need to enroll, what statistical methods to use, when to reassess function in those that do recover, ways to avoid bias and applying protocols consistently.

The statement's final section addresses ethical issues like respecting patient or family wishes for being on life support and do-not-resuscitate orders. The authors address that quality of life is an important factor, and stress that currently there is limited data regarding long term outcomes after awakening and more work needs to be done.

Credit: 
Johns Hopkins Medicine

New data on ctDNA as a biomarker for detecting cancer progression presented at ASCO

HERCULES, Calif. - July 11, 2019 - Scientists presented more than 30 abstracts featuring Bio-Rad's Droplet Digital PCR (ddPCR) technology at the American Society of Clinical Oncology (ASCO) Annual Meeting in Chicago, May 31-June 4. Many of these studies used liquid biopsy powered by the ddPCR platform to measure circulating tumor DNA (ctDNA) and evaluate ctDNA's potential as a biomarker for guiding cancer treatment decisions and predicting efficacy.

Detecting and analyzing ctDNA in liquid biopsy samples is gaining recognition as a less invasive method of monitoring disease progression. Researchers are not only testing known mutations in various cancers but are also discovering new mutations that may become important biomarkers. However, wider clinical adoption requires that these ctDNA biomarkers be tested in various cancers and clinical conditions. The studies below highlight efforts to evaluate ctDNA as a potential biomarker vehicle to monitor treatment efficacy, tumor progression, or tumor recurrence.

Mutant BRAF ctDNA is a potential biomarker for advanced melanoma treatment efficacy

There are currently no validated blood-based biomarkers for monitoring treatment efficacy in patients with advanced melanoma. David Polsky, MD, PhD, of NYU Langone Medical Center, and his colleagues are one of the teams that are studying the potential of ctDNA as a biomarker for treatment efficacy in this disease.

The researchers examined ctDNA containing a mutated gene known as BRAF, which is commonly associated with melanoma, in 345 patients undergoing therapy. Using ddPCR technology, Polsky's team was able to determine that baseline concentration of ctDNA with this mutation negatively correlated with patient survival. Additionally, Polsky found that patients survived longer when their levels of mutated BRAF ctDNA dropped to undetectable levels after four weeks of treatment.

"The reason we could conduct this quantitative analysis was because the ddPCR platform delivers highly accurate and precise measurements of the amount of mutated DNA in a sample," said Polsky.

ctDNA may not reliably predict progression of melanoma metastases in the brain

Jenny Lee, MD, of Macquarie University in Sydney and the Melanoma Institute Australia, and her colleagues discovered the limitations of using ctDNA to analyze the progression of advanced melanoma when the cancer has metastasized specifically to the brain.

Following up on an earlier study, Lee and her colleagues used ddPCR technology to analyze ctDNA levels in 48 patients with advanced melanoma and brain metastases. All the patients were receiving immune checkpoint therapy. Eight had metastases in their brains but nowhere else. The researchers found that an absence of detectable tumor ctDNA at the start or at an early stage of treatment was a good sign. Specifically, it was associated with superior progression-free survival and overall survival, though this did not apply to responses observed in brain metastases.

As to why that might be the case, Lee noted that the metastases in the brain may have been smaller than those in other parts of the body and that the blood brain barrier could have filtered out ctDNA before it reached the blood.

The team concluded that though ctDNA could be useful for monitoring treatment efficacy in metastatic melanoma, it may not be helpful in detecting brain metastasis or monitoring whether the brain tumors respond to immune checkpoint therapy. This study has potentially important implications because up to half of melanoma patients exhibit brain metastases at some point in their treatment.

Chimeric cell-free DNA can track tumor recurrence in hepatocellular carcinoma

One third of patients with hepatocellular carcinoma (HCC) experience tumor recurrence within the first year following surgery. Detecting and tracking recurrent tumors is essential for physicians to implement therapeutic trials at the right time.

Most patients with HCC have been infected with hepatitis B virus (HBV) and harbor virus host (vh)-chimera DNA, the result of viral DNA integrating into HCC chromosomes. To determine whether vh-chimera DNA could serve as a more sensitive biomarker of HCC progression than currently available tests, Ya-Chun Wang, PhD, of TCM Biotech International Corporation, and his colleagues used ddPCR technology to quantify vh-chimera DNA in plasma samples in patients with HBV-related HCC before and after tumor removal surgery. The researchers detected and quantified vh-chimera DNA before surgery in 44 of 50 patients.

Of the patients whose plasma samples still contained vh-chimera DNA two months after surgery, 82 percent had a tumor recur in the following year. Furthermore, in all but two patients, the vh-chimera DNA matched that of the original clone, indicating that the majority of recurrences come from original HCC cells.

The authors note that the findings support vh-chimera DNA as a biomarker to complement existing ones in detecting the existence of HBV-HCC and in tracking tumor recurrence based on vh-chimera clonalities.

"The ddPCR assay provides an advantage for sensitive quantification of chimera DNA," Wang said. "In our current practice, the target chimera DNA in blood can be detected as low as one to two copies per reaction."

Credit: 
CG Life

Quantum sensor breakthrough using naturally occurring vibrations in artificial atoms

A team of scientists, led by the University of Bristol, have discovered a new method that could be used to build quantum sensors with ultra-high precision.

When individual atoms emit light, they do so in discrete packets called photons.

When this light is measured, this discrete or 'granular' nature leads to especially low fluctuations in its brightness, as two or more photons are never emitted at the same time.

This property is particularly useful in developing future quantum technologies, where low fluctuations are key, and has led to a surge of interest in engineered systems that act like atoms when they emit light, but whose properties are more easily tailored.

These 'artificial atoms' as they are known, are typically made from solid materials, and are in fact much larger objects, in which the presence of vibrations is unavoidable, and usually considered to be detrimental.

However, a collaborative team, led by the University of Bristol, has now established that these naturally occurring vibrations in artificial atoms can surprisingly lead to an even greater suppression of fluctuations in brightness than that present in natural atomic systems.

The authors, which include academics from the universities of Sheffield and Manchester, show that these low fluctuations could be used to build quantum sensors that are inherently more accurate than those possible without vibrations.

Their findings are published today in the journal Nature Communications.

Dr Dara McCutcheon, principal investigator of the research and Lecturer in Quantum Engineering from the University of Bristol's School of Physics said: "The implications of this research are quite far reaching.

"Usually one always thinks of the vibrations present in these relatively large artificial atoms as being detrimental to the light they emit, as typically the vibrations jostle the energy levels, with the resulting fluctuations imprinted onto the emitted photons.

"What's happening here though, is that at low temperatures the vibrational environment acts to cool the system - in a sense freezing the energy levels, and in turn suppressing fluctuations on the emitted photons."

This work points towards a new vision for these artificial atoms, in which their solid-state nature is actually put to good use to produce light that couldn't be made using natural atomic systems.

It also opens the door to a new set of applications which use artificial atoms for quantum enhanced sensing, ranging from small scale magnetometry that could be used to measure signals in the brain, all the way up to full-scale gravitational wave detection revealing cosmic processes at the centre of galaxies.

Credit: 
University of Bristol

Redesign of opioid medication management shows impact in rural clinics

In rural practice, a system redesign resulted in declines in the proportion of patients on high dose opioids and the number of patients receiving opioids. The "Six Building Blocks," a team-based redesign of opioid medication management within smaller practice settings addressing policy changes, patient agreements, patient tracking, in-clinic support, and success metrics, was implemented in 20 clinic locations across eastern Washington and central Idaho. Among patients aged 21 years and over, there was a 2.2% decline in patients receiving high dose opioids over a period of 15 months, compared to a 1.3% decline in the control group. Similarly, a 14% decline was observed in the total number of patients receiving opioids in the intervention clinics compared to a 4.8% control group decline. The results indicate that efforts to redesign care by primary care teams, guided by the Six Building Blocks framework, can improve opioid prescribing practices and possibly reduce dependency.

Credit: 
American Academy of Family Physicians

'Traffic light' food labels reduce calories purchased in hospital cafeteria

BOSTON - A new study by Massachusetts General Hospital (MGH) investigators released today in JAMA Network Open, a publication of the Journal of the American Medical Association, showed that labeling food choices in a hospital cafeteria with simple "traffic-light" symbols indicating their relative health value was associated with a reduction in calories purchased by employees, and that the dietary changes were sustained over two years.

In the program, green labels indicated the healthiest foods, yellow labels indicated less healthy foods, and red the least healthy based on positive and negative criteria, including whether the main ingredient was fruit, vegetable, whole grain, and so forth, and the amount of saturated fat.

Researchers used employee ID numbers to track the purchases of 5,695 employees buying food at MGH cafeterias. After establishing a three-month baseline period, the researchers tracked purchases made after the labels were added and again after product-placement changes made healthier choices more accessible. The interventions remained in place at MGH cafeterias, and the study analyzed data over two years after the traffic-light labels were first introduced.

The researchers found that the proportion of green-labeled foods purchased increased while the proportion of the least healthy foods purchased decreased.

The current study, a retrospective analysis using newly available item-level calorie data, associated the labeling with a reduction in calories over the two-year period studied and found that the biggest calorie decreases were seen in red-labeled food purchases. "So that indicates that not only were employees' consuming fewer calories at work," said lead author Anne N. Thorndike, MD, MPH, "but also that they were improving the quality of the calories they were purchasing."

For employees who visited the cafeterias most frequently, the estimated reduction in calories equated to a weight loss of up to 2 kg (4.4 pounds) over time. However, Thorndike stressed, "this is not a weight loss program." Data show that people gain an average of one to two pounds per year. "If a program like this could help guarantee that every adult maintained a steady weight rather than continuing to gain," she said, "we could start to reverse the obesity epidemic."

Prior research evaluating the impact of food labeling interventions on calories purchased has been either lab-based or cross-sectional, assessing a single food or meal choice. "The difference with our study is that it looked at real-world purchases by employees over several months," said Thorndike.

A third of the nearly 150 million Americans who are employed are obese, and the prevalence of obesity is increasing across all industries, including healthcare. Obesity and diet-related diseases such as diabetes and cardiovascular disease reportedly contribute to higher absenteeism and lower productivity as well as to approximately $200 billion in healthcare costs nationwide. Employees frequently acquire meals at work, and a recent nationally representative household survey found that workplace food was high in calories from saturated fat and sugars, often consisting of items such as pizza, regular soft drinks, cookies, and brownies. Effective strategies for reducing nonnutritive energy intake during the workday could help address the rising prevalence of obesity in the United States and worldwide.

"More workplaces should be doing these kind of interventions," Thorndike said. "Wellness programs typically end after a certain period, but programs like this, that people are exposed to every day when they go to work, become part of the workplace culture. That's how you get people to make long-term changes."

Thorndike believes the labels helped employees to make the healthier choices they wanted to make. "A red label is a reminder that something is not healthy at the time you're about to make the purchase," she said. "The labels are for people who are trying to make a healthy choice but don't have time to look at the nutrition-facts panel. They want something quick and easy so they can make the choice and get back to work."

Credit: 
Massachusetts General Hospital

Study identifies new potential target in glioblastoma

image: 'The field has postulated for years that cancer stem cells are a small population within the tumor but critical because they mediate treatment resistance and cancer resistance,' said Dr. Sunit Das, a scientist at St. Michael's Keenan Research Centre for Biomedical Science and The Arthur and Sonia Labatt Brain Tumour Research Centre at SickKids. 'We've now found proof of that speculation.'

Image: 
St. Michael's Hospital

TORONTO, July 10, 2019 -- Researchers are hopeful that new strategies could emerge for slowing the growth and recurrence of the most common primary brain cancer in adults, glioblastoma, based on the results of a study published today in Cancer Research.

Research led by Toronto's St. Michael's Hospital and The Hospital for Sick Children (SickKids) suggests the protein ID1 is critical to tumour initiation and growth and also impacts the disease's response to chemotherapy. ID1 is a protein that keeps other genes from being activated or repressed by binding to their activators or inhibitors. In this work, scientists found that ID1 helps maintain cancer stem cells in glioblastoma, making them less susceptible to treatment.

"The field has postulated for years that cancer stem cells are a small population within the tumour but critical because they mediate treatment resistance and cancer resistance," said Dr. Sunit Das, a scientist at St. Michael's Keenan Research Centre for Biomedical Science and The Arthur and Sonia Labatt Brain Tumour Research Centre at SickKids. "We've now found proof of that speculation."

Researchers found that when they "turned off" the protein ID1 in lab models and human cells using CRISPR technology or a drug that is traditionally used to treat psychosis and Tourette's syndrome - pimozide - glioblastoma tumours slowed down. The team also found that turning off the protein altogether helped tumours become less resistant to chemotherapy.

Glioblastoma is an aggressive form of brain cancer, representing 15 per cent of all primary brain tumours, and is often difficult to treat. Therapy generally involves the combination of several approaches to control the disease, but there is currently no cure. The diagnoses of the late Tragically Hip singer Gord Downie and U.S. Sen. John McCain have raised the profile of glioblastoma in recent years.

"The average survival rate for glioblastoma is less than two years and we unfortunately don't have too many options to offer these patients," said Dr. Das, who is also a neurosurgeon at St. Michael's.

"Our findings suggest that we may be able to enhance the effectiveness of therapies we already have, such as chemotherapy, as opposed to taking many years to create entirely new therapies."

In lab models, researchers found that inhibiting ID1 slows the progression of tumours in glioblastoma, breast adenocarcinoma and melanoma. In human tissue, they found that the protein caused cells to be more resistant to chemotherapy treatment in glioblastoma. Turning off this protein using the medication pimozide increased overall survival and caused glioblastoma tumours to recur less frequently, progress less and grow more slowly.

"Targeting the protein with medication may present a novel and potentially promising strategy for patients with glioblastoma," Dr. Das said.

The next steps for this research, Dr. Das explained, are to look at the development of new inhibiting medications for ID1 and commence a trial to ensure that the targeting is effective.

Credit: 
St. Michael's Hospital

Is elevated systolic blood pressure associated with risk for valvular heart disease?

What The Study Did: A group of 329,237 men and women of white Bristish ancestry with genetic data in the UK Biobank and blood pressure measurements were included in a study that examined the association between systolic blood pressure and risk of major valvular heart disease.

Authors: Kazem Rahimi, M.D., F.R.C.P., of the University of Oxford in Oxford, United Kingdom, is the corresponding author.

(doi:10.1001/jamacardio.2019.2202)

Editor's Note: The article includes conflict of interest disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

#  #  #

Media advisory: The full study is linked to this news release.

Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time: https://jamanetwork.com/journals/jamacardiology/fullarticle/2737872?guestAccessKey=bb1acfca-2e16-424b-9ed4-8d7f39dffcfa&utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_content=tfl&utm_term=071019

Credit: 
JAMA Network