Culture

Improved detection of atrial fibrillation could prevent disabling strokes

image: An implantable device that monitors for atrial fibrillation over 12 months is more than three times more effective at detecting the condition than standard care, and could help prevent disabling strokes in patients, according to an Alberta clinical trial.

Image: 
Supplied

A clinical trial examining the efficacy of two devices to monitor and detect atrial fibrillation (AF), or an irregular heartbeat, in ischemic stroke patients--one an implantable device that monitors over 12 months, the other an external device that monitors over a 30-day period--found the implantable device is more than three times more effective in detecting AF, and both are a significant improvement over the current standard of care in Alberta, Canada.

The Post-Embolic Rhythm Detection With Implantable Versus External Monitoring (PER DIEM) study, led jointly by University of Alberta and University of Calgary researchers, was published today in the journal JAMA. The findings are expected to significantly change practice in how clinicians look for AF in Albertan patients following ischemic stroke.

"We know that (the current method of monitoring) isn't as effective as it could be in picking up atrial fibrillation from this study because regardless of which arm of the study patients went into, we were picking up anywhere from five to 15 per cent extra atrial fibrillation," said Brian Buck, a stroke neurologist and associate professor of medicine at the U of A. "We found in the study there were a lot of patients with undetected atrial fibrillation, even after they received the standard cardiac monitoring."

Atrial fibrillation causes about one in four strokes in Alberta. Detecting it early is key to preventing further disabling strokes in patients who have already experienced ischemic stroke, a type of stroke caused by a blockage in an artery that supplies blood to the brain. If atrial fibrillation is detected, clinicians have treatments--mainly blood thinners--that can reduce the risk of stroke by almost 70 per cent.

The standard test in Alberta for AF is a 24-hour electrocardiogram monitor. In the PER DIEM trial, 300 Albertan patients who had suffered a stroke were randomized to one of two new devices that can monitor for AF for longer durations. The study showed that the implantable device picked up three times more new AF than the 30-day monitor (15 per cent versus five per cent). All of the patients in the clinical trial with new AF were started on blood thinners.

"We didn't expect that we would get such a dramatic increase with the longer recording, even though it intuitively makes sense," said study co-author Michael Hill, professor of neurology at the University of Calgary and senior medical director for stroke with Alberta Health Services' Cardiovascular and Stroke Strategic Clinical Network. "Most people suspected that detection rates apply to only certain subtypes of ischemic stroke. This study showed that theory is not correct."

"We believe that those patients that were identified with atrial fibrillation are now, for the rest of their lives, going to have a much lower risk of having a stroke in the future," added Buck, who is also a member of the U of A's Neuroscience and Mental Health Institute.

One of the patients who took part in the trial was Norman Mayer, the sitting mayor of the central Alberta community of Camrose for the past 32 years. Mayer recalls being admitted to the emergency department about five years ago after not feeling well and experiencing sudden pain. After examination, the clinicians on duty informed him that he had likely experienced a minor stroke.

After being stabilized, Mayer was informed of the clinical trial and given the option of participating. After giving his consent, he was randomly assigned to the group of patients who were given the implantable monitoring device.

"It was the luck of the draw, and the advantage of it was that it's inconspicuous and wearing (an external device) would not have been very appealing to me," said Mayer. "So I had (the implantable device) tucked into my chest. It's there and nobody knows about it except for me and my doctor.

"It gives you a bit of a comfort level, I guess. It's not bothering you. It's just there and a part of life," he added. "It gives you the feeling that if something was to go wrong, somebody's going to be in touch to let you know (what steps need to be taken)."

With better monitoring, clinicians may be able to diagnose much more AF after stroke and dramatically reduce the risk of future disabling stroke. According to the Canadian Coordinating Office for Health Technology Assessment (CADTH), the external device costs about $1,000 per patient to administer, while the implantable device typically costs over $5,000 per patient. The implantable device had the added advantage of remote monitoring, reducing the need for trips to the hospital--an important consideration for rural Albertans. The team says an in-depth cost-benefit analysis is needed to determine the best approach to providing superior care while also providing savings for the health-care system.

"The biggest problem with stroke is that it dramatically impacts people's lives. So you take a healthy, independent person with a big disabling stroke, and they often end up being dependent on others for help with care. So that's terrible for patients and it's very expensive for the system. If you can prevent even a few of those big disabling strokes per year, it helps the person and reduces the burden on the health system," said Buck.

"This new evidence will help guide selection of what strategy is best going forward," added Hill. "We need to go beyond (24-hour monitoring) for Albertan patients. But if the system is going to pay for this technology, we need to know more definitively that patients are going to end up with lower stroke rates in the future."

The researchers say studies to this point have shown a trend in that direction, but more work is needed to prove it definitively. They hope to address those questions in future research.

Credit: 
University of Alberta Faculty of Medicine & Dentistry

If countries implement Paris pledges with cuts to aerosols, millions of lives can be saved

image: The UC San Diego team wanted to explore the tradeoffs countries would face by taking aerosols into consideration while concurrently making CO2 cuts to implement Paris pledges. Their model provides a country-by-country breakdown of the impacts of aerosol reductions across the eight economic sectors which cause emissions.

Image: 
UC San Diego

Aerosol reductions that would take place as countries meet climate goals could contribute to global cooling and prevent more than one million annual premature deaths over a decade, according to a new study from the University of California San Diego.

The landmark Paris Agreement of 2016 does not address emissions of aerosols--fine particulates like soot that cause pollution. Nonetheless, findings from the recent study authored by researchers at UC San Diego's Scripps Institution of Oceanography and the School of Global Policy and Strategy suggests that aerosol accounting should be explicitly incorporated into international climate policy.

It is crucial because as countries implement their greenhouse gas reduction targets under the Paris climate agreement, their choices about which sectors to target will also reduce aerosols that are co-emitted, which will have major impacts to public health and global temperatures.

"Joint consideration of greenhouse gases and aerosols is critical," said Pascal Polonik, a Ph.D. student at Scripps Oceanography and first author of the paper published in Earth's Future. "Polluting particles, known as aerosols, are emitted in tandem with greenhouse gases but aren't accounted for. While all greenhouse gas emissions might be thought of as unambiguously harmful, aerosols are more complicated. All aerosols are harmful to human health but they also often help counteract global warming by cooling the Earth's surface."

It is estimated that emissions of aerosols from burning fossil fuels like coal and diesel are responsible for nine million premature deaths worldwide. Though most aerosols have a cooling effect because they reflect sunlight, certain types, such as black carbon have a warming effect.

The UC San Diego team wanted to explore the tradeoffs countries would face by taking aerosols into consideration while concurrently making CO2 cuts to implement Paris pledges.

Their model provides a country-by-country breakdown of the impacts of aerosol reductions across the eight economic sectors which cause emissions. For each country, the authors consider three scenarios. The first scenario prioritizes air quality, targeting aerosol cuts to the "dirtiest" sectors that emit the most solid particles. The second prioritizes temperatures by targeting industries that emit aerosols that most contribute to warming and the third, dubbed the "politically expedient" approach, reduces emissions from all economic sectors equally.

Preventing as many as one million premature deaths per year by cutting emissions from certain sectors first

Under these three approaches, the authors find that by 2030, the three scenarios would yield prevention of as many as one million premature deaths every year and global temperature differences of the same magnitude as those from greenhouse gas reductions.

The study demonstrates the importance of domestic decisions for reducing emissions because making cuts to certain sectors can produce cleaner air and save more lives, or further reduce warming.

For example, the U.S. could choose to save more lives by targeting aerosol emissions in the industrial production, shipping, or residential/commercial sectors. It could also choose to limit warming more with cuts to the solvents, residential/commercial and waste sectors.

To the authors' surprise, the third scenario, which may be most politically feasible to implement as policy, can lead to both more deaths and less cooling in certain places, such as Africa, China, the Middle East and South America.

"Implementing cuts equally and making each industry do their fair share may be the easiest way to implement climate policy in a democratic society like the U.S. where there are many competing political interests," said co-author Kate Ricke, assistant professor with Scripps Oceanography and the School of Global Policy and Strategy. "However, there are real benefits to being thoughtful about how aerosols factor into climate policy outcomes. There may be big benefits to cutting emissions from certain sectors first."

The research is critical to the U.S., as it is currently renegotiating its Paris agreement climate pledge.

"Our analysis does indicate some considerable tradeoffs between temperature and health outcomes that will need to be contended with in meeting near-term emission reductions goals," said Jennifer Burney, the Marshall Saunders Chancellor's Endowed Chair in Global Climate Policy and Research at the School of Global Policy and Strategy.

In India, for example, emission cuts in the transportation sector could save more lives, while cuts in the residential sector would produce more cooling.

The authors note that because the tradeoffs vary considerably for each region, countries are likely to have different priorities for weighing reduction of warming versus protection of public health when making climate policy decisions.

The conclusion, they emphasize, is that there are many ways to achieve the same magnitude of greenhouse gas reduction pledged in the Paris Agreement, but the aerosol emissions that "ride along" with those cuts may vary a lot depending on which sectors are targeted. As such, the authors write, "we believe that this is a strong case for explicitly considering aerosols when constructing climate policy."

Credit: 
University of California - San Diego

Chip inserted under the skin may better identify patients at risk of recurrent stroke

BOSTON - For patients who have experienced certain common types of stroke, a small chip inserted under the skin may help physicians predict their likelihood of experiencing a second stroke, and therefore their likelihood of benefiting from preventive therapy. The findings come from a recent clinical trial published in the Journal of the American Medical Association and led by investigators at Massachusetts General Hospital (MGH) and Northwestern University Feinberg School of Medicine.

Each year, approximately 800,000 strokes occur in the United States, and as many as one-fourth occur in people who experienced a previous stroke. Investigators have been searching for ways to identify patients who are likely to experience a recurrent stroke, as these individuals could be candidates for taking certain medications such as blood thinners. One group of patients who face an elevated risk of recurrent strokes are those with atrial fibrillation--an irregular and often rapid heart rate--that often goes undetected and untreated. (Irregular heartbeats can allow blood to pool in the heart, which can cause clots to form and travel to the brain.)

Recent research has shown that a small chip inserted under the skin can monitor the heart rate and rhythm, and help physicians detect atrial fibrillation in patients who previously experienced what's called a cryptogenic stroke, one with no identified cause despite thorough patient testing. Now investigators have tested the chip--less than 1¾? long and 1/6? thick and called an insertable cardiac monitor--in patients who experienced a stroke caused by narrowing of a large artery like the carotid artery, or blockage of a small artery deep in the brain where atrial fibrillation would be unexpected.

In the Stroke of Known Cause and Underlying Atrial Fibrillation (STROKE AF) trial, 492 patients were randomized and completed 12 months of follow-up after receiving either an insertable cardiac monitor within 10 days of an initial stroke or usual care consisting of external cardiac monitoring through electrocardiograms or other tracking methods.?

The chip detected atrial fibrillation in 12.1% of patients, compared with 1.8% detected through usual care. The team noted that the episodes of atrial fibrillation were not brief, with most lasting at least one hour. Most stroke experts would recommend that patients with this degree of atrial fibrillation start taking blood thinners to prevent a future stroke.

"We found that a significant minority of patients with stroke not thought to be related to atrial fibrillation actually have atrial fibrillation, but we can only find it with an implantable monitor," says lead author Richard A. Bernstein, MD, PhD, a professor of Neurology at Northwestern University Feinberg School of Medicine.

Adds senior author Lee H. Schwamm, MD, C. Miller Fisher Chair of Vascular Neurology at MGH: "Based on the study findings, we believe that patients with stroke who are similar to those in the STROKE AF Trial should now undergo long-term cardiac monitoring with an insertable cardiac monitor to identify unsuspected atrial fibrillation."

Schwamm notes that for every eight patients monitored, clinicians could expect to find atrial fibrillation in one of them in the first year. "This could dramatically change the treatment recommendations by their doctor," he says.

Next steps in this research include identifying patient factors that predict the development of atrial fibrillation and the duration and extent of the arrhythmia. Additional studies are being explored to further understand the association of silent atrial fibrillation and recurrent stroke of all types.

Credit: 
Massachusetts General Hospital

UCI-led study sheds light on mysterious genotype-phenotype associations

image: Shown are the genetic variants associated with alternative mRNA 3'UTR lengths. A change from the T allele to the C allele leads to the lengthening of a 3'UTR measured by a ruler. These 3'UTR associated genetic variants provide substantial new insights into the molecular mechanisms underlying many human complex traits and diseases.

Image: 
UCI School of Medicine

Irvine, CA - June 1, 2021 - A new study analyzing the association between an individual's genetics (genotype) and their observable characteristics resulting from the interaction of genetics and the environment (phenotype), contributes new knowledge to the understanding of human complex traits and diseases.

The study titled, "An atlas of alternative polyadenylation quantitative trait loci (3?aQTLs) contributing to complex trait and disease heritability," was recently published in Nature Genetics. Led by University of California, Irvine professor of bioinformatics Wei Li, PhD, the Grace B. Bell chair of the Department of Biological Chemistry at UCI's School of Medicine, this new research unlocks how much differences in people's genes account for differences in their traits and what can be attributed to the effects of the surrounding environment.

"Our research is of particularly importance as it offers interpretations that explain how natural variations can shape human phenotypic diversity and tissue-specific diseases," said Li. "Our most exciting finding was that certain events in human genes can explain a substantial proportion of trait heritability."

It was well known that 3'-UTR alternative polyadenylation (APA) occurs in approximately 70 percent of human genes and substantively impacts cellular processes such as proliferation, differentiation and tumorigenesis. But, until now, the association of APA events with disease risk and complex human traits was not well understood.

Genome-wide association studies have identified thousands of noncoding variants associated with human traits and diseases. However, the functional interpretation of these variants has proven to be a major challenge. In this study, researchers constructed a multi-tissue atlas of human 3? UTR alternative polyadenylation (APA) quantitative trait loci (3?aQTLs), containing approximately 0.4 million common genetic variants associated with the APA of target genes, identified in 46 tissues isolated from 467 individuals (Genotype-Tissue Expression Project).

"From our findings we could show that specific molecular features associated with human phenotypic variations contribute substantially to the molecular mechanisms underlying human complex traits and diseases," explained Li.

The research team is continuing to study these molecular mechanisms to test the novel 3?aQTLs genes for diabetes, prostate cancer, Alzheimer's Disease and Amyotrophic lateral sclerosis (ALS).

Credit: 
University of California - Irvine

Adults With Cognitive Impairment Who Use Pain Medication Have Higher Falls Risk

Older adults with cognitive impairment are two to three times more likely to fall compared with those without cognitive impairment. What's more, the increasing use of pain medications for chronic pain by older adults adds to their falls risk. Risks associated with falls include minor bruising to more serious hip fractures, broken bones and even head injuries. With falls a leading cause of injury for people aged 65 and older, it is an important public health issue to study in order to allow these adults increased safety and independence as they age.

Although elevated risk of falls due to use of pain medication by older adults has been widely studied, less is known about how pain medication use affects falls risk of older adults living with cognitive impairment. In a study recently published in Age and Ageing, researchers at Texas A&M University examined a national sample to identify the relationship between pain medication use and falls among older adults based on their cognitive status. The team included Texas A&M Health Center for Population Health and Aging postdoctoral research associate, Aya Yoshikawa, DrPH; center co-director Matthew Lee Smith, PhD, MPH; and center founding director Marcia G. Ory, PhD, MPH.

Using data from the National Health and Aging Trends Study (NHATS), the team analyzed associations between pain medication use and recent falls by cognitive status. The data used were self-reported measures except for cognitive test scores, which were derived from the NHATS validated algorithm based on physician diagnosis, cognitive domain (memory, orientation and executive function) test scores, and AD8 Dementia Screening Interview test scores.

Falls were identified as "yes" or "no" answers to the definition of "any fall, slip, or trip in which you lose your balance and land on the floor or ground or at a lower level" in the past month. Frequency of taking pain medication in the past month was identified as seven days a week, five to six days a week, two to four days a week, once a week or less, and never. Information about specific pain medications was not identified in this study.

Additional measures included age, race/ethnicity, education, living arrangement, balance or coordination problems, being bothered by pain, and number of chronic conditions.

The researchers found that among the 7,491 community-dwelling participants in the study, 8.3 percent had possible dementia while 8.2 percent had probable dementia.

"Although there was no significant difference in being bothered by pain by cognitive status, people living with dementia took medication for pain more frequently than those with no dementia," Yoshikawa said. "Older age, not non-Hispanic white race/ethnicity, lower levels of education, living alone, and having more chronic conditions were associated with people living with dementia versus those with no dementia. People living with dementia were more likely to report at least one fall in the past month and worry about falling down and balance/coordination."

In addition, researchers found increased likelihood of recent falls was associated with pain medication among persons with probable dementia, and that taking pain medication two days a week or more was also associated with an increased risk of falls among those with probable dementia.

"These results support that the risk of falls associated with pain medication is elevated among those with higher levels of cognitive impairment," Yoshikawa said. "The different relationships of pain medication with falls by cognitive status can be partly explained by the severity of cognitive impairment among older adults."

Finally, the researchers note that the findings in this study have practical implications for falls prevention strategies and programs.

"To address the risk of falls associated with pain medication, especially for probable dementia, it is essential to conduct screening and medication reconciliation in the health care system. The provision of education about pain medication and alternative pain management programs is critical to preventing falls," Yoshikawa said. "There is need for fall prevention programs that encourage both exercise training for improving one's balance and reducing worry about falling down through fall management strategies."

Credit: 
Texas A&M University

Modulating rapamycin target protein promotes autophagy, lowering toxic Huntingtin protein

Researchers world-wide are focused on clearing the toxic mutant Huntingtin protein that leads to neuronal cell death and systemic dysfunction in Huntington's disease (HD), a devastating, incurable, progressive neurodegenerative genetic disorder. Scientists in the Buck Institute's Ellerby lab have found that the targeting the protein called FK506-binding protein 51 or FKBP51 promotes the clearing of those toxic proteins via autophagy, a natural process whereby cells recycle damaged proteins and mitochondria and use them for nutrition.

Publishing in Autophagy , researchers showed that FKBP51 promotes autophagy through a new mechanism that could avoid worrisome side effects associated with rapamycin, an immune-suppressing drug which also extends lifespan in mice. They show both rapamycin and the small pharmacological inhibitor of FKBP51, SAFit2, protect HD neurons but that the mechanisms of the two drugs are distinct.

The possibility of avoiding the negative side effects of rapamycin

Researchers focused on a family of binding proteins called FKBP's and specifically on FKBP51, which was most changed in mouse and human stem cell models of HD. During the course of the study scientist found that FKBP51 acts on a pathway independent of mTOR (mammalian Target of Rapamycin), which is associated with rapamycin. Scientists also identified a small molecule, SAFit2, which crossed the blood-brain barrier and promoted autophagy and reduced the toxic disease-causing protein through that mTOR-independent pathway.

"Rapamycin can have both positive and negative effects and this new molecule could give us a way to go after the toxic proteins without those complications," said Buck Professor Lisa Ellerby, PhD, director of the study, who added that the findings are also significant for the aging field. "We know that FKBP's get dysregulated during aging, a phenomena which likely contributes to the accumulation of toxic proteins associated with other age-related diseases. SAFit2, which is neuroprotective, could give us another option to promote autophagy and clear out disease-causing proteins or proteins accumulated during disease and aging which are correlated with other conditions." FKPB51 has been implicated in Parkinson's and Alzheimer's diseases as well as post-traumatic stress disorder and schizophrenia.

The first author of the work, Barbara Bailus, PhD, is a former postdoc in the Ellerby lab. "The fact that SAFit2 crosses the blood brain barrier is significant," said Bailus, who is now an Assistant Professor of Genetics at the Keck Graduate Institute in Claremont, CA. "In our mouse models of HD, the small molecule interacted with FKPBP51 and cleared toxic proteins in both the cortex and the striatum which is part of the neural circuit necessary for voluntary movement."

The Ellerby lab will do pre-clinical work with SAFit2, which was developed by a collaborator, Dr. Felix Hausch, PhD, at the Technical University in Darmstadt, Germany.

Current status of clinical trials for HD

The recent failure of an experimental drug tested in Europe and Canada against HD highlights the desperation of patients who are forced to deal with a malady that usually sees it victims dying about 20 years following the onset of observable symptoms. The drug was developed by Ionis and Roche, and is an antisense oligonucleotide (ASO). It was designed to silence the gene responsible for HD, and had to be injected into the fluid-filled space between the thin layers of tissue that cover the brain and spinal cord. While the details of the failed trial are not published yet, Ellerby says the drug appeared not to diffuse into the entire brain, the ASOs may have unanticipated toxic effects and the ASOs do not reach all affected peripheral tissues. HD affects coordination and leads to cognitive decline and psychiatric problems.

"While we had hoped that this drug would ultimately work for patients in desperate need of treatment, those of us in the field have been aware that we need less invasive treatments for HD that are more likely to be easily tolerated," said Ellerby. "I don't know if we'll be able to do that with this small molecule, but at this point it does show potential and we look forward to evaluating its effects in pre-clinical experiments."

Credit: 
Buck Institute for Research on Aging

Sloan Kettering Institute scientists learn what fuels the 'natural killers' of the immune system

Despite a name straight from a Tarantino movie, natural killer (NK) cells are your allies when it comes to fighting infections and cancer. If T cells are like a team of specialist doctors in an emergency room, NK cells are the paramedics: They arrive first on the scene and perform damage control until reinforcements arrive.

Part of our innate immune system, which dispatches these first responders, NK cells are primed from birth to recognize and respond to danger. Learning what fuels NK cells is an active area of research in immunology, with important clinical implications.

"There's a lot of interest right now in NK cells as a potential target of immunotherapy," says Joseph Sun, an immunologist in the Sloan Kettering Institute. "The more we can understand what drives these cells, the better we can program them to fight disease."

Despite a name straight from a Tarantino movie, natural killer (NK) cells are your allies when it comes to fighting infections and cancer. If T cells are like a team of specialist doctors in an emergency room, NK cells are the paramedics: They arrive first on the scene and perform damage control until reinforcements arrive.

Part of our innate immune system, which dispatches these first responders, NK cells are primed from birth to recognize and respond to danger. Learning what fuels NK cells is an active area of research in immunology, with important clinical implications.

"There's a lot of interest right now in NK cells as a potential target of immunotherapy," says Joseph Sun, an immunologist in the Sloan Kettering Institute. "The more we can understand what drives these cells, the better we can program them to fight disease."

First in Line

Previous work from researchers at MSK and elsewhere has shown that T cells rely on aerobic glycolysis to carry out their protective functions. But whether NK cells depend on this form of metabolism to power their own activities was not known.

Because Dr. Sun and his colleagues studied NK cells in animals instead of a dish, they could establish what type of metabolism NK cells use and compare it to T cells in a natural setting. They found that NK cells ramp up aerobic glycolysis about five days prior to when T cells respond with their own glycolytic surge.

"This fits with the idea that NK cells are innate immune cells that are really critical for mounting a rapid response," Dr. Sheppard says.

The findings are relevant to ongoing efforts to use NK cells as immunotherapy in people with cancer and other conditions. In particular, they have implications for using NK cells as a form of cell therapy -- when cells are grown outside a patient and then infused back into the patient's blood.

"If you're growing these cells in a dish and you push them to divide too rapidly, they may not have as much potential to undergo aerobic glycolysis when you put them into a patient," Dr. Sheppard says.

The takeaway for researchers designing clinical trials is this: They must find a balance between encouraging NK cells to multiply and preserving their stamina. These NK cells are the paramedics of our immune system, so it's important to keep them speedy and responsive.

Credit: 
Memorial Sloan Kettering Cancer Center

How an elephant's trunk manipulates air to eat and drink

image: Andrew Schulz led the study as a Georgia Tech mechanical engineering Ph.D. student.

Image: 
Andrew Schultz, Georgia Tech

New research from the Georgia Institute of Technology finds that elephants dilate their nostrils in order to create more space in their trunks, allowing them to store up to nine liters of water. They can also suck up three liters per second -- a speed 30 times faster than a human sneeze (150 meters per second/330 mph).

The Georgia Tech College of Engineering study sought to better understand the physics of how elephants use their trunks to move and manipulate air, water, food and other objects. They also sought to learn if the mechanics could inspire the creation of more efficient robots that use air motion to hold and move things.

While octopus use jets of water to move and archer fish shoot water above the surface to catch insects, the Georgia Tech researchers found that elephants are the only animals able to use suction on land and underwater.

The paper, "Suction feeding by elephants," is published in the Journal of the Royal Society Interface.

"An elephant eats about 400 pounds of food a day, but very little is known about how they use their trunks to pick up lightweight food and water for 18 hours, every day," said Georgia Tech mechanical engineering Ph.D. student Andrew Schulz, who led the study. "It turns out their trunks act like suitcases, capable of expanding when necessary."

Schulz and the Georgia Tech team worked with veterinarians at Zoo Atlanta, studying elephants as they ate various foods. For large rutabaga cubes, for example, the animal grabbed and collected them. It sucked up smaller cubes and made a loud vacuuming sound, or the sound of a person slurping noodles, before transferring the vegetables to its mouth.

To learn more about suction, the researchers gave elephants a tortilla chip and measured the applied force. Sometimes the animal pressed down on the chip and breathed in, suspending the chip on the tip of trunk without breaking it. It was similar to a person inhaling a piece of paper onto their mouth. Other times the elephant applied suction from a distance, drawing the chip to the edge of its trunk.

"An elephant uses its trunk like a Swiss Army Knife," said David Hu, Schulz's advisor and a professor in Georgia Tech's George W. Woodruff School of Mechanical Engineering. "It can detect scents and grab things. Other times it blows objects away like a leaf blower or sniffs them in like a vacuum."

By watching elephants inhale liquid from an aquarium, the team was able to time the durations and measure volume. In just 1.5 seconds, the trunk sucked up 3.7 liters, the equivalent of 20 toilets flushing simultaneously.

An ultrasonic probe was used to take trunk wall measurements and see how the trunk's inner muscles work. By contracting those muscles, the animal dilates its nostrils up to 30 percent. This decreases the thickness of the walls and expands nasal volume by 64 percent.

"At first it didn't make sense: an elephant's nasal passage is relatively small and it was inhaling more water than it should," said Schulz. "It wasn't until we saw the ultrasonographic images and watched the nostrils expand that we realized how they did it. Air makes the walls open, and the animal can store far more water than we originally estimated."

Based on the pressures applied, Schulz and the team suggest that elephants inhale at speeds that are comparable to Japan's 300-mph bullet trains.

Schulz said these unique characteristics have applications in soft robotics and conservation efforts.

"By investigating the mechanics and physics behind trunk muscle movements, we can apply the physical mechanisms -- combinations of suction and grasping -- to find new ways to build robots," Schulz said. "In the meantime, the African elephant is now listed as endangered because of poaching and loss of habitat. Its trunk makes it a unique species to study. By learning more about them, we can learn how to better conserve elephants in the wild."

Credit: 
Georgia Institute of Technology

New evidence may change timeline for when people first arrived in North America

image: Andrew Somerville made an unexpected discovery while studying the origins of agriculture.

Image: 
Christopher Gannon, Iowa State University

AMES, Iowa - An unexpected discovery by an Iowa State University researcher suggests that the first humans may have arrived in North America more than 30,000 years ago - nearly 20,000 years earlier than originally thought.

Andrew Somerville, an assistant professor of anthropology in world languages and cultures, says he and his colleagues made the discovery while studying the origins of agriculture in the Tehuacan Valley in Mexico. As part of that work, they wanted to establish a date for the earliest human occupation of the Coxcatlan Cave in the valley, so they obtained radiocarbon dates for several rabbit and deer bones that were collected from the cave in the 1960s as part of the Tehuacan Archaeological-Botanical Project. The dates for the bones suddenly took Somerville and his colleagues in a different direction with their work.

The date ranges for the bone samples from the base of the cave ranged from 33,448 to 28,279 years old. The results are published in the academic journal Latin American Antiquity. Somerville says even though previous studies had not dated items from the bottom of the cave, he was not expecting such old ages. The findings add to the debate over a long-standing theory that the first humans crossed the Bering Land Bridge into the Americas 13,000 years ago.

"We weren't trying to weigh in on this debate or even find really old samples. We were just trying to situate our agricultural study with a firmer timeline," Somerville said. "We were surprised to find these really old dates at the bottom of the cave, and it means that we need to take a closer look at the artifacts recovered from those levels."

Somerville says the findings provide researchers with a better understanding of the chronology of the region. Previous studies relied on charcoal and plant samples, but he says the bones were a better material for dating. However, questions still remain. Most importantly, is there a human link to the bottom layer of the cave where the bones were found?

To answer that question, Somerville and Matthew Hill, ISU associate professor of anthropology, plan to take a closer look at the bone samples for evidence of cut marks that indicate the bones were butchered by a stone tool or human, or thermal alternations that suggest the bones were boiled or roasted over fire. He says the possible stone tools from the early levels of the cave may also yield clues.

"Determining whether the stone artifacts were products of human manufacture or if they were just naturally chipped stones would be one way to get to the bottom of this," Somerville said. "If we can find strong evidence that humans did in fact make and use these tools, that's another way we can move forward."

Year-long journey to even find the bones

Not only was this discovery unexpected, but the process of tracking down the animal bones to take samples was more than Somerville anticipated. The collection of artifacts from the 1960s Tehuacan Archaeological-Botanical Project was distributed to different museums and labs in Mexico and the United States, and it was unclear where the animal bones were sent.

After a year of emails and cold calls, Somerville and his collaborator, Isabel Casar from the National Autonomous University of Mexico, had a potential lead for a lab in Mexico City. The lab director, Joaquin Arroyo-Cabrales, agreed to give Somerville and Casar a tour to help search for the missing collection. The tour proved to be beneficial. Among the countless boxes of artifacts, they found what they were looking for.

"Having spent months trying to locate the bones, we were excited to find them tucked away on the bottom shelf in a dark corner of the lab," Somerville said. "At the time, we felt that was a great discovery, we had no idea it would lead to this."

Once he located the bones, Somerville got permission from the Mexican government to take small samples - about 3/4 inch in length and 1/4 inch in width - from 17 bones (eight rabbits and nine deer) for radiocarbon dating. If closer examination of the bones provides evidence of a human link, Somerville says it will change what we know about the timing and how the first people came to America.

"Pushing the arrival of humans in North America back to over 30,000 years ago would mean that humans were already in North America prior to the period of the Last Glacial Maximum, when the Ice Age was at its absolute worst," Somerville said. "Large parts of North America would have been inhospitable to human populations. The glaciers would have completely blocked any passage over land coming from Alaska and Canada, which means people probably would have had to come to the Americas by boats down the Pacific coast."

Credit: 
Iowa State University

Innovative surgical simulator is a significant advance in training trauma teams

video: The Advanced Modular Manikin Enhances Surgical Team Experience During a Trauma Simulation: Results of a Single-Blinded Randomized Trial.

Image: 
American College of Surgeons

Key takeaways

The surgical simulator can realistically simulate multiple trauma scenarios at once, compared with traditional simulators that can only simulate one or a limited number of conditions.

Trauma team members who tested the simulator preferred it for its realism, physiologic responses, and feedback.

The benefits of this innovative simulator may be able to extend to other surgical procedures and settings.

CHICAGO (June 1, 2021): Simulators have long been used for training surgeons and surgical teams, but traditional simulator platforms typically have a built-in limitation: they often simulate one or a limited number of conditions that require performance of isolated tasks, such as placing an intravenous catheter, instead of simulating and providing opportunities for feedback on the performance of multiple interventions that a trauma victim may require at the same time. To overcome this limitation, the Advanced Modular Manikin (AMM), an innovative simulation platform that allows integration of other simulation devices, was developed and field testing was conducted, with support from the Department of Defense (DoD).

The DoD subcontracted with the American College of Surgeons (ACS) Division of Education to conduct field testing of the AMM. The results have been published online in advance of print by the Journal of the American College of Surgeons. Robert M. Sweet, MD, FACS, MAMSE, of the department of surgery at the University of Washington, served as principal investigator (PI) of the DoD contract to build the AMM. Ajit K. Sachdeva, MD, FACS, FRCSC, FSACME, MAMSE, Director, Division of Education, American College of Surgeons, served as the PI for the subcontract to conduct field testing.

The investigators reported that members of trauma teams at a testing site preferred the integrated AMM platform including a "peripheral" simulator over the "peripheral" simulator alone, in terms of realism, physiologic responses, and feedback they receive on the multiple and overlapping interventions they perform on a simulated trauma patient. Corresponding study author Dimitrios Stefanidis, MD, PhD, FACS, FASMBS, FSSH, of the department of surgery at Indiana University School of Medicine, Indianapolis, described the AMM as "more of a platform rather than a manikin."

The DoD supported development of the AMM through a contract with the University of Minnesota and the University of Washington. The goal was to create an open-source simulation platform that permits integration of a number of simulators, known as "peripherals," into a singular, comprehensive training platform. A Steering Committee composed of leaders and staff of the ACS Division of Education and the Research and Development Committee of the ACS-Accredited Education Institutes, along with leaders from the Development Team of the AMM Project created the model for field testing the AMM.

"The AMM platform, along with the 'peripherals,' can help to address the need for more robust simulators that focus on open procedures and interprofessional teamwork," Dr. Sachdeva explained. "The ability to integrate the anatomic and physiologic elements of the simulation is an important advance. The experience with the trauma scenario may readily be extended to other surgical procedures and settings."

Corresponding author Dr. Stefanidis explained that with most traditional simulators, instructors have to manipulate vital signs to respond to specific actions of the learner. He pointed out that the AMM promotes "a learner experience that is more based on the actual physiology of what's happening to the patient." The AMM platform allows different members of the trauma team to perform different tasks concurrently--one inserts a breathing tube, another starts an intravenous line, another performs a splenectomy. "All of these interventions impact the physiology," he said.

The researchers evaluated team experience ratings of 14 trauma teams consisting of 42 individual members who performed tasks on the integrated AMM platform and the standalone "peripheral" simulator. Team experience ratings were higher for the integrated AMM platform as compared with the standalone "peripheral" simulator. Among the team members, surgeons and first responders rated their experience significantly higher than anesthesiologists, who noted higher workload ratings. In focus groups, the team members said they preferred the AMM platform because of its increased realism, and for the way it responded physiologically to their actions and the feedback it provided.

Dr. Stefanidis explained how the AMM can potentially aid in training trauma teams. "Trauma requires exemplary teamwork," he said. "When we see patients who are injured, there are typically multiple providers who take care of them simultaneously--trauma surgeons, emergency room physicians, anesthesiologists, orthopedic surgeons, neurosurgeons, nurses, respiratory therapists, etc. So, it's extremely important to also be able to train these teams in a low-stress simulation environment, such as by using the AMM, where they can hone their skills, individually and as a team, and perform at their best when faced with the very high-stress clinical environment."

The AMM platform offers other benefits for improving the training and proficiency of trauma teams, said the field study PI, Dr. Sachdeva. "Specific training models could be standardized and the situation made increasingly complex in this safe simulation environment," he said.

Credit: 
American College of Surgeons

Analysis reveals global 'hot spots' where new coronaviruses may emerge

Berkeley -- Global land-use changes -- including forest fragmentation, agricultural expansion and concentrated livestock production -- are creating "hot spots" favorable for bats that carry coronaviruses and where conditions are ripe for the diseases to jump from bats to humans, finds an analysis published this week by researchers at the University of California, Berkeley, the Politecnico di Milano (Polytechnic University of Milan) and Massey University of New Zealand.

While the exact origins of the SARS-CoV-2 virus remain unclear, scientists believe that the disease likely emerged when a virus that infects horseshoe bats was able to jump to humans, either directly through wildlife-to-human contact, or indirectly by first infecting an intermediate animal host, such as the pangolin, sometimes known as the scaly anteater. Horseshoe bats are known to carry a variety of coronaviruses, including strains that are genetically similar to ones that cause COVID-19 and severe acute respiratory syndrome (SARS).

The new study used remote sensing to analyze land use patterns throughout the horseshoe bat's range, which extends from Western Europe through Southeast Asia. By identifying areas of forest fragmentation, human settlement and agricultural and livestock production, and comparing these to known horseshoe bat habitats, they identified potential hot spots where habitat is favorable for these bat species, and where these so-called zoonotic viruses could potentially jump from bats to humans. The analysis also identified locations that could become easily become hot spots with changes in land use.

"Land use changes can have an important impact on human health, both because we are modifying the environment, but also because they can increase our exposure to zoonotic disease," said study co-author Paolo D'Odorico, a professor of environmental science, policy and management at UC Berkeley. "Every formal land use change should be evaluated not only for the environmental and social impacts on resources such as carbon stocks, microclimate and water availability, but also for the potential chain reactions that could impact human health."

Most of the current hot spots are clustered in China, where a growing demand for meat products has driven the expansion of large-scale, industrial livestock farming. Concentrated livestock production is particularly concerning because the practice brings together large populations of genetically similar, often immune-suppressed animals that are highly vulnerable to disease outbreaks, the researchers said.

The analysis also found that parts of Japan, the north Philippines and China south of Shanghai are at risk of becoming hot spots with further forest fragmentation, while parts of Indochina and Thailand may transition into hot spots with increases in livestock production.

"The analyses aimed to identify the possible emergence of new hot spots in response to an increase in one of three land use attributes, highlighting both the areas that could become suitable for spillover and the type of land use change that could induce hot spot activation," said study co-author Maria Cristina Rulli, a professor in hydrology and water and food security at the Politecnico di Milano in Italy. "We hope these results could be useful for identifying region-specific targeted interventions needed to increase resilience to coronavirus spillovers."

Human encroachment into natural habitat can also can indirectly increase exposure to zoonotic disease by reducing valuable biodiversity. When forest lands become fragmented and natural habitats are destroyed, species that require very specific habitat to survive, called "specialists," may dwindle or even go extinct. Without competition from specialists, "generalist" species, which are less picky about their habitat, can take over.

Horseshoe bats are a generalist species and have often been observed in areas characterized by human disturbance. Earlier work by Rulli, D'Odorico and study co-author David Hayman has also linked forest fragmentation and habitat destruction in Africa to outbreaks of the Ebola virus.

"By creating conditions that are disadvantageous to specialist species, generalist species are able to thrive," D'Odorico said. "While we are unable to directly trace the transmission of SARS-CoV-2 from wildlife to humans, we do know that the type of land use change that brings humans into the picture is typically associated with the presence of these bats who are known to carry the virus."

While China has been a leader in tree planting and other greening efforts over the past two decades, many of the trees have been planted in discontinuous land areas or forest fragments. To tilt the ecological balance back in favor of specialist species, creating continuous areas of forest cover and wildlife corridors are more important than increasing total tree cover.

"Human health is intertwined with environmental health and also animal health," D'Odorico said. "Our study is one of the first to connect the dots and really drill down into the geographic data on land use to see how humans are coming into contact with species that might be carriers."

Credit: 
University of California - Berkeley

Ethnically diverse research identifies more genetic markers linked to diabetes

image: Cassandra Spracklen is an assistant professor of biostatistics and epidemiology in the UMass Amherst School of Public Health and Health Sciences.

Image: 
UMass Amherst

By ensuring ethnic diversity in a largescale genetic study, an international team of researchers, including a University of Massachusetts Amherst genetic epidemiologist, has identified more regions of the genome linked to type 2 diabetes-related traits.

The findings, published May 31 in Nature Genetics, broaden the understanding of the biological basis of type 2 diabetes and demonstrate that expanding research into different ancestries yields better results. Ultimately the goal is to improve patient care worldwide by identifying genetic targets to treat the chronic metabolic disorder. Type 2 diabetes affects and sometimes debilitates more than 460 million adults worldwide, according to the International Diabetes Federation. About 1.5 million deaths were directly caused by diabetes in 2019, the World Health Organization reports.

Cassandra Spracklen, assistant professor of biostatistics and epidemiology in the UMass Amherst School of Public Health and Health Sciences, is part of the international MAGIC collaboration. That group of more than 400 global academics conducted the genome-wide association meta-analysis, led by the University of Exeter in the United Kingdom.

"Our findings matter because we're moving toward using genetic scores to weigh up a person's risk of diabetes," says Spracklen, one of the paper's lead authors.

Up to now, some 88% of genomic research of this type has been conducted in white European-ancestry populations. "We know that scores developed exclusively in individuals of one ancestry don't work well in people of a different ancestry," Spracklen adds.

The team analyzed data across a wide range of cohorts, encompassing more than 280,000 people without diabetes. Researchers looked at glycemic traits, which are used to diagnose diabetes and monitor sugar and insulin levels in the blood.

The researchers incorporated 30 percent of the overall cohort with individuals of East Asian, Hispanic, African-American, South Asian and sub-Saharan African origin. By doing so, they discovered 24 more loci - or regions of the genome - linked to glycemic traits than if they had conducted the research in Europeans alone.

"Type 2 diabetes is an increasingly huge global health challenge- with most of the biggest increases occurring outside of Europe," says Inês Barroso, professor of diabetes at the University of Exeter, who led the research. "While there are a lot of shared genetic factors between different countries and cultures, our research tells us that they do differ in ways that we need to understand. It's critical to ensuring we can deliver a precision diabetes medicine approach that optimizes treatment and care for everyone."

First author Ji Chen, a data science expert at the University of Exeter, adds: "Beyond the moral arguments for ensuring research is reflective of global populations, our work demonstrates that this approach generates better results."

Though some loci were not detected in all ancestries, the team found it is useful to capture information about the glycemic trait in individual ancestries.

"This is important as increasingly healthcare is moving toward a more precise approach," Spracklen says. "Failing to account for genetic variation according to ancestry will impact our ability to accurately diagnose diabetes."

Credit: 
University of Massachusetts Amherst

Suitable thread type and stitch density for Ghanaian public basic school uniforms

The quality of a sewn garment is dependent on the quality of its seams that are the basic structural element. The factors affecting seam quality in garments include sewing thread type and stitch density. Making the right choice of these helps in getting quality seams in garments. However, the choice of suitable sewing threads and stitch densities for particular fabrics can only be determined through testing.

Dr. Patience Danquah Monnie, from the University of Cape Coast, Ghana, with fellow researchers, conducted research aimed to determine sewing thread brand and stitch density suitable for seams for a selected fabric (79% polyester and 21% cotton) for public basic school uniforms in Ghana. For the research, a 2×3 factorial design was employed involving two brands of sewing threads labelled A? and B?, with three range of stitch density, 10, 12 and 14. There were a total of 81 specimens prepared from the selected fabric. The parameters, under investigation in the research, included fabric weight, strength and elongation, seam strength, seam elongation and efficiency.

With the help of the Predictive Analytic Software (SPSS), the gathered data was analyzed. For the fabric, means and standard deviations of the yarn count, weight, strength, elongation and the linear density of the sewing threads were determined. Analysis of Variance and Independent samples t-test at 0.05 alpha levels were employed in testing the hypotheses.

The researchers found that the differences for seam strength, efficiency and elongation were significant for the two sewing thread brands and the three stitch densities in both warp and weft directions of the fabric sample. The sewing thread brand B? with stitch density 14 performed best in terms of seam strength, elongation and efficiency. Concluding the research findings, sewing thread brand B? and 14 stiches per inch (SPI) were found suitable for seams for the fabric selected for Ghanaian public basic school uniforms as they had greater strength, efficiency and elongation than thread A? and SPIs 10 and 12.

It is recommended that members of the Association of Ghana Apparel Manufacturers (AGAM) and Tailors and Dressmakers Association use sewing thread brand B? with 14 SPI in uniform production in Ghana to achieve quality seams in uniforms.

Credit: 
Bentham Science Publishers

Brain activity reveals when white lies are selfish

image: People with higher selfish motivation in Pareto white lies showed increased VMPFC activity (red) for selfish lies and increased RMPFC activity (blue) for altruistic lies. In addition, their neural representations in the VMPFC were similar between selfish and Pareto lies, but those in the RMPFC were dissimilar between altruistic and Pareto lies.

Image: 
Kim and Kim, JNeurosci 2021

You may think a little white lie about a bad haircut is strictly for your friend's benefit, but your brain activity says otherwise. Distinct activity patterns in the prefrontal cortex reveal when a white lie has selfish motives, according to new research published in JNeurosci.

White lies -- formally called Pareto lies -- can benefit both parties, but their true motives are encoded by the medial prefrontal cortex (MPFC). This brain region computes the value of different social behaviors, with some subregions focusing on internal motivations and others on external ones. Kim and Kim predicted activity patterns in these subregions could elucidate the true motive behind white lies.

The research team deployed a stand in for white lies, having participants tell lies to earn a reward for themselves, for an unknown person, or for both. The team used fMRI to measure the MPFC activity of participants and, by comparing the brain activity of white lies with the selfish and altruistic lies, they could predict the true motivation for the lies. Selfish white lies elicited greater activity in the ventral and rostral MPFC. Activity patterns in the ventral subregion was similar to that of selfish lies, while activity patterns in the rostral subregion was dissimilar to altruistic lies.

Credit: 
Society for Neuroscience

Tai chi about equal to conventional exercise for reducing belly fat in middle aged and older adults

Below please find summaries of new articles that will be published in the next issue of Annals of Internal Medicine. The summaries are not intended to substitute for the full articles as a source of information. This information is under strict embargo and by taking it into possession, media representatives are committing to the terms of the embargo not only on their own behalf, but also on behalf of the organization they represent.

1. Tai chi about equal to conventional exercise for reducing belly fat in middle-aged and older adults

HD video soundbites of the author discussing the findings are available to download at http://www.dssimon.com/MM/ACP-tai-chi

Abstract: https://www.acpjournals.org/doi/10.7326/M20-7014

URL goes live when the embargo lifts

A randomized controlled trial found that tai chi is about as effective as conventional exercise for reducing waist circumference in middle-aged and older adults with central obesity. Central obesity, or weight carried around the midsection, is a major manifestation of metabolic syndrome and a common health problem in this cohort. The findings are published in Annals of Internal Medicine.

Tai chi is a form of mind-body exercise often described as "meditation in motion." It is practiced in many Asian communities and is becoming increasingly popular in Western countries, with more than 2 million people practicing it in the United States. While it is known to be a suitable activity for older people including those who are not active, there previously has been little evidence on tai chi's health benefits.

Researchers from the University of Hong Kong randomly assigned more than 500 adults over 50 with central obesity to a regimen of tai chi, conventional exercise, or no exercise over 3 months. Participants in the tai chi and exercise groups met for instructor-led workouts for 1 hour 3 times a week for 12 weeks. The tai chi program consisted of the Yang style of tai chi, the most common style adopted in the literature, and the conventional exercise program consisted of brisk walking and strength training activities. Waist circumference and other indicators of metabolic health were measured at baseline, 12 weeks, and 38 weeks. The researchers found that both the tai chi intervention and conventional exercise intervention group had reductions in waist circumference, relative to control. The reduction in waist circumference had a favorable impact on HDL cholesterol, or so-called good cholesterol, but did not translate into detectable differences in fasting glucose or blood pressure.

According to the study authors, their findings are good news for middle-aged and older adults who have central obesity but may be averse to conventional exercise due to preference or limited mobility.

Media contacts: For an embargoed PDF, please contact Angela Collom at acollom@acponline.org. To speak with the corresponding author, Parco M. Siu, PhD, please email pmsiu@hku.hk.

2. COVID-19 disease and cost burden especially high among older adults, particularly those of color

Abstract: https://www.acpjournals.org/doi/10.7326/M21-1102

Editorial: https://www.acpjournals.org/doi/10.7326/M21-2187

URL goes live when the embargo lifts

A retrospective observational study found that the COVID-19 disease burden among adults aged 65 years or older was substantially higher than in the general U.S. population, especially among those of non-White race/ethnicity. The findings are published in Annals of Internal Medicine.

Researchers from the Centers for Disease Control and Prevention (CDC) used Medicare claims data for 28.1 million fee-for-service beneficiaries to examine the characteristics and medical costs of older adults who were diagnosed with COVID-19 from April through December 2020. The data showed that the hospitalization rate was more than 60 times higher, and the mortality rate was 2.5 times higher (4.2%) for older adults compared to the general population. The average cost per COVID-19-related hospitalization was considerable ($21,752) among those older adults, but the costs of COVID-19-related hospitalization decreased with age for the 5 medical outcomes considered. According to the authors, possible reasons include higher mortality rates among older patients, resulting in shorter hospital stays and lower costs (inpatient length of stay was shorter among patients aged ?75 years than among those aged 65 to 74 years); the lower likelihood of younger adults to become seriously ill (those who reached the point of hospitalization may have had substantial risks and complications); and less aggressive care with increasing age.

The data also showed that people of color accounted for a disproportionate share of hospitalizations and deaths during the pandemic. Black, Hispanic, and Asian/Pacific Islander older adults had higher probability of death and receiving ventilator support during hospitalization than non-Hispanic White patients. This finding highlights the importance of identifying effective strategies to promote COVID-19 vaccine uptake among non-White persons aged 65 years or older to mitigate the increased disease and economic burden.

Media contacts: For an embargoed PDF, please contact Angela Collom at acollom@acponline.org. To speak with the corresponding author, Yuping Tsai, PhD, please contact Kristen Nordlund at hok4@cdc.gov.

3. Financial penalties not associated with improvements in quality of outpatient dialysis centers

Abstract: https://www.acpjournals.org/doi/10.7326/M20-6662

URL goes live when the embargo lifts

An observational study found that performance-based financial penalties under the Centers for Medicare & Medicaid Services (CMS) End-Stage Renal Disease Quality Incentive Program (ESRD-QIP) were not associated with improvement in the quality of outpatient dialysis centers. The penalties also did not seem to affect any of the individual outcome measures studied. The findings are published in Annals of Internal Medicine.

There are 500,000 patients on dialysis in the United States today. CMS spends about $100,000 per person/year for patients on dialysis, which is about 6-7% of the total Medicare budget and almost 1% of the total federal government budget. In 2012, the CMS started levying performance-based financial penalties against outpatient dialysis centers under the mandatory ESRD-QIP program. For many reasons, including the complexity of the program, it's effect on quality has never been measured.

Researchers from the Center for Healthcare Outcomes and Policy and the University of Michigan studied publicly available Medicare data to determine whether penalization was associated with improvement in dialysis center quality. The data showed that 1,109 (19.0%) outpatient dialysis centers received penalties in 2017 based on performance in 2015. Penalization was not associated with improvement in total performance scores in 2017 or 2018. This was consistent across a range of different types of centers and individual quality metrics included in the program's total performance score. According to the authors, these findings are significant because they can help Medicare improve the program, which has broad implications for the quality of outpatient dialysis in the United States.

Media contacts: For an embargoed PDF, please contact Angela Collom at acollom@acponline.org. To speak with the corresponding author, Kyle H. Sheetz, MD, MSc, please contact Kara Gavin at kegavin@med.umich.edu.

Credit: 
American College of Physicians