Culture

Housing First proves cost effective especially for the most-vulnerable homeless group

image: According to the researchers, investing in Housing First for people struggling with homelessness and severe mental illness is the most cost-effective way of spending limited public dollars to help these individuals regain and keep permanent housing.

Image: 
Eric Latimer

Canadians spend big money dealing with the consequences of homelessness, but the money spent could be far more effective. According to a new McGill-led analysis, housing homeless people with severe mental illness is even more cost-effective than housing homeless people with moderate needs. A Housing-First strategy aimed at helping these individuals regain and keep permanent housing generates savings equal to about two-thirds of its cost.

Housing First is an approach to addressing homelessness that centres on moving people into housing immediately and then providing additional services and supports as needed. While previous research has shown that Housing First is an effective way to get homeless people housing and help them keep it, this study published in Psychiatric Service provides the largest evaluation of cost-effectiveness of the Housing First approach for homeless individuals with severe mental illness.

In Canada, the annual costs to society for persons struggling with both homelessness and severe mental illness are high - about $75,000 per year, compared to about $51,000 for homeless people with moderate needs. These costs are often related to health services, emergency shelters, and policing.

"Although provincial governments have been willing to support Housing First for people with moderate needs thanks to funding from the federal government, few have been willing to do the same out of their own budgets for people with the greatest needs, who struggle with severe mental illness," says lead author Eric Latimer, a Professor in the Department of Psychiatry at McGill University and Research Scientist at the Douglas Research Centre. "As part of planning for the post-COVID period, it would be a good investment for provincial governments to allocate more funds to Housing First programs to help this most vulnerable group regain permanent housing."

Their analysis, which covers five Canadian cities from coast to coast, including Moncton, Montreal, Toronto, Winnipeg, and Vancouver, shows that investing in Housing First for people with high needs is excellent value for money.

Housing First and how it works

Unlike conventional services that aim to transition people into housing over longer periods of time, Housing First offers rent subsidies and support services so people can enter housing right away, usually in residences integrated in communities. All of this works together to help lower the likelihood of people falling back into homelessness.

Reducing the cost of homelessness

According to the researchers, investing in Housing First for people struggling with homelessness and severe mental illness is the most cost-effective way of spending limited public dollars to help these individuals regain and keep permanent housing.

Most of the costs of Housing First for people struggling with severe mental illness are offset by savings in other areas like emergency shelters, reducing the price of the intervention from about $20,000 to $6,300 (69%) per person per year. For people with moderate needs, the intervention is less expensive, about $14,500, and the savings are smaller (46%), so the net cost is $7,900. The cost for one more day of stable housing is about $42, compared to $56 for people with moderate needs. In either case, Housing First costs about the same as many other housing interventions that provincial governments already pay for, while providing permanent instead of temporary housing.

"We know that Housing First is a cost-effective solution for people with moderate needs; this new research demonstrates that for people with the most needs, the savings are even more dramatic. You get more bang for your buck by serving this group, in terms of reducing costs of shelters, health visits, and incarcerations," says Latimer.

Credit: 
McGill University

Health IT improves engagement in preconception health to reduce racial disparities

New research from Boston Medical Center highlights the benefits of using health technology to engage African American and Black women earlier in preconception care in an effort to close the gap on racial disparities in birth outcomes and maternal mortality. Published in The Lancet Digital Health, findings showed that using Gabby, an online health technology system that delivers simulated care, increased the rate of maintaining and acting on identified preconception care risks by 16 percent after six months, compared to patients receiving a letter listing risks and suggesting a follow-up with a clinician. The results were maintained after 12 months.

In the United States, Black women have more than two times the increased risk of delivering a low birth weight infant, and four times the risk of maternal mortality as compared to white women. To address these racial disparities, researchers used the health information technology, Gabby, to help communicate key health messages to overcome barriers to providing health education and counseling. Gabby, an embodied conversational agent with computer-generated characters, simulates face-to-face conversation, allowing women to select the risks they want to discuss, learn about the importance of preconception health and listen to advice on how to take action.

"There is an overarching need to test new interventions in this high-risk population of women," said Brian Jack, MD, a family medicine physician at Boston Medical Center and director of the Boston University Center for Health System Design & Implementation. "It is now well established that mitigating a wide array of health risks at the time of conception can have profound and enduring effects not only on the health of the woman and her newborn, but also on the long-term health of children into adulthood. It is an important finding that a health information technology system can help to reduce these risks during the preconception period."

Prenatal care comes too late to impact the most critical time of embryonic development, especially for women who enter pregnancy with pre-existing conditions that could impact the health of both mother and baby. These conditions include physical and behavioral health conditions, exposure to risky medications or environmental conditions, genetic disorders, substance use disorder, unhealthy diet or weight, domestic abuse, or other concerns. By interacting with women to identify progress, give feedback and assess readiness, Gabby creates a customizable list of identified preconception care risks to assist in tracking progress.

The randomized trial included 528 women aged 18-34 who self-identified as African-American and/or Black and not pregnant, from 35 states around the United States. The women who received the intervention were provided with access to Gabby, that assessed 102 preconception risks and delivered 12 months of tailored dialogue using synthesized speech, nonverbal behavior, visual aids, and health behavior change techniques. The women who were a part of the control group received a letter listing their preconception risks and encouraged them to talk with a clinician.

"We wanted to create a way for patients to take control of their health outside of the doctor's office," said Jack, also a professor of family medicine at Boston University School of Medicine. "A digital conversation agent like Gabby allows for 24/7 access to accurate health information delivered through interactive dialogue based on best clinical practices."

Scalable health information technology like Gabby could be used as a population health tool to assist health systems to deliver preconception care to eligible women. It could also be used to assist clinicians by collecting data ahead of visits, informing patients of risks, improving patient-centered discussions, and directly addressing clinician time restraints.

Credit: 
Boston Medical Center

Study shows socioeconomic status linked to heart failure mortality in United States

image: UH Cleveland Medical Center

Image: 
University Hospitals

CLEVELAND -- Heart failure is a medical condition that results when the heart muscle is not strong enough to effectively circulate blood. A variety of treatments exist to address this disease, yet it continues to carry a poor prognosis. A new study from University Hospitals showed that a person's address can help predict their chance of mortality from heart disease.

The study found that the variability in heart failure mortality in the United States is at least partially explained by measures of wealth and socioeconomic status. In the United States, counties that have high rates of poverty and other measures of social deprivation also have higher rates of death from heart failure. Socioeconomic status may play an important role in heart failure because access to expensive medications and other therapies can be more available in affluent communities. Furthermore, areas with higher poverty levels may have reduced access to quality healthcare in general.

The study analyzed 1,254,991 heart failure deaths across 3,048 counties between 1999 and 2018. The investigators used multiple indicators of employment, poverty, income, housing and education to determine a person's level of socioeconomic deprivation.

This study highlights the importance of addressing socioeconomic factors to improve heart failure outcomes nationally.

"Analysis of trends in heart failure mortality shows that these disparities have persisted throughout the last two decades" said Graham Bevan, MD, a resident physician at University Hospitals and the first author of the study.

"Living in a particular county should not mean you're more likely to die from heart failure," said Sadeer G. Al-Kindi, MD, cardiologist with UH Harrington Heart & Vascular Institute. "University Hospitals has a history of addressing health care disparities in underserved communities and armed with the information from this study, we can thoughtfully create solutions to better serve these populations."

Credit: 
University Hospitals Cleveland Medical Center

How effective does a COVID-19 vaccine need to be to stop the pandemic?

image: What does it mean if a COVID19 vaccine were to offer 80 percent efficacy? From "Vaccine Efficacy Needed for a COVID-19 Coronavirus Vaccine to Prevent or Stop an Epidemic as the Sole Intervention," by Sarah M. Bartsch et al. https://doi.org/10.1016/j.amepre.2020.06.011

Image: 
PHICOR

Ann Arbor, August 25, 2020 - The American Journal of Preventive Medicine, published by Elsevier, is committed to publishing the most robust, evidence-based research and commentary on COVID-19 as they unfold to keep readers up to date and aware of issues relevant to community and individual health during this continually evolving global outbreak. All articles featured here are freely available.

New computational model finds that a COVID-19 vaccine will have to be at least 80 percent effective to achieve a complete "return to normal"

Researchers around the world are racing to find a COVID-19 vaccine to eliminate the need for social distancing, mask wearing, and limits on interpersonal gatherings. In a new study, a computer simulation model found that if 75 percent of the population gets vaccinated, the vaccine has to have an efficacy (ability to protect against infection) of at least 70 percent to prevent an epidemic and at least 80 percent to extinguish an ongoing epidemic. If only 60 percent of the population gets vaccinated, the thresholds are even higher, around 80 percent to prevent an epidemic and 100 percent to extinguish an ongoing epidemic. "Some are pushing for a vaccine to come out as quickly as possible so that life can 'return to normal.' However, we have to set appropriate expectations. Just because a vaccine comes out doesn't mean you can go back to life as it was before the pandemic," notes lead investigator Bruce Y. Lee, MD, MBA, Public Health Informatics, Computational and Operations Research, CUNY Graduate School of Public Health and Health Policy, New York, NY, USA. "It is important to remember that a vaccine is like many other products - what matters is not just that a product is available, but also how effective it is." The investigators say the results of their study can provide targets for vaccine developers as well as shape expectations for policy makers, business leaders, and the general public.

"Vaccine Efficacy Needed for a COVID-19 Coronavirus Vaccine to Prevent or Stop an Epidemic as the Sole Intervention," by Sarah M. Bartsch, MPH, Kelly J. O'Shea, BSFS, Marie C. Ferguson, MSPH, Maria Elena Bottazzi, PhD, Patrick T. Wedlock, MSPH, Ulrich Strych, PhD, James A. McKinnell, MD, Sheryl S. Siegmund, MS, Sarah N. Cox, MSPH, Peter J. Hotez, MD, PhD, and Bruce Y. Lee, MD, MBA (https://doi.org/10.1016/j.amepre.2020.06.011). Author contact: Bruce Y. Lee at bruceleemdmba@gmail.com.

Like indoor smoking bans, mask wearing should be considered a fundamental occupational health protection

Mask requirements to prevent the spread of COVID-19 are often presented as an infringement on individual rights. In this article, the authors note that bans against indoor smoking were enacted to protect workers' health on the basis that individual rights do not extend to the imposition of risk on others. Similarly, masks should be required to protect workers by limiting the diffusion of particulates that carry COVID-19. Clear and consistent government policies on indoor mask wearing would remove some of the burden from business owners, but in the absence of policy, business owners can also require masks, knowing that healthy workplaces have higher productivity. "Much like stepping outside to smoke, wearing a mask until the pandemic is resolved may feel like a nuisance; however both pose a relatively small inconvenience when compared to workers' rights to a healthy, safe work environment," observes lead author Mike Vuolo, PhD, Department of Sociology, The Ohio State University, Columbus, OH, USA.

"COVID-19 Mask Requirements as a Workers' Rights Issue: Parallels to Smoking Bans," by Mike Vuolo, PhD, Brian C. Kelly, PhD, and Vincent Roscigno, PhD (https://doi.org/10.1016/j.amepre.2020.07.001). Author contact: Mike Vuolo at : +1 979 985 0185 or vuolo.2@osu.edu.

Associations between social vulnerabilities and increased COVID-19 infection vary among US counties, suggesting need for different strategies to address the pandemic

A new study confirms that social vulnerability is associated with increased prevalence of COVID-19 infection in the United States. However, the specific vulnerability factors most important in predicting infection - minority status and language, household composition and disability, and transportation and housing - vary among regions and counties. For example, in the Pacific Northwest, minority status and language and household composition and disability were more predictive of COVID-19 case counts. In the Gulf Coast states, housing and transportation were more predictive. Lead investigator Ibraheem M. Karaye, MD, DrPH, Epidemiology Program, University of Delaware, Newark, DE, USA, explains, "In the US, social vulnerability to COVID-19 is highly 'local,' so while a coordinated Federal response is needed to control COVID-19 nationally, local jurisdictions should, where possible given limited funding and staff, address specific vulnerable groups with interventions designed to mitigate the spread of the pandemic."

"The Impact of Social Vulnerability on COVID-19 in the U.S.: An Analysis of Spatially Varying Relationships," by Ibraheem M. Karaye, MD, DrPH, and Jennifer A. Horney, PhD, MPH (https://doi.org/10.1016/j.amepre.2020.06.006). Author contact: Ibraheem M. Karaye at karaye@udel.edu.

Study identifies racial and socioeconomic disparities in testing and positive results for COVID-19 in New York City

A new study has found that in New York City COVID-19 testing has not been proportional to need. Researchers conducted a statistical analysis on the relationship between race and socioeconomic factors such as household income, gross rent, poverty, education, working class status, and the rate of testing for the virus, and the proportion of positive results. They found that, adjusted for population, the total number of tests performed significantly increased in neighborhoods with more white residents, while the highest proportion of positive tests were recorded in nonwhite neighborhoods and in areas defined by lower socioeconomic status. "COVID-19 testing is a key component of public health efforts to contain the pandemic," says lead investigator Emanuela Taioli, MD, PhD, Director, Institute for Translational Epidemiology;, Icahn School of Medicine at Mount Sinai; and Center for Disaster Health, Trauma, and Resilience, New York, NY, USA. "Our findings show that in New York City, there is an urgent need for widespread testing and public health outreach for the most vulnerable communities."

"Disparities in COVID-19 Testing and Positivity in New York City," by Wil Lieberman-Cribbin, MPH, Stephanie Tuminello, MPH, Raja M. Flores, MD, and Emanuela Taioli, MD, PhD (https://doi.org/10.1016/j.amepre.2020.06.005). Author contact: Emanuela Taioli at emanuela.taioli@mountsinai.org.

Increase in mental distress during the rise of the COVID-19 pandemic associated with greater use of traditional and social media to learn about the disease

In a nationally representative sample of US adults surveyed between March 10 and March 31, 2020, researchers found that individuals who reported spending more time on social media and consulting a greater number of traditional media sources to learn about the disease were more likely to report higher levels of mental distress than those with less media exposure. People who responded later in the survey, as a national emergency was declared and schools and businesses closed, reported greater media exposure and mental distress. "A pandemic of this scale in the era of social media is unprecedented. We need to consider how exposure to social media, and other sources of media like television or newspapers, might affect mental health during this time," says first author Kira E. Riehm, MSc, Department of Mental Health, Bloomberg School of Public Health, Johns Hopkins University, Baltimore, MD, USA. The researchers suggest that engaging with supportive friends and family online and seeking information only from evidence-based sources like the CDC or WHO could support mental health.

"Associations Between Media Exposure and Mental Distress Among U.S. Adults at the Beginning of the COVID-19 Pandemic," by Kira E. Riehm, MSc, Calliope Holingue, PhD, Luther G. Kalb, PhD, Daniel Bennett, PhD, Arie Kapteyn, PhD, Qin Jiang, MA, Cindy Veldhuis, PhD, Renee M. Johnson, PhD, M. Daniele Fallin, PhD, Frauke Kreuter, PhD, Elizabeth A. Stuart, PhD, and Johannes Thrul, PhD (https://doi.org/10.1016/j.amepre.2020.06.008). Author contact: Kira E. Riehm at kriehm@jhu.edu.

People thought to have COVID-19 can face discrimination, regardless of actual disease status, leading to increased anxiety and depression

A national survey asked individuals if they were treated with less courtesy and respect than others; received poorer service at restaurants or stores; found people acted as if they were afraid of them; or felt threatened or harassed because they were suspected of having COVID-19. The rate of perceived discrimination, regardless of actual disease status, increased from March 2020, at the start of the pandemic, to April 2020. Asians and non-Hispanic blacks perceived more discrimination, as did individuals of any race or ethnicity who wore face masks. Perceiving discrimination was associated with increased anxiety and depression. Lead investigator Ying Liu, PhD, Center for Economic and Social Research, University of Southern California; Los Angeles, CA, USA, comments, "COVID-related discrimination is real and serious. It reflects social bias against racial and ethnic minorities and those who wear facemasks, and it disproportionately worsens mental despair in vulnerable populations. It is critical and timely for policy makers and the general public to become aware of this problem."

"Perceived Discrimination and Mental Distress amid the 2019 Coronavirus Disease Pandemic: Evidence from the Understanding America Study," by Ying Liu, PhD, Brian Karl Finch, PhD, Savannah G. Brenneke, MPH, Kyla Thomas, PhD, and PhuongThao D. Le, PhD, MPH (https://doi.org/10.1016/j.amepre.2020.06.007). Author contact: Ying Liu at liu.ying@usc.edu.

Credit: 
Elsevier

Rates of e-cigarette and marijuana use not associated with vaping-related lung injuries

Higher rates of e-cigarette and marijuana use in U.S. states did not result in more e-cigarette or vaping-related lung injuries (known as EVALI), a new study from the Yale School of Public Health finds.

Published in the journal Addiction, the study estimates the relationship between states' total reported EVALI cases per capita as of January 2020, and pre-outbreak rates of adult vaping and marijuana use. Results show that higher rates of vaping and marijuana use are associated with fewer EVALI cases per capita.

"If e-cigarette or marijuana use per se drove this outbreak, areas with more engagement in those behaviors should show a higher EVALI prevalence," said Assistant Professor Abigail Friedman, the study's author. "This study finds the opposite result. Alongside geographic clusters of high EVALI prevalence states, these findings are more consistent with locally available e-liquids or additives driving the EVALI outbreak than a widely used, nationally-available product."

The Centers for Disease Control and Prevention began a cross-state investigation into vaping-related lung injuries in August 2019, and has since confirmed over 2,800 cases and 68 deaths. In February 2020, the CDC concluded its national updates, and officially classified vitamin E acetate, an additive long linked to EVALI and most common in THC e-liquids that are informally-sourced--i.e., purchased off the street or home-mixed--as "a primary cause of EVALI."

The EVALI outbreak has motivated a variety of state and federal legislation to restrict sales of nicotine e-cigarettes, including a temporary ban on all e-cigarette sales in Massachusetts in late-2019 and bans on flavored e-cigarette sales in several states and localities. However, if the goal was to reduce EVALI risks, the study suggests that those policies may have targeted the wrong behavior.

A negative relationship between EVALI prevalence and rates of pre-outbreak vaping and marijuana use suggests that well-established markets may have crowded-out use of riskier, informally sourced e-liquids, Friedman said.

Indeed, the five earliest states to legalize recreational marijuana--Alaska, California, Colorado, Oregon and Washington--all had less than one EVALI case per 100,000 residents aged 12 to 64. None of the highest EVALI-prevalence states--Utah, North Dakota, Minnesota, Delaware and Indiana--allowed recreational marijuana use.

Interestingly, Friedman notes that two of the highest-prevalence states' medical marijuana laws forbid smokable marijuana. "If this policy led some recreational marijuana smokers to switch to vaping THC, perhaps in order to avoid detection, it would have increased their likelihood of exposure to contaminated e-liquids when those came on the market. This may have contributed to the higher EVALI prevalence in those states."

It may be important for policymakers to consider the potential unintended consequences of policies that forbid smokable marijuana while allowing THC e-liquids going forward.

Credit: 
Yale School of Public Health

Importance of rainfall highlighted for tropical animals

image: Rain directly affects birds and other animals, yet scientists haven't recognized the role of precipitation in an organism's ecological niche. A new theoretical framework from University of Illinois and Kansas State University scientists fills the gap.

Image: 
Cristian Bonilla Poveda

URBANA, Ill. - Imagine a tropical forest and you might conjure up tall trees hung with vines, brightly colored birds, howling monkeys, and ... rain. Indeed, precipitation patterns, along with temperature, dictate where tropical forests are distributed around the world, but surprisingly, scientists know very little about the direct effects of rainfall on animals.

A new conceptual framework, developed by University of Illinois and Kansas State University researchers, calls for the scientific community to formally consider the role of precipitation in an organism's ecological niche - the set of biological and environmental factors that optimize life for a given critter.

"We understand exactly how most animals respond to temperature, but the same is not true for rain," says Alice Boyle, associate professor in the Division of Biology at Kansas State and lead author on the Trends in Ecology & Evolution article. "When animal biologists see rainfall effects in their studies, they assume it must be about how plants are responding to rainfall and how that affects the food supply for the organisms they're studying. But there can be direct physiological consequences of rain related to feeding behavior, predation, pathogens, and more. There's a lot more going on than food supply."

In the article, Boyle and co-authors Elsie Shogren and Jeff Brawn propose and define what they call the hygric niche: the collection of physiological, behavioral, and ecological processes and interactions predicting how endothermic, or warm-blooded, organisms perform under a given precipitation scenario.

"Prior to this, there was no unifying conceptual framework to understand why responses to precipitation might differ between species or even within the same species, depending on the location of the study," Boyle says. "We've heard from scientists who have said, 'Wow, how come I've never thought about this before?' I think this new framework is probably going to change the way many people study the distributions, physiology, and demographic responses of endotherms."

Brawn, professor in the Department of Natural Resources and Environmental Sciences at Illinois, adds, "This concept has implications for conservation of sensitive organisms, long term. In terms of planning where to invest conservation dollars or where to prioritize habitat, we should be looking at rainfall refugia where precipitation regimes are likely to stay intact over time."

Because the effects of temperature and moisture are so difficult to disentangle, the team developed the concept of the hygric niche using decades of bird and mammal research from the tropics. Temperature in these equatorial landscapes varies little on an annual basis, but rainfall can vary widely, with some locations experiencing distinct dry and wet seasons and others experiencing daily precipitation throughout the year. But unfortunately, in many tropical locations, these millennia-old patterns are now shifting due to climate and land use change.

"Human-caused changes to climate are resulting in some areas getting wetter, and other areas getting drier. Also, it is not just the amount of precipitation that is changing; the timing and magnitude of storms are also changing, and we have very little idea of how any of this will affect animals," according to the authors.

In their article, Brawn and Boyle describe ways in which precipitation (too little or too much) can affect organisms at the individual, population, and community levels. While rain clearly affects food supply, it also can affect foraging behavior, reproductive and population growth rates, and competitive interactions in subtle ways that might be difficult for researchers to trace back to any particular source. And even small shifts in tropical rainfall patterns could have large effects.

"Even if you can see intact forest out to the horizon, if the precipitation regimes change, the integrity of that ecosystem may be compromised. And that's concerning," Brawn says.

Although the concept was conceived with tropical systems in mind, the researchers suggest it can and should be applied to ecosystems and organisms outside the tropics, with a bit of tweaking and further study.

"I work in both tropical and grassland systems and my major focus of research in grassland birds, one of the most threatened group of birds in North America, is understanding how temporal variation in precipitation affects those populations. So the questions and the concepts are broadly applicable," Boyle says. "It's just that it was more tractable to lay them out and argue for their importance in tropical systems."

Laying out a new ecological concept requires lots of testing by the research community to identify its limitations, and that's just what Boyle and Brawn hope will happen.

"The next steps involve the research community testing key assumptions and predictions of our model," they say. "One of the hardest but most important tasks is to understand whether rainfall affects different animal species for the same or different reasons. Is it really mostly about food, or are these less-obvious physiological costs more important than we thought? Answering these questions will be crucial to doing effective conservation and climate change mitigation in the tropics."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Hydroxychloroquine reduces in-hospital COVID-19 mortality

An Italian observational study contributes to the ongoing debate regarding the use of hydroxychloroquine in the current pandemic. The research, conducted on 3,451 patients treated in 33 hospitals throughout the Italian territory (list of participating centers attached), shows that the use of this drug reduces by 30% the risk of death in hospitalized patients affected by Covid-19.

Published in the European Journal of Internal Medicine, the study was coordinated by the Department of Epidemiology and Prevention of the I.R.C.C.S. Neuromed, Pozzilli, in collaboration with Mediterranea Cardiocentro, Naples, and the University of Pisa, with the participation of 33 hospitals forming the CORIST collaboration (COvid-19 RISk and Treatments). Researchers analysed data regarding current and previous diseases, therapies followed before the infection and drugs administered in the hospital specifically for the treatment of COVID-19. All this information was compared with the evolution and the final in-hospital outcome of the infection.

"We observed - explains Augusto Di Castelnuovo, epidemiologist at the Neuromed Department of Epidemiology and Prevention, currently at Mediterranea Cardiocentro in Naples - that patients treated with hydroxychloroquine had a 30% lower in-hospital mortality rate compared to those not receiving this treatment. Our data were subjected to extremely rigorous statistical analysis, taking into account all the variables and possible confounding factors that could come into play. The drug efficacy was evaluated in various subgroups of patients. The positive results of hydroxychloroquine treatment remained unchanged, especially in those patients showing a more evident inflammatory state at the moment of admission to hospital".

"While waiting for a vaccine - says Licia Iacoviello, Director of the Department of Epidemiology and Prevention at Neuromed and professor of Public Health at the University of Insubria at Varese - identifying effective therapies against COVID-19 is an absolute priority. We hope that our research will make an important contribution to the international debate on the role of hydroxychloroquine in the treatment of hospitalized patients for coronavirus. Further observational studies and ongoing clinical tials will of course be needed to better assess the role of this drug and the most appropriate administration methods. However, data from the CORIST collaboration support the use of hydroxychloroquine. At variance with some studies carried out in other Countries, where efficacy of the drug was not observed, it is interesting to note that the doses of hydroxychloroquine adopted in Italy (200 mg, twice a day) are lower than the ones used in those researches".

"In past months - comments Giovanni de Gaetano, President of Neuromed - the World Health Organization recommended a stop to the use of hydroxychloroquine on the basis of an international observational study, subsequently retracted. Now the new data from the CORIST study, resulting from a 'real life' national collaboration, might help Health Authorities better clarify the role of this drug in the treatment of COVID-19 patients".

The CORIST Collaboration

CORIST (COvid-19 RISk and Treatments) is a collaboration between 33 Italian clinical centers devoted to collection and study of data relating to COVID-19 patients. It is a study carried out in the "real life" of Italian National Health System, bringing together the different experiences of large and small clinical centers, from Lombardy to Sicily.

The I.R.C.C.S. Neuromed

The Institute for Research, Hospitalization and Health Care (I.R.C.C.S.) Neuromed in Pozzilli (Italy) is a landmark, at Italian and international level, for research and therapy in the field of neurological and cardiovascular diseases. A centre in which doctors, researchers, staff and the patients themselves form an alliance aimed at ensuring the best level of service and cutting-edge treatments, guided by the most advanced scientific developments.

Credit: 
Istituto Neurologico Mediterraneo Neuromed I.R.C.C.S.

New tool for identifying endangered corals could aid conservation efforts

image: A newly developed genotyping "chip" -- the first of its kind for corals -- allows researchers to genetically identify corals and the symbiotic algae that live within the coral's cells, a vital step for establishing and maintaining genetic diversity in reef restoration efforts.

Image: 
Iliana Baums, Penn State

Coral conservation efforts could get a boost from a newly developed genotyping "chip"--the first of its kind for corals. The chip allows researchers to genetically identify corals and the symbiotic algae that live within the coral's cells, a vital step for establishing and maintaining genetic diversity in reef restoration efforts. The chip and its accompanying online analysis pipeline help to democratize the genetic identification of coral biodiversity, making it accessible to conservation biologists who might not have access to the laboratory and computational resources needed to extract DNA and analyze the data.

A paper describing the new chip appears in the journal Scientific Reports.

"Corals around the world are endangered due to warming oceans," said Iliana Baums, professor of biology at Penn State and leader of the research team. "We designed this genotyping chip to help restoration and conservation efforts. There is very little overhead needed to use the chip, so small restoration operations can access coral genetic identification to help them maximize reef health by ensuring coral populations are genetically diverse."

The chip, also called a microarray, uses more than 30,000 single nucleotide polymorphisms (SNPs)--locations in the coral genome where at each of the locations a single letter in the DNA alphabet can vary among different corals in the Acroporid family. The Acroporid family of corals contains the largest number of different species of any coral family and are common in the Caribbean Sea and the Pacific Ocean. The chip was designed using Caribbean corals but can also be used to analyze Pacific species and allows researchers to identify the symbiotic algae that reside in the coral cells.

Corals can reproduce asexually by fragmentation, so Caribbean reefs are often dominated by corals that all can be traced back to a single origin and are therefore genetically nearly identical--researchers refer to these related corals as a "genet." The chip is sensitive enough to allow researchers to reliably distinguish members of different genets within the same coral species.

"One way to increase genetic diversity in a reef is to make sure it is built by individuals of more than one genet," said Baums. "Because all of the corals on a reef could be members of the same genet, it is vital to have a reliable way to identify them and our chip provides this to researchers in the field."

To use the SNP chip, which was developed at Penn State and licensed to Thermo Fisher Scientific who produces the Affymetrix microarrays, researchers can simply send a sample of coral to a commercial laboratory. At the lab, DNA is extracted and run on the chip and the resulting data is returned to the researcher. The researcher can then upload the data files into the online analysis pipeline called the Standard Tools for Acroporid Genotyping (STAG). The analysis is performed and data maintained in a customized "Science Gateway" in the open-source web-based Galaxy platform, a resource for data-rich biomedical research also developed at Penn State.

"With the SNP chip and STAG pipeline we can help ensure that researchers around the world can genetically identify corals in a standardized way," said Baums. "The database maintained in the Science Gateway allows researchers to compare samples, identify novel strains, and track coral diversity through time."

Credit: 
Penn State

Hydroxychloroquine plus azithromycin increases heart risk, finds global study

NEW YORK, NY -- The combination of hydroxychloroquine (HCQ) and azithromycin (AZM) has been linked to significant cardiovascular risks, including mortality, in the largest safety study ever performed on both HCQ and HCQ+AZM. This network study, led by the Observational Health Data Sciences and Informatics community, was recently published in Lancet Rheumatology.

OHDSI has established an international network of researchers and observational health databases with a central coordinating center housed at the Department of Biomedical Informatics at Columbia University.

In patients with rheumatoid arthritis, HCQ treatment in the short term (30 days) was found to not carry excess risk of complications associated with its use, but HCQ treatment in the long term had a 65% relative increase in cardiovascular-related mortality, compared to sulfasalazine.

HCQ + AZM had a cardiovascular mortality risk that was more than twice (2.19) as high as the comparative treatment even in the short term based on findings from more than 320,000 users of that combination therapy. This treatment also produced a 15-20% increased rate of angina/chest pain and heart failure.

This study, first published on MedRxiv, has already made significant impacts in the healthcare community. On April 23, the European Medicines Agency (EMA) cited the study in a warning about the risk of serious side effects with chloroquine and hydroxychloroquine. In July, the EMA again highlighted the study, among other efforts within the OHDSI community, in its eighth revision of The European Network of Centres for Pharmacoepidemiology and Pharmacovigilance (ENCePP) Guide on Methodological Standards in Pharmacoepidemiology.

This is the first published study to be generated from the OHDSI COVID Study-a-thon, a global effort in March to set the foundation for OHDSI efforts to design and execute network observational studies around characterization, patient-level prediction and population-level effect estimation to inform decision-making around the global pandemic. Multiple studies, several of which are highlighted later, have been posted to MedRxiv and are currently under peer review.

HCQ, a drug commonly used in the treatment of malaria, lupus and rheumatoid arthritis (RA), gained early attention during the pandemic as a potential COVID-19 treatment. The short-term (

"Hydroxychloroquine, both alone and in combination with azithromycin, gained strong consideration as a potential COVID treatment without a large-scale study of its overall safety profile," said Daniel Prieto-Alhambra, PhD, co-senior author on this study. "We had access to an unprecedented amount of data on this drug, and we were relieved to find no worrying side effects in the short-term use of hydroxychloroquine. However, when prescribed in combination with azithromycin, it may induce heart failure and cardiovascular mortality and we would urge caution in using the two together."

This study examined more than 950,000 HCQ users through deidentified electronic health records and administrative claims data over a 20-year period. Records were collected from 14 different databases spanning six nations (Germany, Japan, Netherlands, Spain, United Kingdom, United States) and then mapped to the OMOP Common Data Model to generate this large-scale analysis.

"At medical school we were taught to 'first do no harm' and to me, our study focuses on this core belief of modern medicine," said Jennifer Lane, MD, who served as co-lead author on this study along with Jamie Weaver. "OHDSI has the power to investigate this question in a very thorough way and to go through rigorous steps. We are looking at patients from the general population, which is why it is so important to look at data from multiple countries. There are reasons why you may get bias from one data source, but if we find a signal in the Netherlands, and we find it in Spain, and we find it in the U.S., then we know we have something."

The study was developed and executed by the OHDSI (Observational Health Data Sciences and Informatics) community, a multi-stakeholder, interdisciplinary collaborative to bring out the value of health data through large-scale analytics. All solutions are open-source, and links to the study protocol, code and results are posted at the bottom of this release.

"It required a global effort to generate this level of reproducible, reliable real-world evidence to inform decision-making around COVID treatment," said Patrick Ryan, PhD, co-senior author on this study. "Our community collaborated for years to develop the high-level analytics which set the course for these studies. Standardizing data for nearly 1,000,000 patients on hydroxychloroquine provides confidence in these findings, and we are pleased to see that this study has already helped make a positive clinical impact as treatment options continue to be evaluated."

Credit: 
Columbia University Irving Medical Center

Trust the power of markets

Organizations that use ad hoc groups or committees to make decisions might do better to crowdsource their decisions, says UC Riverside-led research.

The study found that people trust groups even though they are susceptible to manipulation and can make poor decisions. Information markets, in which people bet on potential outcomes, tend to make more accurate decisions, but people trust them less. Once people get used to using markets, however, they trust them more, making markets a useful decision-making tool for large organizations.

"Our key finding was that transparency and trust are why people prefer groups even though markets outperform them," said first author Boris Maciejovsky, an associate professor of management in UC Riverside's School of Business. "People are skeptical of algorithms. Markets will be used less until people get used to them."

The research raises additional questions for future research about the productivity of working from home during COVID-19 pandemic. The fact that people have less trust in computer-mediated markets than in face-to-face interactions might lead to a reduction in communication efficiency and thus potentially work performance while working remotely.

Information markets work like a game. People place bets on predicted outcomes in an online forum. Familiar types of prediction markets include those for the Oscars or elections. Markets, for example, often do better than exit polls at predicting the outcome of elections. Markets make accurate predictions by pooling and aggregating the diverse beliefs of many participants.

Groups and committees, by contrast, are typically smaller and therefore contain more homogenous knowledge and information. Groups are also easier to manipulate. Conflicts among members, misalignment of organizational and individual goals, and persuasive negotiation or voting can all lead to poor decisions.

In large companies or organizations that use information markets, all employees are typically given the same amount of money and place odds-based bets on the potential outcomes of a situation or strategy in an online forum that everyone can see. Markets receive truthful information by asking participants to "put their money where their mouth is," and do not require alignment of organizational and personal goals. People who bet on the winning outcome get paid, giving an incentive to participate in the market. As in any betting situation, both correct and incorrect decisions have financial consequences.

Given their decision-making success, Maciejovsky and co-author David Budescu, a psychology professor at Fordham University, wondered why information markets are not used more widely by large organizations, which usually prefer groups and committees.

The first experiment had college students select a candidate for a managerial position in either a face-to-face group or an online market. Each category had the same information about the candidate. Group participants were given various roles and financial incentives, which were manipulated by the researchers in different ways. Market participants were manipulated with various financial incentives. Control groups and markets were not manipulated.

The results showed that groups outperform markets when the members share incentives and interests. However, markets outperform groups when conflicts of interest exist among members. Interestingly, people overlooked or failed to notice the detrimental effect of conflict within groups and put more trust on groups than markets.

Participants in the next experiment were either asked to watch a video of three people discussing job candidates or to watch market trading on a screen. Half the participants in the video cohort were told about conflicts of interest some of the candidates had. The market cohort was told that they could infer the merits of the candidates by observing market activity. Afterward, all participants were asked to evaluate their group or market on a number of attributes including transparency, benevolence, efficiency, familiarity, fairness, integrity, and predictability. The results confirmed the findings of the first study: People perceive groups to be more transparent, fair, and honest than markets.

To find out why people trust groups in spite of the demonstrably bad effects of intragroup conflicts, the third experiment recruited people who worked for a large forecasting project that sometimes uses information markets to make predictions and sometimes relies on teams or individual predictions. The forecasters from this project participated in a replication of the second study described above. The results showed a halo effect--everyone trusted the institution with which they were most familiar, which was committees. However, the more experience people had using information markets, the more they trusted them.

"It's hard at first to trust abstract mechanisms like markets," Maciejovsky said. "But our research shows that markets are reliable and less susceptible to bias. Large organizations could benefit from using information markets."

The research also hints at an unexpected potential outcome of the COVID-19 pandemic, where working from home has made in-person meetings impossible.

"Perhaps as people become more comfortable making business decisions in an environment of decreased interpersonal contact and increased reliance on technology decision markets will be seen as less threatening and find wider use in American organizations," Maciejovsky said.

Credit: 
University of California - Riverside

Why we distort probability

The chances of a commercial airliner crashing are vanishingly small -- and yet many people are uncomfortable flying. Vaccination for many common childhood diseases entail almost no risk -- but parents still worry. Human perception of probabilities -- especially very small and very large probabilities -- can be markedly distorted and these distortions can lead to potentially disastrous decisions.

But why we distort probability is unclear. While the question has been previously studied, there is no consensus on its causes.

A team of scientists from New York University and Peking University, using experimental research, has now concluded that our cognitive limitations lead to probability distortions and to subsequent errors in decision-making. The researchers have developed a model of human cognitive limitations and tested its predictions experimentally, as reported in the latest issue of the journal Proceedings of the National Academy of Sciences.

The team, which included New York University's Laurence Maloney as well as the University of Peking University's Hang Zhang, a professor, and Xiangjuan Ren, a post-doctoral fellow, initiated the analysis by examining the nature of distortions as a potential clue for explaining this phenomenon.

"Probability distortion limits human performance in many tasks, and we conjectured that the observed changes in probability distortion with task was a kind of partial compensation for human limitations," explains Maloney. "A marathon runner with a sprained ankle will not run as well as she might have with ankle intact, but the awkward, limping gait we observe could in fact be an optimal compensation for injury."

The key step in the model is the recoding of probabilities that depends on the range of probabilities in a task.

"Much like a variable magnification microscope, the brain can represent a wide range of probabilities, but not very accurately, or a narrow range at high precision," explains Maloney. "If, for example, a task involves reasoning about the probability of various causes of death, for example, then the probabilities are all very small (thankfully) and small differences are important. We can set the microscope to give us high resolution over a limited window of very small probabilities. In another task we might accept less precision in return for the ability to represent a much wider range of probabilities."

Zhang, Ren, and Maloney set out to test this model in two experiments, one in which subjects made typical economic decisions under risk (e.g. choosing between a 50:50 chance of $200 and the certainty of $70) and one involving judgements of relative frequency (the relative frequency of black and white dots appearing on a computer screen). The two experiments together tapped into the basic ways we use probability and frequency in everyday life. The researchers found that their model predicted human performance far better than any previous model.

They discovered that -- like the marathon runner -- people's limitations were costly but, subject to those limitations, we do as well as we possibly can.

Credit: 
New York University

UC Berkeley demographers put COVID-19 death toll into perspective

With over 170,000 COVID-19 deaths to date, and 1,000 more each day, America's life expectancy may appear to be plummeting. But in estimating the magnitude of the pandemic, University of California, Berkeley, demographers have found that COVID-19 is likely to shorten the average U.S. lifespan in 2020 by only about a year.

Seeking to put current COVID-19 mortality rates into historic, demographic and economic perspective, UC Berkeley demographers Ronald Lee and Joshua Goldstein calculated the consequences of U.S. lives lost to COVID-19 in 2020 using two scenarios. One was based on a projection of 1 million deaths for the year, the other on the more likely projection of 250,000 deaths.

Their findings, published online last week in the Proceedings of the National Academy of Sciences journal, conclude that 1 million deaths in 2020 would cut three years off the average U.S. life expectancy, while 250,000 deaths would reduce lifespans by about a year.

That said, without the societal efforts that have occurred to lessen the impact of COVID-19, there could have been 2 million deaths projected by the end of 2020, a reduction of the average U.S. lifespan by five years, the researchers pointed out.

Their estimated drop in life expectancy is modest, in part, because 250,000 deaths is not a large increase on top of the 3 million non-COVID-19 deaths expected for 2020, and because older people, who typically have fewer remaining years of life than others do, represent the most COVID-19 fatalities, the study notes.

Still, while COVID-19 mortality rates remain lower than those of the 1918 Spanish flu pandemic, the coronavirus epidemic could be just as devastating as the longer-lasting HIV and opioid epidemics if mitigation efforts fail, the researchers said.

"The death toll of COVID-19 is a terrible thing, both for those who lose their lives and for their family, friends, colleagues and all whom their lives touched. Those are real people, not abstract statistics," said Lee, a UC Berkeley professor emeritus of demography and associate director of the campus's Center for the Economics and Demography of Aging.

"But the population perspective helps put this tragedy in a broader context. As we work to contain this epidemic, it is important to know that we have been through such mortality crises before," he added.

Goldstein's and Lee's measures are based on factors that include a current U.S. population of 330 million, age-specific death rates and the economic valuation of saved lives.

Among their other findings:

One million COVID-19 deaths in the U.S. in 2020 would be the equivalent of U.S. mortality levels in 1995, adding three years to each American's biological age, albeit temporarily.

The age gap (old versus young) for people dying from COVID-19 is marginally wider than during pre-pandemic times, while the male-female gap is slightly narrower. The researchers found similar death-by-age patterns across several countries.

The economic cost of lives lost to COVID-19 in the U.S. is in the trillions of dollars. According to standard government measures, the demographers estimated that the loss of 1 million lives in 2020 would amount to between $10.2 and $17.5 trillion, while the amount for 250,000 deaths would range from $1.5 to $2.5 trillion.

Credit: 
University of California - Berkeley

Memory protein

When UC Santa Barbara materials scientist Omar Saleh and graduate student Ian Morgan sought to understand the mechanical behaviors of disordered proteins in the lab, they expected that after being stretched, one particular model protein would snap back instantaneously, like a rubber band.

Instead, this disordered protein relaxed slowly, taking tens of minutes to relax into its original shape -- a behavior that defied expectations, and hinted at an inner structure that was long thought to exist, but has been difficult to prove.

"The speed of relaxation is important because it gives us some insight into the structural organization of the protein," said Morgan, the lead author in a paper published in Physical Review Letters. "This is important because the structural organization of a protein is usually related to its biological function."

While a protein with fixed 'folds' -- a well-defined three-dimensional structure -- is associated with its function, disordered proteins, with their unstable structures, derive their functions from their dynamics.

"More than 40% of human proteins are at least partially unfolded, and they are often linked to critical biological processes as well as debilitating diseases," Morgan said.

The slow relaxation is in fact a behavior typically reserved for folded proteins.

"In the 1980s it was discovered that folded proteins exhibit slow relaxations," Morgan said, in a behavior typical of glasses -- a class of materials that are neither truly liquid nor crystalline solid states, but can exhibit characteristics of either state.

"We have been studying folded proteins for a long time and have developed a lot of good tools for them, so it was quickly figured out that the slow relaxations could be explained by a mechanism by which 'frustrated' molecules trying to fit themselves in a small space," Morgan said -- a mechanism called "jamming." "This explanation helped us better understand the structure of folded proteins and explain glassy behavior in a lot of other systems."

However, the protein, which the researchers were trying to stretch by means of a device known as a magnetic tweezer, was a disordered protein. By definition, it wasn't trying to pack many molecules into a small space, so it shouldn't run into the jamming problem, Saleh said.

"So, when we observed slow relaxations, it either meant our definition of the protein was wrong or there had to be another mechanism," Morgan said.

Furthermore, by allowing the stretched protein to relax but stretching it again with less force before it had a chance to fully relax, the researchers found that the protein "remembered" its previous stretching -- initially lengthening, as expected with more force, but eventually slowly relaxing again lengthening as expected with less force, but then slowly relaxing over time. Conceptually, Morgan explained, the longer the protein is stretched the longer it takes to relax, hence it "remembers" how long it was pulled.

To explain these unexpected, glassy behaviors, the researchers drew inspiration from some rather mundane objects: crumpled paper and memory foam. Both structurally disordered systems, they exhibit a similar slow, logarithmic relaxation after being subjected to forces, and particularly in the case of the foam, a "memory" effect.

For the researchers the behaviors suggested that like memory foam and crumpled paper, the internal structure of the protein was not one of a single, fixed unit, but one of several, independent substructures of a range of strengths between strong and weak that respond to a range of forces exerted on the material along different lengths of time. For instance, strong structures may withstand a certain amount of strain before being pulled apart and be the first to relax, whereas weak structures will stretch with smaller forces and take longer to relax.

Based on this notion of multiple substructures and confirmed with experimental data, the researchers determined that the protein's logarithmic relaxation rate is inversely proportional to the stretching force.

"The stronger the stretching force applied to the disordered protein, the more the protein relaxed in the same amount of time," Saleh explained.

"Mechanical disordered systems with similar structural arrangements tend to be remarkably durable," Morgan said. "They also have different mechanical properties depending on how much you pull and compress them. This makes them very adaptable, depending on the magnitude and frequency of the force." Understanding the structure behind this ability to adapt could open the door to future dynamic materials, that, Morgan said, "just like your brain, helps them filter out unimportant information and makes them more efficient at storing repeated stimuli."

Credit: 
University of California - Santa Barbara

Revised code could help improve efficiency of fusion experiments

image: PPPL physicist Stuart Hudson was part of an international team of researchers that has upgraded a key computer code for calculating forces acting on magnetically confined plasma in fusion energy experiments.

Image: 
Elle Starkman / PPPL Office of Communications

An international team of researchers led by the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) has upgraded a key computer code for calculating forces acting on magnetically confined plasma in fusion energy experiments. The upgrade will be part of a suite of computational tools that will allow scientists to further improve the design of breakfast-cruller-shaped facilities known as stellarators. Together, the three codes in the suite could help scientists bring efficient fusion reactors closer to reality.

The revised software lets researchers more easily determine the boundary of plasma in stellarators. When used in concert with two other codes, the code could help find a stellarator configuration that improves the performance of the design. The two complementary codes determine the optimal location for the plasma in a stellarator vacuum chamber to maximize the efficiency of the fusion reactions, and determine the shape that the external electromagnets must have to hold the plasma in the proper position.

The revised software, called the "free-boundary stepped-pressure equilibrium code (SPEC)," is one of a set of tools scientists can use to tweak the performance of plasma to more easily create fusion energy. "We want to optimize both the plasma position and the magnetic coils to balance the force that makes the plasma expand while holding it in place," said Stuart Hudson, physicist, deputy head of the Theory Department at PPPL and lead author of the paper reporting the results in Plasma Physics and Controlled Fusion.

"That way we can create a stable plasma whose particles are more likely to fuse. The updated SPEC code enables us to know where the plasma will be for a given set of magnetic coils."

Fusion combines light elements in the form of plasma -- the hot, charged state of matter composed of free electrons and atomic nuclei -- and in the process generates massive amounts of energy in the sun and stars. Scientists are seeking to replicate fusion in devices on Earth for a virtually inexhaustible supply of safe and clean power to generate electricity.

Plasma stability is crucial for fusion. If plasma bounces around inside a stellarator, it can escape, cool, and tamp down the fusion reactions, in effect quenching the fusion fire. An earlier version of the code, also developed by Hudson, could only calculate how forces were affecting a plasma if the researchers already knew the plasma's location. Researchers, however, typically don't have that information. "That's one of the problems with plasmas," Hudson said. "They move all over the place."

The new version of the SPEC code helps solve the problem by allowing researchers to calculate the plasma's boundary without knowing its position beforehand. Used in coordination with a coil-design code called FOCUS and an optimization code called STELLOPT -- both of which were also developed at PPPL -- SPEC lets physicists simultaneously ensure that the plasma will have the best fusion performance and the magnets will not be too complicated to build. "There's no point optimizing the shape of the plasma and then later finding out that the magnets would be incredibly difficult to construct," Hudson said.

One challenge that Hudson and colleagues faced was verifying that each step of the code upgrade was done correctly. Their slow-and-steady approach was crucial to making sure that the code makes accurate calculations. "Let's say you are designing a component that will go on a rocket to the moon," Hudson said. "It's very important that that part works. So you test and test and test."

Updating any computer code calls for a number of interlocking steps:

First, scientists must translate a set of mathematical equations describing the plasma into a programming language that a computer can understand;

Next, scientists must determine the mathematical steps needed to solve the equations;

Finally, the scientists must verify that the code produces correct results, either by comparing the results with those produced by a code that has already been verified or using the code to solve simple equations whose answers are easy to check.

Hudson and colleagues performed the calculations with widely different methods. They used pencil and paper to determine the equations and solution steps, and powerful PPPL computers to verify the results. "We demonstrated that the code works," Hudson said. "Now it can be used to study current experiments and design new ones."

Credit: 
DOE/Princeton Plasma Physics Laboratory

Opioid prescription rates for knee surgery vary, but higher strength dosage common

A new study published in BMJ Open found that opioid prescription rates for outpatient knee surgery vary widely across the country, but the strength of the average prescription in the United States is at a level that has been linked to an increased risk of overdose death.

While the nationwide rate at which patients - who had not already been taking opioids - received an opioid prescription after an arthroscopic knee surgery was found to be more than 70 percent across the United States between 2015 and 2019. The variation at the state level was stark, bottoming out at 40 percent in South Dakota and reaching 85 percent in Nebraska, the study showed. The strength of the typical prescription, though, was revealed to be high, equal to 50 milligrams of morphine per day, the level that the Centers for Disease Control and Prevention has identified as being the threshold for increased risk of opioid overdose death.

"We found massive levels of variation in the proportion of patients who are prescribed opioids between states, even after adjusting for nuances of the procedure and differences in patient characteristics," said the study's senior author, M. Kit Delgado, MD, an assistant professor of Emergency Medicine and Epidemiology in the Perelman School of Medicine at the University of Pennsylvania. "We've also seen that the average number of pills prescribed was extremely high for outpatient procedures of this type, particularly for patients who had not been taking opioids prior to surgery."

The latter is of increased concern because it is especially prone to contributing to the opioid epidemic. It has been shown that giving patients who have never been on opioids before a high-dosage prescription can be associated with a transition to long term opioid use, higher numbers of leftover pills, and even higher rates of overdose among family members. As such, the research team - which included lead author Benjamin Ukert, MD, then a post-doctoral researcher at Penn and now an assistant professor of Health Policy and Management at Texas A&M - chose arthroscopic knee surgery as the lens through which to examine this because arthroscopies are in the top three of most common outpatient procedures in the United States.

To gauge prescription rates, the researchers accessed a large, national database of insurance claims. They were able to identify nearly 100,000 patients who had arthroscopic knee surgery and had not used any opioid prescriptions in the six months before the surgery.

The team found that, nationwide, 72 percent of patients filled an opioid prescription within three days of their procedure. There was very little variation in the fill-rate between non-invasive (which might include the removal of torn cartilage) and invasive procedures (which requires work cutting or drilling into bone) such as ACL repair.

Significant differences in prescribing rates were found from state to state. High prescription rates (77 percent or above) extended across the Midwest into the Rocky Mountain region, from Ohio to Utah, and extended into Arizona and Washington state. Lower rates, below 70 percent, tended to be on either coast, but also includes the Dakotas, Texas, and other states. There was also wide variation in the number of tablets in each single prescription, ranging from 24 (in Vermont) to 45 (Oklahoma).

"Some factors that may contribute to state variation are policies, such as mandates to check prescription drug monitoring program data, which have shown to affect the opioid prescribing rate," Ukert said. "However, most state policies are aimed at patients with a history of opioid use, and our study focuses on patients who do not have that history. Thus, practice and organizational styles may be more important factors for this population."

While the prescription rate varied greatly, the average prescription translated to roughly 250 milligrams of morphine over a five-day period, above the CDC's threshold for risky opioid prescription. Approximately 25,000 of the patients studied, 36 percent, were receiving this dosage level or more.

With that in mind, a different study involving Delgado and another co-author of this study, Brian Sennett, MD, chief of Sports Medicine at Penn Medicine, is currently being conducted. Using automated text messages to directly check in with patients, it is examining how many opioid tablets they actually do take from their prescription after knee surgeries. So far, they're observing that most patients take less than 10 tablets, which jells with what's been found by other groups.

"These studies suggest that current prescribing patterns are still resulting in a significant number of opioid tablets in the community that could be misused and potentially diverted to others," Delgado said. "The data we've collected show that there's ample opportunity to reduce excessive prescribing for this common outpatient procedure."

Moving forward, both Ukert and Delgado feel there needs to be more definitive work done to nail down what is the right prescription to prevent pain but also protect against potential dependence and overdose.

"Given that most arthroscopies are not invasive, there seems to be room to reduce the prescribing rate and the strength of the prescription," Ukert said.

Credit: 
University of Pennsylvania School of Medicine