Tech

Researchers create roadmap for eliminating defects in health care value

CLEVELAND -- A new paper published in the New England Journal of Medicine Catalyst states that the U.S. health care system spends in excess of $1.3 trillion annually on sub-optimal behavior and outlines a roadmap for reducing costs by eliminating defects in health care value. The paper, entitled Making a Dent in the Trillion-Dollar Problem: Toward Zero Defects, can be found here: https://catalyst.nejm.org/doi/full/10.1056/CAT.19.1064.

A lead author, Peter Pronovost, MD, PhD, Chief Quality and Clinical Transformation Officer at University Hospitals and Clinical Professor of Anesthesiology and Critical Care Medicine at Case Western Reserve University in Cleveland, developed the concept of "defects in value." He and co-first author John W. Urwin, MD, of Beth Israel Deaconess Medical Center, Harvard Medical School, and the Department of Medical Ethics and Health Policy, University of Pennsylvania, wrote their model "offers a hopeful path forward for improving value in health care."

Using the success achieved at University Hospitals (UH) in Cleveland -- where annual costs per patient in the UH Accountable Care Organization (ACO) were reduced by 9 percent over the course of 12 months -- researchers demonstrated that deploying a framework in which specific 'defects in value' were eliminated could not only save money but improve the overall care value proposition. In the UH ACO, they applied this framework to the 37,000 members in its employee plan as well as the 580,000 patients in its ACO, who collectively represent almost half of the 1.2 million individuals for whom UH provides health care.

"In general, health care harms too many patients because it costs too much and learns and improves too slowly," explained Dr. Pronovost. "We deliberately chose the word 'defects' in order to convey an intolerance for suboptimal care and decisions and to hopefully drive a cultural shift emphasizing that these 'defects' are unacceptable and preventable."

To assess the scale of the problem and the potential for improvement, the researchers examined how decisions and behaviors in health care lead to suboptimal care and suboptimal value across all stages of disease, from maintaining good health in the healthy patient, to managing chronic disease in the chronically ill, to treating acute illness both in and out of the hospital.

Through their research, they deduced these sub-optimal behaviors and a mis-alignment of incentives cost the U.S. healthcare system more than $1.3 trillion.

The authors classified patients into three categories: Staying Well, Getting Well and Getting Better. For individuals in the first group -- "Staying Well: Preventing Illness in the Healthy Patient" -- the researchers suggest these people have the power to avoid unhealthy habits and adhere to recommended health maintenance guidelines designed to preserve health and prevent the onset of disease. For example, in the U.S. today, three-fourths of adults are obese, one-fifth smoke cigarettes, and one in 10 consumes more than 10 drinks of alcohol daily.

"Indeed, 22 percent of all health care spending is driven by a mere 10 modifiable risk factors. Those costs represented over $750 billion annually in the U.S.," said Dr. Pronovost.

The second category they called "Getting Well: Managing Chronically Ill Patients." In this group, they see under-diagnosis and misdiagnosis as enormous problems. For example, while 1 in 3 Americans is living with either diabetes or pre-diabetes, 36 percent of patients with diabetes and 88 percent of those with pre-diabetes remain undiagnosed. In addition, 1 in 3 Americans has hypertension, and more than a third of them are unaware of their diagnosis and thus are not receiving treatment. Additionally, 1 in 5 adults experience a mental health illness each year, but only 43 percent of those patients receive care. Moreover, 35 percent of patients with serious mental illnesses, such as schizophrenia or bipolar disorder, have not received any mental health care in the past year. These defects cost the healthcare system billions of dollars.

The third group they defined was "Getting Better: Treating Acutely Ill Patients." These patients often present to inefficient points of care when they could be cared for in higher-value settings. Moreover, they often receive suboptimal medical care, resulting in costly complications. Instead of seeing a physician in the office (at a cost of around $150 per visit), many patients present to the emergency department (at a cost of around $1,500 per visit). Hospitalization costs even more, averaging $3,000 to $5,000 per day. The authors hypothesized, "While estimates of the frequency of improper emergency department use have varied widely...there is a general consensus that at least 20 percent of emergency department visits, and likely 40 percent, are preventable."

"We call for a paradigm shift with a focus on eliminating low-value behaviors and incentives--not just those that are most egregious or visible--in order to sustain a meaningful impact on bending the healthcare cost curve," said Eric Beck, DO, MPH, Chief Operating Officer at UH and a co-author of the paper.

Dr Beck said health systems can find this shift in focus by embracing four fundamental priorities: Alignment around a common purpose and definition of value; a common framework and analytical platform for measuring--and making transparent--defects in value as well as a disciplined management system to reduce them; incentives for all stakeholders toward the common purpose, and an appropriate population of attributable persons for whom creating this system change and alignment is measurable so that the organization which is accountable for a given population realizes the benefits of the investment.

Pronovost explained that UH "aligned around a purpose of improving value, defined as the quality and experience of care divided by the annual cost of care. This purpose was communicated across the organization as a new narrative--specifically, that success is defined as keeping people healthy at home rather than healing them in the hospital."

UH developed a list of key principles for eliminating defects in value as well as a checklist for eliminating such defects. To eliminate these defects, they worked to align incentives among employees and clinicians and created a management system to guide their efforts.

Prior to this effort, the ACO's annual per-member-per-year cost of care for the employee plan increased by 6 percent from 2017 to 2018, then decreased by 1 percent from 2018 to 2019. As they continued to implement this framework in 2020, these costs were further reduced by 13 percent from Q1 2019 to Q1 2020.

Among its 57,000 Medicare Shared Shaving Program beneficiaries, the per-member-per-year cost decreased by 9 percent, whereas the cost of care for the U.S increased by approximately 6 percent. Contributing to these cost reductions were decreases in hospital discharges (by 15 percent); skilled nursing facilities admissions (by 31 percent); length of stay (by 28 percent); post-acute spending (by 32 percent), and emergency department visits (by 13 percent). The health system also saw improvements in several other measures.

Their study's limitations include that it is an observational study design from which they cannot make causal inferences regarding the relationship between interventions and improved outcomes.

"Although we are early in the journey and our framework is only 25 percent deployed, we believe that this model offers a hopeful path forward for improving value," concluded Dr. Pronovost.

Credit: 
University Hospitals Cleveland Medical Center

Antiepileptic drug reduces motor neuron excitability in ALS

BOSTON - The antiepileptic drug ezogabine reduced pathologic excitability of cortical and spinal motor neuron cells that are early signs of clinical dysfunction in people with amyotrophic lateral sclerosis (ALS), according to a study conducted by the Neurological Clinical Research Institute of Massachusetts General Hospital (MGH). In addition to providing a clearer understanding of motor neuron excitability as an important disease pathway for ALS, the multi-site study, published in JAMA Neurology, involves the first clinical investigation of ALS (also known as Lou Gehrig's disease) using a drug identified through an induced pluripotent stem cell (iPSC) model.

"The stem cell approach allowed us to capture the hyperexcitability of motor neurons -- a prominent disease phenotype -- and to then show ezogabine was able to reduce it in people with ALS," says lead author Brian Wainger, MD, PhD>, of the Healey Center for ALS at MGH. "Our findings could have important implications for the field of ALS research both by demonstrating the effect of ezogabine on excitability in people with the disease and by showing that the metrics of cortical and spinal motor neuron excitability may be used as drug biomarkers in multi-site clinical trials."

ALS is a progressive neurodegenerative disorder that leads to the death of neurons in the brain and spinal cord that control speech, swallowing and limb movements. Named after the famous baseball player Lou Gehrig, who was diagnosed with the disease in 1939, there are around 20,000 people in the U.S. with ALS, and another 5,000 newly diagnosed cases each year. Currently, there are three approved drugs in the U.S. for treating ALS, each with limited benefit, creating an urgent need for new therapies that could change the course of the fatal disease.

The MGH study of ezogabine was not designed to assess the long-term effects of the drug on the neurodegenerative disorder, but rather to unravel the biological processes that go awry and identify novel molecular targets for drug intervention. To that end, the ten-week, phase 2 study of 65 participants with ALS at 12 U.S. sites investigated the feasibility of using neuron excitability metrics as predictors of disease progression. "We demonstrated for the first time that these neurophysiological assays can be effectively deployed across multiple study sites, which is important in trials of diseases like ALS where investigators rely on many sites for recruitment," explains Wainger. "That finding could be useful in evaluating other drugs to treat ALS, or even for other diseases where motor neuron metrics could serve as key biomarkers."

Ezogabine (also known as retigabine) had been previously approved by the U.S. Food and Drug Administration (FDA) for treating epilepsy with a unique mechanism of action: facilitating potassium channels in cell membranes that play a central role in controlling neuron excitability, particularly important in the control of seizures. Researchers from MGH's Neurological Clinical Research Institute began evaluating the drug's potential in the context of ALS, using transcranial magnetic stimulation (TMS) and threshold tracking nerve conduction studies (TTNCS) to measure the effects of ezogabine on motor neuron excitability. They learned that ezogabine did indeed calm the excitability of motor neurons.

"Further studies are needed to determine if longer treatment will sustain the effects of reduced excitability and, if so, whether that may slow disease progression," says Wainger. "Through our study we've hopefully established a new research paradigm for using iPSC-based in vitro models for identifying novel disease targets and compounds, and rapidly repurposing drugs for clinical trials."

Credit: 
Massachusetts General Hospital

New serological assay provides rapid, accurate testing for SARS-CoV-2 antibodies

image: Rebecca Dubois in her lab at UC Santa Cruz.

Image: 
Carolyn Lagattuta

Researchers at UC Santa Cruz have developed a novel serological assay for the detection of antibodies to SARS-CoV-2, the coronavirus that causes COVID-19.

Rebecca DuBois, associate professor of biomolecular engineering at UC Santa Cruz, said the new method her team developed is as accurate as the most reliable antibody tests currently available, but is less complex and can be performed much faster.

The gold standard for serological testing is a complex laboratory method called ELISA that takes four to six hours to run and provides quantitative results indicating the strength of the immune response. Simpler assays using test strips provide rapid results, but are less reliable and do not quantify antibody levels.

The new method, called biolayer interferometry immunosorbent assay (BLI-ISA), provides complete quantitative results in less than 20 minutes. DuBois and her colleagues described the assay in a paper published December 10 in Scientific Reports.

"Our assay is as sensitive if not better than other assays in detecting low levels of antibodies, and the specificity [false-positive rate] is as good as the best antibody tests out there," DuBois said. "It combines the advantages of the test strips that take 20 minutes with the quantitative results and higher performance of ELISA."

A positive antibody test indicates prior infection with the virus. These tests are not used to diagnose active infections, however, which requires a different test that detects the virus's genetic material or the virus antigens. Antibodies are proteins made by the immune system that recognize and bind to virus antigens.

Serological testing is important for understanding the spread of the coronavirus by determining how many people in a population have been infected. The tests are also used to evaluate the responses to experimental vaccines in both people and laboratory animals. Quantitative information about antibody levels could be especially important in the future if scientists are able to determine that a certain level of particular antibodies is needed to provide protection against infection with the coronavirus.

"That is still to be determined, but we do know that people who have been infected with SARS-CoV-2 have very diverse levels of antibodies, and it would not be surprising to find that below some baseline level they might not be protective," DuBois said. "So it's really useful to have that quantitative ability to know what someone's antibody status is, whether it's from a past infection or a vaccination."

The new BLI-ISA method uses biolayer interferometry, an optical technique for measuring the interactions between molecules by detecting the binding of molecules to the tip of a fiber-optic biosensor. The instruments required to perform biolayer interferometry are increasingly common in research laboratories. The BLI instrument at UCSC is used by several research groups, and DuBois has been using it in her research to study antibody-antigen interactions.

The new serological assay involves several steps performed by the instrument in an automated "dip-and-read" format. In the first step, the biosensor tip is dipped into a solution containing the antigen (a viral protein) that is recognized by the antibody to be tested for. As the antigen binds to the biosensor tip, it generates a signal that can be used for quality control to ensure consistency in the antigen loading step. Next, after dipping into a wash solution, the biosensor is dipped into the blood plasma sample, generating a signal as antibodies bind to the antigen.

The immune system makes different types of antibodies, called isotypes, including IgM, which is produced early in the infection and declines later, and IgG, which is produced later and persists longer. In the antibody binding step, the BLI instrument detects any type of antibody that binds to the antigen.

"Depending on when the blood sample was taken after infection, there might be different types of antibodies, and this step gives us the total antibody level," DuBois said.

The next step detects and quantifies IgG antibodies specifically by measuring the binding of anti-IgG antibodies. The assay thereby provides quantitative measurements of both total antibodies and IgG antibodies, and it can be designed to measure different isotypes as well.

In response to infection with the coronavirus, the immune system makes antibodies against a range of different coronavirus antigens. Most serological testing methods for SARS-CoV-2 are designed to detect antibodies to structural proteins of the virus such as the spike protein, which studs the surface of the virus and enables it to bind to and enter human cells. Antibodies to one part of the spike protein--the receptor binding domain, or RBD--are of particular interest because they are effective at neutralizing the virus in laboratory assays.

DuBois's team developed and tested the new assay using RBD antigens, but she noted that it can be used to detect a wide range of antigens. The BLI instrument allows easy "multiplexing," meaning that multiple tests can be run in parallel on the same blood sample to detect antibodies to different viral antigens as well as different antibody isotypes.

Because it requires laboratory equipment, BLI-ISA could not be used as a point-of-care test at doctor's offices or pharmacies. But it allows high-throughput processing of samples and is faster and less labor intensive than other quantitative laboratory tests such as ELISA, Immunofluorescent Assay, and Chemiluminescent Immunoassay. The amount of blood needed for the test can be obtained with a finger prick.

Another advantage of BLI-ISA over other methods is the ease of standardizing the assay so that a sample gives the same quantitative results in repeated tests at different times or in different laboratories. ELISA requires an enzyme-based signal amplification step, which can vary depending on temperature and other factors, resulting in much more test-to-test variability in the quantitative measurements.

First author John Dzimianski, a postdoctoral researcher in DuBois's lab, said the new assay could be especially useful for researchers in vaccine development. "This method provides a standardized way to quantify antibody levels, which could be used to compare antibody responses to different vaccine candidates. In addition, running the assay itself is straightforward, requiring little more than the push of a button," he said. "It's a simple but powerful tool."

DuBois's team is also interested in using the assay to conduct a surveillance study to assess the prevalence of coronavirus infections in certain groups in the Santa Cruz community. The team is exploring funding opportunities for this research project.

Credit: 
University of California - Santa Cruz

Fans may relieve breathlessness associated with advanced cancers

Blowing air from a fan into the face of patients with advanced cancer experiencing breathlessness, and other nonpharmacologic interventions, may offer symptom relief, according to new research directed by Johns Hopkins Kimmel Cancer Center investigators. On the other hand, the investigators found medications, such as opioids, had limited impact in improving breathlessness.

In a systematic review of 29 randomized clinical trials of breathlessness in 2,423 adults with advanced cancer, researchers found several nonpharmacological interventions were associated with improved breathlessness, including fan therapy and bilevel ventilation (air pressure delivered through a face mask covering the mouth and nose). These results were published in the Nov. 19 issue of the journal JAMA Oncology.

A longer companion report, including results from both nonpharmacological and pharmacological interventions for breathlessness, was published the same day by the Agency for Healthcare Research and Quality (AHRQ), which sponsored this study along with the American Society of Clinical Oncology (ASCO) and the Patient-Centered Outcomes Research Institute. These findings are informing an upcoming ASCO Clinical Practice Guideline on the management of breathlessness in patients with advanced cancer.

"Breathlessness, or dyspnea, is a common and distressing symptom in patients with advanced cancer," says lead author of the JAMA Oncology article, Arjun Gupta, M.D., chief medical oncology fellow at the Johns Hopkins Kimmel Cancer Center. "Breathlessness can be associated with and made worse by accompanying anxiety, and can severely impact quality of life and exercise capacity. In patients with advanced cancer, treating the underlying cause of breathlessness (such as the cancer itself) may provide incomplete symptom relief or may not be feasible. In these scenarios, treating the symptom of breathlessness may be indicated. However, patients in this situation are often vulnerable with limited time to recover, and a question that comes up for clinicians is, are the potential benefits of this intervention likely to outweigh the harms?"

"Traditionally, in the ward, medications such as opioids and benzodiazepines are often used to treat breathlessness. However, we did not know how well they really worked in patients with advanced cancer. Some of the data was extrapolated from patients with other conditions such as lung and heart disease. Medications can also cause side effects such as drowsiness and constipation. Therefore, we performed a comprehensive review of interventions (both nonpharmacological and pharmacological) to improve breathlessness."

Fan therapy and bilevel ventilation brought relief lasting for a few minutes to a few hours for patients admitted to the hospital, the review found. On the outpatient side, acupressure and reflexology, as well as multicomponent interventions (combining activity and rehabilitation with behavioral, psychosocial and integrative medicine), brought relief lasting for a few weeks to months. Harms associated with nonpharmacological interventions were minimal.

In the accompanying AHRQ report of 50 studies, the authors found that opioids were not more effective than placebo or anti-anxiety drugs for improving breathlessness. The opioid studies reviewed showed no differences in effectiveness between different doses or routes of administration. In addition, anti-anxiety drugs were not more effective than placebo for improving breathlessness.

"We believe these data should catalyze a shift in how we approach and treat breathlessness, away from a medicalized approach using drugs, to a more comprehensive assessment and attempting nonpharmacologic interventions, such as fan therapy, first," Gupta says. "Clinical guidelines and practice should evolve to represent these novel findings."

Credit: 
Johns Hopkins Medicine

Making cheaper, biocompatible E-skin electrodes

image: Professor Sungwon Lee (right) led a team of scientists, including Gihyeok Gwon (left) and Wooseong Jeong (center) of DGIST, to improve the conductivity of PEDOT:PSS electrodes.

Image: 
dgist

Scientists around the world are working to develop electronic skins that attach to the body and monitor vital signs. These E-skins need to be comfortable, breathable, and flexible for everyday use. Gold is typically used to fabricate the electrodes that conduct electric signals in these applications. But gold is expensive, involves a complicated manufacturing process, and must be sterilized for use on the human body.

Among promising alternative materials for electrodes is the polymer PEDOT:PSS. It is biocompatible with human skin, flexible, relatively cheap, and can be easily manufactured and made into an electrode. Unfortunately, it doesn't conduct electricity as well as gold. Scientists have found ways to improve its conductivity, but these methods involve toxic products, like acids, which can leave residues and are therefore not ideal for E-skin applications.

Daegu Gyeongbuk Institute of Science & Technology (DGIST) researchers found a non-toxic method that significantly improves performance. "We developed a hydrothermal treatment, involving humidity and heat, that enhanced the conductivity of PEDOT:PSS films by a factor of 250," says DGIST materials scientist Sungwon Lee, who led the study.

Specifically, the researchers found that applying 80% humidity and more than 60°C heat to a PEDOT:PSS thin film led to structural changes within the material that enhanced its ability to conduct electricity.

PEDOT:PSS is made up of water insoluble, conductive PEDOT molecules and water soluble, insulating PSS molecules. Adding humidity to a thin film of PEDOT:PSS separated the two types of molecules with a screen of water, while adding heat expanded the PEDOT chains, increasing the material's overall crystallinity. These structural changes improved the material's conductivity from 0.495 to 125.367 Siemens per centimetre (S/cm).

The scientists then made electrodes from the enhanced PEDOT:PSS material and found it stably conducted electricity when exposed to air, heat, bending, and stretching. They also found that the electrodes worked well when sprayed onto E-skin devices used for monitoring joint movements, skin temperature, and the heart's electrical activity.

Further improvements are still needed, as treating PEDOT:PSS with acids can improve its electrical conductivity all the way up to 2,244 S/cm. "Our results are, nonetheless, noteworthy," says Lee, "with our novel hydrothermal treatment showing significant potential for use in biomedical applications."

Credit: 
DGIST (Daegu Gyeongbuk Institute of Science and Technology)

Kids gain weight when new convenience stores open nearby

video: An innovative new study in the Journal of the Academy of Nutrition and Dietetics provides data on how changes in the food environment around low-income and high-ethnic/racial minority populations impact childhood obesity over time. It shows that increased availability of small grocery stores selling a selection of healthy items in close proximity to children's homes improves their weight status, whereas increased availability of convenience stores selling predominantly unhealthy foods is likely to be detrimental.

Image: 
Journal of the Academy of Nutrition and Dietetics

Philadelphia, December 10, 2020 - A new study in the Journal of the Academy of Nutrition and Dietetics, published by Elsevier, found that changes in the food environment around low-income and high-ethnic/racial minority populations over time impact childhood obesity. Increased availability of small grocery stores selling a selection of healthy items in close proximity to children's homes improves their weight status over time, whereas increased availability of convenience stores selling predominantly unhealthy foods is likely to be detrimental.

"Childhood obesity has a complex multifaceted etiology. In this study we found that community food environment, particularly small neighborhood stores, can significantly influence children's weight status. Our findings are useful for designing future interventions and public policies," explained lead author and co-director of the research, Punam Ohri-Vachaspati, PhD, RD, Professor, College of Health Solutions, Arizona State University, Phoenix, AZ, USA.

One of the few prospective longitudinal studies examining the influence of key elements of a comprehensive set of food outlets, both large and small, the study followed two groups of 3 to 15 year-old children in four New Jersey cities -- Camden, New Brunswick, Newark, and Trenton. These cities were known to be initiating policy and environmental changes aimed at childhood obesity prevention. The first group was studied from 2009-10 to 2014-15, the second from 2014 to 2016-17.

"Our research design allowed us to examine the patterns of relationship between changes in children's weight status and changes in the food environment over several meaningful distances and lengths of exposure. We found that community food environment in urban neighborhoods matters for children's weight outcomes, especially as it relates to small stores located near children's homes," commented Michael Yedidia, senior author and co-director of the study, Professor, Center for State Health Policy, Institute for Health, Health Care Policy and Aging Research, Rutgers University, New Brunswick, NJ, USA.

Children's exposure to changes in the food environment in each city was calculated for every month during the study. The researchers looked at changes in the number of food outlets across various proximities (quarter mile, half mile, and one mile around a child's home) over different lengths of time (12 months, 18 months, and 24 months before the final interview). Changes included store openings and closings, family moves from one neighborhood to another, and upgrades to existing food stores, fostered by community initiatives to improve offerings at convenience stores.

Food outlets were categorized as supermarkets, small grocery stores, convenience stores, pharmacies, full-service restaurants, or limited service restaurants. Stores were classified as small grocery stores if they sold a specific selection of healthy items such as five different types of fruits, five different types of vegetables, lower fat milk, and fresh or frozen meat. Convenience stores participating in "healthy corner store" initiatives were classified as upgraded convenience stores.

Less healthy changes were found in children when their exposure to convenience stores increased over time. For example, exposure to an additional convenience store within a mile of a child's home over 24 months resulted in 11.7 percent greater likelihood of a child being in a higher body mass index range compared to other children of the same sex and age at the end of the study. In contrast, exposure to an additional small grocery store within a mile over 24 months resulted in 37.3 percent lower odds of being in a higher body mass index category. No consistent patterns were found for changes in exposure to supermarkets, restaurants, or pharmacies.

"While we found no consistent results for supermarkets, our findings do not suggest that large stores are not an important feature of the food environment. Rather, most of our sample living in dense urban cities had access to supermarkets at baseline and did not experience significant change in access to supermarkets over time," observed Dr. Ohri-Vachaspati.

Investigators suggest that elevating the healthfulness of food offered at upgraded convenience stores to levels similar to that of small grocery stores through community initiatives has the potential to improve children's weight status.

The researchers noted that the study design allowed for consideration of a child's experience growing up in a community where the food environment is dynamic and multiple changes occur simultaneously. Dr. Yedidia observed, "The need for a more refined understanding of the impact of local food environments on children's weight status and health has become more evident during the COVID-19 pandemic, which has been accompanied by increased food insecurity among low-income populations."

Credit: 
Elsevier

How does eye position affect 'cocktail party' listening?

image: Several acoustic studies have shown that the position of your eyes determines where your visual spatial attention is directed, which automatically influences your auditory spatial attention. Researchers are currently exploring its impact on speech intelligibility.

Image: 
Virginia Best

MELVILLE, N.Y., December 9, 2020 -- Several acoustic studies have shown that the position of your eyes determines where your visual spatial attention is directed, which automatically influences your auditory spatial attention. Researchers are currently exploring its impact on speech intelligibility.

During the 179th Meeting of the Acoustical Society of America, which will be held virtually Dec. 7-10, Virginia Best, of Boston University, will describe her work to determine whether there is a measurable effect of eye position within cocktail party listening situations. Her poster session, "An effect of eye position in cocktail party listening," will start at 9:30 a.m. on Wednesday, Dec. 9.

A "cocktail party" in their work refers to four competing talkers in addition to the talker for whom you are trying to pay attention.

"Our primary motivation was an intuition that eye position may be especially critical within these situations, where there is substantial energetic and informational masking," said Best. "A secondary motivation was our interest in visually guided beamforming, where the eyes are used to steer a highly directional hearing aid."

Best and colleagues presented participants with sequences of digits from five loudspeakers positioned in front of the listener with a spacing of 15 degrees and asked them to repeat back the digits presented from one target loudspeaker.

In some cases, participants were asked to visually fixate on the target loudspeaker. In others, participants were asked to visually fixate on a nontarget loudspeaker.

During these tasks, participants' head position was stabilized on a neck rest, and their eye position was monitored with an eye tracker. Performance was best when eye fixation was on target, and it suffered when eye fixation was off target.

This shows an influence of eye position within multi-talker situations, even without visual information such as lip reading, according to Best. It suggests optimal performance depends on the spatial alignment of auditory and visual attention.

"Our task is theoretically applicable to any situation in which there are competing voices, including parties, restaurants, and meeting rooms," Best said. "The reason we spend a lot of time studying these situations is because they are extremely difficult for people with hearing impairment and hearing aids."

Credit: 
Acoustical Society of America

Charles Darwin was right about why insects are losing the ability to fly

Most insects can fly.

Yet scores of species have lost that extraordinary ability, particularly on islands.

On the small islands that lie halfway between Antarctica and continents like Australia, almost all the insects have done so.

Flies walk, moths crawl.

"Of course, Charles Darwin knew about this wing loss habit of island insects," says PhD candidate Rachel Leihy, from the Monash University School of Biological Sciences.

"He and the famous botanist Joseph Hooker had a substantial argument about why this happens. Darwin's position was deceptively simple. If you fly, you get blown out to sea. Those left on land to produce the next generation are those most reluctant to fly, and eventually evolution does the rest. Voilà."

But since Hooker expressed his doubt, many other scientists have too.

In short, they have simply said Darwin got it wrong.

Yet almost all of these discussions have ignored the place that is the epitome of flight loss - those 'sub-Antarctic' islands. Lying in the 'roaring forties' and 'furious fifties', they're some of the windiest places on Earth.

"If Darwin really got it wrong, then wind would not in any way explain why so many insects have lost their ability to fly on these islands," said Rachel.

Using a large, new dataset on insects from sub-Antarctic and Arctic islands, Monash University researchers examined every idea proposed to account for flight loss in insects, including Darwin's wind idea.

Reporting today in Proceedings of the Royal Society B, they show that Darwin was right for this 'most windy of places'. None of the usual ideas (such as those proposed by Hooker) explain the extent of flight loss in sub-Antarctic insects, but Darwin's idea does. Although in a slightly varied form, in keeping with modern ideas on how flight loss actually evolves.

Windy conditions make insect flight more difficult and energetically costly. Thus, insects stop investing in flight and its expensive underlying machinery (wings, wing muscles) and redirect the resources to reproduction.

"It's remarkable that after 160 years, Darwin's ideas continue to bring insight to ecology," said Rachel, the lead author of the paper.

Professor Steven Chown, also from the School of Biological Sciences, added that the Antarctic region is an extraordinary laboratory in which to resolve some of the world's most enduring mysteries and test some of its most important ideas.

Credit: 
Monash University

Hidden symmetry could be key to more robust quantum systems, researchers find

Researchers have found a way to protect highly fragile quantum systems from noise, which could aid in the design and development of new quantum devices, such as ultra-powerful quantum computers.

The researchers, from the University of Cambridge, have shown that microscopic particles can remain intrinsically linked, or entangled, over long distances even if there are random disruptions between them. Using the mathematics of quantum theory, they discovered a simple setup where entangled particles can be prepared and stabilised even in the presence of noise by taking advantage of a previously unknown symmetry in quantum systems.

Their results, reported in the journal Physical Review Letters, open a new window into the mysterious quantum world that could revolutionise future technology by preserving quantum effects in noisy environments, which is the single biggest hurdle for developing such technology. Harnessing this capability will be at the heart of ultrafast quantum computers.

Quantum systems are built on the peculiar behaviour of particles at the atomic level and could revolutionise the way that complex calculations are performed. While a normal computer bit is an electrical switch that can be set to either one or zero, a quantum bit, or qubit, can be set to one, zero, or both at the same time. Furthermore, when two qubits are entangled, an operation on one immediately affects the other, no matter how far apart they are. This dual state is what gives a quantum computer its power. A computer built with entangled qubits instead of normal bits could perform calculations well beyond the capacities of even the most powerful supercomputers.

"However, qubits are extremely finicky things, and the tiniest bit of noise in their environment can cause their entanglement to break," said Dr Shovan Dutta from Cambridge's Cavendish Laboratory, the paper's first author. "Until we can find a way to make quantum systems more robust, their real-world applications will be limited."

Several companies - most notably, IBM and Google - have developed working quantum computers, although so far these have been limited to less than 100 qubits. They require near-total isolation from noise, and even then, have very short lifetimes of a few microseconds. Both companies have plans to develop 1000 qubit quantum computers within the next few years, although unless the stability issues are overcome, quantum computers will not reach practical use.

Now, Dutta and his co-author Professor Nigel Cooper have discovered a robust quantum system where multiple pairs of qubits remain entangled even with a lot of noise.

They modelled an atomic system in a lattice formation, where atoms strongly interact with each other, hopping from one site of the lattice to another. The authors found if noise were added in the middle of the lattice, it didn't affect entangled particles between left and right sides. This surprising feature results from a special type of symmetry that conserves the number of such entangled pairs.

"We weren't expecting this stabilised type of entanglement at all," said Dutta. "We stumbled upon this hidden symmetry, which is very rare in these noisy systems."

They showed this hidden symmetry protects the entangled pairs and allows their number to be controlled from zero to a large maximum value. Similar conclusions can be applied to a broad class of physical systems and can be realised with already existing ingredients in experimental platforms, paving the way to controllable entanglement in a noisy environment.

"Uncontrolled environmental disturbances are bad for survival of quantum effects like entanglement, but one can learn a lot by deliberately engineering specific types of disturbances and seeing how the particles respond," said Dutta. "We've shown that a simple form of disturbance can actually produce - and preserve - many entangled pairs, which is a great incentive for experimental developments in this field."

The researchers are hoping to confirm their theoretical findings with experiments within the next year.

Credit: 
University of Cambridge

Life expectancy and healthcare costs for patients with rheumatoid arthritis

A new study published in Arthritis & Rheumatology suggests that recent advances in the treatment of rheumatoid arthritis have prolonged patients' lives but also increased healthcare costs.

For the study, investigators examined medical claims data from the National Health Insurance of Taiwan, identifying 29,352 new cases of rheumatoid arthritis from 2003-2016.

The life expectancy after rheumatoid arthritis diagnosis was 26.3 years, and the lifetime cost was estimated to be US $72,953 after disease-modifying antirheumatic drugs called biologics became available in 2003.

"After the availability of biologics, rheumatoid arthritis patients appeared to live longer with higher lifetime expenditures that should be monitored and evaluated for cost-effectiveness," the authors wrote.

Credit: 
Wiley

The use of wild mammals in traditional medicine

image: In an analysis of published research, investigators identified 565 mammalian species that have been used to source products used in traditional medicine around the world, especially in Asia, Africa, and Latin America.

Image: 
Itamar Barbosa

In an analysis of published research, investigators identified 565 mammalian species that have been used to source products used in traditional medicine around the world, especially in Asia, Africa, and Latin America. The analysis, which is published in Mammal Review, also found that 155 of these mammalian species are considered threatened (vulnerable, endangered, or critically endangered), and a further 46 are near threatened.

The findings suggesting that overexploitation for medicinal use could be an overlooked source of threat for mammalian species.

"Our study revealed that an impressive mammalian species richness--9% of the 6,399 known species--is used in traditional medical systems worldwide. We also highlight that closely related species are used to treat similar diseases," said lead author Rômulo Romeu Nóbrega Alves, PhD, of the Universidade Estadual da Paraíba, in Brazil. "The widespread utilization of mammals in traditional medicine (including threatened species) is evidence of the importance of understanding such uses in the context of mammal conservation. Sanitary aspects of the use of wild mammals by humans, and their implications for public health, are also key aspects to consider."

Credit: 
Wiley

Evolution may be to blame for high risk of advanced cancers in humans

Compared to chimpanzees, our closest evolutionary cousins, humans are particularly prone to developing advanced carcinomas -- the type of tumors that include prostate, breast, lung and colorectal cancers -- even in the absence of known risk factors, such as genetic predisposition or tobacco use.

A recent study led by researchers at University of California San Diego School of Medicine and Moores Cancer Center helps explain why. The study, published December 9, 2020 in FASEB BioAdvances, suggests that an evolutionary genetic mutation unique to humans may be at least partly to blame.

"At some point during human evolution, the SIGLEC12 gene -- and more specifically, the Siglec-12 protein it produces as part of the immune system -- suffered a mutation that eliminated its ability to distinguish between 'self' and invading microbes, so the body needed to get rid of it," said senior author Ajit Varki, MD, Distinguished Professor at UC San Diego School of Medicine and Moores Cancer Center. "But it's not completely gone from the population -- it appears that this dysfunctional form of the Siglec-12 protein went rogue and has now become a liability for the minority of people who still produce it."

Ajit Varki, who is also co-director of both the Glycobiology Research and Training Center and Center for Academic Research and Training in Anthropogeny, led the study with Nissi Varki, MD, professor of pathology at UC San Diego School of Medicine.

In a study of normal and cancerous tissue samples, the researchers discovered that the approximately 30 percent of people who still produce Siglec-12 proteins are at more than twice the risk of developing an advanced cancer during their lifetimes, compared to people who cannot produce Siglec-12.

Normally, genes that encode such dysfunctional proteins are eliminated by the body over time, and approximately two-thirds of the global human population has stopped producing the Siglec-12 protein. Where the gene still hangs around in humans, it was long thought be of no functional relevance, and there have been very few follow-up studies over the two decades since it was discovered. Meanwhile, chimpanzees still produce functioning Siglec-12.

When Nissi Varki's team set out to detect the Siglec-12 in non-cancerous tissue samples using an antibody against the protein, approximately 30 percent of the samples were positive, as expected from the genetic information. In contrast, the majority of advanced cancer samples from the same populations were positive for the Siglec-12 protein.

Looking at a different population of patients with advanced stage colorectal cancer, the researchers found that more than 80 percent had the functional form of the SIGLEC-12 gene, and those patients had a worse outcome than the minority of patients without it.

"These results suggest that the minority of individuals who can still make the protein are at much greater risk of having an advanced cancer," Nissi Varki said.

The researchers also validated their findings in mice by introducing tumor cells engineered to produce Siglec-12. The resulting cancers grew much faster, and turned on many biological pathways known to be involved in advanced cancers, compared to control tumor cells without functioning Siglec-12.

According to Ajit Varki, this information is important because it could be leveraged for future diagnostics and treatments. The team got a jump start by developing a simple urine test that could be used to detect the presence of the dysfunctional protein, and "we might also be able to use antibodies against Siglec-12 to selectively deliver chemotherapies to tumor cells that carry the dysfunctional protein, without harming non-cancerous cells," he said.

Credit: 
University of California - San Diego

Microbes and plants: A dynamic duo

image: Mature sorghum plants before harvest. Sorghum can be used for animal feed, cereal grain, and syrup.

Image: 
Devin Coleman-Derr

Drought stress has been a major roadblock in crop success, and this obstacle will not disappear anytime soon. Luckily, a dynamic duo like Batman and Robin, certain root-associated microbes and the plants they inhabit, are here to help.

Plants and animals have a close connection to the microbes like bacteria living on them. The microbes, the creatures they inhabit, and the environment they create all play a critical role for life on Earth.

"We know that microbiomes, which are the communities of microorganisms in a given environment, are very important for the health of plants," said Devin Coleman-Derr.

Coleman-Derr, a scientist at University of California, Berkeley, studies how drought impacts the microbiome of sorghum. He recently presented his research at the virtual 2020 ASA-CSSA-SSSA Annual Meeting.

Findings show that certain bacteria living in the roots of sorghum, a crop commonly grown for animal feed, work together with the plant to reduce drought stress. This unique pairing leads to overall plant success.

"Plants have hormones, which help plants decide how to spend their energy," says Coleman-Derr. "Microbes can manipulate the system and cause the decision-making process of plants to be altered."

Some bacteria and fungi are destined to inhabit certain plants. And, bacteria want the roots they inhabit to be their dream homes. If a bacterium partners with a plant to help it grow during dry weather, it is essentially building a better home for itself.

Virtually all aspects of the plant's life are connected to the microbes present. When a plant gets thirsty, it can send the entire microbiome into action.

Drought causes dramatic changes in how bacteria and plant partners interact. Additional bacteria may be recruited to help the plant survive the dry weather. These microbes can influence the plant's hormones to encourage more root growth, which will help the plant reach more water.

"We want to know if we can control this," said Coleman-Derr. "Is there the possibility to manipulate the microbiome present to help sorghum cope with drought stress?"

The resiliency of crops to environmental stress is of growing concern to both researchers and farmers, especially with the changes in global climates. New research findings are important to develop crops that can maintain productivity, even in harsher conditions.

"We recognize that the microbiome is dynamic and changes over time," said Coleman-Derr. "While the jury is still out on if we can control sorghum microbiomes, several labs have shown that some bacteria present during drought stress lead to positive outcomes for plants."

Understanding plant microbiomes is a large part of determining factors of crop productivity. Fortunately, plants are excellent models for studying microbiomes.

The next step in this quest is to determine if microbiomes can be manipulated and used as a solution for drought in crop production systems.

"By determining if we can alter the microbiome, we can work towards achieving our goal of creating better producing crops with less inputs," said Coleman-Derr.

Credit: 
American Society of Agronomy

Can sting rays and electric rays help us map the ocean floor?

image: Researchers fixed acoustic transmitters to electric rays and sting rays and were able to map their movements as they swam close to the ocean floor.

Image: 
RIKEN

Researchers at the RIKEN Center for Biosystems Dynamics Research (BDR) in Japan have completed a feasibility study indicating that electric rays and sting rays equipped with pingers will be able to map the seabed through natural exploration

The ocean is a big place full of natural resources including fossil fuels, minerals, and of course, fish. The problem is that many of these resources are on the ocean floor in places we have yet to find. Ocean exploration is therefore necessary, and currently automated vehicles, sonar, and satellites are all used with varying advantages and disadvantages. At RIKEN BDR, scientists led by Yo Tanaka are developing a completely different system that relies on the natural swimming behavior of electric rays and sting rays.

"Electric rays and sting rays are benthic animals, meaning that they spend most of their time swimming around the ocean floor in deep places," explains Tanaka. "By combining simple pinger technology and digital cameras with this natural behavior, we think we can use rays to map the ocean floor, and at the same time collect meaningful data about ocean wildlife, biota, and resources." Additionally, this method could be much more cost effective as Tanaka and his team have already shown that electric rays can use their own electricity to power the small pingers.

A pinger is a device that emits and ultrasonic sound. When a pinger's sound is picked up by several receivers, the position of the receivers and the time when the sound is detected can be used to calculate the position of the pinger. By placing cameras on rays and linking the timing of the recorded video to the timing and locations determined by the pingers, the researchers believe they can create accurate maps of the ocean floor. In their proof-of-concept study, the team conducted two experiments that showed that their idea to use rays is feasible.

The first study took place in a large water tank. A setup with cameras in three planes--front, side, and top--verified that both types of ray swam near the bottom of the tank. The images taken by the camera allowed 3-D reconstruction of movements over time. They also verified that a camera could be attached the rays to record video of their exploration. With these positive results, the team was ready to test their system out in the real world--an area off the coast of Okinawa in Japan. As this was a proof-of-concept experiment, they chose an area with a relatively flat seabed.

They attached pingers to both sting rays and electric rays and lowered them into the ocean from a large boat along with four ultrasound receivers. The depth of the ocean was about 20 m (60 ft) and the rays were allowed to swim about 40 m (120 ft) out from the boat. The researchers recorded the pinger-derived positions as the rays swam near the boat for about two hours. Afterward, they compared the data with a seabed map of the area that already exists and confirmed that the rays' positions were within about 10 cm of those in the public map. Similar results from both types of ray were important because rays are seasonal animals

"In our ocean experiment, in addition to the pinger positioning, we were able to confirm that electric rays actually move around the seabed," says Tanaka. "In the near future we will test the system for long-term monitoring." Long-term monitoring will require pingers that the electric rays can self-charge as well as wearable battery packs for the sting rays. The next test will also monitor an area with a more varied seabed with complex geometry.

Credit: 
RIKEN

Green pandemic recovery essential to close climate action gap - UN report

Nairobi, 9 December 2020 - A green pandemic recovery could cut up to 25 per cent off predicted 2030 greenhouse gas emissions and bring the world closer to meeting the 2°C goal of the Paris Agreement on Climate Change, a new UN Environment Programme (UNEP) report finds.

UNEP's annual Emissions Gap Report 2020 finds that, despite a dip in 2020 carbon dioxide emissions caused by the COVID-19 pandemic, the world is still heading for a temperature rise in excess of 3°C this century.

However, if governments invest in climate action as part of pandemic recovery and solidify emerging net-zero commitments with strengthened pledges at the next climate meeting - taking place in Glasgow in November 2021 - they can bring emissions to levels broadly consistent with the 2°C goal.

By combining a green pandemic recovery with swift moves to include new net-zero commitments in updated Nationally Determined Contributions (NDCs) under the Paris Agreement, and following up with rapid, stronger action, governments could still attain the more-ambitious 1.5°C goal.

"The year 2020 is on course to be one of the warmest on record, while wildfires, storms and droughts continue to wreak havoc," said Inger Andersen, UNEP's Executive Director. "However, UNEP's Emissions Gap report shows that a green pandemic recovery can take a huge slice out of greenhouse gas emissions and help slow climate change. I urge governments to back a green recovery in the next stage of COVID-19 fiscal interventions and raise significantly their climate ambitions in 2021."

Each year, the Emissions Gap Report assesses the gap between anticipated emissions and levels consistent with the Paris Agreement goals of limiting global warming this century to well below 2°C and pursuing 1.5°C. The report finds that in 2019 total greenhouse gas emissions, including land-use change, reached a new high of 59.1 gigatonnes of CO2 equivalent (GtCO2e). Global greenhouse gas emissions have grown 1.4 per cent per year since 2010 on average, with a more rapid increase of 2.6 per cent in 2019 due to a large increase in forest fires.

As a result of reduced travel, lower industrial activity and lower electricity generation this year due to the pandemic, carbon dioxide emissions are predicted to fall up to 7 per cent in 2020. However, this dip only translates to a 0.01°C reduction of global warming by 2050. Meanwhile, NDCs remain inadequate.

Green recovery critical

A green pandemic recovery, however, can cut up to 25 per cent off the emissions we would expect to see in 2030 based on policies in place before COVID-19. A green recovery would put emissions in 2030 at 44 GtCO2e, instead of the predicted 59 GtCO2e - far outstripping emission reductions foreseen in unconditional NDCs, which leave the world on track for a 3.2°C temperature rise.

Such a green recovery would put emissions within the range that gives a 66 per cent chance of holding temperatures to below 2°C, but would still be insufficient to achieve the 1.5°C goal.

Measures to prioritize in green fiscal recovery include direct support for zero-emissions technologies and infrastructure, reducing fossil fuel subsidies, no new coal plants, and promoting nature-based solutions - including large-scale landscape restoration and reforestation.

So far, the report finds, action on a green fiscal recovery has been limited. Around one-quarter of G20 members have dedicated shares of their spending, up to 3 per cent of GDP, to low-carbon measures.

There nonetheless remains a significant opportunity for countries to implement green policies and programmes. Governments must take this opportunity in the next stage of COVID-19 fiscal interventions, the report finds.

The report also finds that the growing number of countries committing to net-zero emissions goals by mid-century is a "significant and encouraging development". At the time of report completion, 126 countries covering 51 per cent of global greenhouse gas emissions had adopted, announced or were considering net-zero goals.

To remain feasible and credible, however, these commitments must be urgently translated into strong near-term policies and action and reflected in NDCs. The levels of ambition in the Paris Agreement still must be roughly tripled for the 2°C pathway and increased at least fivefold for the 1.5°C pathway.

Reforming consumption behaviour critical

Each year the report also looks at the potential of specific sectors. In 2020, it considers consumer behaviour and the shipping and aviation sectors.

The shipping and aviation sectors, which account for 5 per cent of global emissions, also require attention. Improvements in technology and operations can increase fuel efficiency, but projected increases in demand mean this will not result in decarbonisation and absolute reductions of CO2. Both sectors need to combine energy efficiency with a rapid transition away from fossil fuel, the report finds.

The report finds that stronger climate action must include changes in consumption behaviour by the private sector and individuals. Around two-thirds of global emissions are linked to private households, when using consumption-based accounting.

The wealthy bear greatest responsibility: the emissions of the richest one per cent of the global population account for more than twice the combined share of the poorest 50 per cent. This group will need to reduce its footprint by a factor of 30 to stay in line with the Paris Agreement targets.

Possible actions to support and enable lower carbon consumption include replacing domestic short haul flights with rail, incentives and infrastructure to enable cycling and car-sharing, improving the energy efficiency of housing and policies to reduce food waste.

Credit: 
UNEP Division of Public Communication and Information