Culture

New cognitive bias affecting evaluation processes: Generosity-erosion effect

Researchers of the University of Barcelona, together with researchers from the University of Zurich (Switzerland) and Brown University (United States), have analysed more than 10,000 evaluations that were carried out to candidates who wish to hold a public teaching permanent in Catalonia. The objective was to study how the decision by the committee of evaluators is affected by the fact that each candidate holds a certain position in the lists of people to be assessed. The study, published in the journal Science Advances, identifies a new cognitive bias that researchers have named "generosity-erosion effect". It involves that once the evaluators have scored one candidate generously, they tend to act harsher to the subsequent ones.

Researchers considered the fact of giving someone a score of 5.00 -the minimum grade to pass- a generous gesture, since it allows people to pass the exam when they would be on the verge of failing it. In addition, it is hard for evaluators to assess the merit of a candidate at a decimal level. Once this parameter is set, the study shows that the likelihood of success decreases by 7.7% for each previous candidate that received the lowest accepted score to continue with the hiring process.

Authors provide some explanations for this generosity-erosion effect. One could be guilt-aversion: evaluators would tend to be generous and overgrade candidates who are on the verge of failing in order to avoid the feeling of guilt, but once they have given a score of 5 to some candidates this feeling of guilt is reduced and they are likely to act harsher on others. "We observed the mechanism that affects the final score is not fatigue nor the contrast with the previous candidate or the expectations of the examination board, as stated in other studies: it is mainly guilt or the generosity-erosion effect", notes Jordi J. Teixidó, researcher of the Faculty of Economics and Business of the UB and co-author of the article.

The study, also signed by the UB researcher Tania Fernández, Marc Lluís Vives (Brown University, United States) and Miquel Serra-Burriel (University of Zurich, Switzerland), uses tools from the social science game theory to interpret the results of the analysis. The selection process to hire teachers showed proper features for the study, since those candidates to be assessed were listed randomly without a certain order, and the decision was not taken individually but by a committee, which is becoming more common in selection processes.

Credit: 
University of Barcelona

Consistent use of food pantries needed to address food insecurity, related health issues

image: Food Insecurity at Record Levels in the United States

Image: 
UT Southwestern Medical Center

DALLAS - April 21, 2021 - Food banks should be used more consistently rather than only during emergencies to better address food insecurity and related health issues, a joint study by researchers at UT Southwestern Medical Center and economists at the University of Dallas shows.

"The main discovery in our research is that encouraging clients and making it easier for clients to receive food frequently improves their food security, health, and well-being," says Sandi Pruitt, Ph.D., associate professor of population and data sciences at UT Southwestern, and senior author of the study. "The food banking system is predicated on the assumption that people need food pantries for emergencies only. But this is a common misconception, as many families and individuals experience food insecurity for months or years at a time and it's more of a chronic condition."

The researchers calculated that a 10 percentage-point increase in the frequency of food pantry visits led to a 5.7 percent reduced likelihood of food insecurity and a 6.2 percent reduction in likelihood of poor health.

In 2018, 11.1 percent of U.S. households reported being food insecure, defined as inconsistent access to adequate food due to lack of financial or other resources, the researchers report. Food insecurity across the country has increased to new historical highs during the COVID-19 pandemic.

"Food insecurity rates often spike during an economic recession. Following the 2007-2009 recession, the nationwide food insecurity rate took 11 years to return to the pre-recession level," says lead author Tammy Leonard, Ph.D., an economist at the University of Dallas. "This is an indication that we need to rethink our systems and processes for addressing this fundamental need."

Researchers evaluated an innovative model used at Crossroads Community Services, a food pantry and distribution system located in southern Dallas County that focuses on nourishing families through nutritious food items to power dietary change for improved health. Crossroads partners with smaller community organizations to distribute food at multiple locations such as public housing facilities, churches, and community centers, and requires clients to pre-enroll for monthly pickups.

"What we observed in Dallas is that Crossroads has made food more accessible to clients and has also encouraged clients to come back regularly, and that signals that they should be seeking support. We hope that this can be a policy change in food distribution settings across the U.S.," Pruitt says. "The food banking system and everything in the food assistance sector is more important than ever right now, and we really need to maximize the importance of these food assistance programs on clients to ensure that everyone has enough food to eat."

Credit: 
UT Southwestern Medical Center

Bypassing broken genes

UNIVERSITY PARK, Pa. -- A new approach to gene editing using the CRISPR/Cas9 system bypasses disease-causing mutations in a gene, enabling treatment of genetic diseases linked to a single gene, such as cystic fibrosis, certain types of sickle cell anemia, and other rare diseases. The method, developed and tested in mice and human tissue cultures by researchers at Penn State, involves inserting a new, fully functional copy of the gene that displaces the mutated gene.

A proof-of-concept for the approach is described in a paper appearing online April 20 in the journal Molecular Therapy.

The CRISPR/Cas9 system has allowed promising new gene therapies that can target and correct disease-causing mutations in a gene. In this process, Cas9--a bacterial protein--cuts DNA at a specific location, where the genetic sequence can then be edited, trimmed, or a new sequence inserted before the DNA is repaired. However, there are two main limitations to current repair strategies. First, the common repair strategy, called "homology-directed repair," requires using specific proteins within the cell that are only present during cell division, which means the gene repair process cannot be used in most adult tissues where cell division occurs rarely.

"The second challenge stems from the fact that even when a disease is caused by a single gene, it can result from a variety of different mutations within that gene," said Douglas Cavener, professor of biology at Penn State and senior author of the paper. "With homology-directed repair, we'd need to design and test the strategy for each and every one of those mutations, which can be expensive and time-intensive. In this study, we designed an approach called Co-opting Regulation Bypass Repair (CRBR), which can be used in both dividing and non-dividing cells and tissues and for a spectrum of mutations within a gene. This approach is especially promising for rare genetic diseases caused by a single gene, where limited time and resources typically preclude design and testing for the many possible disease-causing mutations."

CRBR takes advantage of the CRISPR/Cas9 system and a cellular repair pathway called "non-homologous end joining" to insert a genetic sequence between a mutated gene's promoter region--the genetic sequence that controls when and where the gene is functional--and the mutated portion of the gene. The newly inserted sequence contains a condensed version of the normal gene that is used in place of the mutated version. A terminator sequence at the end of the inserted sequence prevents the remaining downstream mutated gene from being used. Because CRBR does not rely on the proteins required by homology-directed repair, it can be used in all types of adult tissues.

"Our approach co-opts the native promoter for a gene," said Jingjie Hu, a graduate student at Penn State and first author of the paper. "This means that the newly inserted gene will be expressed at the same times and at appropriate levels within the cell as the gene it is replacing. This is an advantage to other types of gene therapies, which rely on an external promoter to drive high levels of expression of the gene that could lead to negative effects if too much is produced or if essential regulation response is missing under certain physiological conditions."

The research team conducted a series of proof-of-concept experiments to demonstrate the utility of this method. They first focused on the PERK gene, mutations in which can lead to a rare disease called Wolcott-Rallison syndrome. The syndrome results when copies of the gene inherited from both parents have mutations--it is a "recessive" disease--and can cause neonatal diabetes, skeletal problems, growth delay, and other symptoms.

The researchers inserted the PERK gene in healthy mice using CRBR and bred them with mice that had a mutation in the gene. The resulting mice, which contained one CRBR-edited PERK gene and one mutated PERK gene, did not have the typical abnormalities associated with the syndrome, indicating that the CRBR-edited gene can rescue PERK gene function in a mouse model of the Wolcott-Rallison syndrome.

Next, the researchers tested the CRBR method in human tissues in the laboratory, in this case focusing on the insulin gene. They inserted a green fluorescent protein marker gene sequence between the insulin gene promoter and the insulin coding sequence in human cadaver cells. This experiment resulted in the expression of the fluorescent protein in insulin secreting cells, but not other cell types, which suggests the new sequence was strictly regulated under the control of insulin promoter.

"Our results demonstrated that CRBR gene repair can restore PERK gene function in mice and revealed the potential utility of CRBR for gene repair in human tissue," said Hu.

The researchers note that gene editing with CRISPR/Cas9 can be error prone. For example, homology-directed repair-based strategies could potentially produce damaging mutations in the coding sequence if repair does not happen properly. With CRBR, the researchers target the insertion within a region of the gene that does not code for protein, which should be more tolerant of these errors.

"Gene therapies such as CRBR that utilize CRISPR/Cas9 continue to face the challenge of delivery repair machinery into the cells of interest," said Cavener. "One promising development is to isolate cells or tissues from the afflicted patient, repair a mutant gene in the laboratory, and then transplant the repaired cells or tissues back into the patient. We hope that, as researchers continue to improve delivery methods, CRBR can be utilized to treat Wolcott-Rallison syndrome as well other human genetic diseases."

Credit: 
Penn State

Solar panels are contagious - but in a good way: Study

The number of solar panels within shortest distance from a house is the most important factor in determining the likelihood of that house having a solar panel, when compared with a host of socio-economic and demographic variables. This is shown in a new study by scientists using satellite and census data of the city of Fresno in the US, and employing machine learning. Although it is known that peer effects are relevant for sustainable energy choices, very high-resolution data combined with artificial intelligence techniques were necessary to single out the paramount importance of proximity. The finding is relevant for policies that aim at a broad deployment of solar panels in order to replace unsustainable fossil fueled energy generation.

"It's almost like if you see a solar panel from out of your window, you decide to put one on your own roof as well," says study author Leonie Wenz from the Potsdam Institute for Climate Impact Research (PIK) in Germany. "Of course, one might think that other factors are more relevant, for instance income or educational background, or word-of-mouth within the same social network such as a school district. So we compared all these different options, and we've been amazed by the outcome. It turns out that, no, geographical distance really is the most important factor. The more panels there are within a short radius around my house, the more likely I'm of having one, too."

Peer effect halves over the distance of a football field

"The likelihood of putting a solar panel on your roof roughly halves over the distance of a football field", says Anders Levermann from PIK and Columbia University's LDEO in New York who is also an author of the study. "The contagion effect is strongest for a short radius around a home with a solar panel and decreases exponentially the farther away the panels are. It is a remarkable robust feature that is most pronounced in low-income neighborhoods.

The scientists just made the data speak. "We combined population census data for every district with high-resolution satellite data that is able to identify all the solar panels in Fresno," explains study author Kelsey Barton-Henry from PIK. "Then we trained several machine learning algorithms to find the relation between people's socio-economic setting and their likelihood of having a solar panel."

"Seeding solar panels where few exist may flip a community"

"The findings suggest that seeding solar panels in areas where few exist, may flip a community," concludes Levermann. "If more solar panels lead to more solar panels that may generate a kind of tipping point - a good one this time. The climate system has a number of extremely dangerous tipping points from the West Antarctic ice sheet to the North Atlantic Current." Wenz adds: "Hence, researching climate decisions to identify positive social tipping points, both small and big ones, is important to ensure a safe tomorrow for all."

Credit: 
Potsdam Institute for Climate Impact Research (PIK)

How SARS coronaviruses reprogram host cells to their own benefit

Coronavirus researchers led by Professor Rolf Hilgenfeld of the University of Luebeck and PD Dr. Albrecht von Brunn of the Ludwig-Maximilian Universitaet (LMU) in Munich have discovered how SARS viruses enhance the production of viral proteins in infected cells, so that many new copies of the virus can be generated. Notably, coronaviruses other than SARS-CoV and SARS-CoV-2 do not use this mechanism, which may therefore provide a possible explanation for the much higher pathogenicity of the SARS viruses. The findings appear in the EMBO Journal.

Coronaviruses that cause harmless colds in humans were discovered more than 50 years ago. When it emerged in 2002/2003, the SARS coronavirus was the first coronavirus found to cause severe pneumonia in infected people. Comparisons of the RNA genomes of innocuous coronaviruses with those of the SARS coronavirus permitted researchers to identify a region that only occurred in the latter, and was called the "SARS-unique domain" (SUD). Such genomic regions and their protein products might be linked to the extraordinary pathogenicity of SARS coronavirus and its cousin, the COVID-19 virus SARS-CoV-2. 

The research groups led by Hilgenfeld and von Brunn showed that the SUD proteins of these two viruses interact with a human protein called Paip-1, which is involved in the first steps of protein synthesis. Together with Paip-1 and other proteins in human cells, SUD apparently binds to the ribosomes, the molecular machines that are responsible for protein synthesis in cells. This would lead to an enhancement of the production of all proteins, both those of the host cell and those of the virus. However, in cells infected with SARS-CoV or SARS-CoV-2, the messenger RNA molecules that code for host proteins are selectively destroyed by a viral protein named Nsp1. As a result of this complicated process, the infected cell predominantly produces viral proteins, so that many new copies of the virus can be created.

Albrecht von Brunn's research group discovered the interaction between the proteins SUD and Paip-1 several years ago. "Being an experienced coronavirologist, I knew that one has to inspect the special regions of the SARS genome when trying to understand this virus," he says.

The discovery made by the Munich researchers was of great interest to Hilgenfeld, whose research group had already elucidated the three-dimensional structure of the SUD protein some years previously. The two research groups teamed up. Dr. Jian Lei in Hilgenfeld's group, meanwhile a group leader at Sichuan University in Chengdu (China), succeeded in crystallizing the complex formed by SUD and Paip-1 and determining its three-dimensional structure by X-ray crystallography. And co-first author Dr. Yue "Lizzy" Ma-Lauer of von Brunn's group characterized the complex of the two proteins and its function using a variety of cell-biological and biophysical methods.

"Interaction studies of this kind between coronavirus proteins and proteins of the infected human cell will help us understand how the viruses change key functions of the cell to their own benefit," says Hilgenfeld. The project was supported by the German Federal Ministry of Education and Research (BMBF) and by the German Center for Infection Research (DZIF).

Credit: 
Ludwig-Maximilians-Universität München

Cracking open the mystery of how many bubbles are in a glass of beer

After pouring beer into a glass, streams of little bubbles appear and start to rise, forming a foamy head. As the bubbles burst, the released carbon dioxide gas imparts the beverage's desirable tang. But just how many bubbles are in that drink? By examining various factors, researchers reporting in ACS Omega estimate between 200,000 and nearly 2 million of these tiny spheres can form in a gently poured lager.

Worldwide, beer is one of the most popular alcoholic beverages. Lightly flavored lagers, which are especially well-liked, are produced through a cool fermentation process, converting the sugars in malted grains to alcohol and carbon dioxide. During commercial packaging, more carbonation can be added to get a desired level of fizziness. That's why bottles and cans of beer hiss when opened and release micrometer-wide bubbles when poured into a mug. These bubbles are important sensory elements of beer tasting, similar to sparkling wines, because they transport flavor and scent compounds. The carbonation also can tickle the drinker's nose. Gérard Liger-Belair had previously determined that about 1 million bubbles form in a flute of champagne, but scientists don't know the number created and released by beer before it's flat. So, Liger-Belair and Clara Cilindre wanted to find out.

The researchers first measured the amount of carbon dioxide dissolved in a commercial lager just after pouring it into a tilted glass, such as a server would do to reduce its surface foam. Next, using this value and a standard tasting temperature of 42 F, they calculated that dissolved gas would spontaneously aggregate to form streams of bubbles wherever crevices and cavities in the glass were more than 1.4 μm-wide. Then, high-speed photographs showed that the bubbles grew in volume as they floated to the surface, capturing and transporting additional dissolved gas to the air above the drink. As the remaining gas concentration decreased, the bubbling would eventually cease. The researchers estimated there could be between 200,000 and 2 million bubbles released before a half-pint of lager would go flat. Surprisingly, defects in a glass will influence beer and champagne differently, with more bubbles forming in beer compared with champagne when larger imperfections are present, the researchers say.

Credit: 
American Chemical Society

Researchers identify potential subtype of PTSD

(Boston)--A major obstacle in understanding and treating posttraumatic stress disorder (PTSD) is its clinical and neurobiological heterogeneity. In order to better treat the condition and address this barrier, the field has become increasingly interested in identifying subtypes of PTSD based on dysfunction in neural networks alongside cognitive impairments that may underlie the development and maintenance of symptoms.

VA and BU researchers have now found a marker of PTSD in brain regions associated with emotional regulation. "This marker was strongest in those with clinically impaired executive function or the ability to engage in complex goal-directed behavior," explained corresponding author Audreyana Jagger-Rickels, PhD, a post-doctoral scientist in the Boston Attention and Learning Lab (BALLAB) at the VA Boston Healthcare System.

The study included 271 Veterans participants in the Translational Research Center for TBI and

Stress Disorders (TRACTS) at VA Boston, who had been deployed to post-9/11 conflicts and completed a functional MRI scan that measures the communication between brain regions. The Veterans also completed tests that measured PTSD and cognitive (neuropsychological) functioning, including executive functioning.

The researchers found that Veterans with greater PTSD severity had an increased disruption between their cognitive control network (frontal parietal control network) and their emotional processing network (limbic network). Upon further investigation, they found that those with clinically impaired executive function had the greatest disruption to this brain marker of PTSD.

"This study provides preliminary evidence for a "neurocognitive" subtype of PTSD, specifically that a combination of cognitive and brain signatures may identify a subset of people with PTSD that could be unique," said senior author Michael Esterman, PhD, principal investigator, National Center for PTSD at VA Boston Healthcare System and associate professor of psychiatry at Boston University School of Medicine.

According to the researchers these findings suggest that if someone presents with PTSD and impaired executive function, that they may also have a unique brain marker related to emotional regulation. "These individuals may respond best to specific treatment strategies but may also have difficulty engaging in treatments that require high levels of emotional regulation and executive functioning," added Jagger-Rickels.

The researchers hope this study will help identify those who will benefit from specific treatments for PTSD and may lead to new innovative treatments that target cognitive and brain functioning. "Ultimately, diagnosing and treating individuals based on their own unique clinical and biological profile, rather than simply based a broad diagnosis, would be the goal," said Esterman.

Credit: 
Boston University School of Medicine

Freshwater salt pollution threatens ecosystem health and human water security

image: Image of the Bull Run River that feeds the Occoquan Reservoir, an important source of water supply to Fairfax Water, a water utility serving about 2 million people in Northern Virginia and the home of Virginia Tech's Occoquan Watershed Monitoring Lab. Photo courtesy of Peter Vikesland for Virginia Tech.

Image: 
Virginia Tech

Water touches virtually every aspect of human society, and all life on earth requires it. Yet, fresh, clean water is becoming increasingly scarce -- one in eight people on the planet lack access to clean water. Drivers of freshwater salt pollution such as de-icers on roads and parking lots, water softeners, and wastewater and industrial discharges further threaten freshwater ecosystem health and human water security.

"Inland freshwater salt pollution is rising nationwide and worldwide, and we investigated the potential conflict between managing freshwater salt pollution and the sustainable practice of increasing water supply through the addition of highly treated wastewater to surface waters and groundwaters," said Stanley Grant, professor of civil and environmental engineering in the Virginia Tech College of Engineering. "If we don't figure out how to reverse this trend of salt pollution soon, it may become one of our nation's top environmental challenges."

Grant and his collaborators have recently published their findings in the journal Nature Sustainability.

In a recent modeling study, it was predicted that salt pollution will increase over 50 percent in more than half of U.S. streams by 2100. Freshwater salt pollution is associated with the decline of biodiversity, critical freshwater habitat, and lack of safe drinking water.

"We found there are numerous opportunities that exist to reduce the contribution of salt pollution in the highly treated wastewater discharged to the Occoquan Reservoir and freshwater pollution more generally," said Peter Vikesland, professor in the Department of Civil and Environmental Engineering and affiliated faculty member in the Global Change Center, housed within Fralin Life Sciences Institute at Virginia Tech. "These efforts will require deliberative engagement with a diverse community of watershed stakeholders and careful consideration of the local political, social and environmental context."

From time-series data collected over 25 years, the researchers quantified the contributions of three salinity sources -- highly treated wastewater and outflows from two rapidly urbanizing watersheds in Northern Virginia -- to the rising concentration of sodium, a major ion associated with freshwater pollution.

The Occoquan Reservoir, a regionally important drinking-water reservoir in the mid-Atlantic United States, is located approximately 19 miles southwest of Washington, D.C., in Northern Virginia, and is one of two primary sources of water supply for nearly 2 million people in Fairfax County, Virginia, and surrounding communities. On an annual basis, approximately 95% of the water flowing into the reservoir comes from its Occoquan River and Bull Run tributaries.

"This study exemplifies the power of combining historical data and new computational tools; it underscores the incredible value of long-term monitoring," said Grant who is the Co-Director of the Occoquan Watershed Monitoring Lab and an affiliated faculty member in the Center for Coastal Studies at Virginia Tech. "It is a testimony to the vision of Virginia Tech and the Occoquan Watershed Monitoring Lab and their collaboration with stakeholders in the watershed, including Fairfax Water and the Upper Occoquan Service Authority, over the past two decades."

The researchers found that rising salt pollution in the reservoir is primarily from watershed runoff during wet weather and highly treated wastewater during dry weather.

Across all timescales evaluated, sodium concentration in the treated wastewater is higher than in outflow from the two watersheds. Sodium in the treated wastewater originates from chemicals added during wastewater treatment, industrial and commercial discharges, human excretion and down-drain disposal of drinking water and sodium-rich household products.

"Our study is unique because it brings together engineers, ecologists, hydrologists, and social scientists to investigate and tackle one of the greatest threats to the world's water quality," said Sujay Kaushal, a co-author on the paper, professor of geology at the University of Maryland, and an international expert on freshwater salinization.

The researchers envision at least four ways in which salt pollution can be reduced: limit watershed sources of sodium that enter the water supply (such as from deicer use), enforce more stringent pre-treatment requirements on industrial and commercial dischargers, switch to low-sodium water and wastewater treatment methods, and encourage households to adopt low-sodium products.

Drinking water supply and sewage collection systems contribute salt to the former ultimately contribute salt to the latter as well.

"Citizens can start today or tomorrow by thinking more critically about what they put down the drain and how that harms the environment, and in turn, their own drinking water supply," said Vikesland.

This research aligns with the One Water vision used nationally and globally by multiple water resource sectors, and it catalyzes robust stakeholder-driven decision making under seemingly conflicting objectives.

This research was part of a partnership between Virginia Tech, University of Maryland, Vanderbilt University, and North Carolina State University. It was funded by a recent multimillion dollar grant that Grant and his collaborators received from the National Science Foundation aimed at addressing freshwater salt pollution and is part of the National Science Foundation's Growing Convergence Research (GCR) program, which aims to catalyze solutions to societal grand challenges by the merging of ideas, approaches, and technologies from widely diverse fields of knowledge to stimulate innovation and discovery. Experience gained and lessons learned from this research will be upscaled nationally and globally in partnership with The Water Research Foundation.

"The collaborative effort by this highly interdisciplinary team exemplifies the type of paradigm shifting science that we seek to catalyze and promote," said William Hopkins, professor in the College of Natural Resources and Environment, director of the Global Change Center, and associate executive director of the Fralin Life Sciences Institute. "Freshwater salt pollution has become a major focus for diverse researchers at Virginia Tech because the problem is so widespread, getting worse, and affects both the environment and society. Fortunately, the team's research advances our understanding of important sources of salt pollution so that evidence-based interventions can be identified and implemented. The study has far reaching implications globally as we try to solve this complex environmental problem."

This study reflects the exciting convergent approach the NSF-funded project is taking.

"While the biophysical findings are front-and-center here, it acknowledges the complex socio-political contexts in which that information will be applied and foreshadows the collaborative, multi-stakeholder approaches to tackling the freshwater salt pollution problem that we are currently advancing," said Todd Schenk, assistant professor in the School of Public and International Affairs in the College of Architecture and Urban Studies and affiliated faculty member of the Global Change Center and Center for Coastal Studies.

Credit: 
Virginia Tech

For scleroderma, algorithm helps better screen for fatal complication

image: A cartoon showing some of the affects of scleroderma.

Image: 
Michigan medicine

Screening for a sometimes fatal condition among patients with a rare autoimmune disease could soon - thanks to a computer algorithm - become even more accurate.

Researchers at Michigan Medicine found that an internet application improved their ability to spot pulmonary arterial hypertension in patients with systemic sclerosis, or scleroderma. The unpredictable condition is marked by tightening of the skin that can damage internal organs.

The algorithm, aptly named DETECT, outperformed standard methods used to identify the form of high blood pressure in the lungs that causes the heart to weaken and fail.

"We've been advocating for a long time that every scleroderma patient should be screened on an annual basis using DETECT, and this data supports that," says Dinesh Khanna, M.B.B.S., M.Sc., senior author of the study and director of Michigan Medicine's Scleroderma Program. "Pulmonary arterial hypertension is a leading cause of death for these patients, and we want to diagnose them early."

The DETECT algorithm is a two-step algorithm that uses six different clinical variables to determine whether a patient requires an echocardiogram, or ultrasound, of the heart. The second step then informs whether the patient should be referred for a right heart catheterization.

Researchers found the algorithm correctly identified all 10 patients with pulmonary arterial hypertension in a study of 68 subjects.

"It didn't miss a single patient; it can't get better than that," Khanna says. "This is a highly sensitive screening tool and can be very useful."

Of the times DETECT identified signs of pulmonary hypertension during the study, however, only 20% of patients who had right heart catheterizations actually suffered the debilitating condition. Khanna says it's better to be cautious.

"That's the trade-off of having such a sensitive test," he says. "The right heart catheterization is invasive, but because the mortality of [pulmonary arterial hypertension] is so high, and the prevalence is so high, the benefits outweigh the risks."

Around 10% of patients with scleroderma, which affects around 70,000 people in the U.S. each year, develop pulmonary hypertension. Under current guidelines, physicians screening scleroderma patients for the condition observe an annual echocardiogram.

While it's an effective diagnostic tool for symptomatic patients, the ultrasounds don't predict the condition accurately in asymptomatic people or early in the disease, Khanna says. That inaccuracy motivated both him and principal investigator and rheumatologist Amber Young, M.D., to conduct the study.

"These ultrasounds miss around one in three patients who may have pulmonary arterial hypertension," he says. "And by the time we diagnose a patient so late, the story is over - the patient will likely die in the next two or three years," he says.

This study was the first that compared the algorithm to echocardiogram guidelines published in 2015. The research team hopes more physicians will consider using DETECT, allowing them to treat the complication earlier. And Khanna expects more studies will conclude with similar recommendations.

"I'm sure people around the globe will be doing this work and validating it," Khanna says. "Early diagnosis and treatment of pulmonary arterial hypertension will lead to better outcomes, including improved quality of life and survival in people with scleroderma."

Credit: 
Michigan Medicine - University of Michigan

21st century medical needles for high-tech cancer diagnostics

image: By coupling ultrasound waves to a medical needle, researchers were able to make the tip of a medical needle vibrate 30 000 times per second. The new technology could improve cancer management.

Image: 
Aalto University

The diagnosis of diseases like cancer almost always needs a biopsy - a procedure where a clinician removes a piece of suspect tissue from the body to examine it, typically under a microscope. Many areas of diagnostic medicine, especially cancer management, have seen huge advances in technology, with genetic sequencing, molecular biology and artificial intelligence all rapidly increasing doctors' ability to work out what's wrong with a patient. However the technology of medical needles hasn't changed dramatically in 150 years, and - in the context of cancer management - needles are struggling to provide adequate tissue samples for new diagnostic techniques. Now researchers have shown that modifying the biopsy needle to vibrate rapidly at 30,000 times per second not only provides sufficient data for 21st century diagnostic needs, but is also potentially less painful and less traumatic for patients.

"Biopsy yields – the amount of tissue extracted – are often inadequate, with some studies showing that up to a third of fine-needle biopsies struggle to get enough tissue for a reliable diagnosis,” says Professor Heikki Nieminen, at Aalto University, Department of Neuroscience and Biomedical Engineering. “A biopsy can be painful, and the wait for the results from a diagnostic test can be a highly distressing time for the patient and family, especially if diagnosis needs re-biopsies to be conclusive. We wanted to make the procedure more gentle for the patient, and increase the certainty that the test will be able to give us an answer on the first attempt.” Professor Nieminen was visiting the University of Toronto, Canada, to work with Professor Kenneth Pritzker, a Pathologist at Mount Sinai Hospital, Toronto, as well as a university researcher in the Temerty Faculty of Medicine. It was while they were at lunch one day that Pritzker suggested that maybe the solution to the problem could be addressed with the help of ultrasound.

One of the least painful biopsy methods is called 'fine-needle biopsy', which uses a needle the same thickness as in many other medical procedures. However for more advanced diagnostic treatments - like those used in cancer - fine-needles alone don't get enough material routinely enough, so the current practice is often to use a much thicker needle, called a core needle. "They are painful for the patient and can also cause bleeding - you don't want to use a core needle unless you have to." says Pritzker. "At body temperature, human tissue exists as something that behaves part-way between being a solid and a liquid. The breakthrough here is that by making the needle tip vibrate ultrasonically, we're able to make the tissue flow more like a liquid, which allows us to extract more of it through a narrow needle."

Feels like a regular needle

In a new paper, published in Scientific Reports, the team is sharing with the wider world how well these ultrasonic vibrating needles work. "The vibrations provide energy to the tissue to make it more fluid-like," explains the first author of the paper, Emanuele Perra, who works in Nieminen's group at Aalto University. "The vibrations are localised to just the tip, so it doesn't affect any other tissue except a small region around the needle. We were able to show that the ultrasonic vibrations increase the biopsy yield by 3 to 6 times compared to the same needle without ultrasound, which was even greater than we hoped for." The vibrations are far above the hearing range for humans, and the amplitude of the waves is small enough that it shouldn't feel much different to a normal blood test.

The big increase in the amount of tissue extracted in the biopsy means it is very useful for the growing trend for high-tech cancer treatment. One such example is molecular diagnostics, which examine the chemical makeup of tumours, to allow doctors to target treatment more effectively to a specific cancer type. "Molecular diagnostics is an expensive process, and it is an expensive waste of money to have it fail because the quality of the material gathered in the biopsy wasn't previously good enough," explains Pritzker.

The technology that powers the needle is non-linear acoustics, where vibrations passing through a material have such large amplitude that they interact with the material itself. These interactions allowed the needle's designers to focus all the energy to just the tip of the needle, and measure their effects. "We've been able to characterise the vibrations at the end of the needle really well. We've used high speed cameras that have allowed us to study the physical effects of the vibrating needle on boundaries between fluids, solids and air in unprecedented detail," says Nieminen. "The rich understanding we've managed to get of the physics allowed us to design the medical device and understand how it could be used for different medical purposes."

Medical trials getting underway

The needle is expected soon to move into studies with real cancer patients, although for the time being only four-legged ones. A specialist veterinary hospital in Canada is soon expected to be trialing the device on domestic pets with cancer, and if all goes as expected, the team hopes that their needles will be used in human patients soon after.

"Modern oncology doesn't just take a biopsy at the beginning of treatment", explains Nieminen. "Increasingly, oncologists want to be able to take multiple biopsies to track how the tumors are changing and responding over the course of the treatment. We want the tools for these biopsies to be as effective and painless as possible."

While the team is preparing the needles for the real world biopsies, they are also excited about future applications that they are still researching. "The effect that ultrasonic vibrations have on tissue might also be able to work the other way" explains Perra, "the vibrations might make it easier to deliver pharmaceuticals in a targeted way to tissue like the liver. They might also be able to break up small hard objects in soft tissue, like kidney stones, or even small tumours - all minimally invasively." By combining experts in acoustics physics with experts in medical technology, the team hopes that many more innovations will arise from their 21st century upgrade of the humble medical needle.

Credit: 
Aalto University

Humungous flare from sun's nearest neighbor breaks records

image: Artist's conception of the violent stellar flare from Proxima Centauri discovered by scientists in 2019 using nine telescopes across the electromagnetic spectrum, including the Atacama Large Millimeter/submillimeter Array (ALMA). Powerful flares eject from Proxima Centauri with regularity, impacting the star's planets almost daily.

Image: 
NRAO/S. Dagnello

Scientists have spotted the largest flare ever recorded from the sun's nearest neighbor, the star Proxima Centauri.

The research, which appears today in The Astrophysical Journal Letters, was led by the University of Colorado Boulder and could help to shape the hunt for life beyond Earth's solar system.

CU Boulder astrophysicist Meredith MacGregor explained that Proxima Centauri is a small but mighty star. It sits just four light-years or more than 20 trillion miles from our own sun and hosts at least two planets, one of which may look something like Earth. It's also a "red dwarf," the name for a class of stars that are unusually petite and dim.

Proxima Centauri has roughly one-eighth the mass of our own sun. But don't let that fool you.

In their new study, MacGregor and her colleagues observed Proxima Centauri for 40 hours using nine telescopes on the ground and in space. In the process, they got a surprise: Proxima Centauri ejected a flare, or a burst of radiation that begins near the surface of a star, that ranks as one of the most violent seen anywhere in the galaxy.

"The star went from normal to 14,000 times brighter when seen in ultraviolet wavelengths over the span of a few seconds," said MacGregor, an assistant professor at the Center for Astrophysics and Space Astronomy (CASA) and Department of Astrophysical and Planetary Sciences (APS) at CU Boulder.

The team's findings hint at new physics that could change the way scientists think about stellar flares. They also don't bode well for any squishy organism brave enough to live near the volatile star.

"If there was life on the planet nearest to Proxima Centauri, it would have to look very different than anything on Earth," MacGregor said. "A human being on this planet would have a bad time."

Active stars

The star has long been a target for scientists hoping to find life beyond Earth's solar system. Proxima Centauri is nearby, for a start. It also hosts one planet, designated Proxima Centauri b, that resides in what researchers call the "habitable zone"--a region around a star that has the right range of temperatures for harboring liquid water on the surface of a planet.

But there's a twist, MacGregor said: Red dwarves, which rank as the most common stars in the galaxy, are also unusually lively.

"A lot of the exoplanets that we've found so far are around these types of stars," she said. "But the catch is that they're way more active than our sun. They flare much more frequently and intensely."

To see just how much Proxima Centauri flares, she and her colleagues pulled off what approaches a coup in the field of astrophysics: They pointed nine different instruments at the star for 40 hours over the course of several months in 2019. Those eyes included the Hubble Space Telescope, the Atacama Large Millimeter Array (ALMA) and NASA's Transiting Exoplanet Survey Satellite (TESS). Five of them recorded the massive flare from Proxima Centauri, capturing the event as it produced a wide spectrum of radiation.

"It's the first time we've ever had this kind of multi-wavelength coverage of a stellar flare," MacGregor said. "Usually, you're lucky if you can get two instruments."

Crispy planet

The technique delivered one of the most in-depth anatomies of a flare from any star in the galaxy.

The event in question was observed on May 1, 2019 and lasted just 7 seconds. While it didn't produce a lot of visible light, it generated a huge surge in both ultraviolet and radio, or "millimeter," radiation.

"In the past, we didn't know that stars could flare in the millimeter range, so this is the first time we have gone looking for millimeter flares," MacGregor said.

Those millimeter signals, MacGregor added, could help researchers gather more information about how stars generate flares. Currently, scientists suspect that these bursts of energy occur when magnetic fields near a star's surface twist and snap with explosive consequences.

In all, the observed flare was roughly 100 times more powerful than any similar flare seen from Earth's sun. Over time, such energy can strip away a planet's atmosphere and even expose life forms to deadly radiation.

That type of flare may not be a rare occurrence on Proxima Centauri. In addition to the big boom in May 2019, the researchers recorded many other flares during the 40 hours they spent watching the star.

"Proxima Centauri's planets are getting hit by something like this not once in a century, but at least once a day if not several times a day," MacGregor said.

The findings suggest that there may be more surprises in store from the sun's closest companion.

"There will probably be even more weird types of flares that demonstrate different types of physics that we haven't thought about before," MacGregor said.

Credit: 
University of Colorado at Boulder

Insurance isn't enough for women at high risk of breast cancer

Women at high risk of breast cancer face cost-associated barriers to care even when they have health insurance, a new study has found.

The findings suggest the need for more transparency in pricing of health care and policies to eliminate financial obstacles to catching cancer early.

The study led by researchers at The Ohio State University included in-depth interviews with 50 women - 30 white, 20 Black - deemed at high risk of breast cancer based on family history and other factors. It appears in the Journal of Genetic Counseling.

The researchers considered it a given that women without any insurance would face serious barriers to preventive care including genetic counseling and testing, prophylactic mastectomy and advanced breast imaging.

But they wanted to better understand the nuances - how finances played into decision-making in other ways and for women who had insurance.

"Financial barriers seem to regularly impede access to critical information that high-risk women can only get through genetic counseling and testing, and keep them from using regular screenings that could catch cancers in the earliest and most treatable stages," said co-lead author Tasleem Padamsee, assistant professor of health services management and policy at Ohio State.

"For women at the highest levels of risk, financial impediments can also put the most effective preventive surgeries and medications entirely out of reach," said Padamsee, who is also part of Ohio State's Comprehensive Cancer Center.

The study provided several new insights about barriers to care, including:

Financial constraints not only affect the health care and preventive choices of low-income women or uninsured women. Across the financial spectrum, women reported worrying about the financial impacts of prevention choices and avoiding taking steps they can't afford or don't know if they can afford.

When women decide whether or not they can afford a procedure or test, they aren't just considering the expense of that specific care - they are balancing these costs with other financial demands they face, from medical debt to child care to other illnesses they may be paying to treat for themselves or a family member. Competing demands play a unique role in cancer prevention care, the authors said.

Financial considerations are influenced by more than the financial realities of women's lives. They are also influenced by broader social and political issues such as lack of price transparency on the part of insurance companies, which often results in women having to guess which services are covered and which are not.

"Underinsurance was a really big factor - even for those women who have private insurance, they come across a lot of hurdles with requesting coverage for genetic testing, counseling, risk-reducing surgeries and enhanced breast screening," said study co-lead author Rachel J. Meadows, who worked on the research as a doctoral student at Ohio State's College of Public Health.

"These women are managing other priorities, including weighing paying for care for chronic diseases they currently have against managing a future risk. And they have other financial demands, including raising children and supporting other family members," said Meadows, who currently works at the Center for Outcomes Research at the JPS Health Network in Fort Worth, Texas.

Many high-risk women also worry about the risk of future discrimination if they have genetic testing, she said, although current law prevents genetic discrimination.

Often, studies simply look at the association of income and insurance status with use of health care services, but this work's detailed conversations with women can help advocates, providers and others better understand the subtleties of decision-making, the researchers said.

"All this information is critical to our ability to improve care. Knowing that a wide range of high-risk women are affected by financial constraints suggests that they might be better served by providers who are trained and ready to share information about insurance coverage, costs and financial assistance programs alongside information about potentially helpful tests and procedures," Padamsee said.

The findings from the study also suggest a need for regulatory changes such as long-term guarantees against genetic discrimination and stronger requirements that insurance companies disclose their full benefits and co-pays in more transparent and comprehensible ways, she said.

"These changes could improve women's ability to access high-risk care, reduce the number and severity of future cancers, and avoid future cancer treatment costs for both patients and payers."

Another new study involving the same group of women found that 45% of participants - and only 21% of Black participants - were aware of their options for taking medications to reduce their risk of developing breast cancer. Women were more likely to have heard of these drugs, usually tamoxifen or raloxifene, if they had access to care from a specialist. The study appears in the journal BMC Women's Health.

"Lack of chemoprevention awareness is a critical gap in women's ability to make health-protective choices," Padamsee said.

Credit: 
Ohio State University

Verbal fluency deficits in multiple sclerosis may reflect impaired language ability

image: Dr. Strober, senior research scientist in the Center for Neuropsychology and Neuroscience Research, focuses on cognitive effects of multiple sclerosis, and its impact on quality of life.

Image: 
Kessler Foundation/Jody Banks

East Hanover, NJ. April 21, 2021. Kessler Foundation researchers showed that people with multiple sclerosis (MS) experience subtle language impairments that standard neuropsychological tests may incorrectly attribute to impaired executive functions. The article, "The role of language ability in verbal fluency of individuals with multiple sclerosis" (doi: 10.1016/j.msard.2021.102846) was published on February 16, 2021, in Multiple Sclerosis and Related Disorders.

The authors are Nancy D. Chiaravalloti, PhD, director of the Centers for Neuropsychology, Neuroscience, and Traumatic Brain Injury Research at Kessler Foundation, Lauren B. Strober, PhD, senior research scientist at the Center for Neuropsychology and Neuroscience Research, and Amy L. Lebkuecher, MS, of Pennsylvania State University, formerly of Kessler Foundation. Drs. Chiaravalotti and Strober also have research faculty appointments at Rutgers New Jersey Medical School.

Assessing language ability in people with MS is a complicated endeavor, given the vast spectrum of individual clinical experiences within this population. Yet the ability to identify any form of language impairment, not just severe language disorder, is essential to fully understanding a patient's cognitive profile and providing optimal interventions.

While some early research suggested that language ability is largely intact in people with MS, newer studies indicate that milder language impairments may exist but are too subtle to be quantified by standard neuropsychological tests. As a result, verbal fluency deficits observed in MS are often attributed to impaired processing speed and executive functions rather than language ability. Because individuals with MS have been presumed to have intact language ability, more comprehensive tests are rarely performed, according to lead author Dr. Lebkuecher.

In this study, the Kessler research team challenged the assumption that impaired verbal fluency of individuals with MS solely reflects executive dysfunction. The team analyzed pre-existing data from 74 individuals with MS to evaluate the contribution of various cognitive factors to verbal fluency, including language ability, oral-motor speed, processing speed, and executive functions. They conducted linear multiple regression analyses with letter and category verbal fluency--which relate to a person's ability to produce words starting with a given letter or within a semantic category--as outcome variables.

The results showed that vocabulary and processing speed predicted letter fluency, while only vocabulary predicted category fluency. Although further research is needed to better understand the relationship between verbal fluency and vocabulary and processing speed, the results suggest the observed verbal fluency deficits in MS may reflect both impaired language ability and processing speed.

"Our results indicate that language ability and domain-general factors were predictive of verbal fluency in individuals with MS," summarized Dr. Chiaravalloti. "Specifically, language ability played a significant role. This could indicate that verbal fluency deficits in MS reflect underlying language impairment," she added, "Our findings further demonstrate the need for more comprehensive examination of language in people with MS."

Credit: 
Kessler Foundation

Record-breaking flare from Sun's nearest neighbor

image: An artist's impression of a flare from Proxima Centauri, modeled after the loops of glowing hot gas seen in the largest solar flares. An artist's impression of the exoplanet Proxima b is shown in the foreground.

Image: 
Roberto Molar Candanosa / Carnegie Institution for Science, NASA/SDO, NASA/JPL.

Washington, DC-- A team of astronomers including Carnegie's Alycia Weinberger and former-Carnegie postdoc Meredith MacGregor, now an assistant professor at the University of Colorado Boulder, spotted an extreme outburst, or flare, from the Sun's nearest neighbor--the star Proxima Centauri.

Their work, which could help guide the search for life beyond our Solar System, is published in The Astrophysical Journal Letters.

Proxima Centauri is a "red dwarf" with about one-eighth the mass of our Sun, which sits just four light-years, or almost 25 trillion miles, from the center of our Solar System and hosts at least two planets, one of which may look something like Earth.

In a worldwide campaign carried out over several months, the researchers observed Proxima Centauri using nine ground- and space-based telescopes. They caught the extreme flare on May 1, 2019, with five telescopes that traced its timing and energy in unprecedented detail.

"The star went from normal to 14,000 times brighter when seen in ultraviolet wavelengths over the span of a few seconds," said MacGregor.

Stellar flares happen when a shift in the star's magnetic field accelerates electrons to speeds approaching that of light. The accelerated electrons interact with the highly charged plasma that makes up most of the star, causing an eruption that produces emission across the entire electromagnetic spectrum.

"Proxima Centauri is of similar age to the Sun, so it's been blasting its planets with high energy flares for billions of years," said Weinberger. "Studying these extreme flares with multiple observatories lets us understand what its planets have endured and how they might have changed."

Like many red dwarfs--the most-common stars in the galaxy and hosts to many of the thousands of known exoplanets--Proxima Centauri is very lively.

"If there was life on the planet nearest to Proxima Centauri, it would have to look very different than anything on Earth," MacGregor said. "A human being on this planet would have a bad time."

To see just how much Proxima Centauri flares, the researchers pulled off what approaches a coup in the field of astrophysics: They pointed nine different instruments at the star for 40 hours over the course of several months in 2019. Those eyes included the the duPont Telescope at Carnegie's Las Campanas Observatory in Chile, the Hubble Space Telescope, the Atacama Large Millimeter Array (ALMA), and NASA's Transiting Exoplanet Survey Satellite (TESS). Five of them recorded the massive May 1 flare from Proxima Centauri, capturing the event as it produced a wide spectrum of radiation. This marked first time astronomers have ever had this kind of multi-wavelength coverage of a stellar flare. Usually, it's considered lucky to get observations from two instruments.

"Now we know these very different observatories operating at very different wavelengths can see the same fast, energetic impulse," Weinberger said.

The technique delivered one of the most in-depth anatomies of a flare from any star in the galaxy. While it didn't produce a lot of visible light, it generated a huge surge in both ultraviolet and radio, or "millimeter," radiation. These signals could help researchers gather more information about how stars generate flares.

They also suggest that there may be more surprises in store from the Sun's "next door" neighbor.

Going forward, "there will probably be even more weird types of flares that demonstrate different types of physics that we haven't thought about before," MacGregor concluded.

Credit: 
Carnegie Institution for Science

Does listening to calming music at bedtime actually help you sleep?

A new study published in the Journal of the American Geriatrics Society has found that listening to music can help older adults sleep better.

Researchers from the National Cheng Kung University Hospital in Taiwan combined the results of past studies to understand the effect that listening to music can have on the quality of older adults' sleep. Their work suggests that:

- Older adults (ages 60 and up) living at home sleep better when they listen to music for 30 minutes to one hour at bedtime.

- Calm music improves older adults' sleep quality better than rhythmic music does.

- Older adults should listen to music for more than four weeks to see the most benefit from listening to music.

Why Older Adults Have Trouble Getting a Good Night's Sleep

As we age, our sleep cycles change and make a good night's sleep harder to achieve. What does it really mean to get a good night's sleep? If you wake up rested and ready to start your day, you probably slept deeply the night before. But if you're tired during the day, need coffee to keep you going, or wake up several times during the night, you may not be getting the deep sleep you need. [1] According to the National Institute on Aging, older adults need seven to nine hours of sleep each night.[2]

But studies have shown that 40 to 70 percent of older adults have sleep problems and over 40 percent have insomnia, meaning they wake up often during the night or too early in the morning. Sleep problems can make you feel irritable and depressed, can cause memory problems, and can even lead to falls or accidents.

How the Researchers Studied the Effect of Music on Older Adults' Quality of Sleep

For their study, the researchers searched for past studies that tested the effect of listening to music on older adults with sleep problems who live at home. They looked at five studies with 288 participants. Half of these people listened to music; the other half got the usual or no treatment for their sleep problems. People who were treated with music listened to either calming or rhythmic music for 30 minutes to one hour, over a period ranging from two days to three months. (Calming music has slow tempo of 60 to 80 beats per minute and a smooth melody, while rhythmic music is faster and louder.) All participants answered questions about how well they thought they were sleeping. Each participant ended up with a score between 0 and 21 for the quality of their sleep.

The researchers looked at the difference in average scores for:

- people who listened to music compared to people who did not listen to music;

- people who listened to calm music compared to people who listened to rhythmic music;

- and people who listened to music for less than four weeks compared to people who listened to music for more than four weeks.

What the Researchers Learned

Listening to calming music at bedtime improved sleep quality in older adults, and calming music was much better at improving sleep quality than rhythmic music. The researchers said that calming music may improve sleep by slowing your heart rate and breathing, and lowering your blood pressure.[3] This, in turn helps lower your levels of stress and anxiety.

Researchers also learned that listening to music for longer than four weeks is better at improving sleep quality than listening to music for a shorter length of time.

Limits of the Study

- Researchers only looked at studies published in English and Chinese, meaning they may have missed studies in other languages on the effect of listening to music on sleep in older adults.

- Results may not apply to older adults with Alzheimer's disease or Parkinson's disease.

- In the studies researchers used, people who listened to music received more attention from researchers than did people who got standard or no treatment for their sleep problems. This means that sleep improvements in the music therapy group could be due to that extra attention.

- Since the different studies used different kinds of music, researchers could not single out which type of calming music improved sleep the most.

- All of the people in the study had similar kinds of sleep problems. This means listening to music may not help people with other kinds of sleep problems.

What this Study Means for You

If you're having trouble sleeping, listening to music can be a safe, effective, and easy way to help you fall and stay asleep. It may also reduce your need for medication to help you sleep.

Credit: 
American Geriatrics Society