Culture

Poor oral health linked to cognitive decline, perceived stress, Rutgers studies find

Oral health is an essential part of psychological well-being and overall health in older adults. Poor oral health is associated with decreased quality of life, depression, hypertension, and cognitive decline. Two Rutgers studies, co-authored by Darina Petrovsky, Bei Wu, and Weiyu Mao, and published in the Journal of the American Geriatrics Society, explored the relationship between poor oral health and cognitive decline and the effects of perceived stress and social support on dry mouth among older Chinese Americans.

Researchers interviewed more than 2,700 Chinese Americans aged 60 and older and found that nearly 50 percent of study participants reported experiencing tooth symptoms, 25.5 percent reported dry mouth. In the first study, those who reported tooth symptoms experienced declines in cognition and episodic memory, often precursors to dementia. In the second study, the researchers found that stress increased symptoms of dry mouth, leading to poorer overall oral health.

"Racial and ethnic minorities are particularly vulnerable to the negative consequences of poor oral health," said XinQi Dong, director of Rutgers University's Institute for Health, Health Care Policy and Aging Research. "Minorities have less access to preventive dental care that is further exacerbated by language barriers and low socioeconomic status. Older Chinese Americans are at particular risk for experiencing oral health symptoms due to lack of dental insurance or not visiting a dental clinic regularly."

According to Dong, the increasing oral health disease burdens among older Chinese immigrants point to the need for investigations of psychosocial factors due to the current emphasis on physical diseases and health behaviors in oral health.

"Efforts must be made to increase social support to alleviate stress and the resulting dry mouth issues reported by our study participants," Dong continued. "These efforts can help preserve older adults' health and well-being and limit cognitive decline."

Key findings:

47.8 percent of older Chinese Americans reported having teeth symptoms; participants who reported teeth symptoms at baseline experienced their global cognition and episodic memory decline

18.9 percent of older Chinese Americans reported gum symptoms.

15.6 percent of older Chinese Americans reported teeth and gum symptoms.

25.5 percent of older Chinese Americans reported dry mouth.

More perceived stress was associated with higher odds of dry mouth.

"These studies demonstrate the importance of examining immigrant oral health outcomes later in life to understand the specific type of outcomes of different cultural groups," said Dong. "The studies further serve as a call to action for policymakers to develop programs aimed at improving oral health preventative and dental care services in this high-risk population. Darina Petrovsky, first author, added, "Examining current oral health practices among older Chinese Americans is crucial for developing culturally-tailored interventions to promote oral health and ultimately mitigate cognitive decline."

"Poor oral health is a top concern among older Chinese Americans. In our study, the prevalence rate of dry mouth is followed by diabetes and heart disease. Our findings demonstrate the importance of studying the linkage between stress and dry mouth in this vulnerable population." said author Weiyu Mao, Assistant Professor, School of Social Work, University of Nevada, Reno.

"Support from family and friends could be protective against dry mouth symptoms in relation to stress; however, the potential overload of such support could be detrimental to oral health outcomes among older Chinese Americans." Mao continued. "Intervention strategies need to expand beyond the common risk factors, such as health conditions and health behaviors, and account for the psychosocial determinants, including stress and social support, to better promote oral health and reduce oral health disparities in this population."

"Our research raises critical awareness for dental and healthcare providers of the role of perceived stress in dry mouth symptoms," added Dong. "Working collaboratively, dental, and healthcare providers can better identify oral health symptoms as risk factors of cognitive decline in this fast-growing vulnerable population. The primary focus should include promoting optimal oral health and improving the quality of life."

Credit: 
Rutgers University

New insight into motor neuron death mechanisms could be a step toward ALS treatment

CORVALLIS, Ore. - Researchers at Oregon State University have made an important advance toward understanding why certain cells in the nervous system are prone to breaking down and dying, which is what happens in patients with ALS and other neurodegenerative disorders.

The study into the role a protein known as heat shock protein 90 plays in intracellular signaling is a key step on the way to figuring out the reason some motor neurons in the spinal cord die and some do not.

Findings, which could eventually lead to therapies to counter motor neuron death, were published in Experimental Biology and Medicine.

Neurons are cells in the nervous system that carry information to muscles, glands and other nerves. Motor neurons are large neurons in the spine and brain stem, with long axons extending outside the nervous system to contact muscles and control their movements via contraction.

Researchers led by Alvaro Estevez and Maria Clara Franco of the OSU College of Science have shown that a ubiquitous "protein chaperone," heat shock protein 90, is particularly sensitive to inhibition in motor neurons that depend for survival on "trophic factors" - small proteins that serve as helper molecules.

Trophic factors attach to docking sites on the surface of nerve cells, setting in motion processes that help keep a cell alive. Research in animal models has shown trophic factors may have the ability to salvage dying neurons.

"It is well known that there are some motor neuron subpopulations resistant to degeneration in ALS, and other subpopulations that are highly susceptible to degeneration," said Estevez, associate professor of biochemistry and biophysics and the corresponding author on this research. "Understanding the mechanisms involved in these different predispositions could provide new insight into how ALS progresses and open new alternatives for the development of novel treatments for the disease."

In this study, a motor-neuron-specific pool of heat shock protein 90, also known as Hsp90, repressed activation of a key cellular receptor and thus was shown to be critical to neuron survival; when Hsp90 was inhibited, motor neuron death was triggered.

The Hsp90 inhibitor used in this research was geldanamycin, an antitumor antibiotic used in chemotherapy. Findings suggest the drug may have the unintended consequence of decreasing motor neurons' trophic pathways and thus putting those nerve cells at risk.

"The inhibition of Hsp90 as a therapeutic approach may require the development of inhibitors that are more selective so the cancer cells are targeted and healthy motor neurons are not," said Franco, assistant professor of biochemistry and biophysics.

ALS, short for amyotrophic lateral sclerosis and also known as Lou Gehrig's disease, is caused by the deterioration and death of motor neurons in the spinal cord. It is progressive, debilitating and fatal.

ALS was first identified in the late 1800s and gained international recognition in 1939 when it was diagnosed in a mysteriously declining Gehrig, ending the Hall of Fame baseball career of the New York Yankees first baseman. Known as the Iron Horse for his durability - he hadn't missed a game in 15 seasons - Gehrig died two years later at age 37.

Credit: 
Oregon State University

How sepsis care program saves lives and reduces costs

MAYWOOD, IL - A sepsis care quality improvement program saves lives, shortens hospital stays and reduces healthcare costs, according to a study by researchers at Loyola Medicine and Loyola University Chicago.

The study is published in the journal Critical Care Medicine.

Loyola's sepsis care quality improvement program includes a multidisciplinary sepsis committee, an education campaign, electronic health record tools and early warning system.

First author Majid Afshar, MD, MSCR, and colleagues examined records of 13,877 adult Loyola patients with suspected infections. Researchers compared outcomes of patients treated before and after the quality program was implemented.

Among patients treated after the quality improvement program began, the in-hospital death rate was 30 percent lower and time to discharge from the hospital was 25 percent faster. The program also was associated with a savings of $272,645 among patients with suspected infections.

Sepsis occurs when an infection triggers an extreme response throughout the body. Sepsis can lead to septic shock, a catastrophic drop in blood pressure that can lead to respiratory or heart failure, stroke, failure of other organs and death.

Sepsis is the leading cause of in-hospital mortality and the most expensive condition treated in the United States. It costs the country more than $24 billion per year.

In 2015, the Centers for Medicare and Medicaid Services (CMS) adopted guidelines for treating sepsis and septic shock that included early resuscitation and timely administration of antibiotics. For patients with severe sepsis, the hospital should obtain blood cultures, measure the patient's blood lactate levels and administer antibiotics within three hours of diagnosis. Hospitals should follow the same steps for patients with septic shock, plus take additional steps within six hours.

Loyola implemented its sepsis care quality improvement program in anticipation of the CMS three-hour and six-hour guidelines. Loyola's program includes education about the CMS guidelines; a sepsis early warning system in the electronic health records; the hiring of a sepsis coordinator; real-time physician feedback; and built-in features in the electronic health record system designed to improve adherence to the guidelines.

Loyola's program "may serve as a benchmark for other institutions to improve health outcomes and provide cost-effective care in patients with suspected infection or sepsis," Dr. Afshar and colleagues concluded.

Credit: 
Loyola Medicine

Emoji buttons gauge emergency department sentiments in real time

image: This is one of the patient-facing button terminals in an Emergency Department near the exit.

Image: 
Penn Medicine

Simple button terminals - featuring "emoji" reflecting a range of emotions and sentiments -- stationed around emergency departments (EDs) are effective in monitoring doctor and patient sentiments in real time, a Penn Medicine study found. Traditionally, surveys are mailed or sent electronically to evaluate patient experiences, but response rates can be low and those that respond do so well after the visit. Using touch terminals could help inform immediate adjustments in the ED that would not just better serve patients but also the clinicians treating them. The findings of this trial were published Sept. 4 in Annals of Emergency Medicine.

"This study begins to shed light on simple, fast ways of identifying trends and providing high-level information on how patients and providers are feeling in real time," said Anish Agarwal, MD, an assistant professor of Emergency Medicine at Penn Medicine. "We wanted to see if people would even notice these buttons, which they did -- and they pushed them a lot."

The terminals Agarwal and his fellow researchers used are similar to those used to gauge visitor satisfaction in sports arenas and airports. Each terminal features four buttons, ranging from very positive (green, with a happy face) to very negative (red, with a frowning face). They were established in three different locations in an urban ED to capture the sentiments of three specific groups: doctors, nurses, and patients. As such, one was set up near physician workstations, another at nurses' workstations, and the last at the patient exit.

During the five-month study period in 2018, nearly 14,000 sentiments were recorded across the three terminals, with roughly 68 percent coming from the provider-facing terminals. The nurses' station terminal recorded the most sentiments, accounting for 53 percent of those captured for the study but, across all three terminals, an average of 108 sentiments were recorded per day, numbers which were encouraging to the researchers.

"This work suggests that we can collect real-time provider and patient feedback that we haven't previously been able to identify," said the study's senior author, Raina Merchant, MD, the director of the Penn Medicine Center for Digital Health and an associate professor of Emergency Medicine. "This can allow for support when things are going well and addressing challenges when they occur."

Although primarily a feasibility study, the researchers did uncover some associations between the recorded sentiments and things like identity and patient volume. For instance, high satisfaction numbers were associated with the patient exit terminal, where positive responses outnumbered the negative each day by 25 percent, on average.

On the negative side of comments, doctor and nurse sentiments were moderately associated with a higher number of patients waiting to be seen, and strongly correlated with an increased number of patients being boarded in the ED. This, Agarwal believes, could be tied to capacity strains that limit providers' ability to see patients.

"I think our staff are happier when things are working smoothly, times where we can quickly treat people and keep things moving," Agarwal said.

Now with evidence that button terminals are used at a high level in the ED, Agarwal hopes to dive deeper on his findings with future work to determine why sentiment shifts happen and determine the best ways that they can be quickly responded to.

"Frictionless feedback is key in helping drive quality improvement in healthcare, and we need to explore more ways in which we can expand it," Agarwal said. "I would argue that the day-to-day frustrations -- and joy -- clinicians experience likely contributes to their long-term satisfaction or burnout. So we need to rethink how we engage with providers and patients."

Credit: 
University of Pennsylvania School of Medicine

Heart failure deaths are highest in the poorest US counties

DALLAS, Sept. 4, 2019 -- People living in counties with high rates of poverty are more likely to die from heart failure compared to people living in more affluent areas, according to new research published in Journal of the American Heart Association, the Open Access Journal of the American Heart Association.

"When you look at a map of the United States, you will see that the poorer counties have the highest death rates from heart failure," said Khansa Ahmad, M.D., the study's lead author and a preventive cardiology fellow at the Alpert Medical School of Brown University in Providence, Rhode Island. Heart failure is a chronic, progressive condition in which the heart muscle is unable to pump enough blood to meet the body's needs for blood and oxygen.

To explore geographical differences, the researchers reviewed data from 3,000 United States counties looking for potential links between heart failure deaths and county-level poverty, education, unemployment and health insurance status. Data was pulled from the Centers of Disease Control and Census Bureau databases. They found:

County-level poverty had the strongest correlation with heart failure deaths.

Heart failure deaths increased by about five deaths per 100,000 for each 1% increase in county poverty status.

A difference of approximately 250 deaths per 100,000 between the poorest and the most affluent counties was observed.

About two thirds of the relationship between country poverty and heart failure deaths was explained by the prevalence of diabetes and obesity across the counties.

The link between poverty and heart failure deaths was the strongest in counties in the southern census region.

"Interventions to help people in poor areas, where obesity is more common, to attain and maintain a healthy body weight should be investigated in developing policies to improve heart failure outcomes across the US counties," said Wen-Chih Wu, M.D., study co-author and chief of cardiology and research health science at the Providence VA Medical Center and associate professor of medicine at Brown University.

"This study underscores the disparities in healthcare faced by many Americans. As healthcare providers, we need to understand the barriers to a healthy lifestyle faced by patients, such as living in areas with no access to healthy food or safe places to walk. Understanding these barriers and helping our patients overcome them is the first step towards building trust and better serving our under-resourced communities," said Jennifer Ellis, M.D., M.B.A., chief of Cardiothoracic Surgery at NYC Health + Hospitals/Bellevue in New York, New York and advisor to the American Heart Association's EmPOWERED to Serve ™ , a platform for people who are passionate about closing disparity gaps through health justice initiatives in their community.

The American Heart Association is working to break these links through community action plans that address specific issues, like obesity, food deserts and access to care in poor communities across the U.S.

Credit: 
American Heart Association

School district secessions in the South have deepened racial segregation between school systems

WASHINGTON, D.C., September 4, 2019--Since 2000, school district secessions in the South have increasingly sorted white and black students, and white and Hispanic students, into separate school systems, weakening the potential to improve school integration, according to a new study published today in AERA Open, a peer-reviewed journal of the American Educational Research Association.

The study, conducted by Kendra Taylor (Sanametrix), Erica Frankenberg (Pennsylvania State University), and Genevieve Siegel-Hawley (Virginia Commonwealth University), is the first to systemically explore whether, and to what extent, new school district boundaries segregate students and residents in those counties in the South where school district secessions have taken place.

From 2000 to 2017, 47 school districts in the United States successfully seceded from a larger school district. These secessions have occurred in 13 counties across the U.S., seven of which are in the South. During this time period, 18 new school districts formed in these seven counties in the South. The authors analyzed trends in school and residential segregation during 2000-2015 for the seven Southern counties.

In the counties studied by the authors, the proportion of school segregation due to school district boundaries has increased. That has been especially true since 2010, when three of the seven counties first experienced district secession. In other words, after school district secession, district boundaries played a larger role in school segregation at the county level.

"Our findings show that after district secessions, students are increasingly being sorted into different school districts by race," said study coauthor Erica Frankenberg, a professor of education and demography at Pennsylvania State University. "Given the relative scarcity of students crossing district lines, the implications of this trend are profound. School segregation is becoming more entrenched, with potential long-term effects for residential integration patterns as well."

The authors examined racial segregation, at the school level and residential level, in the seven Southern counties where school district secessions occurred during 2000-2015: Jefferson, Marshall, Mobile, Montgomery, and Shelby counties in Alabama; East Baton Rouge Parish, Louisiana; and Shelby County, Tennessee. They compared how much of overall school segregation was the result of segregation between districts versus segregation between the individual schools within districts.

In 2000, school district boundaries accounted for, on average, 57.7 percent of multiracial school segregation, a figure that grew to 63.8 percent by 2015. In 2000, school district boundaries contributed, on average, to 59.9 percent of the school segregation for black and white students; that number increased to 70.3 percent in 2015. For Hispanic and white students, the number increased from 37.1 percent in 2000 to 65.1 percent in 2015.

School district secession in the seven Southern counties has resulted in splinter districts that typically report higher percentages of white students enrolled in them than is the case in most of the "left-behind" county districts. In turn, most "left behind" districts had a higher percentage of black and Hispanic students.

"This means that within each school district, there was less racial diversity, and therefore racial sorting between schools within one district became relatively less important to overall segregation," said Frankenberg. "Instead, racial sorting between school districts has become more important."

The authors found that in 2000, school districts were, on average, 32.9 percent less diverse for black and white students than the county they were in, but by 2015, this figure had increased to 37.7 percent. Much larger increases occurred for Hispanic and white students as well as Asian and white students. For example, in 2000, school districts were, on average, 9.2 percent less diverse for white and Hispanic students; by 2015, this figure had increased to 23.9 percent.

"The bottom line is that school segregation has remained persistently high and school boundaries are accounting for an increasing share of the existing segregation," said Frankenberg. "If this trend continues, students of color increasingly will be sorted into schools with fewer resources, segregation will become more ingrained, and all students will have fewer opportunities to experience the educational benefits of a diverse learning environment."

The authors found that secession has occurred in large Southern school systems that have substantially lower shares of white students--roughly 33 percent white on average--than are typical of Southern schools overall--43 percent--"suggesting that racial threat and competition may be at work," said Frankenberg.

"It's hard not to look at many of these instances of secession and see them as a modern-day effort by Southern whites to avoid diverse schools," said Siegel-Hawley, an associate professor of educational leadership, policy, and justice at Virginia Commonwealth University. "This is especially true given the obstacles to comprehensive cross-district integration policies."

"The short- and long-term effects of secession need to be thoroughly considered and evaluated by the public and policymakers," Frankenberg said. "Specifically, policymakers should consider whether to implement more thorough review systems that consider the potential impact on county-level segregation before secessions are allowed to occur."

The study found that, on average, the creation of new district boundaries was not associated with a rise in residential segregation, at least in the short term. However, in the three counties experiencing a long history of school district secession, the authors found that school district boundaries did contribute substantially to residential segregation of the county population, including among residents without children in public schools.

"Although the link between school and residential segregation in Southern communities impacted by secession is less clear-cut in the short term, trends in places with long-standing secession experience suggest that neighborhoods will become more divided along with their schools," said coauthor Kendra Taylor, senior research analyst at Sanametrix.

Credit: 
American Educational Research Association

Electronic glove offers 'humanlike' features for prosthetic hand users

image: An electronic glove, developed by Purdue University researchers, offers 'humanlike' features for prosthetic hand users.

Image: 
Purdue University/Chris Adam

WEST LAFAYETTE, Ind. - People with hand amputations experience difficult daily life challenges, often leading to lifelong use of a prosthetic hands and services.

An electronic glove, or e-glove, developed by Purdue University researchers can be worn over a prosthetic hand to provide humanlike softness, warmth, appearance and sensory perception, such as the ability to sense pressure, temperature and hydration. The technology is published in the Aug. 30 edition of NPG Asia Materials.

While a conventional prosthetic hand helps restore mobility, the new e-glove advances the technology by offering the realistic human hand-like features in daily activities and life roles, with the potential to improve their mental health and wellbeing by helping them more naturally integrate into social contexts. A video about the technology is available at https://youtu.be/lF1VYzKagNo.

The e-glove uses thin, flexible electronic sensors and miniaturized silicon-based circuit chips on the commercially available nitrile glove. The e-glove is connected to a specially designed wristwatch, allowing for real-time display of sensory data and remote transmission to the user for post-data processing.

Chi Hwan Lee, an assistant professor in Purdue's College of Engineering, in collaboration with other researchers at Purdue, the University of Georgia and the University of Texas, worked on the development of the e-glove technology.

"We developed a novel concept of the soft-packaged, sensor-instrumented e-glove built on a commercial nitrile glove, allowing it to seamlessly fit on arbitrary hand shapes," Lee said. "The e-glove is configured with a stretchable form of multimodal sensors to collect various information such as pressure, temperature, humidity and electrophysiological biosignals, while simultaneously providing realistic human hand-like softness, appearance and even warmth."

Lee and his team hope that the appearance and capabilities of the e-glove will improve the well-being of prosthetic hand users by allowing them to feel more comfortable in social contexts. The glove is available in different skin tone colors, has lifelike fingerprints and artificial fingernails.

"The prospective end user could be any prosthetic hand users who have felt uncomfortable wearing current prosthetic hands, especially in many social contexts," Lee said.

The fabrication process of the e-glove is cost-effective and manufacturable in high volume, making it an affordable option for users unlike other emerging technologies with mind, voice and muscle control embedded within the prosthetic at a high cost. Additionally, these emerging technologies do not provide the humanlike features that the e-glove provides.

Lee and Min Ku Kim, an engineering doctoral student at Purdue and a co-author on the paper, have worked to patent the technology with the Purdue Research Foundation Office of Technology Commercialization. The team is seeking partners to collaborate in clinical trials or experts in the prosthetics field to validate the use of the e-glove and to continue optimizing the design of the glove. For more information on licensing a Purdue innovation, contact the Office of Technology Commercialization at otcip@prf.org.

"My group is devoted to developing various wearable biomedical devices, and my ultimate goal is to bring these technologies out of the lab and help many people in need. This research represents my continued efforts in this context," Lee said.

The work aligns with Purdue's Giant Leaps celebration of the university's global advancements in health as part of Purdue's 150th anniversary. That is one of the four themes of the yearlong celebration's Ideas Festival, designed to showcase Purdue as an intellectual center solving real-world issues.

Credit: 
Purdue University

Emergency department openings and closures impact resources for heart attack patients

A new study has found that hospital emergency room closures can adversely affect health outcomes for heart attack patients at neighboring hospitals that are near or at full capacity. Conversely, when a new emergency department opens, health outcomes for patients at those so-called "bystander" hospitals improve.

The national study, believed to be the first to evaluate the impact of emergency department openings and closures on bystander emergency departments, looked specifically at outcomes for heart attack patients. But researchers said the findings have implications for all patients, particularly in communities where inadequate health resources contribute to disproportionately poor health outcomes.

The study, funded by the National Heart, Lung, and Blood Institute (NHLBI), part of the National Institutes of Health, was released today in the September issue of Health Affairs.

"A hospital closure or opening impacts the quality of care that the neighboring hospital can provide to its new patient population," said Nicole Redmond, M.D., Ph.D., M.P.H., a medical officer in the Division of Cardiovascular Sciences at NHLBI. "Hospital closures stress the healthcare infrastructure, especially if the hospital is already caring for a socially and medically complex patient population and working at full capacity. As a result, such closures may inadvertently increase the health disparities that we are trying to mitigate."

Scientists used Medicare data between 2001 to 2013 to examine treatment and health outcomes for more than 1 million patients across 3,720 hospitals--including in rural areas--that had been affected by the closure or opening of an emergency department. The authors said they focused on heart attacks because of the known benefits of timely treatment.

The primary measures of health outcomes were 30-day, 90-day, and one-year mortality rates, as well as 30-day readmission rates. Researchers also examined if a patient received an angioplasty and/or stent to open a narrowed or blocked blood vessel that supplies blood to the heart--procedures that can be affected by delayed care or constrained hospital resources.

Researchers found that when the closure of an emergency department was particularly onerous--that is, it resulted in an increased travel time of 30 minutes or more to get to another hospital--health outcomes for patients in the bystander hospitals were negative. The one-year mortality rate for patients in those hospitals increased by 8% and the 30-day readmission rate increased by 6%. The likelihood of the same patients receiving the cardiac procedure declined by 4%.

On the other hand, researchers found that when an emergency department opened and reduced that driving time by at least 30 minutes, the patients in the bystander hospitals experienced a reduction in one-year mortality by 5%. Researchers also found that the likelihood of these patients receiving the cardiac procedure improved by 12%.

The findings from the study are significant and sobering, according to Renee Hsia, M.D., the lead study author who also is an emergency physician at Zuckerberg San Francisco General Hospital and Trauma Center and a professor of emergency medicine and health policy at the University of California, San Francisco.

"We now have evidence that hospital closures affect other hospitals, and they do so in different ways," said Hsia. "Hospitals that are already crowded will likely be unable to maintain the same quality when a nearby emergency department closes."

She noted that opening hospitals, specifically in areas of high need, could be a potential way to improve outcomes.

Still, to achieve long-lasting improvements that benefit patients, Hsia said policymakers need to address some of the problems that can occur in a market-driven healthcare system.

"Patients will go to other hospitals when they experience healthcare crises," she said. "It is crucial that we provide solutions that can help equitably serve all Americans."

Credit: 
NIH/National Heart, Lung and Blood Institute

Researchers develop a tool for rapid breakdown of cellular proteins

Cellular functions depend on the functionality of proteins, and these functions are disturbed in diseases. A core aim of cell biological research is to determine the functions of individual proteins and how their disturbances result in disease.

One way to study protein functions is to examine the effects of rapidly removing them from cells. During the past years, researchers have developed several techniques to achieve this. One of these techniques is known as AID, or auxin-inducible degron. This method utilises the signalling of a class of plant hormones known as auxins to rapidly deplete individual proteins from cells.

The research group headed by Academy Professor Elina Ikonen at the University of Helsinki increased the speed and improved the hormone-dependency of the AID technique in human cells. The researchers were able to degrade the targeted cellular proteins within minutes.

In addition, the researchers expanded the potential uses of the technique to encompass several types of proteins. The method can also be employed in the acute degradation of proteins whose long-term absence cannot be tolerated by cells.

The study was published in the distinguished Nature Methods journal.

"The technique we have developed is useful primarily in research, but thanks to advances in gene technology it also has potential for novel diagnostic and therapeutic methods," Elina Ikonen states.

Credit: 
University of Helsinki

Automated text analysis: The next frontier of marketing innovation

Researchers from University of Pennsylvania, Northwestern University, University of Maryland, Columbia University, and Emory University published a new article in the Journal of Marketing that provides an overview of automated textual analysis and describes how it can be harnessed to generate marketing insights.

The study, forthcoming in the January issue of the Journal of Marketing, is titled "Uniting the Tribes: Using Text for Marketing Insights" and authored by Jonah Berger, Ashlee Humphreys, Wendy Moe, Oded Netzer, and David Schweidel.

Online reviews, customer service calls, press releases, news articles, marketing communications, and other interactions create a wealth of textual data companies can analyze to optimize services and develop new products. By some estimates, 80-95% of all business data is unstructured, with most of that being text. This text has the potential to provide critical insights about its producers, including individuals' identities, their relationships, their goals, and how they display key attitudes and behaviors. This text can be aggregated to create insights about organizations and social institutions and how attitudes vary over cultural contexts, demographics, groups, and time.

Berger explains that "The digitization of information has made a wealth of textual data readily available. But by itself, all this data is just that. Data. For data to be useful, researchers have to be able to extract underlying insight--to measure, track, understand, and interpret the causes and consequences of marketplace behavior."

But how can marketers do that? The research team explains how researchers and managers can use text to better understand the individuals and organizations who produce the text. The article also explores how the content of text affects various audiences. For example, how consumers may be influenced to change their behaviors or brands influenced to attend to issues raised by consumers depends in large part on the content of text. Moe adds that "Automated text analysis opens the black-box of interactions, allowing researchers to directly access what is being said and how it is said in marketplace communication."

Given the volume of text data available, automated text analysis methods are critical, but need to be handled carefully. Researchers should avoid over-fitting and weigh the importance of features to glean and use the right predictors from text. Thus, this article also provides an overview of the methodologies and metrics used in text analysis, providing a set of guidelines and procedures for marketing researchers and marketing scholars. Understanding these methods help us understand how text is used and processed. For example, virtual assistants are currently under scrutiny for the fact that humans are listening to the audio recordings. However, this process is necessary to train the machines used for automated text analysis.

The goal of this article is to further the collective understanding of text analysis and how it can be used for insights. Researchers and marketers can use this article to create frameworks, establish and communicate policies, and strengthen cross-functional collaboration with teams working on textual analytics projects.

Credit: 
American Marketing Association

New viruses discovered in endangered wild Pacific salmon populations

image: Pacific Salmon Foundation researcher dissects tissue samples from overwintering Chinook to detect for the presence of infectious agents, Quatsino Sound, BC, March 2019.

Image: 
Amy Romer.

Three new viruses--including one from a group of viruses never before shown to infect fish--have been discovered in endangered Chinook and sockeye salmon populations.

While the impact of the viruses on salmon health isn't yet known, all three are related to viruses that cause serious disease in other species.

"We were surprised to find viruses which had never before been shown to infect fish," said Gideon Mordecai, researcher at UBC's department of earth, ocean and atmospheric sciences. "Although there's no risk to humans, one of the viruses is evolutionarily related to respiratory coronaviruses, and is localized to the gills. That suggests it has a similar infection strategy to its distant relatives that infect mammals."

UBC and Fisheries and Oceans Canada researchers used DNA sequencing followed by tests specific to each virus to screen more than 6,000 salmon from along the B.C. coast, including wild, hatchery and aquaculture fish.

"We found the new viruses widely distributed in dead and dying farmed salmon and in wild salmon," said UBC virologist Curtis Suttle. "It emphasizes the potential role that viral disease may play in the population dynamics of wild fish stocks, and the threat that these viruses may pose to aquaculture."

One new virus, detected more commonly in salmon hatcheries, infected more than 15 per cent of all hatchery Chinook tested.

Another new virus was detected in 20 per cent of Chinook from fish farms --but was only found in adult or sub-adult salmon. In general, the new viruses were more commonly found in cultured fish populations than in wild.

"It's essential that we determine whether these viruses are important factors in the decline of Chinook and sockeye salmon stocks," said Suttle. "The research highlights the need for robust surveillance to improve our understanding of how viruses might impact the health of wild Pacific salmon populations."

Over the past 30 years, steady declines in Chinook and sockeye salmon populations have been of great concern to Indigenous peoples, commercial and recreational fishers, and the general public. While much of the focus has been on the impact of piscine orthoreovirus (PRV), the new findings highlight how little is known about other viruses endemic to salmon populations.

"Being able to screen so many fish for these viruses was an exciting breakthrough, and meant we were able to identify hotspots of infection," adds Mordecai.

"One of the viruses was relatively common in juvenile migratory salmon as they enter the ocean--a period thought to be critical to their survival into adulthood."

Credit: 
University of British Columbia

How California wildfires can impact water availability

image: Berkeley Lab researchers built a numerical model of the Cosumnes River watershed, extending from the Sierra Nevada mountains to the Central Valley, to study post-wildfire changes to the hydrologic cycle.

Image: 
Berkeley Lab

In recent years, wildfires in the western United States have occurred with increasing frequency and scale. Climate change scenarios in California predict prolonged periods of drought with potential for conditions even more amenable to wildfires. The Sierra Nevada Mountains provide up to 70% of the state's water resources, yet there is little known on how wildfires will impact water resources in the future.

A new study by scientists at Lawrence Berkeley National Laboratory (Berkeley Lab) uses a numerical model of an important watershed in California to shed light on how wildfires can affect large-scale hydrological processes, such as stream flow, groundwater levels, and snowpack and snowmelt. The team found that post-wildfire conditions resulted in greater winter snowpack and subsequently greater summer runoff as well as increased groundwater storage.

The study, "Watershed Dynamics Following Wildfires: Nonlinear Feedbacks and Implications on Hydrologic Responses," was published recently in the journal, Hydrological Processes.

"We wanted to understand how changes at the land surface can propagate to other locations of the watershed," said the study's lead author, Fadji Maina, a postdoctoral fellow in Berkeley Lab's Earth & Environmental Sciences Area. "Previous studies have looked at individual processes. Our model ties it together and looks at the system holistically."

The researchers modeled the Cosumnes River watershed, which extends from the Sierra Nevadas, starting just southwest of Lake Tahoe, down to the Central Valley, ending just north of the Sacramento Delta. "It's pretty representative of many watersheds in the state," said Berkeley Lab researcher Erica Woodburn, co-author of the study. "We had previously constructed this model to understand how watersheds in the state might respond to climate change extremes. In this study, we used the model to numerically explore how post-wildfire land cover changes influenced water partitioning in the landscape over a range of spatial and temporal resolutions."

Using high-performance computing to simulate watershed dynamics over a period of one year, and assuming a 20% burn area based on historical occurrences, the study allowed them to identify the regions in the watershed that were most sensitive to wildfires conditions, as well as the hydrologic processes that are most affected.

Some of the findings were counterintuitive, the researchers said. For example, evapotranspiration, or the loss of water to the atmosphere from soil, leaves, and through plants, typically decreases after wildfire. However, some regions in the Berkeley Lab model experienced an increase due to changes in surface water runoff patterns in and near burn scars.

"After a fire there are fewer trees, which leads to an expectation of less evapotranspiration," Maina said. "But in some locations we actually saw an increase. It's because the fire can change the subsurface distribution of groundwater. So there are nonlinear and propagating impacts of changing the land cover that leads to opposite trends than what you might expect from altering the land cover."

Changing the land cover leads to a change in snowpack dynamics. "That will change how much and when the snow melts and feeds the rivers," Woodburn said. "That in turn will impact groundwater. It's a cascading effect. In the model we quantify how much it moves in space and time, which is something you can only do accurately with the type of high resolution model we've constructed."

She added: "The changes to stream flow and groundwater levels following a wildfire are especially important metrics for water management stakeholders, who largely rely on this natural resource but have little way of understanding how they might be impacted given wildfires in the future. The study is really illustrative of the integrative nature of hydrologic processes across the Sierra Nevada-Central Valley interface in the state."

Berkeley Lab researchers are also studying how the 2017 Sonoma County wildfires have affected the region's water systems, including the biogeochemistry of the Russian River watershed. "Developing a predictive understanding of the influence of wildfire on both water availability and water quality is critically important for California water resiliency," said Susan Hubbard, the Associate Laboratory Director of Earth and Environmental Sciences at Berkeley Lab. "High-performance computing allows our scientists to numerically explore how complex watersheds respond to a range of future scenarios, and the associated downgradient impacts that are important for water management."

Credit: 
DOE/Lawrence Berkeley National Laboratory

New insulation technique paves the way for more powerful and smaller chips

image: Seamless filling of nanoscale trenches with a porous metal-organic framework enabled by solvent-free conversion of a dense metal oxide film.

Image: 
© Ameloot Group

Researchers at KU Leuven and imec have successfully developed a new technique to insulate microchips. The technique uses metal-organic frameworks, a new type of materials consisting of structured nanopores. In the long term, this method can be used for the development of even smaller and more powerful chips that consume less energy. The team has received an ERC Proof of Concept grant to further their research.

Computer chips are getting increasingly smaller. That's not new: Gordon Moore, one of the founders of chip manufacturer Intel, already predicted it in 1965. Moore's law states that the number of transistors in a chip, or integrated circuit, doubles about every two years. This prognosis was later adjusted to 18 months, but the theory still stands. Chips are getting smaller and their processing power is increasing. Nowadays, a chip can have over a billion transistors.

But this continued reduction in size also brings with it a number of obstacles. The switches and wires are packed together so tightly that they generate more resistance. This, in turn, causes the chip to consume more energy to send signals. To have a well-functioning chip, you need an insulating substance that separates the wires from each other, and ensures that the electrical signals are not disrupted. However, that's not an easy thing to achieve at the nanoscale level.

Nanoporous crystals

A study led by KU Leuven professor Rob Ameloot (Department of Microbial and Molecular systems) shows that a new technique might provide the solution. "We're using metal-organic frameworks (MOFs) as the insulating substance. These are materials that consist of metal ions and organic molecules. Together, they form a crystal that is porous yet sturdy."

For the first time, a research team at KU Leuven and imec managed to apply the MOF insulation to electronic material. An industrial method called chemical vapour deposition was used for this, says postdoctoral researcher Mikhail Krishtab (Department of Microbial and Molecular systems). "First, we place an oxide film on the surface. Then, we let it react with vapour of the organic material. This reaction causes the material to expand, forming the nanoporous crystals."

"The main advantage of this method is that it's bottom-up," says Krishtab. "We first deposit an oxide film, which then swells up to a very porous MOF material. You can compare it to a soufflé; that puffs up in the oven and becomes very light. The MOF material forms a porous structure that fills all the gaps between the conductors. That's how we know the insulation is complete and homogeneous. With other, top-down methods, there's always still the risk of small gaps in the insulation."

Powerful and energy efficient

Professor Ameloot's research group has received an ERC Proof of Concept grant to further develop the technique, in collaboration with Silvia Armini from imec's team working on advanced dielectric materials for nanochips. "At imec, we have the expertise to develop wafer-based solutions, scaling technologies from lab to fab and paving the way to realising a manufacturable solution for the microelectronics industry."

"We've shown that the MOF material has the right properties," Ameloot continues. "Now, we just have to refine the finishing. The surface of the crystals is still irregular at the moment. We have to smoothen this to integrate the material in a chip."

Once the technique has been perfected, it can be used to create powerful, small chips that consume less energy. Ameloot: "Various AI applications require a lot of processing power. Think of self-driving cars and smart cities. Technology companies are constantly looking for new solutions that are both quick and energy efficient. Our research can be a valuable contribution to a new generation of chips."

Credit: 
KU Leuven

Brown trout genome will help explain species' genetic superpowers

Better conservation and management of fish stocks is on the horizon, after the completion of the brown trout reference genome by scientists at the Wellcome Sanger Institute and their collaborators. The genome will help settle a longstanding debate about whether the physically-varied brown trout is actually a single species or several, and give insights into their ability to quickly adapt to multiple environments.

The newly-sequenced brown trout genome will allow scientists and conservationists to better understand the genetic roots of this highly specialised species. It will enable researchers to identify any sub-species currently classified as brown trout, facilitating conservation efforts targeted at specific populations during a period of rapid climatic change.

Brown trout (Salmo trutta) are one of the most genetically diverse vertebrates. Taxonomists once classified the species as up to 50 distinct species*. Different populations have adapted to exploit particular biological niches, with some living their whole lives within a 200 metre stretch of freshwater stream while others migrate from the stream where they were born to the open sea.

These different life strategies help to explain the genetic 'superpowers' of brown trout - in particular, the adaptation that allows them to move between marine and freshwater environments. As a result of this trait, brown trout was one of the first species to recolonise previously frozen freshwater areas from the sea at the end of the last ice age.

However, scientists believe we may have been too hasty in lumping all brown trout populations into one species - the consequence being that effective conservation and management is limited without knowing a species' precise life cycle and habits.

Professor Paolo Prodohl, of the School of Biological Sciences at Queen's University Belfast, who can now compare brown trout DNA he believes to be from distinct species against the reference genome, said: "The new brown trout reference genome is a game-changer for us - we'll finally be able to settle the debate about how many species of brown trout there are. If you think in terms of conservation, if you're managing different species as one single species, it actually undermines what you're trying to do. Because you cannot protect what you don't know exists."

The brown trout reference genome will enable scientists to sample and decode DNA from different populations and compare to the whole genome sequence, providing the data required to answer questions of sub-speciation and to learn how particular genetic variations allow certain trout to live in habitats that would be fatal to others. Pinpointing genetic variations that allow Scottish loch trout to adapt to living in relatively acidic waters, for example, may be useful in guiding conservation efforts to protect populations affected by increasing acidity in rivers and oceans as a result of global heating.

Due to the genetic complexity and diversity of the brown trout, which has 38 to 40 chromosomes and multiple copies of those chromosomes within its genome*, specimens with only one set of chromosomes were specially bred by Norway's Institute of Marine Research. Scientists at Wellcome Sanger Institute extracted DNA from these specimens and used PacBio SMRT® Sequencing technology to generate the first, high-quality brown trout reference genome.

Principal scientist Tom Hansen, of the Institute of Marine Research in Norway, who bred the fish used in the genome sequencing, said: "Given the variability of brown trout in the wild, it was important that we could create a number of genetically identical individuals to build the reference genome. Now that we have the genome, we can begin to learn more about how trout adapt to different conditions, helping the management of wild and farmed fish stocks in future."

The brown trout is one of the 25 UK species to have been sequenced as part of the Sanger Institute's 25th anniversary 25 Genomes Project. The 25 Genomes Project** includes species such as grey and red squirrels, golden eagle, blackberry and robin. The project has laid the groundwork for the ambitious Darwin Tree of Life Project***, which will sequence all 60,000 complex species in the UK.

The high-quality genomes will open doors for scientists to use this information to discover how UK species are responding to environmental pressures, and what secrets they hold in their genetics that enables them to flourish, or flounder.

Professor Mark Blaxter, Programme Lead for the Tree of Life programme at the Wellcome Sanger Institute, said: "It's fantastic that we can contribute the genome of such an interesting species as the brown trout to our growing store of knowledge. It will help scientists and conservationists the world over to discover the genetic secrets that make this species so unique. With every species we sequence, we're learning lessons that will help us step up to the challenge of creating genomes for all complex species in the UK."

Credit: 
Wellcome Trust Sanger Institute

MD Anderson study confirms protein as potential cause of most common type of pancreatic cancer

image: This is Ronald DePinho, M.D.

Image: 
MD Anderson Cancer Center

HOUSTON -- Researchers at The University of Texas MD Anderson Cancer Center have confirmed a protein as an oncogene responsible for the most common and lethal form of pancreatic cancer known as pancreatic ductal adenocarcinoma (PDAC). The team's findings, which validated ubiquitin specific protease 21 (UPS21) as a frequently amplified gene and a potential druggable target, appear in the Sept. 5 online issue of Genes & Development.

"The USP family is the largest group of enzymes known as cysteine proteases, which play an important role in tumor development and cancer stem cell biology," said Ronald DePinho, M.D., professor of Cancer Biology. "Genomic analysis identified frequent amplification of USP21 in PDAC. This overexpression correlated with cancer progression in PDAC patient samples, drove malignant transformation of human pancreas cells, and promoted mouse tumor growth."

The researchers also found that depletion of USP21 impairs pancreatic tumor growth, achieved through USP21's ability to deubiquitinate and stabilize TCF7, a transcription factor that promotes cancer cell stemness. Protein ubiquitination is one of the most common post-translational modifications and can affect protein function in several ways, including protein stability regulation.

The findings are important, given that current therapeutic options are ineffective in PDAC. Previous genomic profiling of PDAC has provided a comprehensive atlas of recurrent genetic aberrations that promote PDAC tumorigenesis, said DePinho.

"These genetic events include known oncogenes and tumor suppressor genes, as well as numerous novel genetic aberrations," he said. "Moreover, classification of PDAC based on molecular signatures suggests the existence of distinct potential oncogenic drivers for different PDAC subtypes."

These observations prompted the team to explore newly characterized genetic alterations in PDAC with the goal of identifying and understanding new oncogenes that may expand therapeutic strategies for PDAC.

"Moreover, USP21 knockout mice are normal, suggesting that targeting USP21 may represent a cancer-specific vulnerability," said DePinho.

Credit: 
University of Texas M. D. Anderson Cancer Center