Culture

Gum disease, inflammation, hardened arteries may be linked to stroke risk

DALLAS, Feb. 12, 2020 -- Gum disease was associated with a higher rate of strokes caused by hardening of large arteries in the brain and also with severe artery blockages that haven't yet caused symptoms, according to preliminary research to be presented at the American Stroke Association's International Stroke Conference 2020 - Feb. 19-21 in Los Angeles, a world premier meeting for researchers and clinicians dedicated to the science of stroke and brain health.

Two studies raise the possibility that treating gum disease alongside other stroke risk factors might reduce the severity of artery plaque buildup and narrowing of brain blood vessels that can lead to a new or a recurrent stroke. However, these two studies could not conclusively confirm a cause-and-effect relationship between gum disease and artery blockage or stroke risk.

"Gum disease is a chronic bacterial infection that affects the soft and hard structures supporting the teeth and is associated with inflammation. Because inflammation appears to play a major role in the development and worsening of atherosclerosis, or 'hardening' of blood vessels, we investigated if gum disease is associated with blockages in brain vessels and strokes caused by atherosclerosis of the brain vessels," said Souvik Sen, M.D., M.S., M.P.H., author of both studies and professor and chair of the Department of Neurology at the University of South Carolina School of Medicine in Columbia.

Periodontal disease association with large artery atherothrombotic stroke (Oral Presentation 85)

Researchers examined 265 patients (average age of 64; 49% white; 56% male) who experienced a stroke between 2015 and 2017, analyzing whether gum disease was associated with specific types of stroke.

They found:

Large artery strokes due to intracranial atherosclerosis were twice as common in patients with gum disease as in those without gum disease;

Patients with gum disease were three times as likely to have a stroke involving blood vessels in the back of the brain, which controls vision, coordination and other vital bodily functions; and

Gum disease was more common in patients who had a stroke involving large blood vessels within the brain, yet not more common among those who had a stroke due to blockage in blood vessels outside the skull.

Role of periodontal disease on intracranial atherosclerosis (Oral Presentation 136)

In 1,145 people who had not experienced a stroke, selected from the Dental Atherosclerosis Risk in Communities (DARIC) Study, researchers used two magnetic resonance images to measure blockages in arteries inside the brain. Participants were an average age of 76; 78% were white, and 55% were female. Periodontal examinations were used to classify the presence and severity of gum disease.

Researchers found:

Arteries in the brain were severely blocked (50% or more) in 10% of participants;

People with gingivitis, inflammation of the gums, were twice as likely to have moderately severe narrowed brain arteries from plaque buildup compared to those with no gum disease; and

After adjusting for risk factors such as age, high blood pressure and high cholesterol, people with gingivitis were 2.4 times as likely to have severely blocked brain arteries.

"It's important for clinicians to recognize that gum disease is an important source of inflammation for their patients and to work with patients to address gum disease," Sen said

The study excluded people who had gum disease serious enough to have resulted in tooth loss.

"We are working on a current study to evaluate if treatment of gum disease can reduce its association with stroke," Sen said.

Credit: 
American Heart Association

How plants in the cabbage family look inward when sulfur is scarce

image: Researchers at Kyushu University's Department of Bioscience and Biotechnology found that disrupting the production of two enzymes in thale cress -- a relative of cabbage, broccoli, and cauliflower -- reduces the breakdown of health-beneficial glucosinolates at the expense of plant growth in sulfur-deficient environments.

Image: 
William J. Potscavage Jr., Kyushu University

New research from Kyushu University in Japan provides a better understanding of how chemicals thought to impart unique health benefits to plants in the cabbage family are broken down to promote growth in conditions lacking sufficient sulfur and could aid in the future development of broccoli and cabbage that are even healthier for you.

Researchers from the Department of Bioscience and Biotechnology at Kyushu U reported that disrupting the production of two enzymes in thale cress plants--a relative of cabbage--reduced the conversion of chemicals called glucosinolates to simpler compounds and further slowed growth when the plants did not receive sufficient amounts of sulfur from their environment.

Produced by plants in the Brassicaceae family, which includes cabbage, broccoli, cauliflower, and mustard, glucosinolates are sulfur-containing compounds that give the vegetables their unique flavor and smell, and some studies indicate that glucosinolates may also be beneficial for preventing cancer and cardiovascular diseases.

However, the plants are known to breakdown glucosinolates in environments deficient of sulfur, an essential nutrient for plant growth. While this mechanism appears to act as a strategy to sustain growth under such unfavorable conditions, current knowledge of how the process occurs and contributes to adaptation to sulfur deficiency is still limited.

A group of researchers led by Akiko Maruyama-Nakashita has now published in Plant and Cell Physiology a deeper understanding of this mechanism through the study of genetically modified model plants.

"While we had previous evidence suggesting two particular enzymes may be key based on their increased presence when sulfur is deficient, our new results show that removing these enzymes through genetic modification dramatically disrupts this breakdown," says Maruyama-Nakashita.

Maruyama-Nakashita and her group studied thale cress plants--a member of the Brassicaceae family and the first plant to have its genome completely sequence--modified through the insertion of DNA from bacteria to prevent one of the two enzymes from being produced. By cross fertilizing these plants obtained from the Arabidopsis Biological Resource Center, the researchers created plants that lacked both enzymes, called BGLU28 and BGLU30.

While all of the plants had similar levels of glucosinolates in sulfur-sufficient conditions, levels were significantly higher in plants missing both enzymes compared to unmodified plants and those missing only one enzyme when grown in sulfur-deficient conditions.

Furthermore, growth was dramatically stunted in the plants missing both enzymes relative to the other plants when sulfur was scarce, proving that breakdown of glucosinolates contributes greatly to sustaining plant growth in sulfur-deficient environments. Thus, one of the roles of the glucosinolates in the plants may be as a store of sulfur that can be released when needed.

"The knowledge obtained here deepens our understanding of plant adaptation strategies to sulfur deficient environments, and thus provides implications for promoting effective sulfur utilization in modern agriculture," comments Liu Zhang, the first author on the paper reporting the results.

"We hope that characterization of key enzymes that regulate glucosinolate breakdown will shed light on designing strategies to improve the content of these functional compounds in Brassica crops," she adds.

Credit: 
Kyushu University

Researchers describe new condition involving numerous GI polyps in cancer survivors

BOSTON - In a paper published online today, Dana-Farber Cancer Institute researchers provide new details about a recently discovered condition in which childhood cancer survivors develop numerous colorectal growths called polyps despite not having a hereditary susceptibility to the condition.

The condition, known as therapy-associated polyposis, or TAP, was first described by Dana-Farber scientists in 2014 based on a group of five patients. The new study presents a deeper look at the condition, based on data from 34 patients at eight cancer centers around the US.

The development of colorectal polyps - abnormal bump-like growths of tissue - in any individual is a risk factor for colorectal cancer. Polyposis, a condition in which many polyps grow in the intestinal tract, often signals a predisposition to colorectal cancer and other malignancies. People diagnosed with polyposis, as well as their relatives, are typically advised to undergo increased screening and other invasive procedures to detect the abnormal growths at the earliest stage possible. By knowing the specific signs of TAP - and knowing that it isn't part of a familial syndrome - physicians can spare family members unnecessary screenings and ensure patients receive proper treatment.

"Survivors of cancer in childhood or young adulthood have an increased risk for a variety of cancers and noncancerous conditions, including colorectal cancers and polyps, in the years after treatment," says Leah Biller, MD, physician-researcher at Dana-Farber Cancer Institute and Brigham and Women's Hospital (BWH), first author of the study, published online today by the journal Cancer Prevention Research. "When they develop polyposis, we often are concerned about a hereditary cause and recommend testing to see if they have an inherited link to the condition. TAP patients, however, develop polyposis without a known hereditary susceptibility. This suggested that while their condition mimicked the symptoms of hereditary polyposis syndromes, it was a separate phenomenon."

To pin down the characteristics of TAP - and potentially distinguish it from hereditary polyposis - researchers needed data from numerous patients with the condition. Aided by colleagues at Dana-Farber, Brigham and Women's and seven other treatment centers across the U.S., they gathered data from 34 patients with TAP who didn't have a hereditary or known genetic link to the condition but had been treated with chemotherapy and/or radiation therapy for childhood cancers.

The patients with TAP developed polyposis a median of 27 years after their cancer treatment. Investigators also found that 35% of the patients had more than 50 colorectal polyps, and 94% had multiple types of polyps, including adenomas, serrated polyps, hyperplastic polyps, and hamartomas. This contrasts with other hereditary polyposis syndromes in which all the polyps are generally of the same type.

Investigators also found that 74% of the patients had experienced other complications associated with cancer treatment: 50% had been diagnosed with cancerous conditions outside the colon; and 47% had been diagnosed with non-cancerous conditions indicative of prior cancer treatment. These findings suggest that people who develop TAP may be especially susceptible to treatment-related conditions in general, the study authors say.

"TAP appears to be an acquired condition that imitates various familial colorectal cancer syndromes but is biologically distinct from them," says Matthew Yurgelun, MD, an oncologist at Dana-Farber specializing in gastrointestinal cancer and Director of the Lynch Syndrome Center at Dana-Farber. "The fact that it takes different forms and involves different types of polyps suggests that there may be multiple biological pathways involved in its development. We're working to better understand these pathways in order to improve treatment of it and other treatment-related conditions."

Credit: 
Dana-Farber Cancer Institute

UTSA examines reporters' portrayal of US border under Trump

(San Antonio -- February 12, 2020) Social scientists analyzed journalistic stories over the course of three years in the run-up and during the Trump campaign. The researchers found that the long-held implicit beliefs that tend to shape American thought about others, sovereignty and immigration seeped into the national news narratives that reporters reproduced.

According to UTSA's researchers, it wasn't until much later into the Trump administration that reporters developed stories to reflect a broader view of the Texas border as more than just a nefarious zone.

"I don't think the problem is necessarily journalism as an industry but rather the assumptions that are built into how most Americans think about the U.S.-Mexico border, including journalists," said Jill Fleuriet, an associate professor in UTSA's Department of Anthropology and lead author of the study. "What we saw--whether it was MSNBC or Breitbart, outlets with different political messages--is that they said the same thing about the border: It's remote. It's far away. It's dangerous and corrupt. And that picture of the border is incomplete and simplistic."

The researchers analyzed close to 800 news articles published in 2015, 2016 and 2017. The data was pulled from 11 national outlets ranging from The New York Times, Fox News Network and National Public Radio. In effect, reporters circulated Trump's limited definition of the U.S. border as alarmingly porous to people, including the majority of Central American asylum seekers, and drugs as well as being overly susceptible to corruption. Specifically, the Texas borderlands, which spans 1,240 miles and forms about only 16% of the entire U.S. land mass border, became the epicenter of a perceived gateway to economic and social disruption.

"Our work contributes to scholarship that connects discursive regimes and statecraft with life in borderlands to lay bare underlying social tensions and potential violence," added Fleuriet.

The south Texas borderlands, according to Fleuriet and former student and co-author Mari Castellano, fall under a bigger American idea of "the border." The authors argue that "the border" is what's known as a concept-metaphor--a cognitive and linguistic device used across cultures to reference a shared idea so people can talk about it, but it's not very good at reflecting reality. The UTSA authors argue that "the border" is an American concept-metaphor used by the Trump political platform and by national news media to communicate insecurity and lawlessness that must be controlled, rather than a reference to a geopolitical borderline with complex, nuanced communities.

"Why is it that we don't think of New York City as the border? It's because we are socialized to think of the U.S.-Mexico area as the only "the border" and a threat. Reporters tend to walk into these stories with the same unquestioned assumptions. I would love if those that read this article or the upcoming book walked away thinking more critically about ideas that are used to shape our beliefs about who is different and who is not. Who belongs and who doesn't. Critically questioning our assumptions about the world, especially in times of surging nationalism, is one thing that makes anthropology relevant to all of us," said Fleuriet.

"Media, place-making, and concept-metaphors: the US-Mexico border during the rise of Donald Trump" was published in the latest edition of the journal Media, Culture and Society.

Fleuriet has a forthcoming book that further examines news framing and the transformation of concept-metaphors in a process known as "bordering-debordering-rebordering." The book features four years' worth of interviews, focus groups and observations in South Texas border communities, including activists and longtime civic leaders to reveal a broader view of the border than merely a gateway into the feared "Global South." She also includes local journalistic stories that generally are very different than the national news depictions.

Credit: 
University of Texas at San Antonio

Nutrition a key ingredient for psychological health in Canadian adults

A new study investigating factors that contribute to psychological distress in adults has found that that risk of malnourishment is linked to psychological distress among Canadians aged 45 years and older.

"These findings are consistent with other research which has found links between poor quality diet, and depression, bipolar disorder, and psychological distress," says study lead Dr. Karen Davison, Health Science faculty member at Kwantlen Polytechnic University in Surrey, BC. "Collectively, they indicate that nutrition may be an important consideration in mental health care."

Adults who have insufficient appetite, face challenges in preparing food, or consume low-quality diets are identified to be at risk of malnourishment. Indicators of a poor diet found in the study that were associated with psychological distress included low fruit and vegetable intake and higher levels of chocolate consumption.

Given that lower grip strength is a measure of poor nutrition, the researchers also explored the relationship between grip strength and psychological health. Men with low grip strength had 57% higher odds of psychological distress.

"This finding is consistent with previous studies which suggests that psychological problems such as depression are associated with an increased risk of frailty" says co-author Shen (Lamson) Lin, a doctoral student at University of Toronto's Factor Inwentash Faculty of Social Work (FIFSW).

Other factors associated with psychological distress among older Canadians

In addition to nutrition indicators, other factors found to be associated with psychological distress include chronic pain, multiple physical health problems, poverty and immigrant status.

One in five older adults with three or more chronic health problems were in distress compared to one in 17 who did not have any chronic conditions. One-third of women and one-quarter of men in chronic pain were in distress.

"Distress is common among those experiencing uncontrollable and chronic pain. Furthermore, dealing with multiple physical health problems can be upsetting and can make day-to-day activities, work and socializing much more difficult." says senior author, Esme Fuller-Thomson, professor at FIFSW and director of the Institute for Life Course & Aging. Fuller-Thomson is also cross-appointed to the Department of Family and Community Medicine and the Faculty of Nursing.

The prevalence of distress was highest among the poorest respondents; One in three older adults who had a household income under $20,000 per year were in distress.

"It is not surprising that those in poverty were in such high levels of distress: Poverty is a chronic and debilitating stressor. It can often be challenging even to pay one's rent and put healthy food on the table. Poverty may also result in poorer housing and neighborhood quality, and greater residential turnover which are also stress-inducing," says co-author Yu Lung, a doctoral student at FIFSW.

The study also found that immigrant women living in Canada less than 20 years had a higher prevalence of distress than women who were Canadian-born residents (21% vs 14%).

"Unfortunately, this survey did not identify the reasons for the greater distress among immigrant women, but we hypothesize that it may be due to the difficulties of resettling in a new country, such as language barriers, financial strain, complications of having one's qualifications recognized, distance from family and other social support networks and perceived discrimination" says co-author Hongmei Tong, Assistant Professor of Social Work at MacEwan University in Edmonton.

"Although immigrant men also face many of these settlement problems, they were not at elevated risk of distress compared to their Canadian-born peers," says co-author Karen Kobayashi, Professor in the Department of Sociology and a Research Affiliate at the Institute on Aging & Lifelong Health at the University of Victoria. "One idea we hope to explore in future research is whether these gender differences could be due to the fact that the husbands initiated the immigration process and the wives may have had limited or no say in the decision to leave their homeland."

The study team analyzed data from the Canadian Longitudinal Study on Aging which included 25,834 men and women aged 45-85 years. The article was published this month in the Journal of Affective Disorders.

"The team's findings suggest that policies and health care practices should aim to reduce nutrition risk, improve diet quality, address chronic pain and health problems and poverty among those experiencing poor mental health," adds Dr. Davison. "Given that mental health conditions place a large burden of disability worldwide, such program and policy changes are becoming critically important."

Credit: 
University of Toronto

The use of jargon kills people's interest in science, politics

image: You can say "laparoscopy" - or you could say "minimally invasive surgery."

Image: 
Ohio State University

COLUMBUS, Ohio - When scientists and others use their specialized jargon terms while communicating with the general public, the effects are much worse than just making what they're saying hard to understand.

In a new study, people exposed to jargon when reading about subjects like self-driving cars and surgical robots later said they were less interested in science than others who read about the same topics, but without the use of specialized terms.

They were also less likely to think they were good at science, felt less informed about science and felt less qualified to discuss science topics.

Crucially, it made no difference if the jargon terms - like "vigilance decrement" and "laparoscopy" - were defined in the text: Even when the terms were defined, readers still felt just as disengaged as readers who read jargon that wasn't explained.

The problem is that the mere presence of jargon sends a discouraging message to readers, said Hillary Shulman, lead author of the study and assistant professor of communication at The Ohio State University.

"The use of difficult, specialized words are a signal that tells people that they don't belong," Shulman said.

"You can tell them what the terms mean, but it doesn't matter. They already feel like that this message isn't for them."

This new study is the latest in a series by Shulman and her colleagues that shows how complex language in politics, as well as science, can lead people to tune out.

"Politics is where I started," Shulman said.

"We have found that when you use more colloquial language when talking to people about issues like immigration policy, they report more interest in politics, more ability to understand political information and more confidence in their political opinions."

Shulman and colleagues have now studied language and public engagement involving about 20 different political and science topics, all with the same results.

"We can get citizens to engage with complex political and scientific issues if we communicate to them in language that they understand," she said.

The latest study was published online recently in the Journal of Language and Social Psychology and will appear in a future print edition.

In the study, 650 adults participated online. They read a paragraph about each of three science and technology topics: self-driving cars, surgical robots and 3D bio-printing.

About half of them read versions of the paragraphs with no jargon and half read versions with jargon.

For example, one of the sentences in the high-jargon version of the surgical robots paragraph read: "This system works because of AI integration through motion scaling and tremor reduction."

The no-jargon version of that same sentence read: "This system works because of programming that makes the robot's movements more precise and less shaky."

Half of the people who read the high-jargon versions were also offered the opportunity to see the jargon terms defined. When they held their computer mouse over an underlined jargon term, a text box appeared with the definition - the exact language used in the no-jargon version.

After reading each paragraph, participants rated how easy it was to read.

After they read all three paragraphs, participants completed a variety of measures examining issues like their interest in science and how much they thought they knew about science.

As expected, participants who read the high-jargon paragraphs thought they were more difficult to read than did those who read the no-jargon descriptions - even if they had the definitions available to them.

"What we found is that giving people definitions didn't matter at all - it had no effect on how difficult they thought the reading was," Shulman said.

Being exposed to jargon had a variety of negative effects on readers, the study showed.

"Exposure to jargon led people to report things like 'I'm not really good at science,' 'I'm not interested in learning about science,' and 'I'm not well qualified to participate in science discussions,'" Shulman said.

But people who read no-jargon versions felt more empowered.

"They were more likely to say they understood what they read because they were a science kind of person, that they liked science and considered themselves knowledgeable."

There's an even darker side, though, to how people react when they are exposed to jargon-filled explanations of science.

In an earlier study with these same participants, published in the journal Public Understanding of Science, the researchers found that reading jargon led people to not believe the science.

"When you have a difficult time processing the jargon, you start to counter-argue. You don't like what you're reading. But when it is easier to read, you are more persuaded and you're more likely to support these technologies," she said.

"You can see how important it is to communicate clearly when you're talking about complex science subjects like climate change or vaccines."

Shulman said that the use of jargon is appropriate with scientific audiences. But scientists and science communicators who want to communicate with the public need to modify their language, beginning with eliminating jargon.

"Too many people think that if they just define their terms, everything is set. But this work shows that is not the case."

Credit: 
Ohio State University

Gene associated with autism also controls growth of the embryonic brain

image: Image showing brain cells with lower levels of Foxp1 function (at left) and higher levels (right). Apical radial glia are stained in green and secondary progenitors and neurons stained in red.

Image: 
UCLA Broad Stem Cell Research Center/Cell Reports

A UCLA-led study reveals a new role for a gene that's associated with autism spectrum disorder, intellectual disability and language impairment.

The gene, Foxp1, has previously been studied for its function in the neurons of the developing brain. But the new study reveals that it's also important in a group of brain stem cells -- the precursors to mature neurons.

"This discovery really broadens the scope of where we think Foxp1 is important," said Bennett Novitch, a member of the Eli and Edythe Broad Center of Regenerative Medicine and Stem Cell Research at UCLA and the senior author of the paper. "And this gives us an expanded way of thinking about how its mutation affects patients."

Mutations in Foxp1 were first identified in patients with autism and language impairments more than a decade ago. During embryonic development, the protein plays a broad role in controlling the activity of many other genes related to blood, lung, heart, brain and spinal cord development. To study how Foxp1 mutations might cause autism, researchers have typically analyzed its role in the brain's neurons.

"Almost all of the attention has been placed on the expression of Foxp1 in neurons that are already formed," said Novitch, a UCLA professor of neurobiology who holds the Ethel Scheibel Chair in Neuroscience.

In the new study published in Cell Reports, he and his colleagues monitored levels of Foxp1 in the brains of developing mouse embryos. They found that, in normally developing animals, the gene was active far earlier than previous studies have indicated -- during the period when neural stem cells known as apical radial glia were just beginning to expand in numbers and generate a subset of brain cells found deep within the developing brain.

When mice lacked Foxp1, however, there were fewer apical radial glia at early stages of brain development, as well as fewer of the deep brain cells they normally produce. When levels of Foxp1 were above normal, the researchers observed more apical radial glia and an excess of those deep brain cells that appear early in development. In addition, continued high levels of Foxp1 at later stages of embryonic development led to unusual patterns of apical radial glia production of deep-layer neurons even after the mice were born.

"What we saw was that both too much and too little Foxp1 affects the ability of neural stem cells to replicate and form certain neurons in a specific sequence in mice," Novitch said. "And this fits with the structural and behavioral abnormalities that have been seen in human patients."

Some people, he explained, have mutations in the Foxp1 gene that blunt the activity of the Foxp1 protein, while others have mutations that change the protein's structure or make it hyperactive.

The team also found intriguing hints that Foxp1 might be important for a property specific to the developing human brain. The researchers also examined human brain tissue and discovered that Foxp1 is present not only in apical radial glia, as was seen in mice, but also in a second group of neural stem cells called basal radial glia.

Basal radial glia are abundant in the developing human brain, but absent or sparse in the brains of many other animals, including mice. However, when Novitch's team elevated Foxp1 function in the brains of mice, cells resembling basal radial glia were formed. Scientists have hypothesized that basal radial glia also are connected to the size of the human brain cortex: Their presence in large quantities in the human brain may help explain why it is disproportionately larger than those of other animals.

Novitch said that although the new research does not have any immediate implications for the treatment of autism or other diseases associated with Foxp1 mutations, it does help researchers understand the underlying causes of those disorders.

In future research, Novitch and his colleagues are planning to study what genes Foxp1 regulates in apical radial glia and basal radial glia, and what roles those genes play in the developing brain.

Credit: 
University of California - Los Angeles Health Sciences

Many teens are victims of digital dating abuse; boys get the brunt of it

image: Sameer Hinduja, Ph.D., lead author and a professor in the School of Criminology and Criminal Justice within FAU's College for Design and Social Inquiry, and co-director of the Cyberbullying Research Center.

Image: 
Alex Dolce, Florida Atlantic University

With February being Teen Dating Violence Awareness Month, new research is illuminating how this problem is manifesting online. "Digital dating abuse" as it has been termed, uses technology to repetitively harass a romantic partner with the intent to control, coerce, intimidate, annoy or threaten them. Given that youth in relationships today are constantly in touch with each other via texting, social media and video chat, more opportunities for digital dating abuse can arise.

A researcher from Florida Atlantic University, in collaboration with the University of Wisconsin-Eau Claire, conducted a study to clarify the extent to which youth are experiencing digital forms of dating abuse, as well as to identify what factors are linked to those experiences.

Research on this phenomenon is still emerging; indeed, this study is the first to examine these behaviors with a large, nationally representative sample of 2,218 middle and high school students (12 to 17 years old) in the United States who have been in a romantic relationship.

Results of the study, published in the Journal of Interpersonal Violence, showed that more than one-quarter (28.1 percent) of teens who had been in a romantic relationship at some point in the previous year said they had been the victim of at least one form of digital dating abuse. These included: whether their significant other looked through the contents of their device without permission; kept them from using their device; threatened them via text; posted something publicly online to make fun of, threaten, or embarrass them; and posted or shared a private picture of them without permission.

In addition, more than one-third (35.9 percent) had been the victim at least one form of traditional (offline) dating abuse (i.e., they were pushed, grabbed or shoved; hit or threatened to be hit; called names or criticized, or prevented from doing something they wanted to do).

Interestingly, males were significantly more likely to have experienced digital dating abuse (32.3 percent) compared to females (23.6 percent), and more likely to experience all types of digital dating abuse, and were even more likely to experience physical aggression. No other differences emerged with respect to demographic characteristics such as sexual orientation, race and age.

"Specific to heterosexual relationships, girls may use more violence on their boyfriends to try to solve their relational problems, while boys may try to constrain their aggressive impulses when trying to negotiate discord with their girlfriends," said Sameer Hinduja, Ph.D., lead author and a professor in the School of Criminology and Criminal Justice within FAU's College for Design and Social Inquiry, and co-director of the Cyberbullying Research Center. "It's unfortunate to be thinking about dating abuse as we approach one of the most romantic days of the year, Valentine's Day. However, it is clear that digital dating abuse affects a meaningful proportion of teenagers, and we need to model and educate youth on what constitutes a healthy, stable relationship and what betrays a dysfunctional, problematic one."

The researchers also found a significant connection between digital and traditional forms of dating abuse: the vast majority of students who had been abused online had also been abused offline. Specifically, 81 percent of the students who had been the target of digital dating abuse had also been the target of traditional dating abuse.

Students victimized offline were approximately 18 times more likely to have also experienced online abuse compared to those who were not victimized offline. Similarly, most of the students who had been the victim of offline dating violence also had been the victim of online dating violence, though the proportion (63 percent) was lower.

A number of risk factors were significantly associated with digital dating abuse. Students who reported depressive symptoms were about four times as likely to have experienced digital dating abuse. Those who reported that they had sexual intercourse were 2.5 times as likely to have experienced digital dating abuse. Most notably, those students who had sent a "sext" to another person were nearly five times as likely to be the target of digital dating abuse as compared to those who had not sent a sext. Finally, those who had been the target of cyberbullying also were likely to have been the target of digital dating abuse.

"As we observe 'Teen Dating Violence Awareness Month,' we are hopeful that our research will provide more information on the context, contributing factors, and consequences of these behaviors," said Hinduja. "Gaining a deeper understanding of the emotional and psychological mind-set and the situational circumstances of current-day adolescents may significantly inform the policy and practice we need to develop to address this form and all forms of dating abuse."

Credit: 
Florida Atlantic University

Research reverses the reproductive clock in mice

Researchers have lifted fertility rates in older female mice with small doses of a metabolic compound that reverses the ageing process in eggs, offering hope for some women struggling to conceive.

The University of Queensland study found a non-invasive treatment could maintain or restore the quality and number of eggs and alleviate the biggest barrier to pregnancy for older women.

A team led by UQ's Professor Hayden Homer found the loss of egg quality through ageing was due to lower levels of a particular molecule in cells critical for generating energy.

"Quality eggs are essential for pregnancy success because they provide virtually all the building blocks required by an embryo," Professor Homer said.

"We investigated whether the reproductive ageing process could be reversed by an oral dose of a 'precursor' compound - used by cells to create the molecule."

The molecule in question is known as NAD (nicotinamide adenine dinucleotide) and the 'precursor' as NMN (nicotinamide mononucleotide).

Professor Homer said fertility in mice starts to decline from around one year of age due to defects in egg quality similar to changes observed in human eggs from older women.

"We treated the mice with low doses of NMN in their drinking water over four weeks, and we were able to dramatically restore egg quality and increase live births during a breeding trial," Professor Homer said.

Professor Homer said poor egg quality had become the single biggest challenge facing human fertility in developed countries.

"This is an increasing issue as more women are embarking on pregnancy later in life, and one in four Australian women who undergo IVF treatment are aged 40 or older," he said.

"IVF cannot improve egg quality, so the only alternative for older women at present is to use eggs donated by younger women.

"Our findings suggest there is an opportunity to restore egg quality and in turn female reproductive function using oral administration of NAD-boosting agents - which would be far less invasive than IVF. It is important to stress, however, that although promising, the potential benefits of these agents remains to be tested in clinical trials".

Credit: 
University of Queensland

Frailty can affect how well older adults fare following emergency surgery

Frailty is the medical term for becoming weaker or experiencing lower levels of activity/energy. Becoming frail as we age increases our risk for poor health, falls, disability, and other serious concerns. This can be especially true for older people facing surgery, up to half of whom are classified as frail.

Studies show that frail people may have a higher risk of complications, longer hospital stays, and a higher risk for death within 30 days of their surgery. This is a special concern when frail older adults face emergency surgery for abdominal conditions such as bleeding ulcers and bowel perforations (the medical term for developing a hole in the wall of your intestines). This is because there is no time to help someone facing emergency surgery get stronger before their procedure.

Right now, experts have information on how well frail people do within 30 days of surgery. However, they don't yet know how well frail older adults do 30 days later and beyond. This information is important so that healthcare providers can inform patients about risks and help them set expectations for recovery after surgery.

A new study in the Journal of the American Geriatrics Society sought to gain more information about how frailty affects older adults in the months after surgery. The research team wanted to test their theory that these people would have a higher risk for death a year after surgery, have higher rates of being sent to long-term care facilities rather than to their homes, and have poorer health one year after surgery.

The research team used Medicare claims to measure frailty in patients 65 years old or older who had one of five types of emergency abdominal surgeries associated with the highest risk for death. These surgeries included emergency colon removal or surgical treatment of a bleeding stomach ulcer. The researchers assigned the patients to one of four groups: non-frail, pre-frail, mildly frail, and moderately to severely frail.

The researchers studied 468,459 older Medicare beneficiaries who underwent the surgeries. Of these patients, 37 percent were pre-frail, 12 percent were mildly frail, and 4 percent were moderately to severely frail. Patients with mild and moderate to severe frailty were older, mostly female, and white; one-fifth were admitted to the hospital from another healthcare facility.

Overall, almost 16 percent of all participants died within 30 days of surgery. Twenty-five percent died within 180 days, and 30 percent had died at one year following surgery. People with moderate to severe frailty had the highest rates of death, followed by those with mild frailty and pre-frailty, compared to non-frail patients.

The study found that frail older patients spent six to 14 fewer weeks at home after being discharged from the hospital compared to non-frail patients. The researchers also noted that frail older adults who had abdominal surgery experienced four to six times more hospital encounters (such as an emergency department visit or a hospitalization) after they were discharged from the hospital post-surgery.

According to the researchers, these findings suggest that the initial hospitalization for emergency surgery is the best time for surgeons (and non-surgeons who are part of the frail patient's care team) to discuss patients' expectations about their future following surgery. Since these patients are at high risk of death or needing future hospital care, it is important for the healthcare team to have conversations about their care preferences during hospitalization and before surgery. This can also help make sure that any post-operative treatments are in line with the patients' preferences.

Credit: 
American Geriatrics Society

Mass General Hospital researchers identify new 'universal' target for antiviral treatment

BOSTON - As the coronavirus outbreak shows, viruses are a constant threat to humanity. Vaccines are regularly developed and deployed against specific viruses, but that process takes a lot of time, doesn't help everyone who needs protection, and still leaves people exposed to new outbreaks and new viruses.

Now, researchers at Massachusetts General Hospital (MGH) have uncovered a novel potential antiviral drug target that could lead to treatments protecting against a host of infectious diseases - creating a pan, or universal, treatment. Their work suggests that the protein Argonaute 4 (AGO4) is an "Achilles heel" for viruses.

AGO4 is one of a family of AGO proteins. Until now, there has been little evidence of whether they have individual roles. The researchers, led by Kate L. Jeffrey, PhD, and her collaborators found that AGO4 plays a key role protecting cells against viral infections.

Specifically, this protein is uniquely antiviral in mammalian immune cells. The group studied the anti-viral effects of several Argonaute proteins, and found that only cells that were deficient in AGO4 were "hyper-susceptible" to viral infection. In other words, low levels of AGO4 make mammalian cells more likely to become infected.

This study was published today by Cell Reports.

The MGH researchers suggest that boosting levels of AGO4 could shore up the immune system to protect against multiple viruses. "The goal is to understand how our immune system works so we can create treatments that work against a range of viruses, rather than just vaccines against a particular one," says Jeffrey.

Mammals have four Argonaute proteins (1-4), which act by silencing genes and which are remarkably conserved throughout multiple living things, including plants. These are RNA interference (RNAi) and microRNA effector proteins and RNAi is the major antiviral defense strategy in plants and invertebrates. Studies of influenza infected mice have shown that AGO4-deficient animals have significantly higher levels of the virus.

The next steps are to "determine how broad spectrum this is to any virus type," says Jeffrey. "Then we need to discover how to boost AGO4 to ramp up protection against viral infections."

Credit: 
Massachusetts General Hospital

Making 3-D printing smarter with machine learning

image: A screenshot from printfixer shows the predicted variations in a printed shape, with expanded areas highlighted in red and smaller areas marked in blue.

Image: 
Nathan Decker

3-D printing is often touted as the future of manufacturing. It allows us to directly build objects from computer-generated designs, meaning industry can manufacture customized products in-house, without outsourcing parts. But 3-D printing has a high degree of error, such as shape distortion. Each printer is different, and the printed material can shrink and expand in unexpected ways. Manufacturers often need to try many iterations of a print before they get it right.

What happens to the unusable print jobs? They must be discarded, presenting a significant environmental and financial cost to industry.

A team of researchers from USC Viterbi School of Engineering is tackling this problem, with a new set of machine learning algorithms and a software tool called PrintFixer, to improve 3-D printing accuracy by 50 percent or more, making the process vastly more economical and sustainable.

The work, recently published in IEEE Transactions on Automation Science and Engineering, describes a process called "convolution modeling of 3-D printing." It's among a series of 15 journal articles from the research team covering machine learning for 3-D printing.

The team, led by Qiang Huang, associate professor of industrial and systems engineering, chemical engineering and materials science, along with Ph.D. students Yuanxiang Wang, Nathan Decker, Mingdong Lyu, Weizhi Lin and Christopher Henson has so far received $1.4M funding support, including a recent $350,000 NSF grant. Their objective is to develop an AI model that accurately predicts shape deviations for all types of 3-D printing and make 3-D printing smarter.

"What we have demonstrated so far is that in printed examples the accuracy can improve around 50 percent or more," Huang said. "In cases where we are producing a 3-D object similar to the training cases, overall accuracy improvement can be as high as 90 percent."

"It can actually take industry eight iterative builds to get one part correct, for various reasons," Huang said, "and this is for metal, so it's very expensive."

Every 3-D printed object results in some slight deviation from the design, whether this is due to printed material expanding or contracting when printed, or due to the way the printer behaves.

PrintFixer uses data gleaned from past 3-D printing jobs to train its AI to predict where the shape distortion will happen, in order to fix print errors before they occur.

Huang said that the research team had aimed to create a model that produced accurate results using the minimum amount of 3-D printing source data.

"From just five to eight selected objects, we can learn a lot of useful information," Huang said. "We can leverage small amounts of data to make predictions for a wide range of objects."

The team has trained the model to work with the same accuracy across a variety of applications and materials - from metals for aerospace manufacturing, to thermal plastics for commercial use. The researchers are also working with a dental clinic in Australia on the 3-D printing of dental models.

"So just like a when a human learns to play baseball, you'll learn softball or some other related sport much quicker," said Decker, who leads the software development effort development in Huang's group. "In that same way, our AI can learn much faster when it has seen it a few times."

"So you can look at it," said Decker, "and see where there are going to be areas that are greater than your tolerances, and whether you want to print it."

He said that users could opt to print with a different, higher-quality printer and use the software to predict whether that would provide a better result.

"But if you don't want to change the printer, we also have incorporated functionality into the software package allowing the user to compensate for the errors and change the object's shape - to take the parts that are too small and increase their size, while decreasing the parts that are too big," Decker said. "And then, when they print, they should print with the correct size the first time."

The team's objective is for the software tool to be available to everyone, from large scale commercial manufacturers to 3-D printing hobbyists. Users from around the world will also be able to contribute to improving the software AI through sharing of print output data in a database.

"Say I'm working with a MakerBot 3-D printer using PLA (a bioplastic used in 3-D Printing), I can put that in the database, and somebody using the same model and material could take my data and learn from it," Decker said.

"Once we get a lot of people around the world using this, all of a sudden, you have a really incredible opportunity to leverage a lot of data, and that could be a really powerful thing," he said.

Credit: 
University of Southern California

Pilot program aims to improve reproducibility, utility, and ethics of biomedical research

Addressing the widespread concern over transparency and reproducibility in biomedical research, one of the largest institutions in German science has begun to provide a framework, interventions, and incentives for improving the quality and value of translational research. The program is described by its leader, Ulrich Dirnagl of Berlin Institute of Health (BIH), and colleagues in a new article publishing on February 11 in the open-access journal PLOS Biology.

Despite the progress of modern biomedical science and approval of new drugs, there is a wide and growing recognition that the practice of biomedical research has significant weaknesses that lead to exorbitant waste, failed "breakthrough" treatments, and inability to replicate "landmark" findings. To address these concerns, the Berlin Institute of Health founded the QUEST Center for Transforming Biomedical Research, which developed the quality improvement program. The QUEST Center is implementing the program in BIH's two member institutions, Charité- Universitätsmedizin Berlin and the Max Delbrück Center for Molecular Medicine.

The program offers training, tools, and incentives to researchers to improve the quality of research, based on the principles of trustworthiness, usefulness, and ethics. For instance, the Center offers courses in reducing bias in research design, provides a guide to publishing negative or inconclusive findings, and offers financial rewards for making data publicly available. Program evaluations are ongoing and will be published in the near future.

"Conceptually, we are conducting and evaluating a large-scale behavior change intervention," said Dirnagl. "While such changes at a single institution can have little effect by themselves, we hope this program can provide a model for widespread adoption by other research institutions globally."

Credit: 
PLOS

More than just a carnival trick: Researchers can guess your age based on your microbes

image: The makeup of microbial communities residing on your skin and elsewhere, your 'microbiota age,' can now be correlated with your biological age.

Image: 
UC San Diego Health Sciences

Our microbiomes -- the complex communities of microbes that live in, on and around us -- are influenced by our diets, habits, environments and genes, and are known to change with age. In turn, the makeup of our microbiomes, particularly in the gut, is well-recognized for its influence on our health. For example, gut microbiome composition has been linked to inflammatory bowel disease, autoimmune disease, obesity, even neurological disorders, such as autism.

Given a microbiome sample (skin, mouth or fecal swab), researchers have demonstrated they can now use machine learning to predict a person's chronological age, with a varying degree of accuracy. Skin samples provided the most accurate prediction, estimating correctly to within approximately 3.8 years, compared to 4.5 years with an oral sample and 11.5 years with a fecal sample. The types of microbes living in the oral cavity or within the gut of young people (age 18 to 30 years old) tended to be more diverse and abundant than in comparative microbiomes of older adults (age 60 years and older).

The predictive tool, described in a paper published February 11, 2020 by mSystems, was developed as a collaboration between researchers at University of California San Diego and IBM.

"This new ability to correlate microbes with age will help us advance future studies of the roles microbes play in the aging process and age-related diseases, and allow us to better test potential therapeutic interventions that target microbiomes," said co-senior author Zhenjiang Zech Xu, PhD, who was at the time of the study a postdoctoral researcher in the UC San Diego School of Medicine lab of co-senior author Rob Knight, PhD, professor and director of the UC San Diego Center for Microbiome Innovation.

The team's ultimate goal is to create similar machine learning models to correlate microbiome and clinical conditions, such as inflammation in autoimmune conditions. This approach could someday form the basis for a noninvasive microbiome-based test that potentially helps clinicians better diagnose or assess a person's risk for a disease.

In a 2014 study, Washington University researchers compared "microbial age" -- age as predicted by the fecal microbiome -- and actual chronological age in the context of malnourished infants during the first months of life. The researchers noted that the difference between chronological and microbial age was associated with the degree of the children's developmental maturity. In the new study, UC San Diego researchers took this idea a step further to see if this association could apply to adults, and how well it generalized to other body sites.

According to Xu, one of the most important requirements for a good statistical model is a large sample size and a representative population. To do that, the researchers mined microbiome sequencing data available from the public databases of several citizen science projects, such as the American Gut Project, in which participants mail in fecal, saliva or skin swabs, receive their personalized microbiome readouts, and contribute their anonymized data to the scientific community.

The study relied on a total of 4,434 fecal samples from the US and China, 2,550 saliva samples from the US, Canada, UK and Tanzania, and 1,975 skin samples from the U.S. and U.K. The participants whose data were used in the study ranged in age from 18 to 90 years old, with body mass indices of 18.5 to 30, did not have inflammatory bowel disease or diabetes, and had not used antibiotics for at least one month prior to sampling. The study also excluded pregnant, hospitalized, disabled or critically ill individuals.

"This was the most comprehensive investigation of microbiome and age to date," said first author Shi Huang, PhD, a postdoctoral researcher in Knight's lab and the UC San Diego Center for Microbiome Innovation.

The team found gender-specific differences in gut microbiome results, but no difference between males and females when it came to oral and skin microbiome results. Despite the diversity of microbes living on different sites across the human body, it also made no difference whether the skin samples had been collected from foreheads or hands, meaning future skin microbiome studies can boost their statistical power by combining collection sites and genders.

One potential reason the microbes living on our skin change so consistently as we age, the researchers said, is due to the predictable changes in skin physiology that everyone experiences, such as decreased serum production and increased dryness.

"The accuracy of our results demonstrate the potential for applying machine learning and artificial intelligence techniques to better understand human microbiomes," said co-author Ho-Cheol Kim, PhD, program director of the Artificial Intelligence for Healthy Living Program, a collaboration between IBM Research and UC San Diego under the IBM AI Horizons Network. "Applying this technology to future microbiome studies could help unlock deeper insights into the correlation between how microbiomes influence our overall health and a wide range of diseases and disorders from neurological to cardiovascular and immune health."

According to co-author Yoshiki Vázquez-Baeza, PhD, associate director of bioinformatic integration at the UC San Diego Center for Microbiome Innovation, age prediction is a particularly attractive method for training predictive models because participants don't need to meet special criteria in order to become a sample donor, and assessing age typically does not require a visit to a hospital.

"Other studies that focus on one particular condition, such as inflammatory bowel disease, often struggle to get enough participants who meet the study criteria and who are willing to participate in order to be able to draw meaningful conclusions," Vázquez-Baeza said. "But here, the wide applicability of age-prediction allowed us to explore the limits of microbial modeling at an unprecedented scale."

"Learning how to create accurate and robust microbiome-based models will open the door to a number of biotechnological applications, and help us better understand the relationship of certain bacteria with outcomes of interest," Knight said.

Credit: 
University of California - San Diego

Combining viral genomics and public health data revealed new details about mumps outbreaks

In 2016 and 2017, a surge of mumps cases at Boston-area universities prompted researchers to study mumps virus transmission using genomic data, in collaboration with the Massachusetts Department of Public Health and local university health services. As the outbreaks unfolded, the teams analyzed mumps virus genomes collected from patients, revealing new links between cases that first appeared unrelated and other details about how the disease was spreading that weren't apparent from the epidemiological investigation.

The teams shared their sequencing data and findings in real-time during the outbreaks, with both each other and the broader scientific community, and now report their conclusions in PLOS Biology.

Analyzing viral genomes from an outbreak can show how a virus is evolving and being transmitted -- data that can help public health officials slow and stop the spread of disease.

"High-resolution genomic data about a virus, gathered from patient samples, allows us to reconstruct parts of an outbreak that aren't evident at first," said co-senior author Pardis Sabeti, institute member at the Broad Institute, professor at Harvard University, and Howard Hughes Medical Institute investigator. "The better we understand transmission chains in situations like this, the better we can inform efforts to control outbreaks and devise strategies to predict and stop them in the future."

In Massachusetts, the typical rate of mumps is less than 10 cases per year -- but more than 250 cases were reported in 2016 and more than 170 in 2017, despite high rates of vaccination. Many of the cases were from 18 colleges and universities in the state, including Harvard University, University of Massachusetts Amherst, and Boston University (these three universities met certain criteria to ensure patient privacy protection in this study and agreed to be named in the paper). Other outbreaks flared elsewhere in Boston and across the country around the same time.

These patterns of cases raised questions about how much the virus was circulating in the Massachusetts and US populations. To learn more, the research teams paired traditional epidemiological data with analysis of mumps virus whole genome sequences from 201 infected individuals, focusing primarily on the Massachusetts university communities.

Mumps insight

The viral genomic data revealed details about the Boston-area outbreaks that could not be reconstructed by relying solely on more traditional approaches. For example, the researchers found a clear link between cases at Harvard and an outbreak in East Boston, which were classified as distinct outbreaks during the initial public health investigation.

Public health officials first thought the cases in these two communities were unrelated based on several pieces of evidence: epidemiological data, the different demographic makeup of the two populations (older adults with no obvious university connection versus mostly college-aged students), and a long gap between the apparent end of the outbreak at Harvard and the cases in the local community.

However, the genomic data indicated that the mumps viruses in the East Boston cases were genetically similar to those in the Harvard virus samples. This finding enabled the teams to identify contacts and transmission links between the university and the wider community.

"Even though the two outbreaks were occurring at different places and different times, we were able to show connections between these outbreaks that were operationally informative," explained senior co-author Bronwyn MacInnis, associate director of malaria and viral genomics in the Infectious Disease and Microbiome Program and co-lead of the Global Health Initiative at Broad. "The public health teams could determine that they were essentially dealing with one problem, not two."

Understanding such transmission routes can help guide the outbreak response -- for example, by determining whether efforts should be focused more on controlling transmission within a single community or between different ones.

"Whole-genome sequencing of patient samples helps us reconstruct the progression of an outbreak," said co-first author Shirlee Wohl, formerly a Harvard graduate student in the Sabeti lab and now a postdoctoral fellow at Johns Hopkins University. "Traditional outbreak surveillance efforts can help identify possible sources of infection, but whole-genome sequencing can confirm these links and even suggest new, unexplored connections."

The team emphasized that this study was made possible by the close partnerships it had with the Massachusetts Department of Health and the health services teams at several universities. "I am proud to be part of the Massachusetts higher education community," Sabeti added. "They worked together and demonstrated the necessity of transparency in outbreak response. This is not a story of mumps at these universities, but of outstanding mumps reporting."

Mutating mumps?

Another question of particular interest to the local teams was whether a new mutation in the mumps virus -- for example, one that allows it to evade the immune system in a vaccinated individual -- might have sparked the outbreak. Of the infected individuals, 65 percent had received the recommended two doses of the MMR vaccine. However, given the available data, the researchers found no evidence that genetic variants arising specifically during this outbreak contributed to the disease spread. This finding suggests that, in the Boston area, the virus wasn't evolving into one that could dodge vaccine-induced immunity.

In addition to the findings related to the Boston-area outbreaks, the study's broader geographic analysis suggested that the mumps virus has been circulating continuously at a low rate around the US, only rarely flaring up into notable outbreaks as in 2016 and 2017.

"This whole endeavor demonstrated the value of genetic data to the epidemiological health response, and of data-sharing among collaborating teams," Sabeti said. "One of our goals is to build this capacity in many areas around the world so that public health officials can rapidly mobilize and do this type of analysis whenever they need to."

Credit: 
Broad Institute of MIT and Harvard