Culture

Surrey's simplified circuit design could revolutionise how wearables are manufactured

Researchers have demonstrated the use of a ground-breaking circuit design that could transform manufacturing processes for wearable technology.

Silicon-based electronics have aggressively become smaller and more efficient over a short period of time, leading to major advances in devices such as mobile phones. However, large-area electronics, such as display screens, have not seen similar advances because they rely on a device, thin-film transistor (TFT), which has serious limitations.

In a study published by IEEE Sensors Journal, researchers from the University of Surrey, University of Cambridge and the National Research Institute in Rome have demonstrated the use of a pioneering circuit design that uses an alternative type of device, the source-gated transistor (SGT), to create compact circuit blocks.

In the study, the researchers showed that they are able to achieve the same functionality from two SGTs as would normally be the case from today's devices that use roughly 12 TFTs - improving performance, reducing waste and making the new process far more cost effective.

The research team believe that the new fabrication process could result in a generation of ultralightweight, flexible electronics for wearables and sensors.

Dr Radu Sporea, lead author of the study and Lecturer in Semiconductor Devices at the University of Surrey, said: "We are entering what may be another golden age of electronics, with the arrival of 5G and IoT enabled devices. However, the way we have manufactured many of our electronics has increasingly become overcomplicated and has hindered the performance of many devices.

"Our design offers a much simpler build process than regular thin-film transistors. Source-gated transistor circuits may also be cheaper to manufacture on a large scale because their simplicity means there is less waste in the form of rejected components. This elegant design of large area electronics could result in future phones, fitness tracker or smart sensors that are energy efficient, thinner and far more flexible than the ones we are able to produce today."

Credit: 
University of Surrey

Why is stroke so deadly for people of African descent?

image: "Given the undue burden that people of African ancestry endure from stroke and other cerebrovascular disease, the lack of investigation of risk factors in this group has been a substantial gap," said researcher Bradford B. Worrall, MD, a neurologist at UVA Health.

Image: 
UVA Health

African-Americans have up to three times the risk of dying from strokes as people of European descent, yet there has been little investigation of if and how genetic variants contribute to their elevated stroke risk. Until now.

A large international team of scientists has completed the largest analysis of stroke-risk genes ever undertaken in individuals of African descent. The new study examined the genomes of more than 22,000 people of African ancestry, identifying important genetic contributors to stroke risk. These findings will help doctors better understand stroke risk, identify those at high risk and prevent the debilitating condition.

"Given the undue burden that people of African ancestry endure from stroke and other cerebrovascular disease, the lack of investigation of risk factors in this group has been a substantial gap," said researcher Bradford B. Worrall, MD, a neurologist at UVA Health. "Our work is an important step toward filling that gap, albeit with much more work to be done. These findings will provide greater insight into ethnic-specific and global risk factors to reduce the second leading cause of death worldwide."

Understanding Stroke Risk

Stroke is the leading cause of adult disability in the United States. But strokes strike African-Americans more often and at younger ages than people of European descent. In addition, African-Americans who survive strokes often face greater disability.

Family history is a major risk factor for stroke, suggesting our genes play a significant role in our stroke risk. But most genetic stroke studies, until now, have primarily focused on people of European descent. And the results have not always held true in African-Americans.

The new meta-analysis comes from the Consortium of Minority Population genome-wide Association Studies of Stroke (COMPASS). The researchers revisited previous studies to identify genetic risk factors specific to people of African descent. In total, they examined the genomes of 3,734 people who had suffered strokes and more than 18,000 who had not.

The researchers discovered that a common variation near the HNF1A gene was strongly associated with increased stroke risk in those of African ancestry. The gene previously has been associated to both stroke and cardiovascular disease.

While that variant had the strongest link to stroke risk, the researchers identified 29 other variants that also appear likely to influence stroke risk.

The variants occur at 24 different locations on our chromosomes. Sixteen of the "loci," as the locations are known, appeared also to influence stroke risk in other populations, the researchers report.

"Studies of this nature are critical given the paucity of genetic studies focused on people of African descent and other minority populations and the substantial health disparities related to stroke in these groups," said Keith Keene, PhD, a former UVA researcher and frequent collaborator of Worrall's who now leads the Center for Health Disparities at East Carolina University's Brody School of Medicine. "Furthermore, we increasingly recognize the power of looking at genetic risk factors across different race ethnic groups, known as transethnic analyses, for unlocking the underlying biology of diseases like stroke. If we understand the biology, we can develop new treatment and prevention strategies."

In a paper outlining their findings, the researchers note the importance of such studies in understanding stroke risk among minorities. These studies have "huge potential to provide insight into the mechanisms underlying stroke disparities," the researchers write. "Our study identified novel associations for stroke that might not otherwise be detected in primarily European cohort studies. Collectively, this highlights the critical nature and importance of genetic studies in a more diverse population with a high stroke burden."

Credit: 
University of Virginia Health System

Racial discrimination linked to suicide

image: University of Houston professor Rheeda Walker is reporting that racial discrimination is so painful that it is linked to the ability to die by suicide, a presumed prerequisite for being able to take one's own life, and certain mental health tools - like reframing an incident - can help.

Image: 
University of Houston

In this age of racial reckoning, new research findings indicate that racial discrimination is so painful that it is linked to the ability to die by suicide, a presumed prerequisite for being able to take one's own life. However, the ability to emotionally and psychologically reframe a transgression can mitigate its harmful effects.

Over the last decade, suicide rates in the United States have increased dramatically among racial and ethnic minorities, and Black Americans in particular. For Black young adults ages 15-24 years, suicide is the third leading cause of death with approximately 3,000 Black Americans dying by suicide each year.

Two studies conducted independently tell a compelling story.

"Our findings demonstrate that for Black adults, perceived discrimination serves as a sufficiently painful experience that is directly associated with higher capability to overcome one's inherent fear of death and achieve an increased capacity for self-harm," reports Rheeda Walker, professor of psychology and director of the University of Houston's Culture, Risk and Resilience Lab. As author of the newly released "The Unapologetic Guide to Black Mental Health," Walker is one of the leading researchers in the U.S. specializing in culture, race, mental health and suicide.

The studies were led by Jasmin Brooks, a doctoral student in the research lab, and published in the journals Suicide and Life-Threatening Behavior and Cultural Diversity and Ethnic Minority Psychology, premier journals in suicide science and cultural psychology, respectively.

Capability for suicide: Discrimination as a painful and provocative event

In this study, the research team measured the relationship between a person's experiences of discrimination and their level of capability for suicide. The study included 173 Black and 272 white college students, who responded to questionnaires about their experiences.

The findings suggest that while perceived discrimination creates emotional disturbance for white adults, it is a uniquely painful event for Black adults.

"For Black adults, perceived discrimination accounted for statistically significant variance above and beyond both feelings of depression and non-discriminatory stressors in predicting suicide capability. For white adults, perceived discrimination was not uniquely associated with capability for suicide," reports Walker.

In a separate, but timely study, Walker and her team examined how some of the effects of racism could be mitigated.

The moderating effect of dispositional forgiveness on perceived racial discrimination and depression for African American adults

While perceived racial discrimination is associated with depression for African American adults, insight into protective measures for racism and depression in African Americans is limited. In this study, 101 African American college students reported their personal experiences and feelings, and Walker's team investigated whether dispositional forgiveness is associated with less depression. Dispositional forgiveness, the ability to reframe an incident, is not the same as excusing, encouraging reconciliation, or freeing an offender from the consequences of their actions.

"Using internal coping strategies is vital for marginalized populations that experience racial discrimination daily. The results of this study suggest that dispositional forgiveness, a robust internal coping mechanism, can serve as a helpful coping strategy associated with fewer depressive symptoms for African American adults who have experienced racial discrimination," reports Walker.

Walker said the findings could have important clinical implications in that dispositional forgiveness, and specifically the ability to engage in cognitive restructuring and reframing, prevents prolonged rumination.

"In a better, more inclusive world, racism would not exist. Until that happens, psychological tools are critical for mitigating acute and long-term emotional consequences of racial discrimination in African American individuals," said Walker.

Credit: 
University of Houston

Child sleep problems associated with impaired academic and psychosocial functioning

Philadelphia, August 3, 2020--Whether children have ongoing sleep problems from birth through childhood or do not develop sleep problems until they begin school, a new study by researchers at Children's Hospital of Philadelphia (CHOP) has found that sleep disturbances at any age are associated with diminished well-being by the time the children are 10 or 11 years old. The findings, which were published in the Journal of Child Psychology and Psychiatry, suggest health care providers should screen children for sleep problems at every age and intervene early when a sleep problem is identified.

"Our study shows that although those with persistent sleep problems have the greatest impairments when it comes to broad child well-being, even those with mild sleep problems over time experience some psychosocial impairments," said Ariel A. Williamson, PhD, a psychologist in the Sleep Center and faculty member at PolicyLab and the Center for Pediatric Clinical Effectiveness at CHOP. "The range of impairments across academic and psychosocial domains in middle childhood indicate that it is important to screen for sleep problems consistently over the course of a child's development, especially to target children who experience persistent sleep problems over time."

The researchers examined data from an Australian birth cohort involving more than 5,000 patients. Caregivers reported on whether their children had sleep problems at multiple points in time, from birth through 10 or 11 years of age. To assess child well-being, which included psychosocial measures like self-control and emotional/behavioral health and academic performance measures, the researchers used a combination of reports from caregivers and teachers as well as child-completed assessments.

In analyzing caregiver-reported sleep behaviors, the researchers found five distinct sleep problem trajectories, or patterns that characterized child sleep problems over time: persistent sleep problems through middle childhood (7.7%), limited infant/ preschool sleep problems (9.0%), increased middle childhood sleep problems (17.0%), mild sleep problems over time (14.4%) and no sleep problems (51.9%).

Using those with no sleep problems as a benchmark, the researchers found that children with persistent sleep problems had the greatest impairments across all outcomes except in their perceptual reasoning skills. Children with increased middle childhood sleep problems also experienced greater psychosocial problems and worse quality of life, but did not score lower on academic achievement. Children with limited infant/preschool sleep problems or mild increases in sleep problems over time also demonstrated psychosocial impairments and had worse caregiver-reported quality of life, but the effects were smaller than the other sleep trajectories.

While the researchers found impairments related to all of the sleep problem trajectories, they note the possibility that for certain trajectories, the relationship could be bidirectional - that is, psychosocial issues like anxiety could lead to sleep issues, and vice versa, particularly in children who develop sleep problems later in childhood.

"Although this study cannot answer whether minor, early or persistent sleep problems represent a marker for the onset of behavioral health or neurodevelopmental conditions, our findings support consistently integrating questions about sleep into routine developmental screenings in school and primary care contexts," Williamson said.

Credit: 
Children's Hospital of Philadelphia

Chlamydia: Greedy for glutamine

image: Resting Chlamydia (left; bright circles), which are held without glutamine. After the addition of glutamine (right) the bacteria enter the division stages (darker circles).

Image: 
Chair of Microbiology / University of Wuerzburg

Chlamydia are bacteria that cause venereal diseases. In humans, they can only survive if they enter the cells. This is the only place where they find the necessary metabolites for their reproduction. And this happens in a relatively simple way: the bacteria create a small bubble in the cell and divide in it over several generations.

What is the decisive step that initiates the reproduction of the bacteria? It has not been known so far. Researchers from Julius-Maximilians-Universität Würzburg (JMU) in Bavaria, Germany, have now discovered it. This is important because the first step in the reproduction of the pathogens is likely to be a good target for drugs.

Glutamine import into the host cell increases

In the case of Chlamydia, the first step is to reprogram the metabolism of their human host cells. The cells then increasingly import the amino acid glutamine from their environment. If this does not work, for example because the glutamine import system is out of order, the bacterial pathogens are no longer able to proliferate. This was reported by a JMU team led by Dr. Karthika Rajeeve, who has meanwhile been awarded a professorship at the Aarhus University in Denmark, and Professor Thomas Rudel in the journal Nature Microbiology.

"Chlamydiae need a lot of glutamine to synthesize the ring-shaped molecule peptidoglycan," explains Professor Rudel, who heads the Chair of Microbiology at the JMU Biocenter. In bacteria, this ring molecule is generally a building material of the cell wall. Chlamydiae use it for the construction of a new wall that is drawn into the bacterial cell during division.

Next, the JMU team hopes to clarify the importance of the glutamine metabolism in chronic chlamydiae infections. This might provide information that might help to better understand the development of severe diseases as a result of the infection.

Facts about Chlamydia

Chlamydiae cause most venereal diseases in Germany. The bacteria are sexually transmitted and can cause inflammation in the urethra, vagina or anal area. If an infection is detected in time, it can be treated well with antibiotics.

Around 130 million people worldwide are infected with Chlamydia. The biggest problem is that the infection usually proceeds without noticeable symptoms. This makes it easier for the pathogen to spread, this leads to severe or chronic diseases such as cervical and ovarian cancer.

Credit: 
University of Würzburg

Large international study pinpoints impact of TP53 gene mutations on blood cancer severity

Considered the "guardian of the genome," TP53 is the most commonly mutated gene in cancer. TP53's normal function is to detect DNA damage and prevent cells from passing this damage on to daughter cells. When TP53 is mutated, the protein made from this gene, called p53, can no longer perform this protective function, and the result can be cancer. Across many cancer types, mutations in TP53are associated with worse outcomes, like disease recurrence and shorter survival.

As with all our genes, TP53 exists in duplicate in our cells. One copy we get from our mothers, the other we get from our fathers. Up until now, it has not been clear whether a mutation was needed in one or both copies of TP53 to affect cancer outcomes. A new study led by researchers at Memorial Sloan Kettering definitively answers this question for a blood cancer called myelodysplastic syndrome (MDS), a precursor to acute myeloid leukemia.

"Our study is the first to assess the impact of having one versus two dysfunctional copies of TP53 on cancer outcomes," says molecular geneticist Elli Papaemmanuil, a member of MSK's Epidemiology and Biostatistics Department and the lead scientist on the study, published August 3 in the journal Nature Medicine. "From our results, it's clear that you need to lose function of both copies to see evidence of genome instability and a high-risk clinical phenotype in MDS."

The consequences for cancer diagnosis and treatment are immediate and profound, she says.

A Large, Definitive Study

The study analyzed genetic and clinical data from 4,444 patients with MDS who were being treated at hospitals all over the world. Researchers from 25 centers in 12 countries were involved in the study, which was conducted under the aegis of the International Working Group for the Prognosis of MDS whose goal is to develop new international guidelines for the treatment of this disease. The findings were independently validated using data from the Japanese MDS working group led by Seishi Ogawa's team in Kyoto University.

"Currently, the existing MDS guidelines do not consider genomic data such as TP53 and other acquired mutations when assessing a person's prognosis or determining appropriate treatment for this disease," says Peter Greenberg, Director of Stanford University's MDS Center, Chair of the National Comprehensive Cancer Network Practice Guidelines Panel for MDS, and a co-author on the study. "That needs to change."

Using new computational methods, the investigators found that about one-third of MDS patients had only one mutated copy of TP53. These patients had similar outcomes as patients who did not have a TP53 mutation -- a good response to treatment, low rates of disease progression, and better survival rates. On the other hand, the two-thirds of patients who had two mutated copies of TP53 had much worse outcomes, including treatment-resistant disease, rapid disease progression, and low overall survival. In fact, the researchers found that TP53 mutation status -- zero, one, or two mutated copies of the gene -- was the most important variable when predicting outcomes.

"Our findings are of immediate clinical relevance to MDS patients," Dr. Papaemmanuil says. "Going forward, all MDS patients should have their TP53 status assessed at diagnosis."

As for why it takes two "hits" to TP53 to see an effect on cancer outcomes, the study's first author Elsa Bernard, a postdoctoral scientist in the Papaemmanuil lab, speculates that one normal copy is enough to provide adequate protection against DNA damage. This would explain why having only one mutated copy was not associated with genome instability or any worse survival rates than having two normal copies.

Given the frequency of TP53 mutations in cancer, these results make a case for examining the impact of one versus two mutations on other cancers as well. They also reveal the need for clinical trials designed specifically with these molecular differences in mind.

"With the increasing adoption of molecular profiling at the time of cancer diagnosis, we need large, evidence-based studies to inform how to translate these molecular findings into optimal treatment strategies," Dr. Papaemmanuil says.

Credit: 
Memorial Sloan Kettering Cancer Center

Study shows demolishing vacant houses can have positive effect on neighbor maintenance

image: Daniel Kuhlmann, assistant professor of community and regional planning at Iowa State University

Image: 
Iowa State University

AMES, Iowa -- New research suggests that demolishing abandoned houses may lead nearby property owners to better maintain their homes.

This study, by Daniel Kuhlmann, assistant professor of community and regional planning at Iowa State University, was published recently in the peer-reviewed Journal of Planning Education and Research. He examined whether the demolition of dilapidated and abandoned housing affects the maintenance decisions of nearby homeowners.

In the wake of the 2008 recession, many cities experienced an increase in the number of vacant and abandoned houses. Some cities, such as Cleveland and Detroit, received federal funding to acquire vacant properties through land bank programs. While land banks were able to remodel and sell some of these properties, for the most distressed houses, demolition was the only option.

Kuhlmann wondered how effective those policies and demolitions had been.

"Demolition programs have two goals. The first is to get nuisance properties out of neighborhoods because they can be dangerous," he said. "The second goal is to help stabilize declining neighborhoods."

Past research showed that demolitions have little effect on neighboring property values. But what about the physical condition of nearby homes?

Kuhlmann looked at changes to houses over time, including the presence of boarded or broken windows, dumping or yard debris, and damages to roofs, paint, siding, gutters and porches.

Using the results of two property condition surveys and administrative records on demolitions in some of the most distressed neighborhoods in Cleveland, Kuhlmann found properties near demolitions were more likely to show signs of improvement between the two surveys and less likely to deteriorate themselves.

Kuhlmann recognizes the longstanding disinvestment in some of these neighborhoods, many of which are doubly affected by racial inequities. This fact makes studies like Kuhlmann's "challenging because even if distressed housing contributes to decline, it is certainly also a symptom of it." He suggests that future research should look at demolitions' long-term effects on a neighborhood.

"Community-wide, residents tend to see demolitions as a good idea in specific instances, but they would like larger investments," he said. "It can't end with demolitions."

These findings are useful for planners, policymakers and academics concerned about damages caused by abandoned and deteriorating housing, Kuhlmann says.

"My research in general focuses on the extremes of decline, but I do think these types of properties exist in more cities than we might expect," he said.

Credit: 
Iowa State University

Study reveals less connectivity between hey brain regions in people with FXTAS premutation

image: Sensorimotor test stimuli and custom fiber-optic transducer (C; Neuroimaging Solutions, Gainesville, Florida). Participants pressed when the red bar (A) turned green (B) in order to move the white bar up to the target green bar. They were instructed to maintain their force level at the level of the green bar as steadily as possible.

Image: 
McKinney, et al.

LAWRENCE -- A new paper in the journal NeuroImage: Clinical from researchers at the University of Kansas reveals a possible early indicator of Fragile X-associated tremor/ataxia syndrome, or FXTAS. The disease afflicts some older people who carry a "premutation" of the gene known as FMR1, which can lead to impairments in movement and cognition -- while other people who carry the premutation are unaffected.

Among people with the FMR1 premutation, scientists have struggled to find biomarkers to indicate who might develop FXTAS.

The new study of 16 people with the FMR1 premutation and 18 healthy controls recorded participants' brain activity with functional magnetic resonance imaging while they performed a test of sensorimotor control. Participants were asked to manipulate images on a screen using a grip-force controller while the fMRI machine recorded the small changes in blood flow that occur when different parts of the brain become more active.

"It's one of the first studies we know about to use fMRI to look at brain system function during motor behavior in a patient population at risk for developing motor deterioration and motor degeneration where they show a loss of balance, increased shaking or tremor as they reach their 50s, 60s or 70s," said Matthew Mosconi, KU associate professor of clinical child psychology and associate scientist at KU's Life Span Institute, who oversaw the investigation in his BRAIN Lab. "But we know very little about which premutation carriers will develop FXTAS. We know males are at greater risk than females. Otherwise, we don't know a whole lot about which premutation carriers are going to get it. And we don't know a whole lot about what's going on in the brain functionally."

The investigators were able to identify brain processes specifically linked to sensorimotor issues in aging people with the FMR1 premutation.

"We found the functional connectivity of cerebellum - a brain region that controls our movement accuracy and timing -- and the extrastriate cortex, a brain area critically involved in processing visual information, is reduced in aging FMR1 premutation carriers," said Walker McKinney, lead author of the new paper and a KU doctoral student in clinical child psychology. "In some people, these longer connections -- like highways between the different parts of the brain -- aren't communicating as efficiently. Each part may be firing, but they're not firing together."

Significantly, the researchers found very little overlap in terms of functional connectivity of this pathway between premutation carriers and healthy controls in the study, suggesting connectivity levels between the cerebellum and extrastriate cortex could serve as an early emerging indicator of FXTAS, or predict who among FMR1 carriers will develop the characteristic symptoms of FXTAS before they develop.

"When studies get reported, oftentimes we're talking about a 'mean difference' between groups -- there's always overlap with healthy people and there's variability there," Mosconi said. "With our study, the fact that there's minimal overlap between premutation carriers and controls suggests that this may be what we would call a biomarker. What we need to do now is follow this measure and these people over time to determine who gets FXTAS and who doesn't. In other words, this seems like a clear target for understanding brain degeneration in FXTAS and identifying it early in its course."

Credit: 
University of Kansas

For solar boom, scrap silicon for this promising mineral

ITHACA, N.Y. - When it comes to the future of solar energy cells, say farewell to silicon and hello to calcium titanium oxide - the compound mineral better known as perovskite.

Cornell University engineers have found that photovoltaic wafers in solar panels with all-perovskite structures outperform photovoltaic cells made from state-of-the-art crystalline silicon, as well as perovskite-silicon tandem cells, which are stacked pancake-style cells that absorb light better.

In addition to offering a faster return on the initial energy investment than silicon-based solar panels, all-perovskite solar cells mitigate climate change because they consume less energy in the manufacturing process, according to Cornell research published in Science Advances.

"Layered tandem cells for solar panels offer more efficiency, so this is a promising route to widespread deployment of photovoltaics," said Fengqi You, Professor in Energy Systems Engineering at Cornell.

The paper, "Life Cycle Energy Use and Environmental Implications of High-Performance Perovskite Tandem Solar Cells," compares energy and life-cycle environmental impacts of modern tandem solar cells made of silicon and perovskites.

Perovskite needs less processing, and much less of the heat or pressure, during the fabrication of solar panels, You said.

Silicon photovoltaics require an expensive initial energy outlay, and the best ones take about 18 months to get a return on that investment. A solar cell wafer with an all-perovskite tandem configuration, according to the researchers, offers an energy payback on the investment in just four months. "That's a reduction by a factor of 4.5, and that's very substantial," You said.

But solar panels don't last forever. After decades of service, silicon solar panels become less efficient and must be retired. And as in the manufacturing phase, breaking down silicon panels for recycling is energy intensive. Perovskite cells can be recycled more easily.

"When silicon-based solar panels have reached the end of their efficiency lifecycle, the panels must be replaced," You said. "For silicon, it's like replacing the entire automobile at the end of its useful life," while replacing perovskite solar panels is akin to installing a new battery.

Adopting materials and processing steps to make perovskite solar cell manufacturing scalable is also critical to developing sustainable tandem solar cells, You said.

"Perovskite cells are promising, with a great potential to become cheaper, more energy-efficient, scalable and longer lasting," You said. "Solar energy's future needs to be sustainable."

Credit: 
Cornell University

Study: Experiencing childhood trauma makes body and brain age faster

Children who suffer trauma from abuse or violence early in life show biological signs of aging faster than children who have never experienced adversity, according to research published by the American Psychological Association. The study examined three different signs of biological aging--early puberty, cellular aging and changes in brain structure--and found that trauma exposure was associated with all three.

"Exposure to adversity in childhood is a powerful predictor of health outcomes later in life--not only mental health outcomes like depression and anxiety, but also physical health outcomes like cardiovascular disease, diabetes and cancer," said Katie McLaughlin, PhD, an associate professor of psychology at Harvard University and senior author of the study published in the journal Psychological Bulletin. "Our study suggests that experiencing violence can make the body age more quickly at a biological level, which may help to explain that connection."

Previous research found mixed evidence on whether childhood adversity is always linked to accelerated aging. However, those studies looked at many different types of adversity--abuse, neglect, poverty and more--and at several different measures of biological aging. To disentangle the results, McLaughlin and her colleagues decided to look separately at two categories of adversity: threat-related adversity, such as abuse and violence, and deprivation-related adversity, such as physical or emotional neglect or poverty.

The researchers performed a meta-analysis of almost 80 studies, with more than 116,000 total participants. They found that children who suffered threat-related trauma such as violence or abuse were more likely to enter puberty early and also showed signs of accelerated aging on a cellular level-including shortened telomeres, the protective caps at the ends of our strands of DNA that wear down as we age. However, children who experienced poverty or neglect did not show either of those signs of early aging.

In a second analysis, McLaughlin and her colleagues systematically reviewed 25 studies with more than 3,253 participants that examined how early-life adversity affects brain development. They found that adversity was associated with reduced cortical thickness - a sign of aging because the cortex thins as people age. However, different types of adversity were associated with cortical thinning in different parts of the brain. Trauma and violence were associated with thinning in the ventromedial prefrontal cortex, which is involved in social and emotional processing, while deprivation was more often associated with thinning in the frontoparietal, default mode and visual networks, which are involved in sensory and cognitive processing.

These types of accelerated aging might originally have descended from useful evolutionary adaptations, according to McLaughlin. In a violent and threat-filled environment, for example, reaching puberty earlier could make people more likely to be able to reproduce before they die. And faster development of brain regions that play a role in emotion processing could help children identify and respond to threats, keeping them safer in dangerous environments. But these once-useful adaptations may have grave health and mental health consequences in adulthood.

The new research underscores the need for early interventions to help avoid those consequences. All of the studies looked at accelerated aging in children and adolescents under age 18. "The fact that we see such consistent evidence for faster aging at such a young age suggests that the biological mechanisms that contribute to health disparities are set in motion very early in life. This means that efforts to prevent these health disparities must also begin during childhood," McLaughlin said.

There are numerous evidence-based treatments that can improve mental health in children who have experienced trauma, McLaughlin said. "A critical next step is determining whether these psychosocial interventions might also be able to slow down this pattern of accelerated biological aging. If this is possible, we may be able to prevent many of the long-term health consequences of early-life adversity," she says.

Credit: 
American Psychological Association

Properly-equipped laypersons can potentially reverse opioid overdose mortality

Without timely reversal, opioid overdose causes respiratory depression that may deteriorate into apnea, leading to brain injury and even death. Naloxone, a medication designed to rapidly reverse opioid overdose, can quickly restore normal respiration to a person whose breathing has slowed or stopped as a result of overdose with heroin or prescription opioid pain medications.

One of the major challenges in decreasing lethal opioid overdose is ensuring that naloxone reaches those in need at short notice. For opioid overdose, as for out-of-hospital cardiac arrest (OHCA), layperson response is a key link in the "chain of survival", the complex relationship between bystanders, emergency services, and hospitals. Locating a nearby volunteer with naloxone presents various challenges that may be addressed by means of collective mobilization.

In a paper published today in The Lancet's EClinicalMedicine journal, researchers from Bar-Ilan University and Drexel University report the results of the first observational cohort study of community members equipped with naloxone and a smartphone application to signal and respond to opioid overdoses. The cohort was comprised of individuals who lived and/or worked in a neighborhood with high incidence of opioid overdose in the US city of Philadelphia. After tracking the group for more than a year, the researchers showed that laypersons, including people who use opioids, can effectively signal and respond to overdose incident to administer nasal naloxone in advance of emergency medical service (EMS) arrival.

Volunteers were trained in recognizing opioid overdose, the use of intranasal naloxone, and use of a dedicated smartphone app to signal and/or respond to a suspected overdose alert. The app was activated by volunteers witnessing an overdose to signal other nearby volunteers. The researchers looked for three possible scenarios when volunteers received an alert: those who received the alert and chose to respond and help, those who explicitly declined to respond, or those who missed/ignored the alert. The witnessing volunteer was connected to speak with 9-1-1 dispatch through a semi-automated telephone call. The primary outcome was layperson-initiated overdose reversal before the arrival of EMS/first responders.

"We observed 202 layperson-initiated overdose true alerts with a rate of layperson naloxone administration of 36?6% (74/202) and found that naloxone-based reversal was initiated over five minutes prior to EMS arrival in 59?6% of these cases," said Prof. David Schwartz, of Bar-Ilan University's Graduate School of Business Administration. "We observed layperson support behaviors, including contacting EMS and remaining with the victim until recovery, that are consistent with American Heart Association guidelines and that strengthen the chain of survival that begins in the community," added Schwartz, who led the study with Dr. Stephen Lankenau, of Drexel University's Dornsife School of Public Health.

Equipping laypersons with naloxone and an emergency response community app to signal suspected opioid overdose and alert other nearby volunteers to provide naloxone can result in naloxone administration prior to EMS arrival and overdose reversal, potentially reducing mortality in opioid overdose. The findings support further study of smartphone-based naloxone intervention to strengthen the chain of survival starting at the community level.

There are striking parallels for emergency healthcare delivery between opioid overdose in the community and out-of-hospital cardiac arrest (OHCA). Studies have shown that CPR and early defibrillation by a layperson, in advance of EMS, contributes to positive outcomes after OHCA. "It is time to recognize that opioid use disorder patients can benefit from similar forms of community support that we advance for OHCA." write the authors. Locating a nearby volunteer with naloxone presents some unique challenges but is not inherently different than locating a nearby defibrillator. Creating and studying smartphone-based emergency response communities for naloxone provision can help address this important challenge.

Credit: 
Bar-Ilan University

Speech processing hierarchy in the dog brain

image: A dog and researchers (Márta Gácsi (left), Attila Andics, Anna Gábor (right)) at the scanner.

Image: 
Enik? Kubinyi / Eötvös Loránd University

Dog brains, just as human brains, process speech hierarchically: intonations at lower, word meanings at higher stages, according to a new study by Hungarian researchers at the Department of Ethology, Faculty of Science, Eötvös Loránd University (ELTE) using functional MRI on awake dogs. The study, which reveals exciting speech processing similarities between us and a speechless species, will be published in Scientific Reports.

Humans keep talking to dogs whose sensitivity to human communicative signs is well known. Both the words what we say and the intonation how we say them carry information for them. For example, when we tell 'sit' many dogs can sit down. Similarly, when we praise dogs with a high toned voice, they may notice the positive intent. We know very little, however, on what is going on in their brains during these.

In this study, Hungarian researchers measured awake, cooperative dogs' brain activity via functional magnetic resonance imaging (fMRI). Dogs listened to known, praise words (clever, well done, that's it) and unknown, neutral words (such, as if, yet) both in praising and neutral intonation.

"Exploring speech processing similarities and differences between dog and human brains can help a lot in understanding the steps that led to the emergence of speech during evolution. Human brains process speech hierarchically: first, intonations at lower-, next, word meanings at higher stages. Some years ago, we discovered that dog brains, just as human brains, separate intonation and word meaning. But is the hierarchy also similar? To find it out, we used a special technique this time: we measured how dog brain activity decreases to repeatedly played stimuli. During brain scanning, sometimes we repeated words, sometimes intonations. Stronger decrease in a given brain region to certain repetitions shows the region's involvement" - Anna Gábor, postdoctoral researcher at the MTA-ELTE 'Lendület' Neuroethology of Communication Research Group, lead author of the study explains.

The results show that dog brains, just like human brains, process speech hierarchically: intonation at lower stages (mostly in subcortical regions), while known words at higher stages (in cortical regions). Interestingly, older dogs distinguished words less than younger dogs.

"Although speech processing in humans is unique in many aspects, this study revealed exciting similarities between us and a speechless species. The similarity does not imply, however, that this hierarchy evolved for speech processing" - says Attila Andics, principal investigator of the MTA-ELTE 'Lendület' Neuroethology of Communication Research Group. "Instead, the hierarchy following intonation and word meaning processing reported here and also in humans may reflect a more general, not speech-specific processing principle. Simpler, emotionally loaded cues (such as intonation) are typically analysed at lower stages; while more complex, learnt cues (such as word meaning) are analysed at higher stages in multiple species. What our results really shed light on is that human speech processing may also follow this more basic, more general hierarchy."

Credit: 
Eötvös Loránd University

Ancient part of immune system may underpin severe COVID

NEW YORK, NY (Aug. 3, 2020) -- One of the immune system's oldest branches, called complement, may be influencing the severity of COVID disease, according to a new study from researchers at Columbia University Irving Medical Center.

Among other findings linking complement to COVID, the researchers found that people with age-related macular degeneration -- a disorder caused by overactive complement -- are at greater risk of developing severe complications and dying from COVID.

The connection with complement suggests that existing drugs that inhibit the complement system could help treat patients with severe disease.

The study was published on Aug. 3 in Nature Medicine.

The authors also found evidence that clotting activity is linked to COVID severity and that mutations in certain complement and coagulation genes are associated with hospitalization of COVID patients.

"Together these results provide important insights into the pathophysiology of COVID-19 and paint a picture for the role of complement and coagulation pathways in determining clinical outcomes of patients infected with SARS-CoV-2," says Sagi Shapira, PhD, MPH, who led the study with Nicholas Tatonetti, PhD, both professors at Columbia University Vagelos College of Physicians and Surgeons.

Findings Stem from Study of Coronavirus Mimicry

The idea to investigate the role of coagulation and complement in COVID began with a sweeping survey of viral mimicry across all viruses on earth -- over 7,000 in all.

"Viruses have proteins that can mimic certain host proteins to trick the host's cells into aiding the virus with completing its life cycle," Shapira says. "Beyond the fundamental biological questions that we were interested in addressing, based on our previous work and the work of others, we suspected that identifying those mimics could provide clues about how viruses cause disease."

Coronaviruses, the survey found, are masters of mimicry, particularly with proteins involved in coagulation and proteins that make up complement, one of the oldest branches of the human immune system.

Complement proteins work a bit like antibodies and help eliminate pathogens by sticking to viruses and bacteria and marking them for destruction. Complement can also increase coagulation and inflammation in the body. "Unchecked, these systems can also be quite detrimental," says Shapira.

"The new coronavirus -- by mimicking complement or coagulation proteins -- might drive both systems into a hyperactive state."

Macular Degeneration Associated with Greater COVID Mortality

If complement and coagulation influence severity of COVID, people with pre-existing hyperactive complement or coagulation disorders should be more susceptible to the virus.

That led Shapira and Tatonetti to look at COVID patients with macular degeneration, an eye disease caused by overactive complement, as well as common coagulation disorders like thrombosis and hemorrhage.

Among 11,000 COVID patients who came to Columbia University Irving Medical Center with suspected COVID-19, the researchers found that over 25% of those with age-related macular degeneration died, compared to the average mortality rate of 8.5%, and roughly 20% required intubation. The greater mortality and intubation rates could not be explained by differences in the age or sex of the patients.

"Complement is also more active in obesity and diabetes," Shapira says, "and may help explain, at least in part, why people with those conditions also have a greater mortality risk from COVID."

People with a history of coagulation disorders also were at increased risk of dying from COVID infection.

Coagulation and Complement Pathways Activated

The researchers then examined how gene activity differed in people infected with the coronavirus.

That analysis revealed a signature in COVID-infected patients indicating that the virus engages and induces robust activation of the body's complement and coagulation systems.

"We found that complement is one of the most differentially expressed pathways in SARS-CoV-2 infected patients," Tatonetti says. "As part of the immune system, you would expect to see complement activated, but it seems over and above what you'd see in other infections like the flu."

Some Coagulation and Complement Genes are Associated with Hospitalization

More evidence linking severe COVID with coagulation and complement comes from a genetic analysis of thousands of COVID patients from the U.K. Biobank, which contains medical records and genetic data on half a million people.

The authors found that variants of several genes that influence complement or coagulation activity are associated with more severe COVID symptoms that required hospitalization.

"These variants are not necessarily going to determine someone's outcome," Shapira says. "But this finding is another line of evidence that complement and coagulation pathways participate in the morbidity and mortality associated with COVID-19."

Targeting Coagulation and Complement

Physicians treating COVID patients have noticed coagulation issues since the beginning of the pandemic, and several clinical trials are underway to determine the best way to use existing anti-coagulation treatments.

Complement inhibitors are currently used in relatively rare diseases, but at least one clinical trial is testing the idea with COVID patients.

"I think our findings provide a stronger foundation for the idea that coagulation and complement play a role in COVID," Tatonetti says, "and will hopefully inspire others to evaluate this hypothesis and see if it's something that can be useful for fighting the ongoing pandemic."

Credit: 
Columbia University Irving Medical Center

Baby boomers show concerning decline in cognitive functioning

COLUMBUS, Ohio - In a reversal of trends, American baby boomers scored lower on a test of cognitive functioning than did members of previous generations, according to a new nationwide study.

Findings showed that average cognition scores of adults aged 50 and older increased from generation to generation, beginning with the greatest generation (born 1890-1923) and peaking among war babies (born 1942-1947).

Scores began to decline in the early baby boomers (born 1948-1953) and decreased further in the mid baby boomers (born 1954-1959).

While the prevalence of dementia has declined recently in the United States, these results suggest those trends may reverse in the coming decades, according to study author Hui Zheng, professor of sociology at The Ohio State University.

"It is shocking to see this decline in cognitive functioning among baby boomers after generations of increases in test scores," Zheng said.

"But what was most surprising to me is that this decline is seen in all groups: men and women, across all races and ethnicities and across all education, income and wealth levels."

Results showed lower cognitive functioning in baby boomers was linked to less wealth, along with higher levels of loneliness, depression, inactivity and obesity, and less likelihood of being married.

The study was published online recently in the Journals of Gerontology: Social Sciences.

Zheng analyzed data on 30,191 Americans who participated in the 1996 to 2014 Health and Retirement Survey, conducted by the University of Michigan. People over 51 years old were surveyed every two years.

As part of the study, participants completed a cognitive test in which they had to recall words they had heard earlier, count down from 100 by 7s, name objects they were shown and perform other tasks.

Other research has shown that overall rates of mortality and illness have increased in baby boomers, but generally found that the highly educated and wealthiest were mostly spared.

"That's why it was so surprising to me to see cognitive declines in all groups in this study," Zheng said. "The declines were only slightly lower among the wealthiest and most highly educated."

Zheng also compared cognition scores within each age group across generations so that scores are not skewed by older people who tend to have poorer cognition. Even in this analysis, the baby boomers came out on bottom.

"Baby boomers already start having lower cognition scores than earlier generations at age 50 to 54," he said.

The question, then, is what has happened to baby boomers? Zheng looked for clues across the lifetimes of those in the study.

Increasing cognition scores in previous generations could be tied to beneficial childhood conditions - conditions that were similar for baby boomers, Zheng said.

Baby boomers' childhood health was as good as or better than previous generations and they came from families that had higher socioeconomic status. They also had higher levels of education and better occupations.

"The decline in cognitive functioning that we're seeing does not come from poorer childhood conditions," Zheng said.

The biggest factors linked to lower cognition scores among baby boomers in the study were lower wealth, higher levels of self-reported loneliness and depression, lack of physical activity and obesity.

Living without a spouse, being married more than once in their lives, having psychiatric problems and cardiovascular risk factors including strokes, hypertension, heart disease and diabetes were also associated with lower cognitive functioning among people in this generation.

"If it weren't for their better childhood health, move favorable family background, more years of education and higher likelihood of having a white-collar occupation, baby boomers would have even worse cognitive functioning," Zheng said.

There were not enough late baby boomers (born in 1960 or later) to include in this study, but Zheng said he believes they will fare no better. The same might be true for following generations unless we find a solution for the problems found here, he said.

While many of the problems linked to lower cognitive functioning are symptoms of modern life, like less connection with friends and family and growing economic inequality, other problems found in this study are unique to the United States, Zheng said. One example would be the lack of universal access and high cost of health care.

"Part of the story here are the problems of modern life, but it is also about life in the U.S.," he said.

One of the biggest concerns is that cognitive functioning when people are in their 50s and 60s is related to their likelihood of having dementia when they are older.

"With the aging population in the United States, we were already likely to see an increase in the number of people with dementia," Zheng said.

"But this study suggests it may be worse than we expected for decades to come."

Credit: 
Ohio State University

New study on development of Parkinson's disease is 'on the nose'

image: Ning Quan, Ph.D., a neuroscientist from Florida Atlantic University's Schmidt College of Medicine and a faculty member of the FAU Brain Institute (I-BRAIN).

Image: 
Florida Atlantic University

The loss of a sense of smell is known to be one of the earliest signs of Parkinson's disease (PD) and can even appear years before the characteristic tremors and loss of motor function are seen. Some scientists believe that olfactory dysfunction may not just be a sign of broader neural damage, but rather may have a more direct linkage to the generation of the disorder itself. In support of this idea, deposits of a protein called alpha-synuclein that form Lewy bodies can be found in olfactory areas, as well as in dying dopamine neurons whose loss triggers PD, and mutations in the gene encoding alpha-synuclein produce PD.

In the central nervous system, the sensory neurons that line the nasal epithelium are particularly susceptible to neuroinflammatory attack due to their accessibility to toxic agents inhaled from the environment. Indeed, the olfactory system is directly exposed to a barrage of environmental toxins arising from bacteria, viruses, mold, dust, pollen and chemicals. These toxins lead to local inflammatory responses inside the nose where olfactory neurons send their sensitive endings, and inflammation can spread to promote activation of inflammatory cells called microglia deeper in the brain.

Since mounting evidence indicates that neuroinflammation contributes to the development and progression of PD and other degenerative diseases, scientists have proposed that the initial impact of environmental toxins inhaled through the nose may induce inflammation in the brain, triggering the production of Lewy bodies that can then be spread to other brain regions. However, the relationship linking olfactory dysfunction and PD development remains unclear.

Ning Quan, Ph.D., a neuroscientist from Florida Atlantic University's Schmidt College of Medicine and a faculty member of the FAU Brain Institute (I-BRAIN), is among a team of researchers with new findings that add weight to this theory and identify a critical signaling molecule that may be key to the domino effect kicked off by nasal inflammation.

Results of the study, published in the journal Brain Pathology, showed that application of an irritating component of a bacterium's cell wall induces inflammation in the areas exactly where the olfactory neurons project, called the olfactory bulb. Moreover, these areas show the hallmark signs of PD, depositions of alpha-synuclein, the core components of Lewy bodies. PD is characterized by progressive motor and non-motor symptoms linked to alpha-synuclein pathology and the loss of dopaminergic neurons in the nigrostriatal system. Toxic aggregates of alpha-synuclein can arise from either overexpression of the protein, changes in protein modifications, and from hereditary mutations.

Quan and collaborators from China's Xuzhou Medical University, Nanjing University of Information Science and Technology, and First Affiliated Hospital of Soochow University, demonstrate that inflammation induced in the nasal epithelium leads to overexpression of toxic forms of alpha-synuclein both in the olfactory system and in the dopamine neurons, which then degenerate and trigger Parkinson's-like behaviors in mice. Using a mouse model developed by Quan, the researchers demonstrate that these effects require activation of a single receptor protein for the inflammatory signal, interleukin 1 beta.

"Data from our study show that the bacterial trigger does not move across the blood-brain barrier," said Quan. "Rather, a sequential inflammatory activation of the olfactory mucosa triggers a subsequent expression of inflammatory molecules within the brain, propagating the inflammation."

According to the Parkinson's Foundation, approximately 60,000 Americans are diagnosed with PD each year. More than 10 million people worldwide are living with PD. Incidence of PD increases with age, however, an estimated 4 percent of people with PD are diagnosed before age 50. People with PD may experience tremor, bradykinesia, limb rigidity, gait and balance problems as well as cognitive impairment.

"Parkinson's disease is a devastating neurodegenerative disorder," said Randy Blakely, Ph.D., executive director of FAU's I-BRAIN. "Currently, there is no cure for the disease and current medications have significant side-effects. These new findings may ultimately lead to potential therapies that could shut down the origins and progression of this debilitating disease."

Credit: 
Florida Atlantic University