Culture

Cellular metabolism regulates the fate decision between pathogenic and regulatory T cells

image: Laurie Harrington.

Image: 
UAB

BIRMINGHAM, Ala. - Patients with autoimmune diseases like multiple sclerosis, inflammatory bowel disease and rheumatoid arthritis have an imbalance between two types of immune system T cells. Destructive Th17 cells that mediate chronic inflammation are elevated, and regulatory T cells, or Treg cells, which suppress inflammatory responses and play a protective role in autoimmune disorders, are diminished.

Both cells differentiate from the same precursors -- naïve CD4 T cells -- and the beginning of their change to either Th17 or Treg cells starts with the same signal. Subsequently, a fate decision occurs, like a fork in the road, steering the changing CD4 cells to become either inflammatory T cells or regulatory T cells.

New, preclinical research, led by Laurie Harrington, Ph.D., associate professor in the UAB Department of Cell, Developmental and Integrative Biology at the University of Alabama at Birmingham, shows a pivotal role for cellular metabolism to regulate that fate decision, a decision that occurs very early in the activation of CD4 T cells. This opens a possibility that manipulating the cellular metabolism of T cells may provide a new, promising therapeutic intervention to modulate the balance between pathogenic Th17 and Treg cells in chronic autoimmune disorders. The research is published in the journal Cell Reports.

Upon activation, T cells were known to rapidly increase metabolism, including glycolysis and mitochondrial oxidative phosphorylation, or OXPHOS, to meet the energetic demands of differentiation. But the precise contribution of OXPHOS to that Th17 differentiation was not defined.

The UAB researchers, and one colleague at New York University, found that ATP-linked mitochondrial respiration during Th17 differentiation was essential to upregulate glycolysis and the TCA cycle metabolism. Strikingly, it also was essential to promote inflammation of the central nervous system by Th17, as shown in a mouse model for multiple sclerosis.

In the mouse model, experimental autoimmune encephalitis, Th17 cells cause the disease progression. For the experiment, harvested CD4 T cells were differentiated using a combination of Th17-polarizing cytokines. One group was the polarized control, and one group was polarized in the presence of oligomycin, an inhibitor of mitochondrial OXPHOS. Then the T cells were transferred into experimental mice. Mice receiving the T cells treated with oligomycin during polarizing conditions showed a significantly delayed onset of disease and reduced disease severity. Both groups of T cells proliferated robustly after transfer.

In mechanistic experiments, the researchers detailed the early molecular events that differ between cells polarized in the presence or absence of oligomycin. These included gene sets that are upregulated or downregulated, presence or absence of Th17 or Treg cell markers, expression of signature transcription factors needed for Th17 differentiation, and expression of gene products that play a role in T cell receptor signaling.

A surprise was found in the timing of the fate decision. In an experiment, CD4 T cells were exposed to Th17-polarizing conditions with oligomycin present only during the first 24 hours. They were then washed and allowed to continue differentiation in the polarizing conditions. The effects of this brief exposure to oligomycin were T cells that lacked Th17 markers and instead showed hallmarks of Treg cells, including expression of Foxp3. Thus, the brief early exposure to oligomycin imprinted the Foxp3 fate decision.

Overall, Harrington said, "inhibition of mitochondrial OXPHOS ablates Th17 pathogenicity in a mouse model of multiple sclerosis and results in generation of functionally suppressive Treg cells under Th17 conditions."

Credit: 
University of Alabama at Birmingham

Video game-like intervention shows promise in improving attention of children with ADHD

A four-week randomised controlled trial of 348 children aged 8-12 years, published in journal The Lancet Digital Health, suggests that a digital intervention for paediatric attention deficit hyperactivity disorder (ADHD) might help to improve inattention with minimal adverse effects. Further research is needed to confirm the clinical meaningfulness of the observed changes, but the digital nature of the intervention could help to improve access for some patients.

ADHD is a childhood-onset disorder estimated to affect around 5% of people worldwide. It is characterised by persistent impaired attention, hyperactivity or impulsivity. Recommended treatments include both medication and evidence-based behaviour therapy but both have limitations. Access to behavioural interventions is limited because of a lack of properly trained paediatric mental health specialists and availability of services. Medication may not be suitable for some patients due to caregiver preferences or concerns about abuse, misuse, and diversion. Moreover, medication, while highly effective for treating ADHD symptoms, may not be as effective at addressing day-to-day cognitive and functional impairments faced by patients. Many areas of impairment for children with ADHD, such as social and academic functioning require more complex skill acquisition over time and treatments that only address symptoms might not directly improve these kinds of challenges. Digital alternatives to traditional care have shown promise and could help tackle these issues.

Researchers investigated whether a video game-like intervention designed to target attention and cognitive control, could improve a validated score related to attention (Test of Variables of Attention (TOVA) Attention Performance Index (API)). Between July 2016 and November 2017, 348 children were randomly assigned to receive the digital therapy (n=180) or a control (n=168), which was designed to match the intervention as a challenging and engaging digital word game.

Patients withdrew from any medication for three days so a baseline attention score could be measured before the intervention or control, which was compared with results of the same test at the end of the trial. During the trial, participants did not take their ADHD medication. Patients were instructed to use the intervention or control for a total of 25 minutes a day for five days per week. Compliance was monitored electronically and parents notified by email if no intervention was used in a 48-hour period.

Professor Scott Kollins of the Duke University Medical Centre, USA, says: "Our trial is one of only a few randomised controlled investigations into digital interventions for children with ADHD. The improvement observed in attentional functioning in patients who received the active intervention was meaningful, although the full clinical meaningfulness of the findings should be explored in further studies. We do not yet know whether this intervention could be considered as an alternative to current treatments." [1]

On average, patients in the intervention group completed 83 of 100 sessions in the four weeks, and in the control group 96 of 100 sessions. For the intervention group, significantly more patients improved their scores on the attention score. In terms of secondary outcomes including symptom ratings both the treatment and control groups improved, but there were no differences between the groups.

There were no serious adverse events or discontinuations on the trial. Only 12 children in the intervention group and three in the control group experienced any treatment-related adverse events, the most common of which were frustration (five in the intervention group) and headaches (three in the intervention group).

According to the authors, the TOVA API attention score was selected as a primary endpoint in this trial because the intervention was designed specifically to target cognitive control and attention. This outcome is different from symptom rating scales that are routinely used as primary outcomes in pharmacological treatment studies of ADHD. These symptom rating scales were used as secondary outcomes in the present study, though there were no differences between the intervention and control groups.

Co-author, Doctor Elena Cañadas, of Akili Interactive Labs, USA, says: "These findings are confirmatory evidence of digital therapy being a safe and easy-to-access intervention that could address issues with delivery of treatment for many patients with ADHD. Further work should investigate the impact of different scheduling and time involved in the treatment sessions as well as looking longer term at optimal benefits." [1]

The authors acknowledge several limitations. Just four weeks of treatment is relatively short and future work should look at a longer intervention, and additionally test different regimens of the video game-like therapy. The results may not be generalisable to the whole population of children with ADHD as milder cases that did not record below a certain attention score were excluded, as were children with significant psychiatric comorbidities. Children could not take their usual medication during the trial, which again means the result may not generalise to those who are on medication.

The study did not collect EEG data which could help explain the mechanism for the findings. Further work should explore secondary outcomes around behavioural improvements in more depth as well as the mechanism underlying the effect.

In a linked Comment, Dr Catalá López, scientist at Institute of Health Carlos III, Spain, says: "The results of Kollins and colleagues' study are interesting and highlight the way for further development of digital health interventions for children with ADHD. [...] Because of the chronic course of ADHD, in addition to short-term trials, long-term efficacy should be established in future studies. Thus, further research is needed to examine ways of sustaining treatment effects over the long-term, in the broader population of children with ADHD including those who have comorbidities and receive evidence-based therapies."

Credit: 
The Lancet

Engaging with schizophrenia -- experts argue for new approaches to treatment

A better understanding of the lived experience of people with schizophrenia would enable clinicians to help patients live with their condition, alongside treating symptoms with medication and psychotherapy, say experts at the University of Birmingham.

According to researchers at the University, this approach would involve developing an understanding of 'self-disturbance' in schizophrenia - in which patients' sense of connection to themselves and to their actions is disrupted.

In a new paper, published in The Lancet Psychiatry, researchers assessed existing theories around how this sense of self is constructed by schizophrenia patients. These theories explore the ways in which patients might feel their thoughts do not belong to them, or the irregularities in the way people with schizophrenia might perceive the world.

Rather than attempting to find out which theory is right, the authors argue these different approaches should be drawn together to inform clinical practice.

The authors also argue for the integration of more recent computational 'prediction error' models which attempt to explain delusion and hallucination in terms of a mismatch between expectation and experience.

Dr Clara Humpston, co-lead author of the study, explains: "Clinical intervention frequently focuses on correcting the patient's thoughts and perceptions. We think this effort is misplaced. Instead, well-informed clinicians might focus on how patients can lead a fulfilling life with their symptoms."

"Key to this is acknowledging that what we consider to be 'real' is likely to be different for the clinician and the patient. This conflict is likely to be particularly pronounced in the early stages of the illness where patients are likely to show a lack of insight into their behaviour or the condition itself. However, 'reality' is still constructed by similar neural and experiential mechanisms for both clinician and patient. Clinicians must not forget how they approach the discrepancies in reality can make a lasting impact on the patients' willingness to engage as it's often in the early stages of an illness that intervention can be most successful."

Dr Humpston added: "This sort of approach requires clinicians to listen more carefully with an open mind, putting aside what they would interpret as 'real'. With this understanding, clinicians are better able to engage with the patient, share clinical knowledge and come to a mutually understood plan for care and recovery."

Professor Matthew Broome, co-lead author, said: "Despite decades of working in parallel, we're at an important time in mental health research and practice where new approaches in computational neuroscience are engaging meaningfully with detailed accounts of personal experience and phenomenology, allowing science to address the issues that matter most to those who may experience psychosis and schizophrenia."

Credit: 
University of Birmingham

Researchers adapt cognitive assessment for people with intellectual disability

WHAT:
The NIH Toolbox Cognitive Battery--an assessment of cognitive functioning for adults and children participating in neuroscience research--can be adapted to people with intellectual disabilities by modifying some test components and making accommodations for the test-takers' disabilities, according to researchers funded by the National Institutes of Health. The adaptations ensure that the battery can be used to assess the cognitive ability of people with intellectual disabilities who have a mental age of 5 years and above, providing objective measures that could be used in a wide variety of studies.

The research team, led by David Hessl, Ph.D., of the University of California Davis Medical Center, published their findings in Neurology. The work was funded by NIH's Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD) and National Center for Advancing Translational Sciences, as well as the Administration for Community Living.

The battery is administered on a computer tablet and measures memory, vocabulary, reading and executive functioning, which includes skills such as the ability to shift from one thought to another, pay attention and control impulses. The researchers adapted the battery by reducing the complexity of the instructions and including developmentally appropriate starting points. They also developed a structured manual to guide test administrators.

The researchers validated the battery and its modifications by assessing 242 people ages 6 through 25 with fragile X syndrome, Down syndrome or other disabilities. They found that the battery produced reliable and valid results for those with a mental age of 5 years and above. The authors called for additional research to adapt the battery to people with lower mental ages and to older adults with intellectual disability who may be experiencing cognitive decline or dementia.

WHO:
Alice Kau, Ph.D., of the NICHD Intellectual and Developmental Disabilities Branch is available for comment.

ARTICLE:
Shields, R et al. Validation of the NIH Toolbox Cognitive Battery in intellectual disability. Neurology. 2020. http://dx.doi.org/10.1212/WNL.0000000000009131.

Credit: 
NIH/Eunice Kennedy Shriver National Institute of Child Health and Human Development

Neural cells speed up function in 3D bioprinted skeletal muscle constructs

image: WFIRM 3D bioprinter prints muscle.

Image: 
Wake Forest Institute for Regenerative Medicine/WFIRM

WINSTON-SALEM, NC, Feb. 24 -- Wake Forest Institute for Regenerative Medicine (WFIRM) scientists have improved upon the 3D bioprinting technique they developed to engineer skeletal muscle as a potential therapy for replacing diseased or damaged muscle tissue, moving another step closer to someday being able to treat patients.

Skeletal muscles are attached to bones by tendons and are responsible for the body's movement. When they are damaged, there is often loss of muscle function because the nerves are no longer sending signals to the brain.

Treatment of extensive muscle defect injuries like those caused by improvised explosive devices (IEDs) on the battlefield, for instance, is difficult and often requires reconstructive surgery with muscle grafts. Effective nerve integration of bioengineered skeletal muscle tissues has been a challenge.

"Being able to bioengineer implantable skeletal muscle constructs that mimics the native muscle to restore function represents a significant advance in treating these types of injuries," said lead author Ji Hyun Kim, PhD, of WFIRM. "Our hope is to develop a therapeutic option that will help heal injured patients and give them back as much function and normalcy as possible."

The work is detailed in a paper published online today by the journal Nature Communications.

Institute scientists previously demonstrated that the Integrated Tissue and Organ Printing System (ITOP), developed in house over a 14-year period, can generate organized printed muscle tissue that is robust enough to maintain its structural characteristics.

Since then, the WFIRM researchers have been developing and testing different types of skeletal muscle tissue constructs to find the right combination of cells and materials to achieve functional muscle tissue. In the current study, they investigated the effects of neural cell integration into the bioprinted muscle construct to accelerate functional muscle regeneration.

"These constructs were able to facilitate rapid nerve distribution and matured into organized muscle tissue that restored normal muscle weight and function in a pre-clinical model of muscle defect injury," said Sang Jin Lee, PhD., co-senior author, also of WFIRM.

This ongoing line of research at WFIRM is supported and federally funded through the Armed Forces Institute of Regenerative Medicine. The goal is to develop clinical therapies to treat wounded warriors that will also benefit the civilian population.

"Continued improvements in 3D bioprinting techniques and materials are helping us advance in our quest to make replacement tissue for patients," said co-senior author Anthony Atala, M.D., who is director of WFIRM.

Credit: 
Atrium Health Wake Forest Baptist

InSight detects gravity waves, devilish dust on Mars

ITHACA, N.Y. - More than a year after NASA's Mars InSight lander touched down in a pebble-filled crater on the Martian equator, the rusty red planet is now serving up its meteorological secrets: gravity waves, surface swirling "dust devils," and the steady, low rumble of infrasound, Cornell and other researchers have found.

"This is entirely new territory we are exploring," said Don Banfield, principal research scientist and the science lead for the Auxiliary Payload Sensor Suite, or APSS, aboard InSight. Banfield is the lead author of "The Atmosphere of Mars as Observed by Insight,"published in Nature Geoscience.

While other scientists studying the stationary Martian lander explore what lies beneath the planet's surface, the APSS team keeps track of the meteorology above.

Mars experiences strong daily pressure and temperature fluctuations, "stronger than on Earth," Banfield said. "The atmosphere is so thin that it can heat up and cool down much faster than on Earth." (Click here for the latest Mars Weather.)

About a month after landing, InSight endured a large dust storm, which is a periodic global event on Mars that can dramatically change the planet's weather and climate. The scientists also noted daily changing winds controlled by the seasonal freezing and thawing of the carbon dioxide in the polar caps.

The craft features a seismometer for detecting Mars' quakes; sensors for gauging wind and air pressure; a magnetometer for measuring the planet's magnetic forces; and a probe designed to take the planet's temperature.

Banfield and the meteorology team were surprised that their sensors detected gravity waves, which are buoyancy oscillations of air parcels. Such waves on Earth can create linear rows of rolled "morning glory" clouds - white, puffy clouds that look like lofty jelly rolls. "We're still working to understand what these waves can teach us about Mars," Banfield said.

Banfield and his colleagues have noted "infrasound" - pressure oscillations below 10 Hertz, found by the lander's sensors. It is a low rumbling below what the human ear can detect.

"We expected infrasound would exist, but this is the first direct measurement," Banfield said. "It's still mysterious as to exactly what causes the signals we've heard, but we'll keep studying."

During the Martian daytime, the APSS team has found convective vortices, better known as "dust devils" - small whirlwinds forming into tiny tornadoes, caused by wind shear and convection near the surface. Earth has dust devils, too, formed from dust and sometimes even snow. Banfield said these may be the cause of the Mars' constant dustiness.

"We have seen the pressure signature of thousands of dust devils, and we have tried to take images at the right times of day," Banfield said. "We've caught absolutely no dust devils on camera. Other landers have more effortlessly imaged dust devils, so it's surprising that we haven't even captured an image of one."

Credit: 
Cornell University

'Surprising' evolutionary shift in snakes

image: A juvenile Rhabdophis tigrinus 'keelback' snake from the Japanese island of Ishima, takes a defense posture. Utah State University herpetologist Alan Savitzky and colleagues document an evolutionary example of adaptation in the reptiles to compensate for the absence of defensive compounds following a shift to a new class of prey.

Image: 
Alan Savitzky

LOGAN, UTAH, USA - In the animal kingdom, survival essentially boils down to eat or be eaten. How organisms accomplish the former and avoid the latter reveals a clever array of defense mechanisms. Maybe you can outrun your prey. Perhaps you sport an undetectable disguise. Or maybe you develop a death-defying resistance to your prey's heart-stopping defensive chemicals that you can store in your own body to protect you from predators.

Such is the case with most snake species of the Rhabdophis genus. Commonly called "keelbacks" and found primarily in southeast Asia, the snakes sport glands in their skin, sometimes just around the neck, where they store bufadienolides, a class of lethal steroids they get from toads, their toxic prey of choice.

"These snakes bend their necks in a defensive posture that surprises unlucky predators with a mouthful of toxins," says Utah State University herpetologist Alan Savitzky, who has long studied the slithery reptiles.

"Scientists once thought these snakes produced their own toxins, but learned, instead, they obtain it from their food - namely, toads."

In a surprising twist, Savitzky and colleagues discovered not all members of the genus derive their defensive toxin from the same source. The multi-national team, consisting of researchers from USU; Kyoto University, University of the Ryukyus and Nihon University in Japan; the Chinese Academy of Sciences and Leshan Normal University in China; the National Pingtung University of Science and Technology in Taiwan; the University of Sri Jayewardenepura in Sri Lanka; and the Vietnam Academy of Science and Technology, reports a species group of the snakes, found in western China and Japan, shifted its primary diet from frogs (including toads) to earthworms.

The earthworms don't produce the toxins; instead, the snakes also snack on firefly larvae, which produce the same class of toxins as the toads. Their findings appear in the Feb. 24, 2020, early online issue of the Proceedings of the National Academy of Sciences [DOI: 10.1073/pnas.1919065117].

"This is the first documented case of a vertebrate predator switching from a vertebrate prey to an invertebrate prey for the selective advantage of getting the same chemical class of defensive toxin," says Savitzky, professor in USU's Department of Biology and the USU Ecology Center.

Given the distant relationship between toads and fireflies, he says, the dramatic dietary shift most likely involved a chemical cue shared by the toads and fireflies; perhaps the toxins themselves.

"This represents a remarkable evolutionary example of adaptation to compensate for the absence of defensive compounds following a shift to a new class of prey," Savitzky says.

Credit: 
Utah State University

Cardiologists: Big data advances research, but shouldn't do so at the cost of privacy

image: Health data collected from your apps or wearable devices could revolutionize personalized health care, but the current lack of legal protections related to this technology could lead to personal health information becoming available to unscrupulous third parties, according to a new publication in "Circulation."

Image: 
Michigan Medicine

We know we shouldn't, but most of us have clicked "agree" in a hurry to download an app or sign up for a streaming service without reading the user agreement in detail.

And it might seem pretty safe to add an app that promises to help take control of your health through simple things like tracking your steps, measuring your blood pressure or noting your eating and exercise habits. But doctors who appreciate the research potential of incorporating big data into medical care are also warning about the need to manage the risk of exposing such health data while it's still possible to do so.

That's because the average patient probably has no idea about the complex rules around what happens to their sensitive health data when it is collected by apps. For instance, a 2019 study found 19 out of 24 health-related apps studied shared their users' data. Patient privacy laws preclude providers from sharing information, but the commercially available apps used to collect it don't necessarily have to follow HIPAA, the law that governs privacy about health information.

"Some health data shared with a physician in the context of a health-related interaction is protected, but in a different context that same data is not protected," says Jessica Golbus, M.D., a cardiovascular medicine fellow at the Michigan Medicine Frankel Cardiovascular Center. "The way the U.S. addresses health data focuses on who is using the data, but not the data itself."

This lack of protection could lead to people's sensitive data becoming available to unscrupulous third parties with sales interests, or those making decisions about life or disability insurance or employment, she says.

She authored a new perspective to highlight these consequences, published in the February 25 issue of Circulation, with two members of the University of Michigan's Institute for Healthcare Policy & Innovation: Brahmajee Nallamothu, M.D., MPH, also with the Frankel CVC, and W. Nicholson Price II, J.D., Ph.D., also with the U-M Law School.

Benefits outweigh risks - so far

Imagine getting a text message on a sunny day to remind you to go for a walk if you're trying to keep up your momentum after cardiac rehab, or a notification popping up with a sleep tip if your smartwatch finds you haven't gotten enough ZZZ's this week. Without advanced consumer devices that allow for data collection and synthesis, these kinds of interactions would still be a faraway dream. But these interventions are already here, or just around the corner.

And they have the potential to help patients better manage their own health, the authors believe.

"We're learning how to leverage digital technology, like smartphones and wearables, to engage patients in their health, improve remote monitoring of health conditions and potentially deliver targeted interventions outside of the clinic appointment," Golbus says.

The authors say they don't want to scare patients or providers away from embracing these potential improvements to research and patient care, because the benefits presently outweigh the risks. However, the risks of commercial exploitation and privacy harms should be addressed, they write.

"We simply think it's important to make sure providers who are encouraging their patients to use this technology are also able to have conversations with their patients about data privacy," Golbus says.

The need for education and advocacy

Golbus and colleagues suggest urgently addressing two areas: provider education and legislation. This, they believe, would help researchers continue to harness more data than they've ever had access to before to improve health and disease prevention and treatment, while also protecting their patients' privacy.

"It could be helpful for patients to provide doctors with their heart rate and blood pressure measurements collected using wearable devices," co-author Nallamothu added. "But this means we have a responsibility to educate patients and ourselves about potential downstream uses of that data."

This is a problem that will be difficult for consumers to solve on their own, the authors say.

As in many other commercial areas, data are being collected at large scale with little understanding by the public. Beyond the need for health care providers to learn about and raise awareness of the risks to using commercially available applications, the authors say the government should also ensure better protection of consumer data.

The responsibility shouldn't be placed only on the shoulders of consumers worrying about to whom they've consented to data sharing, they write.

"The U.S. is living in the past with regard to data protection," co-author Price says. "We've got privacy law from a world where health data means what your doctor writes in your medical chart. We're not in that world anymore. The law needs to change."

Credit: 
Michigan Medicine - University of Michigan

Wildfire cycles and climate change

Wildfire, a natural phenomenon, existed on the Earth over 400 Ma. However, the mechanisms underlying wildfire-climate interactions are not clear. Wildfire forcing has long been underestimated or overlooked in climate change studies.

A study led by AN Zhisheng from the Institute of Earth Environment (IEE) of the Chinese Academy of Sciences revealed a linkage between glacial cycles and inland Asian high-intensity wildfire events by analyzing high-resolution soot deposition over the last 2.6 million years. The study was published online in PNAS on Feb. 24.

As ice ages have come and gone during the Quaternary period, mountain glaciers on the central Asian plateau have grown and shrunk with climate oscillations.

To determine how the dry glacial periods and wetter interglacial periods affected wildfire events, the researchers reconstructed a unique soot record to reflect regional-to-continental high intensity fires for the central Asian plateau by measuring soot and char particles in more than 1300 loess (wind-blown silt) sediment samples.

They measured black carbon in the loess sediment and distinguished soot (submicron-sized particles that are produced under fast-burning, high intensity conditions) from char (larger particles that are produced by lower intensity, smoldering combustion), which made it possible to investigate specific relationships among the types of fires and climate variables.

This record is the first to show clear glacial-interglacial cycles of wildfire. The results revealed that high intensity fires were associated with drier glacial periods when dust loads in the atmosphere were high and the ice-volume modulated aridification affected wildfire occurrences in the Quaternary climate system.

"Wildfires could act as a source of soluble iron that fertilized the oceans, promoting the drawdown of atmospheric carbon dioxide," said Dr. HAN Yongming, first author of the paper. "These results suggest possible connections between fires, dust, and climate through the iron cycle and potential effects of wildfire on the global climate system."

Recognition of the impact of wildfires on the iron cycle could help us understand climate change not only over orbital glacial-interglacial cycles but also over longer geological timescales and in the future as well.

Credit: 
Chinese Academy of Sciences Headquarters

Simple blood test could help reduce heart disease deaths

image: Professor Konstantinos Stellos, from Newcastle University's Biosciences Institute

Image: 
Newcastle University, UK

Scientists at Newcastle University have revealed how a simple blood test could be used to help identify cardiovascular ageing and the risk of heart disease.

For the first time, experts led by Professor Konstantinos Stellos report that higher levels of amyloid-beta in the blood may be a key indicator of cardiovascular disease.

It is hoped that this research will one day lead to the development of a simple blood test that could be used as a clinical biomarker to identify patients who are most at risk, so that preventative measures can be put in place and death rates reduced.

Key role of amyloid-beta

Amyloid-beta is known to be involved in the development of Alzheimer's disease, yet scientists have now concluded that it may have a key role to play in vascular stiffening, thickening of the arteries, heart failure and heart disease progression.

The work, published today in the Journal of the American College of Cardiology, proposes the existence of a common link between both conditions, which has not been acknowledged before, and could lead to better patient care.

The findings suggest that the higher the level of amyloid-beta in the blood the higher the risk of developing serious heart complications.

Professor Stellos, from Newcastle University's Biosciences Institute, UK, who also works as a consultant cardiologist at Newcastle Hospitals NHS Foundation Trust, led a series of international studies over the last few years, which involved experts from countries such as Greece, Germany, Switzerland and the USA.

He said: "Our work has created and put all the pieces of the puzzle together. For the first time, we have provided evidence of the involvement of amyloid-beta in early and later stages of cardiovascular disease.

"What is really exciting is that we were able to reproduce these unexpected, clinically meaningful findings in patients from around the world. In all cases, we observed that amyloid-beta is a biomarker of cardiovascular ageing and of cardiovascular disease prognosis."

Global health problem

Cardiovascular disease is the number one cause of death around the world, taking almost 18 million lives each year. It includes coronary heart disease, heart attack, heart failure and other conditions.

Professor Stellos' Group, in collaboration with several international scientists, analysed blood samples from more than 6,600 patients from multiple cohort studies in nine countries, and found that patients could be divided into high and low risk categories of heart disease based on their amyloid-beta levels.

In the future, it is hoped that a simple blood test could be added to the current method of patient screening, known as the GRACE score, which assesses heart attack risk and guides patients' treatment plans.

Using the GRACE score, eight factors are used to predict the risk of heart attack, including age, blood pressure, kidney function and elevated biomarkers.

Further research at Newcastle University will focus on clinical trials to establish the use of a bedside blood test in predicting risk of heart attack and/or death and look at the most effective ways to reduce amyloid-beta in the blood.

Professor Stellos said: "I am interested in knowing which of my patients is at risk of death and/or recurrent heart attacks.

"Measuring amyloid-beta reclassified a large proportion of patients who had a heart attack in the correct risk categories over an established guideline-suggested risk score in independent clinical studies.

"If blood-based amyloid-beta predicts death in patients with heart disease, does it make a therapeutic target? Our next step is to investigate this."

Credit: 
Newcastle University

A promising new strategy to help broken bones heal faster

People with diabetes are at a higher risk of fracturing a bone than the general population. And if they do break one it also takes longer than normal to heal.

In the March issue of Biomaterials, Henry Daniell, Shuying (Sheri) Yang, and colleagues at Penn's School of Dental Medicine share promising findings from an animal model in which a plant-grown protein drug sped healing of a bone fracture. The work, which used the protein insulin-like growth factor-1 (IGF-1), showed that an orally delivered, shelf-stable medication grown in lettuce plants could stimulate the growth of bone-building cells and promote bone regeneration.

"It's amazing how one protein impacted fracture healing," says Daniell, corresponding author on the paper. "The current drug for diabetic patients with a fracture requires repetitive injections and hospital visits and as a result patient compliance is low. Here we gave an oral drug once a day and saw healing to be greatly accelerated."

"Fracture healing is a significant health issue, especially for patients with diabetes," says Yang, the paper's co-corresponding author. "They tend to have reduced bone repair and increased fracture risk, presenting a treatment challenge. Delivering this novel human IGF-1 though eating lettuce is effective, easily delivered, and an attractive option for patients. The study provides a new and ideal therapeutic option for diabetic fracture and other musculoskeletal diseases."

The study employed the plant-based drug production platform that Daniell has developed over many years, which entails introducing a protein of interest into plant cells, prompting them to begin expressing that gene in their cells, eventually producing that protein in their leaves which can be harvested and used in an oral therapy.

In this case, the target was a novel IGF-1, a protein important for bone and muscle health. Lower levels of IGF-1 in the blood are known to be associated with an increased risk of breaking a bone.

From earlier work focused on muscular dystrophy conducted with former Penn Dental Medicine faculty member Elizabeth Barton, now at the University of Florida, the researchers believed that a particular form of IGF, a precursor of the protein that includes a separate component known as an e-peptide, was likely to stimulate regeneration better than mature IGF-1 that lacked the peptide. Current IGF1 used in the clinic not only lacks the e-peptide but is also glycosylated, a less active form.

The team used methods that Daniell has refined to highly express the human version of IGF-1 in plant leaves and remove the antibiotic resistance gene that is used to select for plants growing the target protein, crucial steps to get a therapy ready for clinical use. They paired the IGF-1 precursor protein with another protein, CTB, which helps ferry the fused proteins from the digestive tract into the bloodstream.

After growing the transgenic lettuce plants, they freeze-dried and powdered the leaves, confirming the product was shelf-stable for nearly three years.

"Fundamental to all these projects is we want to make the delivery of this drug affordable, comfortable, and possible to do at home," says Daniell.

In both mouse and human cells, the researchers showed that the plant-derived drug caused a variety of cell types, including oral-tissue cells and osteoblasts, or bone-building cells, to grow and differentiate, or divide to form a variety of different cell types.

Turning next to investigate the activity of the drug in animal models, the researchers initially showed that feeding mice the plant-based product caused their IGF-1 levels to increase. And finally, in a diabetic mouse model, they discovered that feeding it to animals improved bone volume, density, and area, signs of a more robust healing process.

"We're hoping to find partners to advance this work as there are a lot of people with diabetes who could benefit from a therapy like this," Daniell says.

In future work, the researchers hope to continue developing the plant-growing IGF-1 to move it to the clinic, not only for bone fracture healing but for other musculoskeletal problems as well, including osteoporosis and bone regeneration following cancer.

Credit: 
University of Pennsylvania

PA school nurses on the frontlines of the opioid epidemic

image: Penn Nursing's Catherine C. McDonald, PhD, RN, Assistant Professor and lead investigator of the study.

Image: 
Penn Nursing

PHILADELPHIA (February 24, 2020) - As opioid overdoses continue to grab headlines, more states are providing their communities with easier access to naloxone, which can prevent death by reversing opioid overdoses. But while naloxone may be available at township buildings, libraries, or other community locations, little is known about how schools maintain a supply and use naloxone to prepare for treating overdose.

At the University of Pennsylvania School of Nursing (Penn Nursing), researchers conducted an online survey of 362 Pennsylvania school nurses (elementary, middle, and high school) to better understand how they have a supply, administer, and perceive storing naloxone in their schools. The results illustrate that though many nurses have a supply of naloxone in their school, important barriers to access and use of this life-saving medication still exist.

In the study, close to half of those who responded to the survey reported that they did not have a supply of naloxone in their school building. Reasons for not having a supply included lack of school board and/or administration support, belief that the medication was not needed in their school, and lack of funds, among other reasons. Of those who had naloxone in their school, 5% reported that the overdose reversal drug had been used at their school.

"Our survey results show that barriers, particularly those related to lack of support or beliefs that naloxone is not needed in a community, need to be addressed," says Penn Nursing's Catherine C. McDonald, PhD, RN, Assistant Professor and lead investigator of the study. "School nurses are in a position as both health educators and emergency responders regarding opioid overdose and may be an untapped resource to spread the adoption of community access to naloxone through normative behavior."

The study's findings have just been published in Public Health Nursing in an article titled "School Nurse Reported Supply and Administration of Naloxone in Schools." Co-authors of the study include Jennifer Pinto-Martin, PhD, MPH; Peggy Compton, PhD, RN, FAAN; and Madeleine Parikh, BSN, all of Penn Nursing; and Zachary Meisel, MD, MPH, MSHP, of the Perelman School of Medicine at the University of Pennsylvania. This study was supported by a pilot award from the Penn Injury Science Center which is an Injury Control Research Center funded by the Centers for Disease Control and Prevention (R49 CE002474).

Credit: 
University of Pennsylvania School of Nursing

TMS shows promise in treating stroke, dementia and migraines

image: Antonio H. Iglesias, MD

Image: 
Loyola Medicine

MAYWOOD, IL. - Transcranial magnetic stimulation (TMS) has shown significant efficacy in treating major depressive and obsessive compulsive disorders. A newly published literature review by Antonio H. Iglesias, MD, a Loyola Medicine neurologist and assistant professor at the Loyola University Chicago Stritch School of Medicine, highlights the compelling scientific and clinical data supporting further studies into the use of TMS to treat a broader range of common neurological conditions, including stroke, acute migraines and dementia.

A TMS device is made of one or two copper coils, positioned on an external, targeted area of a patient's scalp, which produces brief, magnetic pulses to an estimated depth of approximately 2 to 2.5 centimeters. The magnetic field triggers changes in neuronal activity and communication, which can alter unwanted activity within the brain.

"TMS can work as a stimulant or an inhibitor of cerebral activity, or both," says Dr. Iglesias. In addition, different sized coils and varying magnetic impulses can impact outcomes, depending on a patient's neuroplasticity--the capacity for neurons and the nerve cells to change and compensate for injury and disease.

"Most importantly, TMS is well-tolerated by most patients with few side effects," says Dr. Iglesias.

Transcranial magnetic stimulation is approved by the Food and Drug Administration (FDA) to treat major depression and obsessive compulsive disorder (OCD). According to the article, appearing in the February 4, 2020 journal Current Neurology and Neuroscience Reports, there are 1,641 studies underway utilizing TMS to treat a broad array of other neurological disorders, including more than 60 trials alone studying the effects of TMS to diminish or reverse the effects of early dementia. The most promising results are in the treatment of acute migraines and primary progressive aphasia (PPA), and the effects of stroke.

"TMS has now opened the field of neurology in multiple areas," says Dr. Iglesias. "And, there are many variables that could be studied and arranged to improve brain functionality and network connections."

Loyola Medicine is already successfully utilizing TMS in the treatment of depression and OCD, and "it is my hope that we can begin to explore utilizing this treatment for dementia, and specifically the early effects of PPA, which can rapidly diminish language and other cognitive skills," says Dr. Iglesias.

Credit: 
Loyola Medicine

The 'Trump effect' is real, prejudice was waiting to come out, say California political science academics

When Donald Trump formally announced his presidential candidacy in a June 2015 speech, he declared, among other comments, that "when Mexico sends its people, they're not sending their best," referred to Mexican immigrants as rapists, and reiterated his intention to build a wall at the border.

What impact did Trump's remarks have on normalizing expressions of prejudice? In the years since the election, many have speculated his racially inflammatory speech empowered people with latent prejudices to finally act on them -- a phenomenon known as the "Trump effect."

Now, a new study from a team of political scientists at the University of California, Riverside, has found empirical support for what they term Trump's "emboldening effect."

The team's findings, published online last week in the British Journal of Political Science, suggest Trump's inflammatory remarks on the campaign trail emboldened particular members of the American public, giving them license to express deeply held prejudices.

From UCR, the study's co-authors include Benjamin Newman, an associate professor of public policy and political science; Jennifer Merolla, a professor of political science; Sono Shah, a doctoral candidate in political science; Loren Collingwood, an associate professor of political science; and Karthick Ramakrishnan, a professor of public policy and political science. Co-author Danielle Casarez Lemi received her doctorate in political science from UCR in 2017.

To better understand the impact of Trump's racially inflammatory rhetoric, the researchers surveyed a total of 997 respondents in two online waves in spring 2016, during presidential nomination season.

In the first wave, respondents provided demographic information and their political orientations. To measure respondents' existing prejudice, the researchers asked them to rate how well the words "intelligent," "lazy," "violent," and "here illegally" describe "most Hispanics" in America. These are measures often used to capture stereotypes people hold about the Latinx population.

During the second wave about a week later, each respondent read one of five articles generated by the researchers. The articles drew on real election content related to either campaign finance reform or immigration and discussed the positions of Hillary Clinton and either Jeb Bush or Donald Trump.

Only some articles featured examples of Trump's racially inflammatory speech; of those, certain articles also included text about other political elites condoning or condemning his remarks.

After reading one of the five articles, each respondent read a short vignette and was asked to rate the acceptability of the behavior depicted by a character in the vignette. The main vignette read:

* "Darren Smith is a middle manager at an accounting firm and has been working at the firm for nearly eight years. One part of Darren's job is to supervise the new interns for the accounting firm. While Darren usually likes the interns, he does not like a new intern named Miguel. Darren regularly throws away Miguel's leftover food in the break-room fridge, claiming that 'Miguel's food is greasy and smells up the fridge.'"

"The purpose of this vignette was to depict a mundane situation in which an individual a) harbors prejudice and b) engages in discriminatory behavior," the researchers explained.

They noted that while most respondents described the behavior featured in the vignette as completely unacceptable (49%) or unacceptable (42%), the remaining 9% considered it normatively neutral or acceptable, denoting tolerance of prejudiced behavior in this instance.

The purpose of the activity was twofold: First, the researchers wanted to see whether respondents' tolerance of the behavior featured in the vignette could be linked to their existing prejudice, as measured during the first wave.

Additionally, the team was interested in determining if exposure to Trump's racially inflammatory remarks would have bearing on whether respondents with existing prejudice would subsequently demonstrate more tolerance of the behavior featured in the vignette.

They found exposure to Trump's racially inflammatory speech did have an emboldening effect, making individuals seem to feel more comfortable expressing their prejudice.

Moreover, when respondents weren't exposed to such rhetoric, the opposite was true: Prejudiced individuals appeared to suppress their prejudice by actively denouncing prejudiced behavior.

"However, we find that this 'suppression effect' slowly unravels and gives way to tolerance and acceptance of prejudiced behavior following exposure to racially inflammatory speech by a prominent political elite," the researchers wrote.

They said their most stunning finding was that the emboldening effect of Trump's speech on respondents was strongest when other political elites were presented as tacitly condoning his remarks.

Elites staying silent likely gave the impression that a norm -- in this case the norm of racial equality and tolerance -- was breaking down, which could lead members of the public to feel less pressure to conform with it, the researchers noted. They also found that this effect extended to behavior toward Latinx individuals.

"The emboldening effect of an elite like Donald Trump is most pronounced in a context where citizens are given signals that the political system tolerates prejudice by allowing candidates who engage in prejudiced speech to continue their campaigns without sanction," they added. "Last, we find that condemnation by other elites does little to suppress prejudice once it is activated."

Credit: 
University of California - Riverside

Defects add color to quantum systems

In a future built on quantum technologies, planes and spaceships could be fueled by the momentum of light. Quantum computers will crunch through complex problems spanning chemistry to cryptography with greater speed and energy efficiency than existing processors. But before this future can come to pass, we need bright, on-demand, predictable sources of quantum light.

Toward this end, a team of Stanford University material scientists, physicists and engineers, in collaboration with labs at Harvard University and the University of Technology Sydney, have been investigating hexagonal boron nitride, a material that can emit bright light as a single photon - a quantum unit of light - at a time. And it can do this at room temperature, making it easier to use compared to alternative quantum sources.

Unfortunately, hexagonal boron nitride has a significant downside: It emits light in a rainbow of different hues. "While this emission is beautiful, the color currently can't be controlled," said Fariah Hayee, the lead author and a graduate student in the lab of Jennifer Dionne, associate professor of materials science and engineering at Stanford. "We wanted to know the source of the multi-color emission, with the ultimate goal of gaining control over emission."

By employing a combination of microscopic methods, the scientists were able to trace the material's colorful emission to specific atomic defects. A group led by co-author Prineha Narang, assistant professor of computational materials science at Harvard University, also developed a new theory to predict the color of defects by accounting for how light, electrons and heat interact in the material.

"We needed to know how these defects couple to the environment and if that could be used as a fingerprint to identify and control them," said Christopher Ciccarino, a graduate student in the NarangLab at Harvard University and co-author of the paper.

The researchers describe their technique and different categories of defects in a paper published in the February 24 issue of the journal Nature Materials.

Multiscale microscopy

Identifying the defects that give rise to quantum emission is a bit like searching for a friend in a crowded city without a cellphone. You know they are there, but you have to scan the full city to find their precise location.

By stretching the capabilities of a one-of-a-kind, modified electron microscope developed by the Dionne lab, the scientists were able to match the local, atomic-scale structure of hexagonal boron nitride with its unique color emission. Over the course of hundreds of experiments, they bombarded the material with electrons and visible light and recorded the pattern of light emission. They also studied how the periodic arrangement of atoms in hexagonal boron nitride influenced the emission color.

"The challenge was to tease out the results from what can seem to be a very messy quantum system. Just one measurement doesn't tell the whole picture," said Hayee. "But taken together, and combined with theory, the data is very rich and provides a clear classification of quantum defects in this material."

In addition to their specific findings about types of defect emissions in hexagonal boron nitride, the process the team developed to collect and classify these quantum spectra could, on its own, be transformative for a range of quantum materials.

"Materials can be made with near atomic-scale precision, but we still don't fully understand how different atomic arrangements influence their opto-electronic properties," said Dionne, who is also director of the Photonics at Thermodynamic Limits Energy Frontier Research Center (PTL-EFRC). "Our team's approach reveals light emission at the atomic-scale, en route to a host of exciting quantum optical technologies."

A superposition of disciplines

Although the focus now is on understanding which defects give rise to certain colors of quantum emission, the eventual aim is to control their properties. For example, the team envisions strategic placement of quantum emitters, as well as turning their emission on and off for future quantum computers.

Research in this field requires a cross-disciplinary approach. This work brought together materials scientists, physicists and electrical engineers, both experimentalists and theorists, including Tony Heinz, professor of applied physics at Stanford and of photon science at the SLAC National Accelerator Laboratory, and Jelena Vučkovi?, the Jensen Huang Professor in Global Leadership in the School of Engineering.

"We were able to lay the groundwork for creating quantum sources with controllable properties, such as color, intensity and position," said Dionne. "Our ability to study this problem from several different angles demonstrates the advantages of an interdisciplinary approach."

Additional Stanford co-authors of this paper include Leo Yu, a postdoctoral scholar in the Heinz lab, and Jingyuan Linda Zhang, who was a graduate student in the Ginzton Laboratory during this research. Other co-authors include researchers from the University of Technology Sydney in Australia. Dionne is also a member of Stanford Bio-X, an affiliate of the Precourt Institute for Energy and a member of the Wu Tsai Neurosciences Institute at Stanford. Vučkovi? is also a professor of electrical engineering and a member of Stanford Bio-X and of the Wu Tsai Neurosciences Institute.

Credit: 
Stanford University