Body

Global study finds each city has unique microbiome fingerprint of bacteria

Each city has its own unique microbiome, a "fingerprint" of viruses and bacteria that uniquely identify it, according to a new study from an international consortium of researchers that included a team from the University of Maryland School of Medicine (UMSOM). The international project, which sequenced and analyzed samples collected from public transit systems and hospitals in 60 cities around the world, was published today in the journal Cell.

The research is considered to be the largest-ever global metagenomic study of urban microbiomes, spanning both the air and the surfaces of multiple cities. It features a comprehensive analysis for all the microbial species identified--including thousands of viruses and bacteria and two newly identified single-cell organisms not found in reference databases.

Study co-author Lynn Schriml, PhD, Associate Professor in the Department of Epidemiology & Public Health, Institute for Genome Sciences (IGS), at UMSOM, led the study sampling efforts for Baltimore's transit systems. "Baltimore's distinct microbial signature reveals a unique, fascinating, and diverse world, providing insights into geographical variation and previously unknown microbial genomes," she said.

Added study senior author Christopher Mason, PhD, a professor at Weill Cornell Medicine and the director of the WorldQuant Initiative for Quantitative Prediction: "Every city has its own 'molecular echo' of the microbes that define it. If you gave me your shoe, I could tell you with about 90 percent accuracy the city in the world from which you came."

The study was conducted before the COVID-19 pandemic shut down cities throughout the world, so the researchers are now looking at how the pandemic affected the microbiome fingerprint of each city. "It's a good question," Schriml said, "and we are address this in follow-up research." The consortium launched the MetaCOV project in 2020 to investigate the change in urban metagenomes and isolate the presence of the SARS-CoV-2 virus (the virus that causes COVID-19) in urban environments (e.g. ATM machines, wastewater, hospitals, transit systems).

Findings in the latest research are based on an analysis of 4,728 samples from cities on six continents taken over the course of three years and represent the first systematic worldwide catalogue of the urban microbial ecosystem. In addition to distinct microbial signatures in various cities, the analysis revealed a core set of 31 species that were found in 97 percent of samples across the sampled urban areas. The researchers identified 4,246 known species of urban microorganisms, but they also found that any subsequent sampling will still likely continue to find species that have never been seen before, which highlights the raw potential for discoveries related to microbial diversity and biological functions awaiting in urban environments.

In the future, the findings also have many potential practical applications including identifying potential new compounds that can be used as antibiotics and small molecules annotated from biosynthetic gene clusters (BGCs) that have promise for drug development.

Credit: 
University of Maryland School of Medicine

Serendipitous discovery could lead to treatment for strokes, cardiac arrest

BOSTON - In a surprising discovery, researchers at Massachusetts General Hospital (MGH) identified a mechanism that protects the brain from the effects of hypoxia, a potentially lethal deprivation of oxygen. This serendipitous finding, which they report in Nature Communications, could aid in the development of therapies for strokes, as well as brain injury that can result from cardiac arrest, among other conditions.

However, this study began with a very different objective, explains senior author Fumito Ichinose, MD, PhD, an attending physician in the Department of Anesthesia, Critical Care and Pain Medicine at MGH, and principal investigator in the Anesthesia Center for Critical Care Research. One area of focus for Ichinose and his team is developing techniques for inducing suspended animation, that is, putting a human's vital functions on temporary hold, with the ability to "reawaken" them later. This state of being would be similar to what bears and other animals experience during hibernation. Ichinose believes that the ability to safely induce suspended animation could have valuable medical applications, such as pausing the life processes of a patient with an incurable disease until an effective therapy is found. It could also allow humans to travel long distances in space (which has frequently been depicted in science fiction).

A 2005 study found that inhaling a gas called hydrogen sulfide caused mice to enter a state of suspended animation. Hydrogen sulfide, which has the odor of rotten eggs, is sometimes called "sewer gas." Oxygen deprivation in a mammal's brain leads to increased production of hydrogen sulfide. As this gas accumulates in the tissue, hydrogen sulfide can halt energy metabolism in neurons and cause them to die. Oxygen deprivation is a hallmark of ischemic stroke, the most common type of stroke, and other injuries to the brain.

In the Nature Communications study, Ichinose and his team initially set out to learn what happens when mice are exposed to hydrogen sulfide repeatedly, over an extended period. At first, the mice entered a suspended-animation-like state--their body temperatures dropped and they were immobile. "But, to our surprise, the mice very quickly became tolerant to the effects of inhaling hydrogen sulfide," says Ichinose. "By the fifth day, they acted normally and were no longer affected by hydrogen sulfide."

Interestingly, the mice that became tolerant to hydrogen sulfide were also able to tolerate severe hypoxia. What protected these mice from hypoxia? Ichinose's group suspected that enzymes in the brain that metabolize sulfide might be responsible. They found that levels of one enzyme, called sulfide:quinone oxidoreductase (SQOR), rose in the brains of mice when they breathed hydrogen sulfide several days in a row. They hypothesized that SQOR plays a part in resistance to hypoxia.

There was strong evidence for this hypothesis in nature. For example, female mammals are known to be more resistant than males to the effects of hypoxia--and the former have higher levels of SQOR. When SQOR levels are artificially reduced in females, they become more vulnerable to hypoxia. (Estrogen may be responsible for the observed increase in SQOR, since protection from the adverse effects of hypoxia is lost when a female mammal's estrogen-producing ovaries are removed.) Moreover, some hibernating animals, such as the thirteen-lined ground squirrel, are highly tolerant of hypoxia, which allows them to survive as their bodies' metabolism slows down during the winter. A typical ground squirrel's brain has 100 times more SQOR than that of a similar-sized rat. However, when Ichinose and colleagues "turned off" expression of SQOR in the squirrels' brains, their protection against the effects of hypoxia vanished.

Meanwhile, when Ichinose and colleagues artificially increased SQOR levels in the brains of mice, "they developed a robust defense against hypoxia," explains Ichinose. His team increased the level of SQOR using gene therapy, an approach that is technically complex and not practical at this point. On the other hand, Ichinose and his colleagues demonstrated that "scavenging" sulfide, by using an experiment drug called SS-20, reduced levels of the gas, thereby sparing the brains of mice when they were deprived of oxygen.

Human brains have very low levels of SQOR, meaning that even a modest accumulation of hydrogen sulfide can be harmful, says Ichinose. "We hope that someday we'll have drugs that could work like SQOR in the body," he says, noting that his lab is studying SS-20 and several other candidates. Such medications could be used to treat ischemic strokes, as well as patients who have suffered cardiac arrest, which can lead to hypoxia. Ichinose's lab is also investigating how hydrogen sulfide affects other parts of the body. For example, hydrogen sulfide is known to accumulate in other conditions, such as certain types of Leigh syndrome, a rare but severe neurological disorder that usually leads to early death. "For some patients," says Ichinose, "treatment with a sulfide scavenger might be lifesaving."

Credit: 
Massachusetts General Hospital

Prism adaptation treatment improves rehabilitation outcomes in people with spatial neglect

image: A research participant wearing prism goggles performs a task during prism adaptation treatment.

Image: 
Kessler Foundation/Jody Banks

East Hanover, NJ. May 25, 2021. A team of experts in post-stroke neurorehabilitation confirmed that including prism adaptation treatment in standard of care for patients with post-stroke spatial neglect improved functional and cognitive outcomes according to the Functional Independence Measure®. The article, "Prism Adaptation Treatment Improves Inpatient Rehabilitation Outcome in Individuals with Spatial Neglect: A Retrospective Matched Control Study" (doi: 10.1016/j.arrct.2021.100130.
was published in Archives of Rehabilitation Research and Clinical Translation on May XX, 2021. It is available open access at https://www.sciencedirect.com/science/article/pii/S2590109521000343

The authors are Peii Chen, PhD, and Emma Kaplan, BS, of Kessler Foundation's Center for Stroke Rehabilitation Research; Nicole Diaz-Segarra, MD, of Kessler Institute for Rehabilitation; Kimberly Hreha, EdD, OTR/L, of the University of Texas Medical Branch; and A.M. Barrett, MD, of the Atlanta VA Health Care System. Dr. Chen also has an academic appointment at Rutgers New Jersey Medical School and Dr. Barrett has an academic appointment at Emory University School of Medicine.

Recent research by Dr. Chen's group shows that after a stroke, approximately 30 percent of people experience spatial neglect, a neurological disorder that disrupts a person's 'internal GPS', causing them to have difficulties in navigating their environment. Spatial neglect can also affect people with other types of brain injuries. Symptoms include poor balance and navigation as well as impediments in reading and memory. In addition, spatial neglect slows rehabilitation progress and functional recovery, and increases the risk for injury.

Prism adaptation treatment is a very promising therapy for spatial neglect, with some studies indicating that its beneficial effects may last years. During prism adaptation, people wear goggles equipped with optical prisms that shift their motor movements toward the neglected side. Through a ten-session program, individuals regain some ability to function in the neglected space.

In this video, Dr. Chen demonstrates prism adaptation treatment and how it works.

https://www.youtube.com/watch?v=DvlCWTXh12E

However, prism adaptation treatment has only been tested as an auxiliary therapy, with little data available regarding its effectiveness when integrated into the standard of care for people with spatial neglect.

In this study, researchers analyzed data from patients with spatial neglect at 14 rehabilitation hospitals where Kessler Foundation Prism Adaptation Treatment (KF-PAT®) was implemented in occupational therapy. They compared the outcomes of patients who received 8 to 12 daily sessions of prism adaptation treatment to patients who were untreated. The primary outcome measurable was the Functional Independence Measure (FIM®) and the secondary measurable was discharge destination.

"Our results clearly demonstrated that prism adaptation treatment enhances rehabilitation outcome," said Dr. Chen, senior research scientist at Kessler Foundation. "The treated group showed reliably higher scores than the untreated group in total functional independence and cognitive functional independence." She adds, "This is extremely encouraging evidence that integrating prism adaptation into standard of care for people with spatial neglect is beneficial."

Study results did not show a significant effect in the rate of return-home after discharge, and the authors note that more research on prism adaptation treatment in this population is needed to further clarify how to optimize its effectiveness.

Credit: 
Kessler Foundation

Asthma medication use and exacerbations

Boston, MA-- How does the switch to a high-deductible health plan affect children with asthma? A new study led by researchers at the Harvard Pilgrim Health Care Institute suggests that enrollment in a high-deductible health plan (HDHP) may not be associated with changes in asthma medication use or asthma exacerbations when medications are exempt from the deductible. The findings were published in JAMA Pediatrics on May 10.

To treat asthma, clinical guidelines recommend the use of controller medications, but adherence to these medications is generally suboptimal, putting those affected at risk for asthma exacerbations. High out-of-pocket costs have been associated with decreased controller medication use and adverse asthma outcomes for children and adults. While most evidence about HDHPs has come from studies focused on adult populations, the study team, led by Alison Galbraith, MD, MPH, lead author and Associate Professor in the Department of Population Medicine at Harvard Medical School, examined how enrollment in HDHPs may affect asthma controller medication use and exacerbation in children.

"One challenge of insurance design is balancing affordable coverage with access to necessary care for chronic conditions for both children and adults," said Dr. Galbraith. "Our findings highlight the potential protective effect of exempting asthma medications from the deductible in high-deductible health plans."

The study population, drawn from a large, national, commercial database, included children (ages 4 to 17) and adults (ages 18-64) years with persistent asthma who switched from traditional plans to HDHPs during a 24-month period. Compared to those who remained in traditional plans, children switching to HDHPs experienced small decreases in annual 30-day fills for inhaled corticosteroid-long-acting beta agonist medications but not for other controller medications. Adults switching to HDHPs did not have significant reductions in 30-day fills for any controller medications. There were no statistically significant differences in medication adherence, oral steroid bursts, or asthma-related ED visits for children or adults.

Regarding possible next steps, Dr. Galbraith adds, "Asthma is a major cause of preventable disease burden for both children and adults. Policy makers should consider adopting value-based designs and other policies exempting important medications for asthma and other chronic conditions--which might prevent adverse clinical outcomes--from the deductible."

Credit: 
Harvard Pilgrim Health Care Institute

Nonprofits, federal government surpass pharma to lead Alzheimer's drug development

Two articles published online today by Alzheimer's & Dementia: Translational Research & Clinical Interventions, a journal of the Alzheimer's Association, show substantial changes in the focus and funding of clinical trials for Alzheimer's disease therapies. The newly published articles throw a greater spotlight on a decision -- now before the U.S. Food and Drug Administration (FDA) -- that would potentially bring a new drug therapy to Alzheimer's patients for the first time in nearly 20 years.

Researchers analyzed clinicaltrials.gov, the U.S. National Library of Medicine's database, and five years of annual Alzheimer's pipeline reviews published by UNLV School of Integrated Health Sciences research professor Jeffrey L. Cummings and colleagues. The results capture the well-publicized retreat of pharma from Alzheimer's clinical trials, especially early phase human trials, and the emergence of federal agencies and nonprofit organizations as the primary drivers of growth and innovation.

In the first study, "Who Funds Alzheimer's Disease Drug Development?," Cummings and colleagues found that the number of Alzheimer's clinical trials supported by pharmaceutical companies has decreased over the past five years, while trials supported by federal government sources and public-private partnerships (PPP) have increased. The authors observe that pharma companies are not increasing their involvement in Alzheimer's trials and drug development except through PPP, enabling them to distribute the cost and risk. And they largely engage only in late-stage (Phase 3) clinical trials.

The researchers found that the trials gap is increasingly being filled by academic medical centers (AMCs). Trials by AMCs are up 78% over the past five years, primarily funded by the U.S. National Institutes of Health (NIH) and programs of the National Institute on Aging (NIA), Alzheimer's Association, and Alzheimer's Drug Discovery Foundation (ADDF), including the Alzheimer's Association's Part The Cloud initiative.

"Nonprofits and the NIH are making a huge difference in drug development for Alzheimer's and all other dementia," Cummings said. "Recent years have been a time of pharma retrenchment after multiple negative clinical trials, but also a time of innovation in early-stage trials and re-evaluation of previously under-resourced ideas. We found in our review that, in the newer early-stage clinical trials, the therapeutic mechanisms are more diversified, biomarkers are more regularly used, and repurposed agents are being explored -- increasingly led by academic researchers and funded by NIH, the Alzheimer's Association, and ADDF.

A second paper, "Alzheimer's Disease Drug Development Pipeline: 2021," also by Cummings and colleagues, including a student, Justin Bauzon, from the UNLV School of Medicine, reinforces these trends by showing that, despite pharma's retreat from Alzheimer's, the total number of agents in Alzheimer's clinical trials has been relatively steady over the last five years. The total is up slightly from 2020, driven by additional agents in Phase 2 studies. There is also increasing diversity of targets and therapeutic mechanisms of drugs in the Alzheimer's pipeline, driven by innovative Phase 1 and 2 trials.

"Alzheimer's Association funding, partnerships - including the NIA and ADDF - and advocacy for federal Alzheimer's research funding are now the primary drivers of growth in Alzheimer's clinical trials, filling the gap left by pharma's retreat, and growing and diversifying the front end of the drug pipeline," said Maria C. Carrillo, Alzheimer's Association chief science officer.

The NIA now distributes more than $3 billion annually for Alzheimer's and dementia research, up from $500 million just a few years ago. "This great victory is almost completely due to Alzheimer's Association legislative efforts, our grassroots advocates, and our champions in Congress," Carrillo said.

The FDA is reviewing aducanumab (Biogen) for the treatment of Alzheimer's disease. A decision is expected by June 7.

"If pharma companies do not see a clear path to FDA approval, they will continue not to invest in Alzheimer's," Cummings said. "This further highlights the importance of the decision before the FDA at this moment."

There are four drugs approved and commonly used to treat the symptoms of Alzheimer's dementia, plus a combination therapy that includes two of these drugs. There are currently no approved drugs that change the course or delay the progression of the disease or that delay or stop clinical decline. No new drugs have been approved for Alzheimer's since 2003.

The article authors say, "If new therapies are approved by regulatory authorities, more sponsors and more funding may be attracted to Alzheimer's research with accelerated innovation."

The two studies were supported by the Chambers-Grundy Center for Transformative Neuroscience at UNLV, dedicated to advancing clinical trial methods to get better treatments to patients faster.

Credit: 
Alzheimer's Association

COVID-19 news from Annals of Internal Medicine

Vaccination not associated with worsening symptoms or quality of life in patients with persisting symptoms after acute COVID-19
Findings may help those experiencing vaccine hesitancy after infection

Free full text: https://www.acpjournals.org/doi/10.7326/M21-1976

A small case series of patients who received one dose of either the Pfizer-BioNTech or Oxford-AstraZeneca vaccine found that vaccination was not associated with worsening symptoms or quality of life in patients with persisting symptoms after acute COVID-19. These findings may help to assuage vaccine hesitancy in patients with persistent symptoms. A case report is published in Annals of Internal Medicine.

Researchers from North Bristol NHS Trust, Bristol, United Kingdom, studied 163 patients from a single U.K. hospital to describe quality of life and symptoms after vaccination in a series of patients with persistent symptoms 8 months after hospitalization with COVID-19. Prior to vaccination, the patients reported a total of 159 symptoms across organ systems, including fatigue (75%), breathlessness (61%), and insomnia (53%). In addition, quality of life was markedly reduced from the norm.

All participants were reassessed at approximately 1 month after receiving the vaccine, and quality-of-life questionnaires and review of symptoms were repeated, with specific questions on whether symptoms had improved, stayed the same, or worsened. Participants were only asked to confirm vaccination status after symptom assessment to minimize bias due to a perceived association between the assessment and vaccination. They were subsequently asked about adverse effects temporally related to the vaccine.

Among the 44 participants who had received 1 dose of vaccine, 82% reported at least 1 persistent symptom. Among the 159 symptoms reported before vaccination, 23.2% had improved, 5.6% had worsened, and 71.1% had stayed the same. There was no significant worsening in quality-of-life metrics before versus after vaccination. No difference in any outcome measure was identified between the two different vaccines.

According to the researchers, these observations are important because they may provide reassurance to the increasing number of persons experiencing persistent symptoms after acute SARS-CoV-2 infection that receipt of a messenger RNA or adenoviral vector vaccine is not associated with a decrease in quality of life or worsening of symptoms.

Credit: 
American College of Physicians

Low blood flow in the brain may be an early sign of Parkinson's disease

Patients who suffer from REM sleep behaviour disorder have altered blood flow in the brain, which can lead to a lack of oxygen in the brain tissue. In the long term, this may cause symptoms of Parkinson's disease. This is shown by research from Aarhus University and Aarhus University Hospital.

Do you sleep restlessly and flail your arms and kick out in your sleep? This could be a sign of a disorder associated with diseases of the brain. Researchers from AU and AUH have examined whether the sleep disorder RBD - which is also known as Rapid Eye Movement Sleep Behaviour Disorder - may be an early sign of Parkinson's disease.

"We can see complications in the small blood vessels of the brain in patients with RBD, although these patients don't otherwise have any symptoms and the brain doesn't show other signs of disease," says Simon Fristed Eskildsen, who is behind the study.

He explains:

"We believe that the same disease processes that cause disrupted sleep also affect the ability to control the blood flow in the brain, which can lead to a lack of oxygen in the brain tissue. Over time this will gradually break down the brain tissue and cause symptoms that we see in Parkinson's disease."

Monitored while asleep

The changes in the brain are associated with reduced neurotransmitters, meaning that nerves in the brain have trouble controlling the blood vessels.

"A medical treatment would be able to restore the neurotransmitter and control of the blood vessels, thereby helping to maintain the cognitive function of patients who show early signs of Parkinson's disease," explains the researcher.

Twenty RBD patients aged 54-77 years and 25 healthy control subjects aged 58-76 participated in the study. The participants in the study were monitored in a sleep laboratory, where they had their EEG (electrical activity in the brain), EOG (eye movements), EMG (muscle activity) and ECG (electrical activity in the heart) measured during sleep.

"The patients and the control subjects were tested cognitively and MRI scanned, and the results revealed low blood flow and flow disturbances in the small blood vessels in the brain in the patients compared with the control group. In the patients, these flow disturbances seen in the cerebral cortex were associated with language comprehension, visual construction and recognition - this was also associated with reduced cognitive performance," says last author of the study, Nicola Pavese.

The researchers will now investigate whether the reduced blood flow in the brain deteriorates over time and how it is linked to the symptoms of Parkinson's disease. The hope is that it will be possible to use the method to predict the disease in patients with sleep disorders in order to then prevent the symptoms at an early stage.

The results have just been published in the scientific journal Brain.

Parkinson's disease

There are 7300 patients with Parkinson's disease in Denmark. Symptoms are slow movements, often with shaking, together with muscular rigidity. Parkinson's disease is a chronic condition that continues to worsen over time. About half of the patients experience cognitive decline early in the disease. The disease is somewhat more common in men than in women. Parkinson's disease occurs because the brain lacks dopamine. It primarily affects adults and the first signs most often appear between the ages of 50-70.

Credit: 
Aarhus University

Data from smartwatches can help predict clinical blood test results

image: The image shows the heart rate monitor reading on a standard smart watch.

Image: 
Michaela Kane, Duke University

DURHAM, N.C. -- Smartwatches and other wearable devices may be used to sense illness, dehydration and even changes to the red blood cell count, according to biomedical engineers and genomics researchers at Duke University and the Stanford University School of Medicine.

The researchers say that, with the help of machine learning, wearable device data on heart rate, body temperature and daily activities may be used to predict health measurements that are typically observed during a clinical blood test. The study appears in Nature Medicine on May 24, 2021.

During a doctor's office visit, a medical worker usually measures a patient's vital signs, including their height, weight, temperature and blood pressure. Although this information is filed away in a person's long-term health record, it isn't usually used to create a diagnosis. Instead, physicians will order a clinical lab, which tests a patient's urine or blood, to gather specific biological information to help guide health decisions.

These vital measurements and clinical tests can inform a doctor about specific changes to a person's health, like if a patient has diabetes or has developed pre-diabetes, if they're getting enough iron or water in their diet, and if their red or white blood cell count is in the normal range.

But these tests are not without their drawbacks. They require an in-person visit, which isn't always easy for patients to arrange, and procedures like a blood draw can be invasive and uncomfortable. Most notably, these vitals and clinical samples are not usually taken at regular and controlled intervals. They only provide a snapshot of a patient's health on the day of the doctor's visit, and the results can be influenced by a host of factors, like when a patient last ate or drank, stress, or recent physical activity.

"There is a circadian (daily) variation in heart rate and in body temperature, but these single measurements in clinics don't capture that natural variation," said Duke's Jessilyn Dunn, a co-lead and co-corresponding author of the study. "But devices like smartwatches or Fitbits have the ability to track these measurements and natural changes over a prolonged period of time and identify when there is variation from that natural baseline."

To gain a consistent and fuller picture of patients' health, Dunn, an assistant professor of biomedical engineering at Duke, Michael Snyder, a professor and chair of genetics at Stanford, and their team wanted to explore if long-term data gathered from wearable devices could match changes that were observed during clinical tests and help indicate health abnormalities.

The study, which began in 2015 at Stanford with the Integrative Personal Omics Profiling (iPOP) cohort, included 54 patients. Over three years, the iPOP participants wore an Intel Basis smart watch that measured their heart rate, movement, skin temperature and sweat gland activation. The participants also attended regular clinic visits, where researchers used traditional measurement methods to track things like heart rate, temperature, red and white blood cell count, glucose levels, and iron levels.

The experiment showed that there were multiple connections between the smartwatch data and clinical blood tests. For example, if a participant's watch indicated they had a lower sweat gland activation, as measured by an electrodermal sensor, that indicated that the patient was consistently dehydrated.

"Machine learning methods applied to this unique combination of clinical and real-world data enabled us to identify previously unknown relations between smartwatch signals and clinical blood tests," said ?ukasz Kidzi?ski, a co-lead author of the study and a researcher at Stanford.

The team also found that measurements that are taken during a complete blood lab, like hematocrit, hemoglobin, and red and white blood cell count, had a close relationship to the wearables data. A higher sustained body temperature coupled with limited movement tended to indicate illness, which matched up with a higher white blood cell count in the clinical test. A record of decreased activity with a higher heart rate could also indicate anemia, which occurs when there isn't enough iron in a patient's blood.

Although the wearables data isn't specific enough to accurately predict the precise number of red or white blood cells, Dunn and the team are highly optimistic that it could be a noninvasive and fast way to indicate when something in a patient's medical data is abnormal.

"If you think about someone just showing up in an emergency room, it takes time to check them in, to get labs going, and to get results back," said Dunn. "But if you were to show up in an ER and you've got an Apple Watch or a Fitbit, ideally you'd be able to pull the long-term data from that device and use algorithms to say, 'this may be what's going on.'

"This experiment was a proof-of-concept, but our hope for the future is that physicians will be able to use wearable data to immediately get valuable information about the overall health of a patient and know how to treat them before the clinical labs are returned," Dunn said. "There is a potential for life-saving intervention there if we can get people the right care faster."

Credit: 
Duke University

AI spots neurons better than human experts

image: The graphic shows an image generated by AO-OCT (top), and the result of WeakGCSeg algorithms to identify and trace the shapes of the ganglion cells in the eye (bottom).

Image: 
Sina Farsiu, Duke University

DURHAM, N.C. -- A new combination of optical coherence tomography (OCT), adaptive optics and deep neural networks should enable better diagnosis and monitoring for neuron-damaging eye and brain diseases like glaucoma.

Biomedical engineers at Duke University led a multi-institution consortium to develop the process, which easily and precisely tracks changes in the number and shape of retinal ganglion cells in the eye.

This work appears in a paper published on May 3 in the journal Optica.

The retina of the eye is an extension of the central nervous system. Ganglion cells are one of the primary neurons in the eye that process and send visual information to the brain. In many neurodegenerative diseases like glaucoma, ganglion cells degenerate and disappear, leading to irreversible blindness. Traditionally, researchers use OCT, an imaging technology similar to ultrasound that uses light instead of sound, to peer beneath layers of eye tissue to diagnose and track the progression of glaucoma and other eye diseases.

Although OCT allows researchers to efficiently view the ganglion cell layer in the retina, the technique is only sensitive enough to show the thickness of the cell layer -- it can't reveal individual ganglion cells. This hinders early diagnosis or rapid tracking of the disease progression, as large quantities of ganglion cells need to disappear before physicians can see the changes in thickness.

To remedy this, a recent technology called adaptive optics OCT (AO-OCT) enables imaging sensitive enough to view individual ganglion cells. Adaptive optics is a technology that minimizes the effect of optical aberrations that occur when examining the eye, which are a major limiting factor in achieving high-resolution in OCT imaging.

"This higher resolution makes it easier to diagnose neurodegenerative diseases," said Sina Farsiu, Professor of Biomedical Engineering at Duke. "But it also generates such a large amount of data that image analysis has become a major bottleneck in wide utilization of this potentially game-changing technology in eye and brain research."

In their new paper, Farsiu and Somayyeh Soltanian-Zadeh, a postdoctoral researcher in Farsiu's lab, devise a solution to this problem by developing a highly adaptive and easy-to-train deep learning-based algorithm that is the first to identify and trace the shapes of ganglion cells from AO-OCT scans.

To test the accuracy of their approach, which they've dubbed WeakGCSeg, the team analyzed AO-OCT data from retinas of both healthy and glaucoma subjects. Their framework efficiently and accurately segmented ganglion cells from both samples, and identified which samples came from the glaucomatous eyes based on the number and size of ganglion cells present.

"Our experimental results showed that WeakGCSeg is actually superior to human experts, and it's superior to other state-of-the-art networks that can process volumetric biomedical images," said Soltanian-Zadeh.

In addition to diagnostic work, the team is optimistic that WeakGCSeg will make it easier to conduct clinical trials of therapies for neurodegenerative diseases. For example, if a study is testing a therapy for glaucoma, WeakGCSeg can see if the therapy has slowed down cell degeneration compared to the control group. With OCT alone, the first sign of change would require hundreds if not thousands of cells dying, which can take months or years.

"With our technique, you'd be able to quantify the earliest change," said Farsiu. "Your clinical trial may also be shorter because you can see and measure such an early effect, so there's a lot of potential here."

The team plans to continue their collaboration with colleagues at the Food and Drug Administration (FDA), Indiana University, and the University of Maryland to apply their technique to a larger cohort of patients. They are also hoping to extend WeakGCSeg to different cell types, like photoreceptors, and diseases of the eye, like retinitis pigmentosa and inherent retinal diseases.

WeakGCSeg also has the potential to improve diagnosis and tracking the progression of neurological diseases. According to Farsiu, previous studies have shown that changes in the ganglion cell layer are associated with various diseases of the central nervous system, like Alzheimer's disease, Parkinson's disease, and ALS. With their new technique, they can further study this connection and potentially discover helpful biomarkers for improved diagnosis and treatment for these and other neurodegenerative diseases.

"We're incredibly grateful to our collaborators at the FDA and Indiana University for providing us with samples to test WeakCGSeg," said Farsiu. "And this work could not have been possible without the pioneering works of Donald Miller at Indiana University and Zhuolin Liu and Daniel Hammer at FDA in advancing the AO-OCT imaging technology. It is exciting to see the impact of such in vivo single-neuron imaging technologies on healthcare in the next decade."

Credit: 
Duke University

Pain monitoring helps assess the effectiveness of opioid-sparing approaches during surgery

image: A new study has shown that effective opioid-sparing anaesthesia with dexmedetomidine can be guided with NOL pain monitoring technology (Medasense). The study showed that the NOL monitor is able to detect the effect of dexmedetomidine on the patient's pain response and enable administration of less intraoperative opioids.

Image: 
Medasense

A new study has shown that effective opioid-sparing anaesthesia with dexmedetomidine can be guided with NOL pain monitoring technology (Medasense, Israel). The study showed that the NOL monitor is able to detect the effect of dexmedetomidine on the patient's pain response and enable administration of less intraoperative opioids.

Patients undergoing anaesthesia for surgical procedures are traditionally treated with opioids (e.g., remifentanil) to manage intraoperative pain. But clinicians are progressively seeking to reduce opioid use by introducing multimodal analgesia, a technique that involves a combination of medications that often includes a central alpha agonist, such as dexmedetomidine. This approach may offer pain relief, offset potentially adverse effects of individual drugs in larger doses and enable a reduction of opioid use during surgery.1 With this strategy, however, clinicians are not always able to predict whether the impact of the drug combination will be effective or excessive.

The current study addressed this challenge in multimodal analgesia and examined its effectiveness by monitoring pain response (nociception) levels in patients with the NOL monitoring system.

NOL monitoring is a non-invasive multi-sensor AI technology that provides a reliable index to objectively detect and quantify the patient's physiological response to painful stimuli during anaesthesia, when patients can't communicate. This index guides the clinical team in tailored pain medication for each patient. Earlier studies have shown that the NOL index outperforms other indexes2 for monitoring of pain response to surgical stimuli and that NOL-guided analgesia has resulted in reduced intraoperative opioid consumption3 as well as less pain after surgery.4

Led by Dr. Sean Coeckelenbergh of Erasme University Hospital in Brussels, Belgium, the new randomised controlled study examined 58 patients undergoing NOL-guided analgesia. Patients were randomised to receive either placebo or low-dose dexmedetomidine and both groups received intraoperative antinociception with a target-controlled infusion of remifentanil guided by the NOL index. The study, just published in the European Journal of Anaesthesiology,5 has shown that NOL can provide an objective reflection of dexmedetomidine's effects, helping the clinician in the decision-making process when applying multimodal pain relief, with the potential for significant opioid sparing during surgery.

"Multimodal anaesthesia has benefits, but it is limited in that we often do not know the exact depth of anaesthesia and antinociception," explains Dr. Coeckelenbergh. "The NOL index offers us a new way to quantify nociception when combining alpha agonists and opioids," he says.

Credit: 
Medasense Biometrics

Decreased testing could lead to surge in sexually transmitted infections

HERSHEY, Pa. -- Screening and testing for sexually transmitted infections (STIs) decreased by 63% for men and 59% for women during the early months of the COVID-19 pandemic, according to a new study led by Penn State and Quest Diagnostics researchers. The researchers said this may be the result of restrictions placed on direct patient care and shifts to telehealth and could lead to a possible future surge in STI cases.

This is the first national study to explore the impact of the pandemic on STIs since the Centers for Disease Control and Prevention (CDC) shared its analysis showing an all-time high level of cases in the United States in 2019.

Due to social distancing measures and supply constraints, screening guidelines issued by the CDC during the pandemic recommended halting STI tests except for patients exhibiting symptoms. However, researchers say these recommendations were detrimental, because risk-based screenings were paused, in favor of symptomatic testing, even though the majority of people (80%) with chlamydia or gonorrhea are asymptomatic.

The research team reviewed more than 18 million STI test results from patients, ages 14-49, from January 2019 through June 2020, and found that the pandemic had an adverse impact on sexual health screening. Data from the study, published on May 19 in the American Journal of Preventive Medicine, indicates that asymptomatic and at-risk individuals may not have received timely testing or treatment for STIs during the pandemic, resulting in missed cases.

"The quickest way for people to spread STIs is to not know that they have one," said Casey Pinto, assistant professor of public health sciences at Penn State College of Medicine and researcher at Penn State Cancer Institute. "The inability to detect asymptomatic cases could have negative repercussions for years to come."

Early evidence shows that people continued to be sexually active with individuals living outside of their households. Once testing returns to pre-pandemic levels, the researchers expect to see an increase in the overall prevalence of STIs. If so, this could likely lead to an increase in adverse health outcomes, such as pelvic inflammatory disease, infertility and other STIs.

Analyzing data from Quest Diagnostics, which accounts for about 20% of pre-pandemic STI case reports in the U.S., the investigators found that in early April, testing decreased by approximately 60%, yet the test positivity rate for chlamydia and gonorrhea infections increased. The researchers said this may be due to the CDC's recommendations to only screen symptomatic patients.

Among male patients, the researchers noted increases in test positivity rates for both chlamydia (18%) and gonorrhea (41%) during the pandemic. Similarly, for female patients during the pandemic, there was an increase in the test positivity rates for chlamydia (10%) and gonorrhea (43%) infections. According to the study, the number of cases varied across the U.S., and may be linked to the number of regional COVID-19 cases, and how both public and private entities handled clinic closures.

The researchers found that despite an increase in the test positivity rate, there was a 26% decline in the number of chlamydia cases and a 17% decline in the number of gonorrhea cases overall from March 2020 through June 2020. According to the researchers, this may be a result of CDC recommendations to solely test symptomatic patients, which means many asymptomatic cases may have been missed.

"This research highlights the importance of maintaining resources for STI management even in the midst of a pandemic," Pinto said. "Moving forward, health care providers should strike a balance between responding to emerging crises and continuing to provide routine sexual health services. In addition, STI treatment and intervention efforts should be considered when allocating resources to manage public health emergencies."

Guangqing Chi from the Department of Agricultural Economics, Sociology, and Education and Social Science Research Institute at Penn State; Barbara Van Der Pol from the University of Alabama at Birmingham; and Justin Niles, Harvey Kaufman, Elizabeth Marlowe, and Damian Alagia from Quest Diagnostics also contributed to this study. Quest Diagnostics provided support in the form of salaries for Justin K. Niles, Harvey W. Kaufman, Elizabeth M. Marlowe, and Damian Alagia but did not have any additional role in the design, collection, analysis or interpretation of data. The researchers cite no specific funding support. Additional financial disclosures can be found in the manuscript.

Credit: 
Penn State

Wearable devices show that physical activity may lower atrial fibrillation and stroke risk

BOSTON - Physical activity that conforms to medical and health association guidelines is associated with a lower risk of atrial fibrillation (Afib) and stroke, according to a study by researchers at Massachusetts General Hospital (MGH), who analyzed nearly 100,000 individuals equipped with wrist-worn accelerometers to measure their movement. The researchers' findings suggest that data from wearables, including a new generation of devices with sensors that allow for Afib detection, could provide an opportunity for the public health community to promote moderate physical activity as an effective way to improve health outcomes. The study was published in the European Heart Journal. [1]

"Although some population-based studies have observed a lower risk of atrial fibrillation with exercise, the link has remained inconclusive in part because those studies relied on self-reporting by participants, a less than exact science," says senior author Steven Lubitz, MD, MPH, an investigator in the Division of Cardiology at MGH. "Wearable accelerometers, on the other hand, provide an objective and reproducible measure of physical activity. What we found was that activity in accordance with guideline recommendations is indeed associated with substantially lower risks of both atrial fibrillation and stroke."

Nearly 100,000 members of the UK Biobank agreed to wear accelerometers - electromagnetic devices that measure body movement and orientation to infer certain activities - for seven days. MGH researchers then compared that data with later diagnoses of atrial fibrillation and stroke among participants, most between 55 and 70 years of age, reported to the Biobank from 2013 to 2020.

"Our findings supported recommendations from the European Society of Cardiology, the American Heart Association, and the World Health Organization for 150 minutes or greater of moderate to vigorous physical activity per week," notes lead author Shaan Khurshid, MD, an investigator in the Division of Cardiology at MGH.

Given the explosive growth of "smart" devices with increasingly sophisticated detection capabilities, the study stressed the exciting opportunities that now exist to link disease prevention programs to atrial fibrillation diagnostics. Those devices include wearables and smartphones able to measure heart rate and thus detect possible arrhythmias and other irregularities through their photoplethysmography (a technique that detects changes in blood flow through sensors on the skin) and electrocardiographic (ECG) capabilities.

"It's not hard to imagine how these devices could be used by physicians and patients to achieve a level of physical activity which we know to be associated with a reduced risk of atrial fibrillation," explains Lubitz. "And by potentially identifying Afib through photoplethysmography and electrocardiography, users could be alerted to seek professional care before a stroke develops."

Lubitz is hopeful that these emerging technologies could be applied to not just Afib and stroke but also to other forms of cardiovascular disease, including hypertension, and to metabolic diseases like diabetes, which might be affected by guideline-adherent physical activity. "Wearable devices capable of objective activity monitoring, motivational messaging, and disease detection could be low-cost, highly effective interventions to improve health outcomes for countless numbers of people."

Credit: 
Massachusetts General Hospital

States' developmental disability services lacking for adults with autism and their families

In the latest National Autism Indicators Report, researchers from Drexel University’s A.J. Drexel Autism Institute examined surveys of family members of autistic adults who use Developmental Disability services, and found needs for additional supports like respite care and assistance to plan for crisis and emergencies, especially among families whose adult lived with them.

Data from the surveys showed over one quarter of families with autistic adults who use Developmental Disability services and live with family do not have enough services or supports for themselves, according to the report. And over half of these families indicated a need for respite care to enable them to take a break from caregiving.

Four in 10 families had not discussed preparation for handing crises or emergencies within the previous year at a care team meeting, whether the autistic adult lived with family or apart from family in a group home or other setting. Researchers noted this may have left families ill-equipped to handle illness and unforeseen changes in caregiving needs during the COVID-19 pandemic.

“During the pandemic, families of autistic adults faced complications related to loss of direct support providers, loss of structure provided by daytime activities and a need for extreme precautions due to increased risk of serious illness in this population,” said Lindsay Shea, DrPH, leader of the Life Course Outcomes Research Program and director of the Policy and Analytics Center at the Autism Institute. “The pandemic highlighted just how dangerous lack of emergency preparation can be for families of autistic adults. Who will care for your loved one if you become sick and require hospitalization?”

Lead author Anne Roux, a research scientist at the Autism Institute, and her team looked at data from several thousand families across states that participated in the Adult Family Survey and the Family/Guardian Survey conducted in 2018-2019 as part of the National Core Indicators – a collaborative effort to collect data to help improve the quality of states’ Developmental Disability services.

Gaps in Resources

This is the first National Autism Indicators Report to examine the needs of families whose loved ones use Developmental Disability services, as little data is available that specifically explores the needs of caregivers. Past reports have shown that households of youth on the autism spectrum were more likely to experience financial hardship. While this latest report on families found that only 37% of families with an adult living with them received payment for the care they provided, despite the fact that they may be less able to work due to need for supervision and additional caregiving demands.

Among families whose autistic adult did not live with them, 10% reported abuse or neglect within the past year. Many of these individuals live in congregate group settings in which families sometimes feel they have little choice about which staff provide care for the adult.

“This rate seems very concerning. Having competent, trained direct support staff can make a huge difference in the confidence of family members who are relying on hired personnel to provide skilled care and supervision,” said Roux. “The top concern of parents of autistic adults is what will happen to their son or daughter when they are no longer able to care for them. You’re talking about a group of people with disabilities who have high rates of additional physical and mental health conditions and high levels of support needs for managing distressed behavior. Families need to be involved in planning for the delivery of essential services and supports. No family member wants to turn over this level of care to strangers.”

Yet, few policies govern services and supports for families, and not enough planning or resources are devoted to addressing the dilemma of aging caregivers within a growing population of adults on the autism spectrum.

Barriers to Community Participation

Although many families reported their loved one participated in activities in the community, only one-third had any type of paid daytime activities. About 40% of adults who lived with family, and 60% of those who lived apart from family, were engaged in facility-based work in settings that did not include people without disabilities. Hispanic autistic adults were less likely to participate in community-based activities or to have paid work compared to those who were non-Hispanic white, Black or other/mixed race.

About one in every three families felt like their adult did not have enough support to be able to work or volunteer in the community. “Despite this, families reported high levels of satisfaction with the supports and services their adult received,” Roux said. “At the same time families reported barriers to community participation including stigma in the community or not having adequate staffing to support the adult to do activities in the community.”

The report is a snapshot of a segment of autistic adults who are receiving services. The researchers know there is likely a sizable population of adults with autism who don’t receive Developmental Disability services and really need them.

“Some states don’t provide Developmental Disability services for adults with autism unless they also have intellectual disability. These policies ignore the fact that many autistic adults are cognitively-able but still have tremendous challenges navigating the social, organizational and communication demands of adult life,” Roux explained.

“It’s critical that we put in place state policies that appropriately recognize and adequately meet the unique needs of the growing group of adults on the autism spectrum who need state Developmental Disability services, including the family members who are such vital care partners,” said Shea.

The National Autism Indicators Report series is written in a format that can be understood by non-scientists who need the information, particularly decision- and policy makers. Autism Institute researchers aim to fill a need and desire for usable information. The purpose of indicators is to describe where people are now and have a comparison for measuring changes in the future.

Credit: 
Drexel University

Regular physical activity linked to better organized preteen brains

image: The images show positive effects of physical activity (in orange) and negative effects of BMI (in blue) on local brain networks. Areas where the effects of exercise and BMI overlap are shown in purple.

Image: 
Skylar Brooks, Computational Neuroscience Laboratory, Boston Children’s Hospital

Regular physical activity has positive effects on children's developing brain circuits, finds a Boston Children's Hospital study using neuroimaging data from nearly 6,000 early adolescents. Physical activity of any kind was associated with more efficiently organized, flexible, and robust brain networks, the researchers found. The more physical activity, the more "fit" the brain.

Findings were published in Cerebral Cortex on May 14.

"It didn't matter what kind of physical activity children were involved in - it only mattered that they were active," says Caterina Stamoulis, PhD, principal investigator on the study and director of the Computational Neuroscience Laboratory at Boston Children's. "Being active multiple times per week for at least 60 minutes had a widespread positive effect on brain circuitry."

Specifically, Stamoulis and her trainees, Skylar Brooks and Sean Parks, found positive effects on circuits in multiple brain areas. These circuits play a fundamental role in cognitive function and support attention, sensory processing, motor function, memory, decision-making, and executive control. Regular physical activity also partially offset the effects of unhealthy body mass index (BMI), which was associated with detrimental effects on the same brain circuitry.

Harnessing big data

With support from the National Science Foundation's Harnessing the Data Revolution and BRAIN Initiative, the researchers tapped data from the long-term, NIH-sponsored Adolescent Brain Cognitive Development (ABCD) study. They analyzed functional magnetic resonance imaging (fMRI) data from 5,955 9- and 10-year-olds and crunched these data against data on physical activity and BMI, using advanced computational techniques developed in collaboration with the Harvard Medical School High Performance Computing Cluster.

"Early adolescence is a very important time in brain development," notes Stamoulis. "It's associated with a lot of changes in the brain's functional circuits, particularly those supporting higher-level processes like decision-making and executive control. Abnormal changes in these areas can lead to risk behaviors and deficits in cognitive function that can follow people throughout their lifetime."

Gauging functional brain organization

The functional MRI data were captured in the resting state, when the children were not performing any explicit cognitive task. This allows analysis of the "connectome" -- the architecture of brain connections that determines how efficiently the brain functions and how readily it can adapt to changes in the environment, independently of specific tasks.

The team adjusted the data for age, gestational age at birth, puberty status, sex, race, and family income. Physical activity and sports involvement measures were based on youth and parent surveys collected by the ABCD study.

The analysis found that physical activity was associated with positive brain-wide network properties reflecting the connectome's efficiency, robustness, and clustering into communities of brain regions. These same properties were detrimentally affected by high BMI. Physical activity also had a positive effect on local organization of the brain; unhealthy BMI had adverse impacts in some of the same areas.

"Based on our results, we think physical activity affects brain organization directly, but also indirectly by reducing BMI, therefore mitigating its negative effects," Stamoulis says.

Optimal functional brain structure consists of small regions or "nodes" that are well connected internally and send information to other parts of the brain through strong, but relatively few, long-range connections, Stamoulis explains.

"This organization optimizes the efficiency of information processing and transmission, which is still developing in adolescence and can be altered by a number of risk factors," she says. "Our results suggest that physical activity has a protective effect on this optimization process across brain regions."

Credit: 
Boston Children's Hospital

Built environments don't play expected role in weight gain

People don't gain or lose weight because they live near a fast-food restaurant or supermarket, according to a new study led by the University of Washington. And, living in a more "walkable", dense neighborhood likely only has a small impact on weight.

These "built-environment" amenities have been seen in past research as essential contributors to losing weight or tending toward obesity. The idea appears obvious: If you live next to a fast-food restaurant, you'll eat there more and thus gain weight. Or, if you have a supermarket nearby, you'll shop there, eat healthier and thus lose weight. Live in a neighborhood that makes walking and biking easier and you'll get out, exercise more and burn more calories.

The new study based on anonymized medical records from more than 100,000 Kaiser Permanente Washington patients did not find that living near supermarkets or fast-food restaurant had any impact on weight. However, urban density, such as the number of houses in a given neighborhood, which is closely linked to neighborhood "walkability" appears to be the strongest element of the built environment linked to change in body weight over time.

"There's a lot of prior work that has suggested that living close to a supermarket might lead to lower weight gain or more weight loss, while living close to lots of fast-food restaurants might lead to weight gain," said James Buszkiewicz, lead author of the study and a research scientist in the UW School of Public Health. "Our analyses of the food environment and density together suggests that the more people there are in an area -- higher density -- the more supermarkets and fast-food restaurants are located there. And we found that density matters to weight gain, but not proximity to fast food or supermarkets. So, that seems to suggest that those other studies were likely observing a false signal."

The UW-led study, published earlier this month in the International Journal of Obesity, found that people living in neighborhoods with higher residential and population density weigh less and have less obesity than people living in less-populated areas. And that didn't change over a five-year period of study.

"On the whole, when thinking about ways to curb the obesity epidemic, our study suggests there's likely no simple fix from the built environment, like putting in a playground or supermarket," said Buszkiewicz, who did his research for the study while a graduate student in the UW Department of Epidemiology.

Rather than "something magical about the built environment itself" influencing the weight of those individuals, Buszkiewicz said, community-level differences in obesity are more likely driven by systematic factors other than the built environment -- such as income inequality, which is often the determining factor of where people can afford to live and whether they can afford to move.

"Whether you can afford to eat a healthy diet or to have the time to exercise, those factors probably outweigh the things we're seeing in terms of the built environment effect," he said.

The researchers used the Kaiser Permanente Washington records to gather body weight measurements several times over a five-year period. They also used geocodable addresses to establish neighborhood details, including property values to help establish socioeconomic status, residential unit density, population density, road intersection density, and counts of supermarkets and fast-food restaurants accessible within a short walk or drive.

"This study really leverages the power of big data," said Dr. David Arterburn, co-author and senior investigator at Kaiser Permanente Washington Health Research Institute. "Our use of anonymized health care records allows us to answer important questions about environmental contributions to obesity that would have been impossible in the past."

This study is part of a 12-year, joint UW and Kaiser Permanente Washington research project called Moving to Health. The goal of the study, according to the UW's project website, is to provide population-based, comprehensive, rigorous evidence for policymakers, developers and consumers regarding the features of the built environment that are most strongly associated with risk of obesity and diabetes.

"Our next goal is to better understand what happens when people move their primary residence from one neighborhood to another," Arterburn said. "When our neighborhood characteristics change rapidly -- such as moving to a much more walkable residential area -- does that have an important effect on our body weight?"

Credit: 
University of Washington