Tech

New model may explain the mystery of asymmetry in Parkinson's disease

image: A) Schematic representation of important connectome details in Parkinson's disease (PD). B) Brain-first PD: the asymmetric distribution of alpha-synuclein persists into later disease stages. C) Body-first PD: the symmetric alpha-synuclein pathology leads to more symmetric motor symptoms.

Image: 
Journal of Parkinson's Disease.

Amsterdam, April 28, 2021 - Parkinson's disease (PD) is characterized by slowness of movement and tremors, which often appear asymmetrically in patients. The new model of PD described in this review article published in the Journal of Parkinson's Disease may explain these perplexing asymmetrical motor symptoms and other known variations such as different degrees of constipation and sleep disorders.

PD is a heterogenous disorder. Symptoms and the speed with which symptoms progress vary greatly among patients. In three-quarters of patients, motor symptoms initially appear in one side of the body. Some patients develop constipation, loss of smell, sleep disorders, and other symptoms several years before diagnosis, but others do not. Although it is possible to define several subtypes of PD characterized by similar constellations of symptoms, the underlying causes of these differences are poorly understood. Aggregation and neuron-to-neuron spread of the protein alpha-synuclein is thought to be involved.

The alpha-synuclein Origin and Connectome (SOC) model presented by Per Borghammer, MD, PhD, DMSc, Department of Nuclear Medicine & PET, Aarhus University Hospital, Aarhus, Denmark, proposes a unifying model, which may explain much of this variation among patients. The SOC model is fundamentally based on two ideas: the location or origin of the first alpha-synuclein aggregates, and the importance played by the neural connectome in transmitting the alpha-synuclein pathology to other parts of the nervous system.

This model was developed by integrating already available evidence from clinical and imaging studies of patients, animal models of PD, and postmortem findings in brain tissue from PD patients. This unifying model seems capable of explaining why PD is often an asymmetric disease in the first place, but also why some patients show more asymmetry than others and some no asymmetry at all. It also explains why certain subtypes of PD seem to exist, including why constipation and sleep disorders emerge prior to diagnosis only in some patients.

"Imaging studies of living PD patients and studies of biopsies and gut and brain tissue from biobanks clearly suggest that PD patients display different profiles of neuronal damage," explained Dr. Borghammer. "In some patients, the brain is damaged before the peripheral nervous system, and in others the opposite pattern is seen. This new model, which is an extended version of the body-first versus brain-first hypothesis we described in this journal in 2019, proposes a simple explanation for motor asymmetry, while simultaneously offering explanations for several other unexplained phenomena in PD."

The model postulates that the first instance of pathological alpha-synuclein typically starts in a single location, and then spreads from this origin site using permissible neural connections. The origin site can occur in the enteric nervous system of the gut, leading to a body-first subtype of PD. This type is characterized by early symptoms from gut and other peripheral organs and also sleep symptoms stemming from the lower parts of the brainstem.

In contrast, the first alpha-synuclein pathology can also start inside the brain leading to a brain-first subtype. "Such patients will often develop motor symptoms quite quickly, whereas sleep symptoms and autonomic symptoms develop only later," noted Dr. Borghammer. "At later disease stages, the two types of patients will have a similar burden of motor and non-motor symptoms, but early on they are very different."

The second component of the SOC model pertains to how neurons are wired to each other, known as neuronal connectivity. One brain hemisphere communicates mainly to itself with only 1% of neuronal connections crossing the midline to the other hemisphere. Therefore, if the first occurrence of pathological alpha-synuclein arises in a single location in the brain, it will by definition be in either the left or the right hemisphere. The subsequent spreading of pathology will therefore happen primarily via "same-sided" connections including to the dopamine cells on the same side of the brain. This leads to asymmetric damage of the dopamine neurons causing asymmetric motor symptoms in patients.

"In short, we think that the motor asymmetry in PD must be understood in a brain-first vs. body-first context. In brain-first PD, the initial pathology starts in one hemisphere and initially damages that hemisphere via the predominantly same-sided connections, leading to marked asymmetry," noted Dr. Borghammer. "With time, the other hemisphere is also involved, evidenced by the increasingly symmetric motor symptoms in the patient."

Dr. Borghammer explained that the model also predicts that body-first PD patients will be different and generally have more symmetric motor symptoms. This is because the pathology spreads from the gut to the brain in a more symmetric fashion due to left/right overlapping connections in the peripheral nervous system. This then leads to a "wave" of spreading pathology inside the brain, which is more symmetric than that seen in brain-first patients. They therefore display more symmetric dopamine loss and motor symptoms, and this is exactly what clinical and imaging studies have been reporting. The model also predicts that at diagnosis, body-first patients already have a larger, more symmetric burden of alpha-synuclein pathology, which in turn promotes faster disease progression and accelerated cognitive decline.

"It is known that PD patients of the body-first type are at an increased risk of developing dementia. According to the SOC model this increased risk follows from the fact that, at the time of a PD diagnosis, the alpha-synuclein pathology is more widespread, more symmetrical, and shows more involvement of certain brainstem neurons, which are themselves involved in cognitive decline and dementia," Dr. Borghammer added.

Asked about the future outlook for the SOC model, Dr. Borghammer responded that it will now be tested in future studies. "A good scientific model should be testable and falsifiable, and the present model lives up to these requirements. The scientific community now needs to study whether the SOC model has more explanatory power than previous models of PD pathogenesis. It is certainly not a complete description of what goes wrong in PD and needs to be further refined," he concluded.

PD is a slowly progressive disorder that affects movement, muscle control, and balance and is characterized by a broad range of motor and non-motor symptoms. It is the second most common age-related neurodegenerative disorder affecting about 3% of the population by the age of 65 and up to 5% of individuals over 85 years of age.

Credit: 
IOS Press

A path to graphene topological qubits

image: Schematic illustration of the interplay of magnetism and superconductivity in a graphene grain boundary, a potential building block for carbon-based topological qubits

Image: 
Jose Lado/Aalto University

In the quantum realm, electrons can group together to behave in interesting ways. Magnetism is one of these behaviors that we see in our day-to-day life, as is the rarer phenomena of superconductivity. Intriguingly, these two behaviors are often antagonists, meaning that the existence of one of them often destroys the other. However, if these two opposite quantum states are forced to coexist artificially, an elusive state called a topological superconductor appears, which is exciting for researchers trying to make topological qubits.

Topological qubits are exciting as one of the potential technologies for future quantum computers. In particular, topological qubits provide the basis for topological quantum computing, which is attractive because it is much less sensitive to interference from its surroundings from perturbing the measurements. However, designing and controlling topological qubits has remained a critically open problem, ultimately due to the difficulty of finding materials capable of hosting these states, such as topological superconductors.

To overcome the elusiveness of topological superconductors, which are remarkably hard to find in natural materials, physicists have developed methodologies to engineer these states by combining common materials. The basic ingredients to engineer topological superconductors - magnetism and superconductivity - often require combining dramatically different materials. What's more, creating a topological superconducting material requires being able to finely tune the magnetism and superconductivity, so researchers have to prove that their material can be both magnetic and superconductive at the same time, and that they can control both properties. In their search for such a material, researchers have turned to graphene.

Graphene - a single layer of carbon atoms - represents a highly controllable and common material and has been raised as one of the critical materials for quantum technologies. However, the coexistence of magnetism and superconductivity has remained elusive in graphene, despite long-standing experimental efforts that demonstrated the existence of these two states independently. This fundamental limitation represents a critical obstacle towards the development of artificial topological superconductivity in graphene.

In a recent breakthrough experiment, researchers at the UAM in Spain, CNRS in France, and INL in Portugal, together with the theoretical support of Prof. Jose Lado at Aalto University, have demonstrated an initial step along a pathway towards topological qubits in graphene. The researchers demonstrated that single layers of graphene can host simultaneous magnetism and superconductivity, by measuring quantum excitations unique to this interplay. This breakthrough finding was accomplished by combining the magnetism of crystal domains in graphene, and the superconductivity of deposited metallic islands.

'This experiment shows that two key paradigmatic quantum orders, superconductivity, and magnetism, can simultaneously coexist in graphene,' said Professor Jose Lado, 'Ultimately, this experiment demonstrates that graphene can simultaneously host the necessary ingredients for topological superconductivity. While in the current experiment we have not yet observed topological superconductivity, building on top of this experiment we can potentially open a new pathway towards carbon-based topological qubits.'

The researchers induced superconductivity in graphene by depositing an island of a conventional superconductor close to grain boundaries, naturally forming seams in the graphene which have a slightly different magnetic properties to the rest of the material. The superconductivity and grain boundary magnetism was demonstrated to give rise to Yu-Shiba-Rusinov states, which can only exists in a material when magnetism and superconductivity coexisting together. The phenomena the team observed in the experiment matched up with the theoretical model developed by Professor Lado, showing that the researchers can fully control the quantum phenomena in their designer hybrid system.

The demonstration of Yu-Shiba-Rusinov states in graphene is the first step towards the ultimate development of graphene-based topological qubits. In particular, by carefully controlling Yu-Shiba-Rusinov states, topological superconductivity and Majorana states can be created. Topological qubits based on Majorana states can potentially drastically overcome the limitations of current qubits, protecting quantum information by exploiting the nature of these unconventional states. The emergence of these states requires meticulous control of the system parameters. The current experiment establishes the critical starting point towards this goal, which can be built upon to hopefully open a disruptive road to carbon-based topological quantum computers.

Credit: 
Aalto University

National cardiogenic shock initiative results demonstrate increased heart attack survival

image: Henry Ford Hospital in Detroit, Michigan.

Image: 
Henry Ford Health System

DETROIT (April 28, 2021) - The results of a large, national heart attack study show that patients with a deadly complication known as cardiogenic shock survived at a significantly higher rate when treated with a protocol developed by cardiologists at Henry Ford Hospital in collaboration with four metro Detroit hospitals.

Cardiogenic shock is a critical condition in which the heart is unable to pump enough blood to sustain the body's needs, depriving vital organs of blood supply and can cause them to eventually cease functioning. The typical survival rate of this deadly complication during a heart attack has historically hovered around 50%.

Led by a cardiology research team based at Henry Ford Hospital in Detroit, the National Cardiogenic Shock Initiative (NCSI) results demonstrated a survival rate of 71% in patients whose heart attack was complicated by cardiogenic shock and were treated with the protocol. Researchers announced the trial results today at the Society for Cardiovascular Angiography and Interventions (SCAI) 2021 Scientific Sessions.

"The National Cardiogenic Shock Initiative is the largest prospective study of therapy for acute myocardial infarction cardiogenic shock done in the United States in the past 20 years," said William O'Neill, M.D., medical director of Henry Ford's Center for Structural Heart Disease and principal investigator of the study.

"We have found that the original observations of the Detroit Cardiogenic Shock Initiative have been reproduced in 80 hospitals throughout the U.S. If implemented across the country, the National Cardiogenic Shock Initiative protocol could save up to 20,000 lives a year. We strongly believe that our results will be validated in the upcoming Recover IV trial, which should commence enrollment late next year."

Eighty hospitals in 29 states participated in NCSI and agreed to treat patients who presented with acute myocardial infarction and cardiogenic shock using a standard protocol, which involved rapid initiation of mechanical circulatory support (MCS) with an Impella 2.5® or Impella CP® heart pump, along with right heart catheterization to assess status of right and left ventricular heart function. Patients were enrolled between July 2016 and December 2020.

"More than 400 patients from across the country were enrolled in this study, including a third of patients who had already suffered a cardiac arrest," said Babar Basir, D.O., director of the acute mechanical circulatory support program at Henry Ford and principal investigator of the study. "The survival rate of 71% is significantly higher than any other previous study on cardiogenic shock. The protocol standardizes the process of care provided by nurses, technicians, interventional cardiologists and critical care physicians, providing predictable care in high risk and complex patients. The protocol has already saved many lives and will continue to do so as more hospitals implement its principles."

The treatment algorithm, available at henryford.com/cardiogenicshock, emphasizes quick recognition of the condition, then inserting a temporary straw-sized pump into the heart to keep blood flowing throughout the body. The Impella® heart pump, an FDA-approved device, is inserted through a catheter in the groin as soon as the patient arrives at the hospital. Doctors then treat the cause of the heart attack, either inserting a stent, removing a clot or taking other necessary action.

The NCSI study involved cardiologists at both community hospitals, where many patients with heart attack first present, and large academic centers. Of the more than 1,100 patients who were screened, 406 were enrolled into the study. The study also isolated predictive markers that indicate a patient's condition, an invaluable tool in determining treatment.

To learn more about the NCSI, visit henryford.com/cardiogenicshock.

Credit: 
Henry Ford Health

New frontier for 3D printing developed state-of-the-art soft materials able to self-heal

The scientific community is focusing its research into the multiple applications of Hydrogels, polymeric materials which contains a large amount of water, that have the potential to reproduce the features of biological tissues. This aspect is particularly significant in the field of regenerative medicine, which since a long time has already recognised and been using the characteristics of these materials. In order to be used effectively to replace organic tissues, hydrogels must meet two essential requirements: possessing great geometric complexity and, after suffering of a damage, being able to self-heal independently, exactly like living tissues.

The development of these materials may now be easier, and cheaper, thanks to the use of 3D printing: the researchers in the MP4MNT (Materials and Processing for Micro and Nanotechnologies) team of the Department of Applied Science and Technology of the Politecnico di Torino, coordinated by Professor Fabrizio Pirri, have demonstrated, for the first time, the possibility of manufacturing hydrogels with complex architectures capable of self-healing following a laceration, thanks to 3D printing activated by light. The research was published by the prestigious journal Nature Communication in an article entitled "3D-printed self-healing hydrogels via Digital Light Processing" (DOI 10.1038/s41467-021-22802-z).

Up to now, hydrogels either with self-healing properties or modellable in complex architectures using 3D printing, had already been created in the laboratory, but in the present case, the discovered solution encompasses both features: architectural complexity and the ability to self-heal following damage.
In addition, the hydrogel was created using materials available on the market, processed using a commercial printer, thus making the approach proposed extremely flexible and potentially applicable anywhere, opening new possibilities for development both in the biomedical and soft-robotics fields.

The research was carried out in the context of the HYDROPRINT3D doctoral project, funded by the Compagnia di San Paolo, in the frame of "Joint Research Projects with Top Universities" initiative, by the PhD student Matteo Caprioli, under the supervision of the DISAT researcher Ignazio Roppolo, in collaboration with Professor Magdassi's research group of the Hebrew University of Jerusalem (Israel).

"Since many years", Ignazio Roppolo recounts, "in the MP4MNT group, a research unit coordinated by Dr Annalisa Chiappone and I is specifically devoted to development of new materials that can be processed using 3D printing activated by light. 3D printing is able to offer a synergistic effect between the design of the object and the intrinsic properties of materials, making possible to obtain manufactured items with unique features. From our perspective, we need to take advantage of this synergy to best develop the capabilities of 3D printing, so that this can truly become an element of our everyday life. And this research falls right in line with this philosophy".

This research represents a first step towards the development of highly complex devices, which can exploit both the complex geometries and the intrinsic self-healing properties in various application fields. In particular, once the biocompatibility studies underway at the interdepartmental laboratory PolitoBIOMed Lab of the Politecnico have been refined, it will be possible to use these objects both for basic research into cellular mechanisms and for applications in the field of regenerative medicine.

Credit: 
Politecnico di Torino

Virtual reality could help improve balance in older people

video: Researchers at the University of Bath investigating how virtual reality (VR) can help improve balance believe this technology could be a valuable tool in the prevention of falls.
https://www.bath.ac.uk/announcements/virtual-reality-could-help-improve-...

Image: 
University of Bath

Researchers at the University of Bath investigating how virtual reality (VR) can help improve balance believe this technology could be a valuable tool in the prevention of falls.

As people grow older, losing balance and falling becomes more common, which increases the risk of injury and affects the person's independence.

Falls are the leading cause of non-fatal injuries in over 65-yearolds and account for over 4 million bed days per year in England alone, at an estimated cost of £2 billion.

Humans use three ways of keeping their balance: vision, proprioceptive (physical feedback from muscles and joints) and vestibular system (feedback from semi-circular canals in the ear). Of these, vision is the most important.

Traditional ways of assessing balance include patient surveys and physical tests such as using a treadmill or testing agility when performing specific movements or exercises.

However, the accuracy of these tests can be affected by age, sex and motivation, and the movements measured aren't necessarily reflective of real-life scenarios.

Therefore, several research studies have explored the use of VR to help assess balance and even help train users to improve their balance.

Dr Pooya Soltani, from the University of Bath, and Renato Andrade, from Clínica do Dragão, Espregueira-Mendes Sports Centre - FIFA Medical Centre of Excellence, Porto (Portugal), reviewed data from 19 separate studies to investigate the validity, reliability, safety, feasibility and efficacy of using head-mounted display systems for assessing and training balance in older adults.

Their results, published in the scientific journal Frontiers in Sports and Active Living, found that VR was effective in assessing balance and could be useful for fall prevention and for improving postural control and gait patterns.

They found these systems also have the capacity to differentiate healthy and balance-impaired individuals.

Dr Soltani, Studio Engineer at CAMERA, the University of Bath's motion capture research centre, said: "Traditional tests for measuring balance can be inaccurate and sometimes unsafe - for example if the patient is on a treadmill that stops suddenly.

"It may also be difficult to replicate real life situations in a lab. But using VR opens up a huge range of possible scenarios that are more natural and relevant to the real world.

"For example, patients could be asked to cross a busy street and these scenes can be adapted easily to help them gradually improve their balance and build up confidence in their movement.

"Alternatively, VR could be used more like a video game where patients navigate virtually through a maze whilst doing additional cognitive tasks, like solving mathematical problems.

"VR gives us the flexibility to add disorientating effects or resize and remove elements, to test how well participants maintain their balance."

The researchers found that during VR versions of traditional balance tests, older adults generally acquired a cautious behaviour and took more time to complete the tasks. However, they tended to find them more enjoyable which could help encourage participants to stick to a rehabilitation programme.

Dr Soltani said: "Our review shows this technology has great potential, however there is a lot of work to do before it can be used widely in rehabilitation.

"We need to check parameters such as altering frame rate, find which scenarios are most effective, and also reduce the problems some users experience with motion sickness when using VR."

Whilst Covid19 has temporarily delayed plans to test the technology on volunteers, the researchers are now looking to recruit PhD students to define protocols and develop a robust system that can be tested by users later in the year.

Credit: 
University of Bath

Research gives trees an edge in landfill clean-up

image: A research team from the USDA Forest Service and the University of Missouri has developed a new contaminant prioritization tool that has the potential to increase the effectiveness of environmental approaches to landfill clean-up. Photograph shows an agroforestry phytoremediation buffer system in southeastern Wisconsin.

Image: 
Paul Manley, Missouri University of Science and Technology; used with permission

Rhinelander, Wis., April 28, 2021-- A research team from the USDA Forest Service and the University of Missouri has developed a new contaminant prioritization tool that has the potential to increase the effectiveness of environmental approaches to landfill clean-up.

Phytoremediation - an environmental approach in which trees and other plants are used to control leachate and treat polluted water and soil - hinges on matching the capability of different tree species with the types of contaminants present in soil and water. Identifying the worst contaminants within the dynamic conditions of a landfill has been challenging.

"Thousands of contaminants can be present in landfill leachate, and contamination can vary by location and over time, so it can be difficult to determine what needs to be, or even can be targeted with environmental remediation," said Elizabeth Rogers, a USDA Forest Service Pathways intern and the lead author of the study. "This tool allows site managers to prioritize the most hazardous contaminants or customize the tool to address local concerns."

Rogers and co-authors Ron Zalesny, Jr., a supervisory research plant geneticist with the Northern Research Station, and Chung-Ho Lin, a Research Associate Professor at the University of Missouri's Center for Agroforestry, combined multiple sources of data to develop a pollutant prioritization tool that systematically prioritizes contaminants according to reported toxicity values. The study, "A systematic approach for prioritizing landfill pollutants based on toxicity: Applications and opportunities," is available through the Northern Research Station at: https://www.nrs.fs.fed.us/pubs/62410

Knowing which contaminants are the most hazardous allows scientists like Zalesny to better match trees and tree placement in landfills. "Phytoremediation research has focused on discovering which trees work best in particular soils and sites," Zalesny said. "The ability to home in on specific contaminants will enhance phytoremediation outcomes."

The pollutant prioritization tool allows for greater transparency on the benefits of phytoremediation. "When you know what you are targeting, you can provide better information to your community on how long remediation will take and how effective it is," Lin said.

Credit: 
USDA Forest Service - Northern Research Station

Using cosmic-ray neutron bursts to understand gamma-ray bursts from lightning

image: A lightning mapper at the High Altitude Water Cherenkov (HAWC) Cosmic Ray Observatory in Mexico unexpectedly observed that gamma rays produce more neutrons than previously known.

Image: 
Jordan Goodman, HAWC Collaboration (NSF.gov)

LOS ALAMOS, N.M., April 28, 2021--Analysis of data from a lightning mapper and a small, hand-held radiation detector has unexpectedly shed light on what a gamma-ray burst from lightning might look like - by observing neutrons generated from soil by very large cosmic-ray showers. The work took place at the High Altitude Water Cherenkov (HAWC) Cosmic Ray Observatory in Mexico.

"This was an accidental discovery," said Greg Bowers, a scientist at Los Alamos National Laboratory and lead author of the study published in Geophysical Research Letters. "We set up this system to study terrestrial gamma-ray flashes - or gamma-ray bursts from lightning - that are typically so bright you can see them from space. The idea was that HAWC would be sensitive to the gamma-ray bursts, so we installed a lightning mapper to capture the anatomy of the lightning development and pinpoint the lightning processes producing them."

The team, including Xuan-Min Shao and Brenda Dingus also from Los Alamos, used a small, handheld particle detector, expecting that a terrestrial gamma-ray flash would generate a clear gamma-ray signal in the small particle detector.

"Our system ran for almost two years, and we saw a lot of lightning," said Bowers. But during those storms, they did not observe anything that looked like terrestrial gamma-ray flashes. "We did, however, see large count-rate bursts during clear, fair-weather days, which made us scratch our heads."

HAWC data gathered during these times showed that, in every case, the large array that comprises HAWC had been overwhelmed by extremely large cosmic-ray showers--so large that the Los Alamos researchers couldn't estimate their size.

UC Santa Cruz collaborator David Smith found that these fair-weather bursts had previously been observed by scientists in Russia, who called them "neutron bursts," and determined that they were the result of neutron production in the soil around the impact point of cosmic ray shower cores.

Previous work that simulated these events had only considered hadrons - a type of subatomic particle - in the core of the showers. In addition to hadrons and other particles, cosmic-ray shower cores also contain a lot of gamma rays.

For this work, William Blaine, also of Los Alamos, simulated large cosmic ray-showers, and included both hadrons and gamma rays. "We were able to match our observations with the simulations," said Bowers. "We found that the gamma rays produce the same type of neutron burst as the hadrons."

This study suggests that any natural phenomena that produces a beam of gamma-rays pointed towards the ground (such as downward terrestrial gamma-ray flashes), could produce a similar "neutron burst" signature. This is significant for future terrestrial gamma-ray flash observation modeling efforts.

"It tell us that you can't just model the gamma rays hitting your detector, you'll also have to consider the neutron burst that's happening nearby," said Bowers.

The HAWC Observatory comprises an array of water-filled tanks high on the flanks of the Sierra Negra volcano in Puebla, Mexico, where the thin atmosphere offers better conditions for observing gamma rays. When gamma rays strike molecules in the atmosphere they produce showers of energetic particles. When some of those particles strike the water inside the HAWC detector tanks, they produce flashes of light called Cherenkov radiation. By studying these Cherenkov flashes, researchers reconstruct the sources of the showers to learn about the particles that caused them.

Credit: 
DOE/Los Alamos National Laboratory

Epilepsy discovery reveals why some seizures prove deadly

image: University of Virginia School of Medicine researchers Eric Wengert (from left), Manoj Patel and Ian Wenker have shed light on the cause of Sudden Unexpected Death in Epilepsy (SUDEP).

Image: 
Courtesy Patel lab

New research from the University of Virginia School of Medicine has shed light on the No. 1 cause of epilepsy deaths, suggesting a long-sought answer for why some patients die unexpectedly following an epileptic seizure.

The researchers found that a certain type of seizure is associated with sudden death in a mouse model of epilepsy and that death occurred only when the seizure induced failure of the respiratory system.

The new understanding will help scientists in their efforts to develop ways to prevent sudden unexpected death in epilepsy (SUDEP). Based on their research, the UVA team has already identified potential approaches to stimulate breathing in the mice and prevent death after a seizure. The team believe that this new approach could one day help save lives.

"SUDEP is a major concern for patients with epilepsy and their loved ones," said Manoj Patel, PhD, of UVA's Department of Anesthesiology. "Our study has identified a sequence of events that takes place during a seizure which can progress and lead to death. Furthermore, we show that intervention during a seizure can rescue death in mice with epilepsy. This project is a long time in the making, and we are excited to share it with the scientific community."

Sudden Unexpected Death in Epilepsy

Many people were unfamiliar with sudden unexpected death in epilepsy when it took the life of young Disney Channel star Cameron Boyce in 2019. He was only 20 years old. SUDEP, however, is the most common cause of epilepsy-related death. Estimates suggest it is responsible for 8% to 17% of all epilepsy deaths, increasing to 50% in patients whose seizures do not respond to treatment.

Scientists have suggested a variety of potential causes for SUDEP, but UVA's researchers have brought clarity to why some seizures lead to death while others do not and how we may be able to prevent progression to death.

The researchers found that breathing disruption, known as apnea, began during seizures, as muscles start to stiffen. This stiffening included contraction of the diaphragm, the major breathing muscle, that prevented exhalation, stopping the normal breathing process.

Not all instances of this seizure-induced apnea were fatal; it was only when breathing did not recover immediately after the seizure that the mice died, the researchers found. The UVA team reasoned that artificially stimulating breathing would help prevent sudden death after a seizure. Indeed, in a mouse model of epilepsy, they determined that death could be prevented by directly ventilating the mouse.

While their work was in lab mice, they confirmed their findings by monitoring breathing frequency and patterns in a human patient with epilepsy. They found there were breathing disruptions, or apneas, during seizures that were very similar to those seen in their mice.

"These results implicate respiratory arrest as a major factor in SUDEP and give us targets for future research on intervention," said researcher Ian Wenker, PhD.

Using their innovative new SUDEP model, the UVA researchers have identified potential avenues for preventing SUDEP. One might be to target "adrenergic" receptors that regulate the body's response to adrenaline and other neurotransmitters. These receptors, the scientists found, are vital to restarting breathing after a seizure and preventing death.

"By identifying some of the receptors involved in stimulating breathing recovery following a seizure, we believe our findings will fuel other approaches to help reduce the risk of death in epilepsy patients," said researcher Eric Wenger, a graduate student. "We're eager for other researchers to use our new model to expand our understanding and ability to prevent SUDEP."

Credit: 
University of Virginia Health System

Ludwig Cancer Research study shows pancreatic cancer cells reverse to advance malignancy

image: Ludwig Lausanne's Douglas Hanahan

Image: 
Ludwig Cancer Research

APRIL 28, 2021, NEW YORK - A Ludwig Cancer Research study has identified a previously unrecognized mechanism by which cancer cells of a relatively benign subtype of pancreatic tumors methodically revert--or "de-differentiate"--to a progenitor, or immature, state of cellular development to spawn highly aggressive tumors that are capable of metastasis to the liver and lymph nodes.

The study, led by Ludwig Lausanne's Douglas Hanahan and published in Cancer Discovery, a journal of the American Association for Cancer Research, also shows that engagement of the mechanism is associated with poorer outcomes in patients diagnosed with pancreatic neuroendocrine tumors (PanNETs). Further, its findings provide concrete evidence that such cellular de-differentiation, widely observed across cancer types, is a not merely a random consequence of cancer cells' other aberrations.

"Our study provides a clear example in a single tumor type that de-differentiation is an independently regulated and separable step in multi-step tumorigenesis," said Hanahan, distinguished scholar at the Ludwig Lausanne Branch. "Moreover, this is not nonspecific de-differentiation, but rather, the result of a precise reversion of a developmental pathway that generated the mature cell type from which the cancer arose."

PanNET tumors originate from the islet beta-cells of the pancreas, which produce the hormone insulin. Hanahan and his colleagues had previously reported that these tumors can be divided into two subtypes: a relatively benign, 'well-differentiated' subtype that maintains many features of insulin producing beta-cells, and a more aggressive and poorly-differentiated subtype that lacks those features.

Using a PanNET mouse model, they showed in the current study that the 'poorly differentiated' cancer cells have many characteristics of normal islet progenitor cells, and that the progression from benign to aggressive PanNET tumors requires cancer cells to retrace the pathway of beta cell differentiation and maturation to assume the progenitor state.

The researchers also uncovered a molecular circuit in cancer cells that governs this de-differentiation. They report that tumor cells poised to de-differentiate step up their production of a type of RNA molecule that regulates gene expression known as microRNA18. This ultimately causes the activation of Hmgb3, a protein that controls the expression of a suite of genes that pushes the cells into a progenitor state.

The results of this study provide new insights on de-differentiation as part of the puzzle of cancer and furnish preliminary evidence supporting its inclusion as a distinct and separable step, or perhaps sub-step, in the deadly progression toward malignancy.

This study was supported by Ludwig Cancer Research, the Swiss National Science Foundation and the Human Frontier Science Program Organization.

In addition to his Ludwig post, Douglas Hanahan is a professor emeritus at École Polytechnique Fédérale de Lausanne (EPFL).

Credit: 
Ludwig Institute for Cancer Research

Reducing blue light with a new type of LED that won't keep you up all night

image: This prototype device creates a warm white light without the blue hues that can cause health problems.

Image: 
Jakoah Brgoch

To be more energy efficient, many people have replaced their incandescent lights with light-emitting diode (LED) bulbs. However, those currently on the market emit a lot of blue light, which has been linked to eye troubles and sleep disturbances. Now, researchers reporting in ACS Applied Materials & Interfaces have developed a prototype LED that reduces -- instead of masks -- the blue component, while also making colors appear just as they do in natural sunlight.

LED light bulbs are popular because of their low energy consumption, long lifespan and ability to turn on and off quickly. Inside the bulb, an LED chip converts electrical current into high-energy light, including invisible ultraviolet (UV), violet or blue wavelengths. A cap that is placed on the chip contains multiple phosphors -- solid luminescent compounds that convert high-energy light into lower-energy visible wavelengths. Each phosphor emits a different color, and these colors combine to produce a broad-spectrum white light. Commercial LED bulbs use blue LEDs and yellow-emitting phosphors, which appear as a cold, bright white light similar to daylight. Continual exposure to these blue-tinted lights has been linked to cataract formation, and turning them on in the evening can disrupt the production of sleep-inducing hormones, such as melatonin, triggering insomnia and fatigue. To create a warmer white LED bulb for nighttime use, previous researchers added red-emitting phosphors, but this only masked the blue hue without getting rid of it. So, Jakoah Brgoch and Shruti Hariyani wanted to develop a phosphor that, when used in a violet LED device, would result in a warm white light while avoiding the problematic wavelength range.

As a proof of concept, the researchers identified and synthesized a new luminescent crystalline phosphor containing europium ((Na1.92Eu0.04)MgPO4F). In thermal stability tests, the phosphor's emission color was consistent between room temperature and the higher operating temperature (301 F) of commercial LED-based lighting. In long-term moisture experiments, the compound showed no change in the color or intensity of light produced. To see how the material might work in a light bulb, the researchers fabricated a prototype device with a violet-light LED covered by a silicone cap containing their luminescent blue compound blended with red-emitting and green-emitting phosphors. It produced the desired bright warm white light while minimizing the intensity across blue wavelengths, unlike commercial LED light bulbs. The prototype's optical properties revealed the color of objects almost as well as natural sunlight, fulfilling the needs of indoor lighting, the researchers say, though they add that more work needs to be done before it is ready for everyday use.

Credit: 
American Chemical Society

Inactive oil wells could be big source of methane emissions

image: UC undergraduate research assistant Jacob Hoschouer takes samples from an inactive oil well.

Image: 
UC Geology

Uncapped, idle oil wells could be leaking millions of kilograms of methane each year into the atmosphere and surface water, according to a study by the University of Cincinnati.

Amy Townsend-Small, an associate professor of geology and geography in UC's College of Arts and Sciences, studied 37 wells on private property in the Permian Basin of Texas, the largest oil production region on Earth. She found that seven had methane emissions of as much as 132 grams per hour. The average rate was 6.2 grams per hour.

"Some of them were leaking a lot. Most of them were leaking a little or not at all, which is a pattern that we have seen across the oil and gas supply chain," Townsend-Small said. "A few sources are responsible for most of the leaks."

The study, published in the journal Environmental Research Letters, is the first of its kind on methane emissions from inactive oil wells in Texas.

"Nobody has ever gotten access to these wells in Texas," Townsend-Small said. "In my previous studies, the wells were all on public land."

A 2016 study by Townsend-Small found a similar issue in inactive wells she tested in Colorado, Wyoming, Ohio and Utah. Spread across the estimated 3.1 million abandoned wells, the leaking methane is equivalent to burning more than 16 million barrels of oil, according to government estimates.

Five of the inactive wells Townsend-Small studied in Texas were leaking a brine solution onto the ground, in some cases creating large ponds. 

"I was horrified by that. I've never seen anything like that here in Ohio," Townsend-Small said. "One was gushing out so much water that people who lived there called it a lake, but it's toxic. It has dead trees all around it and smells like hydrogen sulfide."

Most of the wells had been inactive for three to five years, possibly because of fluctuations in market demand. Inactive wells could be a substantial source of methane emissions if they are not subject to leak detection and repair regulations, the UC study concluded.

The study was funded in part by a grant from the U.S. Department of the Interior.

Previous studies have found the basin generates 2.7 billion kilograms of methane per year or nearly 4% of the total gas extracted. That's 60% higher than the average methane emissions in oil and gas production regions nationally. This was attributed to high rates of venting and flaring due to a lack of natural gas pipelines and other gas production infrastructure.

Methane is a powerful greenhouse gas that scientists have linked to climate change. If the rate of methane leaks UC observed were consistent across all 102,000 idled wells in Texas, the 5.5 million kilograms of methane released would be equivalent to burning 150 million pounds of coal each year, according to an estimate by the magazine Grist and nonprofit news organization the Texas Observer.

Townsend-Small and her UC undergraduate research assistant Jacob Hoschouer, a study co-author, came to Texas at the suggestion of the media organizations, which wanted to explore the environmental impact of oil wells, particularly those that are inactive or abandoned. An expert on methane emissions, Townsend-Small has studied releases from oil and natural gas wells across the country.

The journalists arranged with the property owners for Townsend-Small to examine the wells. 

President Joe Biden's administration has pledged $16 billion in its infrastructure plan to cap abandoned oil and gas wells and mitigate abandoned mines. Hoschouer said it would be gratifying if their research could help regulators prioritize wells for capping.

In the meantime, regular inspections of inactive wells using infrared cameras to identify leaks could address the problem, the UC study suggested.

Credit: 
University of Cincinnati

Driving behaviors harbor early signals of dementia

April 28, 2021 -- Using naturalistic driving data and machine learning techniques, researchers at Columbia University Mailman School of Public Health and Columbia's Fu Foundation School of Engineering and Applied Science have developed highly accurate algorithms for detecting mild cognitive impairment and dementia in older drivers. Naturalistic driving data refer to data captured through in-vehicle recording devices or other technologies in the real-world setting. These data could be processed to measure driving exposure, space and performance in great detail. The findings are published in the journal Geriatrics.

The researchers developed random forests models, a statistical technique widely used in AI for classifying disease status, that performed exceptionally well. "Based on variables derived from the naturalistic driving data and basic demographic characteristics, such as age, sex, race/ethnicity and education level, we could predict mild cognitive impairment and dementia with 88 percent accuracy, "said Sharon Di, associate professor of civil engineering and engineering mechanics at Columbia Engineering and the study's lead author.

The investigators constructed 29 variables using the naturalistic driving data captured by in-vehicle recording devices from 2977 participants of the Longitudinal Research on Aging Drivers (LongROAD) project, a multisite cohort study sponsored by the AAA Foundation for Traffic Safety. At the time of enrollment, the participants were active drivers aged 65-79 years and had no significant cognitive impairment and degenerative medical conditions. Data used in this study spanned the time period from August 2015 through March 2019.

Among the 2977 participants whose cars were instrumented with the in-vehicle recording devices, 33 were newly diagnosed with mild cognitive impairment and 31 with dementia by April 2019. The researchers trained a series of machine learning models for detecting mild cognitive impairment/dementia and found that the model based on driving variables and demographic characteristics was 88 percent accurate, much better than models based on demographic characteristics only (29 percent) and driving variables only (66 percent). Further analysis revealed that age was most predictive of mild cognitive impairment and dementia, followed by the percentage of trips traveled within 15 miles of home, race/ethnicity, minutes per trip chain (i.e., length of trips starting and ending at home), minutes per trip, and number of hard braking events with deceleration rates ≥ 0.35 g.

"Driving is a complex task involving dynamic cognitive processes and requiring essential cognitive functions and perceptual motor skills. Our study indicates that naturalistic driving behaviors can be used as comprehensive and reliable markers for mild cognitive impairment and dementia," said Guohua Li, MD, DrPH, professor of epidemiology and anesthesiology at Columbia Mailman School of Public Health and Vagelos College of Physicians and Surgeons, and senior author. "If validated, the algorithms developed in this study could provide a novel, unobtrusive screening tool for early detection and management of mild cognitive impairment and dementia in older drivers."

Credit: 
Columbia University's Mailman School of Public Health

El Niño can help predict cacao harvests up to 2 years in advance

image: A woman inspects harvested cacao beans drying in the sun in Soppeng, South Sulawesi, Indonesia.

Image: 
Cocoa Care and T. Oberthu?r

When seasonal rains arrive late in Indonesia, farmers often take it as a sign that it is not worth investing in fertilizer for their crops. Sometimes they opt out of planting annual crops altogether. Generally, they're making the right decision, as a late start to the rainy season is usually associated with the state of El Niño Southern Oscillation (ENSO) and low rainfall in the coming months.

New research published in the Nature journal Scientific Reports shows that ENSO, the weather-shaping cycle of warming and cooling of the Pacific Ocean along the Equator, is a strong predictor of cacao harvests up to two years before a harvest.

This is potentially very good news for smallholder farmers, scientists, and the global chocolate industry. The ability to predict harvest sizes well in advance could shape on-farm investment decisions, improve tropical crop research programs, and reduce risk and uncertainty in the chocolate industry.

Researchers say that the same methods - which pair advanced machine learning with rigorous, short-term data collection on farmer practices and yields - can apply to other rain-dependent crops including coffee and olives.

"The key innovation in this research is that you can effectively substitute weather data with ENSO data," said Thomas Oberthür, a co-author and business developer at the African Plant Nutrition Institute (APNI) in Morocco. "Any crop that shares a production relationship with ENSO can be explored using this method."

About 80 percent of global cropland depends on direct rainfall (as opposed to irrigation), accounting for almost 60 percent of production. But rainfall data is sparse and highly variable in many of these regions, making it difficult for scientists, policymakers and farmers groups to adapt to the vagaries of the weather.

No weather data? No problem

For the study, researchers used a type of machine learning that did not require weather records for the Indonesian cacao farms that participated in the research.

Rather, they relied on data on fertilizer application, yields and farm type, which they plugged into a Bayesian Neural Network (BNN) and found that ENSO phases predicted 75 % of the variation in yields.

In other words, the sea-surface temperature of the Pacific accurately predicted cacao harvests in a large majority of cases for the farms in the study. In some cases, accurate predictions were possible 25 months before the harvest.

For the uninitiated, a model that can accurately predict 50% of yield variation is usually cause to celebrate. And such long-range predictive accuracy for crop yields crops is rare.

"What this allows us to do is superimpose different management practices - such as fertilization regimes - on farms and deduce, with a high level of confidence, those interventions that work," said James Cock, a co-author and emeritus researcher at the Alliance of Bioversity International and CIAT. "This is a whole paradigm shift toward operational research."

Cock, a plant physiologist, said that while randomized control trials (RCTs) are generally considered the gold standard in research, these are extremely costly and consequently often impossible to perform in developing tropical agricultural areas. The approach used here is much lower cost, requires no expensive collection of weather records and provides useful guidelines on how to better manage crops under variable weather.

Ross Chapman, a data analyst and the study's lead author, explained some of the key benefits of machine learning methods over conventional data analysis approaches:

"The BNN modeling differs from standard regression modeling because the algorithm takes input variables, such as sea-surface temperature and farm type, and then automatically 'learns' to recognize responses in other variables, such as crop yield," Chapman said. "The learning process uses the same fundamental process that the human mind learns to recognize objects and patterns from real-life experience. In contrast, standard models require manual supervision of different variables via human-generated equations."

The value of shared data

While machine learning may promise better crop yield predictions in the absence of weather data, scientists - or farmers themselves - still need to accurately collect certain production information and have that data readily available if machine-learning models are going to work.

In the case of the Indonesian cacao farms in the study, farmers had been part of a major chocolate company's training program on best practices. They kept track of inputs such as fertilizer application, freely shared that data for analysis, and an organization with a local presence, the International Plant Nutrition Institute (IPNI), kept tidy records for researchers to use.

In addition, scientists had previously divided their farms into ten similar groups, where topography and soil conditions were similar. The researchers used data on harvests, fertilizer applications and yields from 2013 to 2018 to build their model.

The knowledge gained by cacao growers gives them confidence on how and when to invest in fertilizers. The agronomic skills this vulnerable group acquired shields them against a loss in their investment, which typically occurs when weather is adverse.

Thanks to their collaboration with the researchers, now their knowledge can be, in a way, shared with growers of other crops in other regions of the world.

"This research could not have happened without dedicated farmers, IPNI and a strong farmers' support organization, Community Solutions International, to pull everyone together," Cock said, emphasizing the importance of multidisciplinary collaboration and balancing stakeholder's different needs.

"What scientists want is to know why something happens," he said. "Farmers want to know what works."

APNI's Oberthür said strong predictive modeling could benefit both farmers and researchers, and fuel further collaboration.

"You need to have tangible results if you're a farmer who is also collecting data, which is a lot of work," Oberthür said. "This modeling, which can provide farmers with beneficial information, may help incentivize data collection since farmers will see that they are contributing to something that provides benefits to them on their farms."

Credit: 
The Alliance of Bioversity International and the International Center for Tropical Agriculture

Study finds green spaces linked to lower racial disparity in COVID infection rates

image: From left, researchers Bin Jiang, Yi Lu and William Sullivan found that more green spaces in an area is associated with a lower racial disparity in coronavirus infection rates. Their study is the first to examine the relationship between the supply of green spaces and reduced disparity in infectious disease rates.

Image: 
Photos courtesy Bin Jiang, Yi Lu and William Sullivan

CHAMPAIGN, Ill. -- A higher ratio of green spaces at the county level is associated with a lower racial disparity in coronavirus infection rates, according to a new study. It is the first study to report the significant relationship between the supply of green spaces and reduced disparity in infectious disease rates.

The research team included William Sullivan, a landscape architecture professor at the University of Illinois Urbana-Champaign, and was led by Bin Jiang, a landscape architecture professor at The University of Hong Kong who received his Ph.D. at Illinois, and Yi Lu, an architecture professor at City University of Hong Kong. They reported their findings in the journal Environment International.

Previous studies by Sullivan, Jiang and Lu have shown that green spaces have positive effects on health. Access to green spaces is associated with improved cognitive performance, reduced mental fatigue and stress, reduced impulsiveness and aggressiveness, increased sense of safety, reduced crime rate, increased physical activity and increased social cohesion.

Prior studies also provide strong evidence that green spaces may mitigate racial disparities in health outcomes. However, none have looked at the effect on disparities in infectious diseases. Most studies examining the racial disparity in coronavirus infections have focused on its association with socio-economic status or pre-existing chronic disease factors.

For this study, the researchers identified 135 of the most urbanized counties in the U.S., with a total population of 132,350,027, representing 40.3% of the U.S. population. They collected infection data from county health departments from late January to July 10, 2020, and used it to calculate the infection rates for Black and white residents of the counties, while controlling for differences in income, pre-existing chronic diseases and urban density.

The data showed that the average infection rate for Black residents was more than twice that of white residents - 497 per 100,000 people for white individuals versus 988 per 100,000 people for Black individuals.

The researchers compared the infection rates of each population within each county, rather than across all the counties studied. The county-level comparison Is critical because it can minimize the bias caused by differences of socioeconomic, transportation, climate and policy conditions among counties, they said.

Sullivan, Jiang and Lu said several factors could account for the findings. They proposed that a greater proportion of green spaces in a county makes it more likely that Black and white individuals have more equal access to the green spaces and the accompanying health benefits.

"In many, many counties, Black folks have less access to green space than white folks do. In counties with more green space, that disparity may be less, and it may help account for some of the positive benefits we're seeing," Sullivan said.

The coronavirus is spread through aerosol particles, and the spread is heightened in indoor settings without adequate ventilation. Having access to green spaces attracts people outdoors, where air movement and the ease of social distancing can reduce the spread of the virus.

More access to green spaces is likely to promote physical activity, which may enhance the immune system. Green spaces enhance mental health and reduce stress, which also promotes immune system health. They strengthen social ties, which is an important predictor of health and well-being, the researchers said. Green spaces also may decrease infection risk by improving air quality and decreasing exposure to air pollutants in dense urban areas.

"We did not measure these things, but we know from previous research that all these things are tied to green spaces and have implications for health and well-being," Sullivan said.

Jiang described green space as preventive medicine, encouraging outdoor physical activity and social ties with neighbors that will boost the immune system and promote social trust and cooperation to reduce risk of infections.

While the study looked at infection rates in the U.S., "we also think the racial disparity issue is not just an American issue. It's an international issue," Jiang said.

The research shows the importance for local and regional governments to invest in the development of green spaces, Sullivan said.

"One of the things the pandemic has helped us understand is that the built environment has real implications for the spread of disease and for our health. The design of landscape in cities, in neighborhoods, in communities also has really important ways it can contribute to or detract from health and well-being," he said. "There is a lot of competition for investment of public dollars. Lots of times, investments in parks and green spaces are prioritized lower. People think it makes a place look pretty and it's a place to go for walks. What we're finding is these kinds of investments have implications for health and well-being."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Soil bacteria evolve with climate change

While evolution is normally thought of as occurring over millions of years, researchers at the University of California, Irvine have discovered that bacteria can evolve in response to climate change in 18 months. In a study published in the Proceedings of the National Academy of Sciences, biologists from UCI found that evolution is one way that soil microbes might deal with global warming.

Soil microbiomes - the collection of bacteria and other microbes in soil - are a critical engine of the global carbon cycle; microbes decompose the dead plant material to recycle nutrients back into the ecosystem and release carbon back into the atmosphere. Multiple environmental factors influence the composition and functioning of soil microbiomes, but these responses are usually studied from an ecological perspective, asking which microbial species increase or decrease in abundance as environmental conditions change. In the current study, the UCI team investigated if bacterial species in the soil also evolve when their environment changes.

"We know that evolution can occur very fast in bacteria, as in response to antibiotics, but we do not know how important evolution might be for bacteria in the environment with ongoing climate change," said Dr. Alex Chase, the lead author of the study and a former graduate student at UCI.

Several inherent characteristics should enable soil microbes to adapt rapidly to new climate conditions. Microbes are abundant and can reproduce in only hours, so a rare genetic mutation that allows for adaptation to new climate conditions might occur by chance over a short time frame. However, most of what is known about bacterial evolution is from controlled laboratory experiments, where bacteria are grown in flasks with artificial food. It was unclear whether evolution happens fast enough in soils to be relevant to the effects of current rates of climate change.

"Current predictions about how climate change will affect microbiomes make the assumption that microbial species are static. We therefore wanted to test whether bacteria can evolve rapidly in natural settings such as soil," explained Dr. Chase.

To measure evolution in a natural environment, the researchers deployed a first-ever bacterial evolution experiment in the field, using a soil bacterium called Curtobacterium. The researchers used 125 "microbial cages" filled with microbial food made up of dead plant material. (The cages allow the transport of water, but not other microbes.) The cages then exposed the bacteria to a range of climate conditions across an elevation gradient in Southern California. The team conducted two parallel experiments over 18 months measuring both the ecological and evolutionary responses in the bacteria.

"The microbial cages allowed us to control the types of bacteria that were present, while exposing them to different environmental conditions in different sites. We could then test, for instance, how the warm and arid conditions of the desert site affected the genetic diversity of a single Curtobacterium species," said Dr. Chase.

After 18 months, the scientists sequenced bacterial DNA from the microbial cages of the experiments. In the first experiment containing a diverse soil microbiome, different Curtobacterium species changed in abundance, an expected ecological response. In the second experiment over the same time frame, the genetic diversity of a single Curtobacterium bacterium changed, revealing an evolutionary response to the same environmental conditions. The authors conclude that both ecological and evolutionary processes have the potential to contribute to how a soil microbiome responds to changing climate conditions.

"The study shows that we can observe rapid evolution in soil microbes, and this is an exciting achievement. Our next goal is to understand the importance of evolutionary adaptation for soil ecosystems under future climate change," said co-author Jennifer Martiny, professor of ecology and evolutionary biology who co-directs the UCI Microbiome Initiative.

Credit: 
University of California - Irvine