Culture

How nature tells us its formulas

image: The atom chip (in gold) at TU Wien

Image: 
TU Wien

Many of the biggest questions in physics can be answered with the help of quantum field theories: They are needed to describe the dynamics of many interacting particles, and thus they are just as important in solid state physics as in cosmology. Often, however, it is extremely complicated to develop a quantum field theoretical model for a specific problem - especially if the system in question consists of many interacting particles.

Now a team from the TU Wien and the University of Heidelberg has developed methods with which these models can be directly obtained from experimental measurements. Instead of comparing the experimental results to theoretical model predictions, it is, in a certain sense, possible to measure the theory itself. This should now shed new light on the complicated field of many-body quantum physics.

Quantum Simulators

In recent years, a new method of studying quantum physical systems has gained importance - the so-called "quantum simulators". "We simply do not have a satisfactory description of some quantum systems, for example high-temperature superconductors. Other systems can just not be observed directly, such as the early universe shortly after the Big Bang. Suppose we still want to learn something about such quantum systems - then we simply choose another system that can be easily controlled in the laboratory and adjust it so that it behaves in a similar way to the system we are actually interested in. For example, we can use experiments on ultracold atoms to learn about systems that we would otherwise not be able to study at all," explains Jörg Schmiedmayer from the Vienna Center of Quantum Science and Technology (VCQ) at TU Wien. This is possible because there are fundamental similarities between different quantum physical descriptions of different systems.

But no matter which quantum system is studied, scientists always come across a fundamental problem: "If there are too many particles involved, the formulas of quantum theory quickly become so complicated that they cannot be solved, not even with the best supercomputers in the world," explains Sebastian Erne. "That's a pity, because systems consisting of many particles are particularly interesting. In everyday life, it is always the case that many particles play a role at the same time."

Getting Rid of the Details

In general, it is not possible to solve the exact quantum theory for a many-particle-system, in which every single particle is considered. One has to find a simplified quantum description that contains all the essential properties, but no longer relies on details about the individual particles. "This is similar to describing a gas," explains Jörg Schmiedmayer. "We're not interested in every single atom, but in more general variables such as pressure and temperature."

But how do you arrive at such theories for many-body systems? Deriving them purely mathematically from the laws of nature that apply to individual particles is extremely complicated. But as it now turns out, this is not necessary. "We have found a method of reading the quantum field theoretical description directly from the experiment," says Schmiedmayer. "In a certain sense, nature provides the formulas, with which it must be described, all by itself."

We know that every quantum theory has to obey certain formal rules - we talk for example about correlations, propagators, vertices, Feynman diagrams - the basic building blocks of every quantum physical model. The research team of TU Wien and the University of Heidelberg has found a way to make these individual basic building blocks experimentally accessible. The experimental measurements result in an empirically obtained quantum theory for a many-body system, without having to work with paper and pencil.

"For years, we have suspected that this is theoretically possible, but not everyone believed us that it actually works," says Jörg Schmiedmayer. "Now we have shown that we were right - by looking at a special case where the theory can also be found and (in certain limits) solved mathematically. Our measurement results provide exactly the same theory building blocks."

Ultracold Atomic Clouds

The experiment was done with clouds of thousands of ultracold atoms that are trapped in a magnetic trap on an atomic chip. "From the quantum wave patterns of these atomic clouds, we can determine the correlation functions from which the basic building blocks of the appropriate theory can be derived," explains Schmiedmayer.

The results have now been published in the journal "Physical Review X". The team hopes that this will significantly simplify the study of quantum many-particle systems. Perhaps it will shed some light on some of the big questions in physics.

Credit: 
Vienna University of Technology

Researchers identify link between decreased depressive symptoms, yoga and the neurotransmitter GABA

(Boston)-- The benefits of yoga have been widely documented by scientific research, but previously it was not clear as to how yoga exerts its physiologic effect.

Now a new study from Boston University School of Medicine (BUSM) proposes that yoga can increase levels of Gamma-amino butyric acid (GABA) in the short-term and completing one yoga class per week may maintain elevated GABA that could mitigate depressive symptoms.

Depression is a highly prevalent and disabling disease. According to the World Health Organization, depression affects approximately 16 million people in the U.S. every year and is the leading cause of disability worldwide. Given its high morbidity, extensive research has been done on effective treatment modalities for depression. GABA is an amino acid that acts as a neurotransmitter in the central nervous system and has been associated with decreased depressive symptoms.

A group of 30 clinically depressed patients were randomly divided into two groups. Both groups engaged in lyengar yoga and coherent breathing with the only difference being the number of 90 minute yoga session and home sessions in which each group participated. Over three months, the high-dose group (HDG) was assigned three sessions per week while the low-intensity group (LIG) was assigned two sessions per week. Participants underwent magnetic resonance imaging (MRI) scans of their brain before the first yoga session and after the last yoga session. They also completed a clinical depression scale to monitor their symptoms.

Results showed that both groups had improvement in depressive symptoms after three months. MRI analysis found that GABA levels after three months of yoga were elevated (as compared to prior to starting yoga) for approximately four days after the last yoga session but the increase was no longer observed after approximately eight days. "The study suggests that the associated increase in GABA levels after a yoga session are 'time-limited' similar to that of pharmacologic treatments such that completing one session of yoga per week may maintain elevated levels of GABA," explained corresponding author Chris Streeter, MD, associate professor of psychiatry at BUSM.

According to the researchers, providing evidence-based data will be helpful in getting more individuals to try yoga as a strategy for improving their health and well-being. "A unique strength of this study is that pairing the yoga intervention with brain imaging provides important neurobiological insight as to the 'how' yoga may help to alleviate depression and anxiety. In this study, we found that an important neurochemical, GABA, which is related to mood, anxiety and sleep, is significantly increased in association with a yoga intervention," said collaborator and co-author Marisa Silveri, PhD, neuroscientist at McLean Hospital and associate professor of psychiatry at Harvard Medical School.

Credit: 
Boston University School of Medicine

New method for removing oil from water

image: is adsorbed within seconds by a leaf of the floating fern Salvinia and pulled from the water.

Image: 
(c) W. Barthlott, M. Mail/Uni Bonn

Oil poses a considerable danger to aquatic life. Researchers at the Universities of Bonn and Aachen and the Heimbach-GmbH have developed a new technology for the removal of such contaminations: Textiles with special surface properties passively skim off the oil and move it into a floating container. The scientists used surfaces from the plant kingdom as a model. The study has now been published in the journal "Philosophical Transactions A".

The video clip is as short as it is impressive: The 18-second sequence shows a pipette from which dark-colored crude oil drips into a glass of water. Then a researcher holds a green leaf against the spot. Within a matter of seconds the leaf sucks the oil from the surface of the water, leaving not even a trace behind.

The star of the movie, the small green leaf, comes from the floating fern Salvinia. The special abilities of its leaves make it highly interesting for scientists, because they are extremely hydrophobic: When submerged, they wrap themselves in an air jacket and remain completely dry. Researchers call this behavior "superhydrophobic", which can be translated as "extremely water repellent".

However, the Salvinia surface loves oil which is, in a way, a flip side of superhydrophobia. "This allows the leaves to transport an oil film on their surface", explains Prof. Dr. Wilhelm Barthlott, emeritus of the University of Bonn and former director of its botanic gardens. "And we have also been able to transfer this property to technically producible surfaces, such as textiles."

Functional textiles as "suction tubes"

Such superhydrophobic substances can then for instance be used to remove oil films from water surfaces efficiently and without the use of chemicals. However, unlike other materials that have been used for this purpose so far, they do not absorb the oil. "Instead, it travels along the surface of the fabric, moved forward solely by its adhesive forces," explains Barthlott. "For example, in the laboratory we hung such fabric tapes over the edge of a container floating on the water. Within a short time they had almost completely removed the oil from the water surface and transported it into the container." Gravity provides the power; the bottom of the container must therefore be below the water surface with the oil film. "The oil is then completely skimmed off - as if using an automatic skimming spoon for meat stock."

This also makes super-hydrophobic textiles interesting for environmental technology. After all, they promise a new approach to solving the acute environmental problem of increasing oil spills on water bodies. Oil films floating on water cause a number of problems. They prevent gas exchange through the surface and are also dangerous on contact for many plants and animals. As oil films also spread quickly over large surfaces, they can endanger entire ecosystems.

Cleaning without chemicals

The new process does not require the use of chemicals. Conventional binding agents simply absorb the oil and can then usually only be burned later. The superhydrophobia method is different: "The oil skimmed into the floating container is so clean that it can be reused," explains Prof. Barthlott.

The procedure is not intended for large-scale oil disasters such as those that occur after a tanker accident. But particularly small contaminations, such as engine oil from cars or ships, heating oil or leaks, are a pressing problem. "Even minor quantities become a danger to the ecosystem, especially in stagnant or slow-flowing waters," emphasizes the biologist. This is where he sees the major application potential of the new method, for which a patent has been filed by the University of Bonn.

Generally speaking, many surfaces exhibit superhydrophobic behavior, albeit to varying degrees. The basic prerequisite is first of all that the material itself is water-repellent, for example due to a wax coating. But that alone is not enough: "Superhydrophobia is always based on certain structures on the surface, such as small hairs or warts - often on a nanotechnological scale," explains the botanist from the University of Bonn. It is also thanks to him that science now knows much more about these relationships than it did a few decades ago.

The research work is funded by the Deutsche Bundesstiftung Umwelt DBU. "This now helps us to develop oil-absorbing materials with particularly good transport properties, in cooperation with RWTH Aachen University," says Barthlott.

Credit: 
University of Bonn

Flickering light mobilizes brain chemistry that may fight Alzheimer's

video: The hope of flickering light to treat Alzheimer's takes another step forward in this new study, which reveals stark biochemical mechanisms: 40 Hertz stimulation triggers a marked release of signaling chemicals.

Image: 
Georgia Tech / Evans / Karcz

For over a century, Alzheimer's disease has confounded all attempts to treat it. But in recent years, perplexing experiments using flickering light have shown promise.

Now, researchers have tapped into how the flicker may work. They discovered in the lab that the exposure to light pulsing at 40 hertz - 40 beats per second - causes brains to release a surge of signaling chemicals that may help fight the disease.

Though conducted on healthy mice, this new study is directly connected to human trials, in which Alzheimer's patients are exposed to 40 Hz light and sound. Insights gained in mice at the Georgia Institute of Technology are informing the human trials in collaboration with Emory University.

"I'll be running samples from mice in the lab, and around the same time, a colleague will be doing a strikingly similar analysis on patient fluid samples," said Kristie Garza, the study's first author. Garza is a graduate research assistant in the lab of Annabelle Singer at Georgia Tech and also a member of Emory's neuroscience program.

One of the surging signaling molecules, in particular, is associated with the activation of brain immune cells called microglia, which purge an Alzheimer's hallmark - amyloid beta plaque, junk protein that accumulates between brain cells.

Immune signaling

In 2016, researchers discovered that light flickering at 40 Hz mobilized microglia in mice afflicted with Alzheimer's to clean up that junk. The new study looked for brain chemistry that connects the flicker with microglial and other immune activation in mice and exposed a surge of 20 cytokines - small proteins secreted externally by cells and which signal to other cells. Accompanying the cytokine release, internal cell chemistry - the activation of proteins by phosphate groups - left behind a strong calling card.

"The phosphoproteins showed up first. It looked as though they were leading, and our hypothesis is that they triggered the release of the cytokines," said Singer, who co-led the new study and is an assistant professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory.

"Beyond cytokines that may be signaling to microglia, a number of factors that we identified have the potential to support neural health," said Levi Wood, who co-led the study with Singer and is an assistant professor in Georgia Tech's George W. Woodruff School of Mechanical Engineering.

The team publishes its findings in the Journal of Neuroscience on February 5, 2020. (There is no embargo. Pre-publication appeared in December but did not yet contain all edits and elements.) The research was funded by the National Institute of Neurological Disorders and Stroke at the National Institutes of Health, and by the Packard Foundation.

Singer was co-first author on the original 2016 study at the Massachusetts Institute of Technology, in which the therapeutic effects of 40 Hz were first discovered in mice.

Sci-fi surrealness

Alzheimer's strikes, with few exceptions, late in life. It destroys up to 30% of a brain's mass, carving out ravines and depositing piles of amyloid plaque, which builds up outside of neurons. Inside neurons, phosphorylated tau protein forms similar junk known as neurofibrillary tangles suspected of destroying mental functions and neurons.

After many decades of failed Alzheimer's drug trials costing billions, flickering light as a potentially successful Alzheimer's therapy seems surreal even to the researchers.

"Sometimes it does feel like science fiction," Singer said.

The 40 Hz frequency stems from the observation that brains of Alzheimer's patients suffer early on from a lack of what is called gamma, moments of gentle, constant brain waves acting like a dance beat for neuron activity. Its most common frequency is right around 40 Hz, and exposing mice to light flickering at that frequency restored gamma and also appears to have prevented heavy Alzheimer's brain damage.

Adding to the surrealness, gamma has also been associated with esoteric mind expansion practices, in which practitioners perform light and sound meditation. Then, in 2016, research connected gamma to working memory, a function key to train of thought.

Cytokine bonanza

In the current study, the surging cytokines hinted at a connection with microglial activity, and in particular, the cytokine Macrophage Colony-Stimulating Factor (M-CSF).

"M-CSF was the thing that yelled, 'Microglia activation!'" Singer said.

The researchers will look for a causal connection to microglia activation in an upcoming study, but the overall surge of cytokines was a good sign in general, they said.

"The vast majority of cytokines went up, some anti-inflammatory and some inflammatory, and it was a transient response," Wood said. "Often, a transient inflammatory response can promote pathogen clearance; it can promote repair."

"Generally, you think of an inflammatory response as being bad if it's chronic, and this was rapid and then dropped off, so we think that was probably beneficial," Singer added.

Chemical timing

The 40 Hz stimulation did not need long to trigger the cytokine surge.

"We found an increase in cytokines after an hour of stimulation," Garza said. "We saw phosphoprotein signals after about 15 minutes of flickering."

Perhaps about 15 minutes was enough to start processes inside of cells and about 45 more minutes were needed for the cells to secrete cytokines. It is too early to know.

20 Hz bombshell

As controls, the researchers applied three additional light stimuli, and to their astonishment, all three had some effect on cytokines. But stimulating with 20 Hz stole the show.

"At 20 Hz, cytokine levels were way down. That could be useful, too. There may be circumstances where you want to suppress cytokines," Singer said. "We're thinking different kinds of stimulation could potentially become a platform of tools in a variety of contexts like Parkinson's or schizophrenia. Many neurological disorders are associated with immune response."

The research team warns against people improvising light therapies on their own, since more data is needed to thoroughly establish effects on humans, and getting frequencies wrong could possibly even do damage.

Credit: 
Georgia Institute of Technology

Army develops big data approach to neuroscience

image: Aggregate distribution of cortical brain-wave activity, organized by standard frequency bands, across a range of depths.

Image: 
U.S. Army graphic

ABERDEEN PROVING GROUND, Md. (Feb 3, 2020) -- A big data approach to neuroscience promises to significantly improve our understanding of the relationship between brain activity and performance.

To date, there have been relatively few attempts to use a big-data approach within the emerging field of neurotechnology. In this field, the few attempts at meta-analysis (analysis across multiple studies) combine only the results from individual studies rather than the raw data. A new study is one of the first to combine data across a diverse set of experiments to identify patterns of brain activity that are common across tasks and people.

The Army in particular is interested in how the cognitive state of Soldiers can affect their performance during a mission. If you can understand the brain, you can predict and even enhance cognitive performance.

Researchers from the U.S. Army Combat Capabilities Development Command's Army Research Laboratory teamed with the University of Texas at San Antonio and Intheon Labs to develop a first-of-its-kind mega-analysis of brain imaging data--in this case electroencephalography, or EEG.

In the two-part paper, they aggregate the raw data from 17 individual studies, collected at six different locations, into a single analytical framework, with their findings published in a series of two papers in the journal NeuroImage. The individual studies included in this analysis encompass a diverse set of tasks such simulated driving and visual search.

"The vast majority of human neuroscientific studies use a very small number of participants employed in very specific tasks," said Dr. Jonathan Touryan, an Army scientist and co-author of the paper. "This limits how well the results from any single study can be generalized to a broader population and a larger range of activities."

Mega-analysis of EEG is extremely challenging due to the many types of hardware systems (properties and configuration of the electrodes), the diversity of tasks, how different datasets are annotated, and the intrinsic variability between individuals and within an individual over time, Touryan said.

These sources of variability make it difficult to find robust relationships between brain and behavior. Mega-analysis seeks to address this by aggregating large, heterogeneous datasets to identify universal features that link neural activity, cognitive state and task performance.

Next-generation neurotechnologies will require a thorough understanding of this relationship in order to mitigate deficits or augment performance of human operators. Ultimately, these neurotechnologies will enable autonomous systems to better understand the Soldier and facilitate communications within multi-domain operations, he said.

To combine the raw data from the collection of studies, the researchers developed Hierarchical Event Descriptors (HED tags) - a novel labeling ontology that captures the wide range of experimental events encountered in diverse datasets. This HED tag system was recently adopted into the Brain Imaging Data Structure international standard, one of the most common formats for organizing and analyzing brain data, Touryan said.

The research team also developed a fully automated processing pipeline to perform large-scale analysis of their high-dimensional time-series data--amounting to more than 1,000 recording sessions.

Much of this data was collected over the last 10 years through the U.S. Army's Cognition and Neuroergonomics Collaborative Technology Alliance and is now available in an online repository for the scientific community. The U.S. Army continues to use this data to develop human-autonomy adaptive systems for both the Next Generation Combat Vehicle and Soldier Lethality Cross-Functional Teams.

Credit: 
U.S. Army Research Laboratory

How the development of skulls and beaks made Darwin's finches one of the most diverse species

image: Main coordinated changes in both the shape of the beak and the shape of the skull found in the study to have characterized the evolution of the skull in both Darwin's finches and Hawaiian honeycreepers. Drawings of some of the species representing the extremes of skull shape.

Image: 
Guillermo Navalon

Darwin's finches are among the most celebrated examples of adaptive radiation in the evolution of modern vertebrates and now a new study, led by scientists from the University of Bristol, has provided fresh insights into their rapid development and evolutionary success.

Study of the finches has been relevant since the journeys of the HMS Beagle in the 18th century which catalysed some of the first ideas about natural selection in the mind of a young Charles Darwin.

Despite many years of research which has led to a detailed understanding of the biology of these perching birds, including impressive decades-long studies in natural populations, there are still unanswered questions.

Specifically, the factors explaining why this particular group of birds evolved to be much more diverse in species and shapes than other birds evolving alongside them in Galapagos and Cocos islands have remained largely unknown.

A similar phenomenon is that of the honeycreepers endemic to the Hawaiian archipelago. These true finches (unlike Darwin's finches which are finch-like birds belonging to a different family) radiated to achieve an order of magnitude more in species and shapes than the rest of the birds inhabiting those islands.

An international team of researchers from the UK and Spain tackled the question of why the rapid evolution in these birds from a different perspective.

They showed in their study published today in the journal Nature Ecology & Evolution that one of the key factors related to the evolutionary success of Darwin's finches and Hawaiian honeycreepers might lie in how their beaks and skulls evolved.

Previous studies have demonstrated a tight link between the shapes and sizes of the beak and the feeding habits in both groups, which suggests that adaptation by natural selection to the different feeding resources available at the islands may have been one of the main processes driving their explosive evolution.

Furthermore, changes in beak size and shape have been observed in natural populations of Darwin's finches as a response to variations in feeding resources, strengthening these views.

However, recent studies on other groups of birds, some of which stem from the previous recent research of the team, have suggested that this strong match between beak and cranial morphology and ecology might not be pervasive in all birds.

Professor Emily Rayfield, from the University of Bristol's School of Earth Sciences, co-authored the new study. She said: "Other factors such as constraints on skull shape during development, the use of the beak for many other functions and the fact that the skull and beak develop and function as a coherent unit may have contributed to this mismatch.

"Therefore, the strong connection between beak, cranial morphology and feeding ecology over the evolution of Darwin's finches, Hawaiian honeycreepers, and perhaps other lineages of birds, might have been only possible if this tight coevolution of cranial regions is somehow 'relaxed' and those regions are able to evolve more freely."

Lead author Guillermo Navalón, recently graduated from a PhD at the University of Bristol and now a Postdoctoral Researcher at the University of Oxford, added: "By taking a broad scale, numerical approach at more than 400 species of landbirds (the group that encompasses all perching birds and many other lineages such as parrots, kingfishers, hornbills, eagles, vultures, owls and many others) we found that the beaks of Darwin's finches and Hawaiian honeycreepers evolved in a stronger association with the rest of the skull than in most of the other lineages of landbirds.

"In other words, in these groups the beak is less independent in evolutionary terms than in most other landbirds."

Jesús Marugán-Lobón co-author of the study and Lecturer at the Autonomous University of Madrid, said: "We found that as a result of this stronger cranial integration, these birds could evolve in a more versatile way but mostly constrained along a very specific direction of adaptive change in the shape of their skulls.

"Paradoxically, we hypothesised that this allowed them to evolve many different shapes very rapidly, filling many of the available niches in their archipelagos as a result."

In contrast, the authors asserted that the other sympatric bird lineages that occupied the island archipelagos at similar time to the ancestors of finches and honeycreepers all belong to the group with the lowest cranial integration in their study and suggest that this was a limiting factor for rapid evolution in other lineages.

Guillermo Navalón added: "While these results are exciting, this is mainly the work of my PhD and at the minute we are working on solving different unanswered questions that stem from this research.

"For instance, are these evolutionary situations isolated phenomena in these two archipelagos or have those been more common in the evolution of island or continental bird communities? Do these patterns characterise other adaptive radiations in birds?

"Future research will likely solve at least some of these mysteries, bringing us one step closer to understanding better the evolution of the wonderful diversity of shapes in birds."

Credit: 
University of Bristol

A fundamental discovery about how gene activity is regulated

Researchers at Johns Hopkins Bloomberg School of Public Health have discovered a fundamental mechanism that regulates gene activity in cells. The newly discovered mechanism targets RNA, or ribonucleic acid, a close cousin of DNA that plays an important role in cellular activity.

The discovery, detailed in a paper published February 3 in the journal Molecular Cell, is a significant addition to the foundational understanding of how gene activity is regulated, and may ultimately lead to powerful new medical treatments.

The newly discovered mechanism effectively silences or dials down certain active genes as a basic cellular regulatory or quality-control system. It may even act as a defense against viruses. When genes are active, they are copied out into strands of RNA. These RNA strands perform cellular functions on their own or are translated into proteins. The new mechanism destroys RNA strands that have excessively folded over and stuck to themselves to form knots, hairpins, and other structures. These highly structured RNAs can occur during normal processing but could possibly also be caused by misfolding.

The finding is likely to have implications for medical research because many human disorders, including cancers and neurodegenerative diseases, such as ALS (Amyotrophic Lateral Sclerosis) and Huntington disease-like syndromes, involve failures of normal RNA regulation and/or the accumulation of abnormally folded or tangled RNA in affected cells.

"We know that there are mechanisms to clear misfolded proteins from cells--possibly this newly uncovered mechanism is involved in clearing misfolded RNAs," says principal investigator Anthony K. L. Leung, PhD, associate professor in the Department of Biochemistry and Molecular Biology at the Bloomberg School. "This newly discovered mechanism might also help scientists understand how normal cells keep themselves healthy, since RNA structure forms can play a role in cells maintaining cellular equilibrium."

Most of the regulatory and quality-control mechanisms that modulate the levels of RNAs in cells target RNAs containing specific sequences of nucleotides--the building blocks of RNAs. The newly discovered mechanism is unique in that it recognizes not sequences but a broad variety of structures formed where RNA strands, which are relatively sticky, have folded back onto themselves.

Leung and his team discovered the new mechanism while investigating a protein called UPF1, which is known to work in other RNA regulation pathways. They found that UPF1 and a partner protein called G3BP1 work together in the new mechanism, targeting only RNAs that contain a high level of structures. When the researchers depleted UPF1 or G3BP1 from cells to shut off the new mechanism, levels of highly structured RNAs rose sharply. The team also confirmed that the new mechanism, which they call structure-mediated RNA decay, is distinct from all other known RNA-removal mechanisms and works across different types of RNA throughout the genome.

"Based on further analyses, we predict that this structure-mediated RNA decay pathway could regulate at least one-fourth of human protein-coding genes and one-third of a class of non-coding genes called circular RNA," Leung says.

Leung and his colleagues now are following up to determine how this RNA decay mechanism actually targets and destroys RNAs. They also are investigating why this mechanism exists. Its functions, they speculate, may include the regulation of specific functional variants of protein-coding RNAs as well as the general disposal of RNAs that have acquired excessive loops and other structures.

The new mechanism may even have an antiviral role, the authors say. "Some single-stranded RNA viruses that are highly structured, such as poliovirus, have ways to get rid of the G3BP1 protein when they infect a cell," Leung says. "Possibly that's because this G3BP1-UPF1 RNA-decay pathway is otherwise a major threat to them."

Credit: 
Johns Hopkins Bloomberg School of Public Health

New score measuring multiple chronic illnesses performs better than current method

A new score that measures multiple long-term health conditions performs better than the current Charlson Comorbidity Index and may help in health care planning and delivery, according to new research in CMAJ (Canadian Medical Association Journal): http://www.cmaj.ca/lookup/doi/10.1503/cmaj.190757

"Multimorbidity scores offer a means of identifying those patients in the population who are most likely to benefit from a tailored approach to care, helping clinicians to prioritize their efforts accordingly," writes Dr. Rupert Payne, Centre for Academic Primary Care, University of Bristol, Bristol, United Kingdom, with coauthors.

Researchers from the United Kingdom developed and tested a measure of multiple illnesses, called the Cambridge Multimorbidity Score, using data from general practitioner records in the United Kingdom. They looked at 37 comorbidities and associated outcomes, such as general practitioner visits, unplanned hospital admissions and death.

"The score outperforms the widely used Charlson index across all outcomes. Performance is best for predicting death, particularly after adjusting for age and gender, and least good for predicting consultations with primary care physicians," says Dr. Payne.

The Cambridge Multimorbidity Score can be a useful predictor of future health care use, including emergency department visits and primary care utilization.

"These scores may be of considerable value for policy development and health care priority-setting, providing accurate, easy-to-implement ways of optimizing health care delivery to an aging population with multiple illnesses," says Dr. Payne.

Credit: 
Canadian Medical Association Journal

Invest in social equity to improve health for low-income people

Canada must invest in social spending and recognize that our health care system is not "universal" if Canadians living in low-income neighbourhoods are to have the same chance of good health as other Canadians, argues an editorial in CMAJ (Canadian Medical Association Journal).

People living in poorer neighbourhoods are at higher risk of dying from preventable diseases than people in affluent neighbourhoods. Even when there are no financial barriers, people with low income access health care less frequently. For example, only 54% of women living in the poorest neighbourhoods in Ontario completed screening for cervical cancer, compared with 67% of women in the richest urban neighbourhoods.

"These differences come with the human cost of thousands of avoidable deaths every year and are particularly harrowing for Indigenous Peoples," write Dr. Andrew Boozary, Executive Director of Health and Social Policy, University Health Network, Toronto, Ontario and Dr. Andreas Laupacis, CMAJ's editor-in-chief. "The persistence of these disparities amounts to discrimination against Canada's most disadvantaged populations."

Why is there such disparity in health outcomes? Poverty is associated with many health risks linked to social context, including housing insecurity, isolation, unhealthy food options and substance use disorders. In an international ranking of high-income countries, Canada came last on social program spending in 2017.

As well, publicly funded health care in Canada covers only certain services, leaving out coverage of prescription drugs for many Canadians, mental health counselling, most home care services and physiotherapy.

Within the health care system itself, differences in access and outcomes according to income persist. "It is well past time that we act on the undeniable importance of the social determinants of health and remedy the inequities within the health care system itself."

The authors call for significant public investment, innovative approaches and political will to level the playing field.

Credit: 
Canadian Medical Association Journal

Novel compound is promising drug candidate for Alzheimer's disease

image: Selective gamma-secretase inhibitor blocks substrate on amyloid precursor protein.

Image: 
Rensselaer Polytechnic Institute

TROY, N.Y. -- A newly identified compound is a promising candidate for inhibiting the production of amyloids, the abnormal proteins that form toxic clumps, called fibrils, inside the brains of patients with Alzheimer's disease. As published today in the Royal Society of Chemistry's Chemical Communications, the compound -- known as "C1" -- uses a novel mechanism to efficiently prevent the enzyme gamma-secretase from producing amyloids.

Amyloid fibrils are largely composed of the peptide Amyloid beta, which is produced when enzymes, including gamma secretase, make cuts to the amyloid precursor protein found in high concentrations the membrane of brain cells. C1 is a covalent gamma-secretase inhibitor that blocks the active site on the precursor protein where gamma-secretase would bind to transform it into amyloids, rather than - as traditional enzyme inhibitors do - blocking the active site on gamma-secretase itself.

"Historically, drug trials for gamma secretase inhibitors failed because traditional enzyme inhibitors have severe side effects. They stopped all of the normal functions of gamma secretase," said Chunyu Wang, a professor of biological sciences and member of the Center for Biotechnology and Interdisciplinary Studies (CBIS) at Rensselaer Polytechnic Institute. "Our compound binds to the cleavage site of the precursor protein instead of the enzyme itself, which may avoid many problems associated with traditional enzyme inhibitors."

In 2018, with support from the Warren Alpert Foundation, Wang began screening drugs to identify a compound that targets the amyloid precursor protein substrate, which would block the activity of gamma secretase involved in amyloid production while allowing all other functions. He began the search with "in silico screening," using computer modeling to test tens of millions of compounds.

C1 was one of several candidates to emerge from that screening. As described in the paper, C1 blocks amyloid production with high efficiency when present at micromolar concentrations, both in test tubes and in cell culture. The research is patent pending.

C1 is a covalent inhibitor, meaning it forms a chemical bond with its target. Wang said that because of their permanent bond, covalent inhibitors are more durable than their non-covalent counterparts. Covalent inhibitors make up about one-third of the drug market, even though they have traditionally been viewed as having a higher risk of causing immune reactivity. In recent years, there is surge in the development of covalent inhibitors, as more highly specific covalent inhibitors showed excellent efficacy towards challenging drug targets.

"With a new approach to tackling the principal pathology of Alzheimer's disease, Chunyu's work is generating a fresh roster of drug candidates with enormous promise," said Deepak Vashishth, the director of CBIS. "His works speaks to the power of the interdisciplinary culture of research at CBIS, and we are pleased with this early result."

Credit: 
Rensselaer Polytechnic Institute

Meat isn't good for you

Eating red meat, processed meat or poultry raises risk of cardiovascular disease

Eating meat - but not poultry- raises risk of dying from all causes

New findings contradict a recent controversial study saying people don't need to reduce their consumption of red meat and processed meat

CHICAGO --- Drop the steak knife. After a controversial study last fall recommending that it was not necessary for people to change their diet in terms of red meat and processed meat, a large, carefully analyzed new study links red and processed meat consumption with slightly higher risk of heart disease and death, according to a new study from Northwestern Medicine and Cornell University.

Eating two servings of red meat, processed meat or poultry -- but not fish -- per week was linked to a 3 to 7% higher risk of cardiovascular disease, the study found. Eating two servings of red meat or processed meat -- but not poultry or fish -- per week was associated with a 3% higher risk of all causes of death.

"It's a small difference, but it's worth trying to reduce red meat and processed meat like pepperoni, bologna and deli meats," said senior study author Norrina Allen, associate professor of preventive medicine at Northwestern University Feinberg School of Medicine. "Red meat consumption also is consistently linked to other health problems like cancer."

"Modifying intake of these animal protein foods may be an important strategy to help reduce the risk of cardiovascular disease and premature death at a population level," said lead study author Victor Zhong, assistant professor of nutritional sciences at Cornell, who did the research when he was a postdoctoral fellow in Allen's lab.

The paper will be published Feb. 3 in JAMA Internal Medicine.

The new findings come on the heels of a controversial meta-analysis published last November that recommended people not reduce the amount of red meat and processed meat they eat. "Everyone interpreted that it was OK to eat red meat, but I don't think that is what the science supports," Allen said.

"Our study shows the link to cardiovascular disease and mortality was robust," Zhong said.

What should we eat?

"Fish, seafood and plant-based sources of protein such as nuts and legumes, including beans and peas, are excellent alternatives to meat and are under-consumed in the U.S.," said study coauthor Linda Van Horn, professor of preventive medicine at Feinberg who also is a member of the 2020 U.S. Dietary Guidelines Advisory committee.

The study found a positive association between poultry intake and cardiovascular disease, but the evidence so far isn't sufficient to make a clear recommendation about poultry intake, Zhong said. Still, fried chicken is not recommended.

The new study pooled together a large diverse sample from six cohorts, included long follow-up data up to three decades, harmonized diet data to reduce heterogeneity, adjusted a comprehensive set of confounders and conducted multiple sensitivity analyses. The study included 29,682 participants (mean age of 53.7 years at baseline, 44.4% men and 30.7% non-white). Diet data were self-reported by participants, who were asked a long list of what they ate for the previous year or month.

Key findings:

A 3 to 7% higher risk of cardiovascular disease and premature death for people who ate red meat and processed meat two servings a week.

A 4% higher risk of cardiovascular disease for people who ate two servings per week of poultry, but the evidence so far is not sufficient to make a clear recommendation about poultry intake. And the relationship may be related to the method of cooking the chicken and consumption of the skin rather than the chicken meat itself.

No association between eating fish and cardiovascular disease or mortality.

Limitations of the study are participants' dietary intake was assessed once, and dietary behaviors may have changed over time. In addition, cooking methods were not considered. Fried chicken, especially deep fat-fried sources that contribute trans-fatty acids, and fried fish intake have been positively linked to chronic diseases, Zhong said.

Credit: 
Northwestern University

Lower protein diet may lessen risk for cardiovascular disease

Hershey, Pa. -- A plant-based diet may be key to lowering risk for heart disease. Penn State researchers determined that diets with reduced sulfur amino acids -- which occur in protein-rich foods, such as meats, dairy, nuts and soy -- were associated with a decreased risk for cardiovascular disease. The team also found that the average American consumes almost two and a half times more sulfur amino acids than the estimated average requirement.

Amino acids are the building blocks of proteins. A subcategory, called sulfur amino acids, including methionine and cysteine, play various roles in metabolism and health.

"For decades it has been understood that diets restricting sulfur amino acids were beneficial for longevity in animals," said John Richie, a professor of public health sciences at Penn State College of Medicine. "This study provides the first epidemiologic evidence that excessive dietary intake of sulfur amino acids may be related to chronic disease outcomes in humans."

Richie led a team that examined the diets and blood biomarkers of more than 11,000 participants from a national study and found that participants who ate foods containing fewer sulfur amino acids tended to have a decreased risk for cardiometabolic disease based on their bloodwork.

The team evaluated data from the Third National Examination and Nutritional Health Survey. They compiled a composite cardiometabolic disease risk score based on the levels of certain biomarkers in participants' blood after a 10-16 hour fast including cholesterol, triglycerides, glucose and insulin.

"These biomarkers are indicative of an individual's risk for disease, just as high cholesterol levels are a risk factor for cardiovascular disease," Richie said. "Many of these levels can be impacted by a person's longer-term dietary habits leading up to the test."

Participants were excluded from the study if they reported having either congestive heart failure, heart attack or a reported change in diet due to a heart disease diagnosis. Individuals were also omitted if they reported a dietary intake of sulfur amino acids below the estimated average requirement of 15 mg/kg/day recommended by the Food and Nutrition Board of the National Academy of Medicine.

For a person weighing 132 pounds, food choices for a day that meet the requirement might include a medium slice of bread, a half an avocado, an egg, a half cup of raw cabbage, six cherry tomatoes, two ounces of chicken breast, a cup of brown rice, three quarters of a cup of zucchini, three tablespoons of butter, a cup of spinach, a medium apple, an eight inch diameter pizza and a tablespoon of almonds.
Nutritionists collected information about participants' diets by doing in-person 24-hour recalls. Nutrient intakes were then calculated using the U.S. Department of Agriculture Survey Nutrient Database.

After accounting for body weight, the researchers found that average sulfur amino acid intake was almost two and a half times higher than the estimated average requirement. Xiang Gao, associate professor and director of the nutritional epidemiology lab at the Penn State University and co-author of the study, published today (Feb. 3) in Lancet EClinical Medicine, suggested this may be due to trends in the average diet of a person living in the United States.

"Many people in the United States consume a diet rich in meat and dairy products and the estimated average requirement is only expected to meet the needs of half of healthy individuals," Gao said. "Therefore, it is not surprising that many are surpassing the average requirement when considering these foods contain higher amounts of sulfur amino acids."

The researchers found that higher sulfur amino acid intake was associated with a higher composite cardiometabolic risk score after accounting for potential confounders like age, sex and history of diabetes and hypertension. They also found that high sulfur amino acid intake was associated with every type of food except grains, vegetables and fruit.

"Meats and other high-protein foods are generally higher in sulfur amino acid content," said Zhen Dong, lead author on the study and College of Medicine graduate. "People who eat lots of plant-based products like fruits and vegetables will consume lower amounts of sulfur amino acids. These results support some of the beneficial health effects observed in those who eat vegan or other plant-based diets."

Dong said that while this study only evaluated dietary intake and cardiometabolic disease risk factors at one point in time, the association between increased sulfur amino acid intake and risk for cardiometabolic disease was strong. She said the data supports the formation of a prospective, longitudinal study evaluating sulfur amino acid intake and health outcomes over time.

"Here we saw an observed association between certain dietary habits and higher levels of blood biomarkers that put a person at risk for cardiometabolic diseases," Richie said. "A longitudinal study would allow us to analyze whether people who eat a certain way do end up developing the diseases these biomarkers indicate a risk for."

Credit: 
Penn State

Examining consumption of processed meat, unprocessed red meat, poultry or fish with risk of CVD, death

What The Study Did: Data for nearly 30,000 adults from six study groups in the U.S. were used to investigate associations between eating processed meat, unprocessed red meat, poultry or fish and the risk of cardiovascular disease and death from any cause.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Victor W. Zhong, Ph.D., of Cornell University in Ithaca, New York, is the corresponding author.

(doi:10.1001/jamainternmed.2019.6969)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Exposing a virus's hiding place reveals new potential vaccine

COLUMBUS, Ohio - By figuring out how a common virus hides from the immune system, scientists have identified a potential vaccine to prevent sometimes deadly respiratory infections in humans.

The research was conducted using the human metapneumovirus (HMPV). The virus was discovered in 2001, but follow-up research has shown that it has circulated in humans for at least 50 years and is considered around the world to be the No. 2 cause of respiratory infections that can be especially dangerous for infants and the elderly.

HMPV is in the same family as respiratory syncytial virus (RSV), the No. 1 cause of human respiratory infections that can also cause serious illness in infants and the elderly - which means these findings may hold promise for development of a vaccine against RSV.

"This is exciting because RSV was discovered in 1953, but we still don't have a vaccine. The virus inhibits the innate immune response, and can infect the same person again and again," said Jianrong Li, senior author of the study and a professor of virology in The Ohio State University Department of Veterinary Biosciences and member of Ohio State's Infectious Diseases Institute.

"Now we have a mutant strain of HMPV that can trigger a higher immune response. Right now we're working to translate this same concept to see if it can work for RSV, too."

For this study, published today (Feb. 3, 2020) in Nature Microbiology, researchers determined that HMPV capitalizes on a common modification of its RNA to hide from the innate immune response, which is the body's first line of defense against any foreign invader. Avoiding that response enables the virus to use host cells to copy itself and cause infection.

By blocking the RNA modification, researchers found in cell studies that the mutated form of the virus they created unexpectedly activated a stronger-than-normal innate immune response. A strong innate response is key to developing an adaptive immune response, the body's production of antibodies against a specific pathogen.

A test on cotton rats showed the mutant virus effectively functioned as a vaccine against HMPV, preventing infection and triggering both a strong innate immune response and an effective adaptive response.

The RNA modification central to this research is known as N6-methyladenosine modification (a form of epigenetic methylation), presumed to be a survival mechanism that evolved over time. This frequently seen change in RNA was discovered in the 1970s, but its biological function has largely been a mystery since then.

Li's lab is one of a handful in the United States studying HMPV, and he and colleagues set out to find out whether HMPV RNA even has this modification, known as m6A methylation - which they discovered it does - and then uncover how the modification affects the virus's behavior.

The team used sophisticated technology, called high-throughput sequencing, to identify the viral gene that contains the most m6A methylation. Researchers then developed a mutated form of the virus by knocking out these modifications so they could assess how the virus functioned without them.

When introduced to human cells, the mutant virus triggered production of a protein called type I interferon, an antiviral molecule, which indicated the innate immune response had been activated - and this protein's level was higher than it would be under normal immune response conditions.

"This opened up a big question," Li said. "Why would a virus lacking this methylation produce a much higher innate immune response?"

By tracing the cell signals required to generate the innate immune response, the scientists determined that the RNA modification allowed the virus to hide from the immune system by reducing a host immune protein's ability to recognize the difference between virus RNA (nonself-RNA) and host RNA (self-RNA).

"We know that when a virus infects cells, it produces RNA, and human cells in the innate response try to separate self-RNA and nonself-RNA," Li said. "The virus is smart: It gained this methylation, and now our host innate response is confused. That's how the virus escapes recognition by the innate immune response."

"We are very excited about this finding. This novel function of m6A may also be conserved in many viruses," said Mijia Lu, the first author and a postdoctoral researcher in Li's laboratory.

The production of a higher level of type I interferon was an important finding in cells. But to be truly effective, a vaccine must prompt the body to also generate an adaptive immune response, producing antibodies and T-cell immune responses against a specific virus or bacterium. The researchers conducted an experiment with the mutant virus in cotton rats to see if this would happen.

Compared to cotton rats given a placebo, the rats immunized with the mutant virus and then infected with HMPV produced both innate and adaptive immune responses and were completely protected from virus replication in their lungs and nasal cavities.

"In the case of cotton rats, the mutant virus produced a higher amount of type I interferon, and triggered a higher antibody response and a higher T-cell immune response. That means you've triggered higher protective ability against the virus infection. So mutating the virus enhances vaccine efficacy," Li said. "That is exactly what we want.

"We proved a concept that these mutant viruses are improved vaccine candidates for HMPV."

Li and his collaborators have filed an application to patent the concept of using the mutant virus as a vaccine candidate for RSV and HMPV.

Credit: 
Ohio State University

Tailor-made vaccines could almost halve rates of serious bacterial disease

New research has found that rates of disease caused by the bacterium Streptococcus pneumoniae could be substantially reduced by changing our approach to vaccination. Researchers from the Wellcome Sanger Institute, Simon Fraser University in Canada and Imperial College London combined genomic data, models of bacterial evolution and predictive modelling to identify how vaccines could be optimised for specific age groups, geographic regions and communities of bacteria.

The study, published today (3 February) in Nature Microbiology, simulated the performance of vaccines over time to assess the risk of vaccine-targeted strains being replaced by other potentially dangerous strains. Through this predictive modelling approach, the researchers identified new vaccine designs that could help reduce overall rates of disease.

S. pneumoniae is often found at the back of the nasal cavity, where it is normally harmless. But when it migrates to other parts of the body, it can cause serious bacterial infections such as pneumonia, sepsis and meningitis - known collectively as invasive pneumococcal disease (IPD). IPD was estimated to cause around 1.6 million deaths per year worldwide prior to the introduction of widespread vaccination, with higher rates of disease in many low- or middle- income countries*. Infants and the elderly are most at risk.

Vaccines against pneumococcus have prevented millions of infections. But like many bacteria, S. pneumoniae is difficult to target with vaccines because infection can be caused by different serotypes**. Each part of a vaccine usually protects against a single serotype, with the most complex pneumococcal conjugate vaccine (PCV13) targeting 13 serotypes.

Because there are approximately 100 S. pneumoniae serotypes around the world, vaccine effectiveness varies between countries depending on which serotypes are present. When serotypes are removed from circulation by a particular vaccine, other serotypes of S. pneumoniae rise to take their place.

In this study, researchers at the Wellcome Sanger Institute, Simon Fraser University and Imperial College London optimised a computer model to approximate the effect of vaccines targeting different serotype combinations. Analysis of vaccine effectiveness was then carried out on S. pneumoniae genomic data from Massachusetts, USA and the Maela refugee camp in Thailand.

The complexity of S. pneumoniae vaccines means many designs are possible, each with different effects on disease. In Maela, for example, the presence of 64 S. pneumoniae serotypes means around 100 trillion vaccine designs are possible. But it would take 19,000 years to simulate them all, with most being sub-optimal. The researchers developed a more efficient method that made it feasible to identify the best-performing designs from the trillions of possibilities.

The team discovered that rates of infant IPD in Maela could actually be reduced by omitting components from the PCV13 vaccine to keep certain serotypes in place, removing the possibility of their replacement by highly-invasive serotypes. In Massachusetts, a vaccine targeting 20 serotypes was found to be more effective than the current PCV13.

The results highlight the need for vaccine programmes to be tailored to specific communities of bacteria and to consider vaccination at different ages.

Dr Nicholas Croucher, of the MRC Centre for Global Infectious Disease Analysis, Imperial College London, said: "Our research shows that the best vaccine designs strongly depend on the bacterial strains present in the population, which vary considerably between countries. The best vaccine designs also depend on the age group being vaccinated. These ideas will be critical for applying lessons learned from introducing vaccines in high-income countries to combatting the disease where the burden is highest."

Vaccination of infants also affects IPD in adults. However, trends in IPD can differ between infants and the elderly in the same country, as seen recently in the UK. In many places, older adults already receive an S. pneumoniae vaccine, which was designed before the infant vaccine. The study also found that adult disease rates could be reduced by almost 50 per cent by redesigning adult vaccines to complement those administered to infants.

Professor Caroline Colijn, of Simon Fraser University and the Wellcome Sanger Institute, said: "This approach to optimising vaccines will help to address several problems, such as invasive disease among infants or adults and minimising antibiotic resistance in the post vaccine population. Such an approach also enables public health policy-makers to assess the likely effectiveness of an existing vaccine for a local population based on genomic surveillance data."

The findings coincide with growing alarm at the threat of antimicrobial resistance (AMR) to common medicines. S. pneumoniae infections are sometime resistant to multiple antibiotics and have been highlighted as a priority threat by the WHO***. This study highlighted how vaccines can be designed to reduce the chances that a person's S. pneumoniae would be resistant to common treatments.

Professor Jukka Corander, of the University of Oslo, University of Helsinki and the Wellcome Sanger Institute, said: "With the power of the latest DNA sequencing technology we are heading towards a future where large-scale genomic surveillance of major bacterial pathogens is feasible. The approach we describe in this study will play an important role in accelerating future vaccine discovery and design to help reduce rates of disease."

Credit: 
Wellcome Trust Sanger Institute