Culture

Probing the genetic basis of Roundup resistance in morning glory, a noxious weed

ANN ARBOR--The herbicide Roundup is the most widely used agricultural chemical in history. But over the past two decades, a growing number of weed species have evolved resistance to Roundup's active ingredient, glyphosate, reducing the product's dominance somewhat.

Research on the genetic basis of glyphosate resistance has focused largely on target-site resistance, which involves mutations to the single gene that encodes the plant protein disrupted by the herbicide.

Much less attention has been paid to nontarget-site glyphosate resistance, which involves identifying other genes that have mutated in ways that confer resistance to Roundup.

In a study published online Feb. 3 in the journal PLOS Genetics, a team led by University of Michigan plant ecological geneticist Regina Baucom used genome-wide scans to identify nontarget-site glyphosate resistance in the common morning glory, an annual vine that is a noxious agricultural weed.

The researchers identified five regions of the genome that showed strong signs of selection, indicating rapid evolution of resistance. Within these regions, genes that enable the weed to detoxify the herbicide were enriched.

"We show that morning glory exhibits nontarget-site resistance and that detoxification of the herbicide by the plant is a likely resistance mechanism in this species," said Baucom, associate professor in the U-M Department of Ecology and Evolutionary Biology. "Common morning glory has always been problematic to farmers, with some populations more resistant than others. But until now we didn't know why it was so problematic.

"With this work, we show that there are multiple ways--including detoxification--that a plant may evolve higher levels of resistance to glyphosate. If morning glory is detoxifying glyphosate or the products of glyphosate, it might make other herbicides less likely to work, since they too may be detoxified. The next major question is whether selection from glyphosate makes morning glory resistant to other herbicides, and if morning glory is detoxifying the herbicide itself or the oxidative stress caused by damage from the herbicide."

Overall, the study found evidence for both parallel and nonparallel genetic changes associated with glyphosate resistance in the morning glory, suggesting there are more genetic avenues underlying the adaptation to herbicide than scientists previously considered. In genetic parallelism, separate species or genetic lineages use the same genetic solution in response to selective pressures.

"These findings suggest that resistance in this species is due to a nontarget genetic mechanism, components of which exhibit signs of both parallel and nonparallel responses to selection among populations," according to the authors.

Credit: 
University of Michigan

Lights out? Fireflies face extinction threats of habitat loss, light pollution, pesticides

image: A female glow-worm (Lampyris noctiluca) will shine for hours to attract her mate, yet brightening skies will dim her prospects.

Image: 
Jason Steel - www.jason-steel.co.uk

MEDFORD/SOMERVILLE, Mass. (February 3, 2020)-- Habitat loss, pesticide use and, surprisingly, artificial light are the three most serious threats endangering fireflies across the globe, raising the spectre of extinction for certain species and related impacts on biodiversity and ecotourism, according to a Tufts University-led team of biologists associated with the International Union for the Conservation of Nature.

Fireflies belong to a widespread and economically important insect group, with more than 2,000 different species spread out across the globe. To better understand what threats are faced by fireflies, the team led by Sara Lewis, professor of biology at Tufts University, surveyed firefly experts around the world to size up the most prominent threats to survivial for their local species.

Their perspective piece, published today in Bioscience, sounds a warning bell about the insects' future, highlighting specific threats and the vulnerability of different species across geographical regions.

According to survey respondents, habitat loss is the most most critical threat to firefly survival in most geoographic regions, followed by light pollution and pesticide use.

"Lots of wildlife species are declining because their habitat is shrinking," said Lewis "so it wasn't a huge surprise that habitat loss was considered the biggest threat. Some fireflies get hit especially hard when their habitat disappears because they need special conditions to complete their life cycle. For instance, one Malaysian firefly [Pteroptyx tener], famous for its synchronized flash displays, is a mangrove specialist." As reported in the article, previous work has revealed drastic declines in this species following conversion of their mangrove habitat to palm oil plantations and aquaculture farms.

One surprising result that emerged from the survey was that, globally, light pollution was regarded as the second most serious threat to fireflies.

Artificial light at night has grown exponentially during the last century. "In addition to disrupting natural biorhythms - including our own - light pollution really messes up firefly mating rituals," explained Avalon Owens, Ph.D. candidate in biology at Tufts and a co-author on the study. Many fireflies rely on bioluminescence to find and attract their mates, and previous work has shown that too much artificial light can interfere with these courtship exchanges. Switching to energy efficient, overly bright LEDs is not helping. "Brighter isn't necessarily better," says Owens.

Firefly experts viewed the widespread agricultural use of pesticides as another key threat to firefly survival.

Most insecticide exposure occurs during larval stages, because juvenile fireflies spend up to two years living below ground or under water. Insecticides such as organophosphates and neonicotinoids are designed to kill pests, yet they also have off-target effects on beneficial insects. While more research is needed, the evidence shows that many commonly used insecticides are harmful to fireflies.

A few studies have quantified firefly population declines, such as those seen in the tourist-attracting synchronous fireflies of Malaysia, and the glowworm Lampyris noctiluca in England. And numerous anecdotal reports suggest that many other firefly species across a wide range of habitats have also suffered recent declines. "However," Lewis points out, "we really need better long-term data about firefly population trends - this is a place where citizen science efforts like Massachusetts Audubon's Firefly Watch project can really help."

The researchers also highlight risk factors that allow them to predict which species will be most vulnerable when faced with threats like habitat loss or light pollution. For instance, females of the Appalachian blue ghost firefly [Phausis reticulata] are flightless. "So when their habitat disappears, they can't just pick up and move somewhere else," explains co-author J. Michael Reed, professor of biology at Tufts. Yet the researchers remain optimistic about fireflies' future. "Here in the U.S., we're fortunate to have some robust species like the Big Dipper fireflies [Photinus pyralis]," notes Lewis. "Those guys can survive pretty much anywhere- and they're beautiful, too."

By illuminating these threats and evaluating the conservation status of firefly species around the world, researchers aim to preserve the magical lights of fireflies for future generations to enjoy. "Our goal is to make this knowledge available for land managers, policy makers, and firefly fans everywhere," says co-author Sonny Wong of the Malaysian Nature Society. "We want to keep fireflies lighting up our nights for a long, long time."

Credit: 
Tufts University

Knowledge Engine is ready to accelerate genomic research

Five years ago, a team of computer scientists, biomedical researchers, and bioinformaticians set out to bring the power of collective knowledge to genomic research. Their new publication in PLOS Biology shares the culmination of that effort, an analytical platform that guides researchers through the process of interpreting complex genomic datasets.

The group was awarded funding by the National Institutes of Health to form a Big Data to Knowledge Center of Excellence. The center, led by Professor of Computer Science and Willett Faculty Scholar Saurabh Sinha at the Carl R. Woese Institute for Genomic Biology (IGB) at the University of Illinois at Urbana-Champaign and including numerous collaborators at Illinois and Mayo Clinic, created a first-of-its-kind analytical platform, the Knowledge Engine for Genomics (KnowEnG). Charles Blatti and Amin Emad, who were both postdoctoral researchers within the center, are co-first authors of the new publication.

"It is exhilarating to see the efforts of such an amazing group of talented and dedicated people--researchers, software engineers, user experience designers, project managers, faculty, postdocs, grad students, undergrads and even high school students--over a period of years, culminate in a single product that we can all be proud of and that can hopefully help genomics researchers world-wide," Sinha said.

To understand KnowEnG's potential to impact genomic research, it's important to know that the initial outcome of many genomic studies is a set of genes of interest: genes that have different activity levels in two different experimental conditions, that carry mutations distinguishing healthy cells from tumor cells, or that exhibit sequence variations in individuals with different health conditions.

Biomedical researchers must find ways to translate a list of not obviously related genes into a comprehensive interpretation: Does the disease state affect the metabolic rate of the affected tissue? Does the experimental therapy seem likely to slow the rate of tumor cell division? A common approach to this challenge is to relate an experimental dataset with existing knowledge of the biological importance of different genes and their relationships with one another. Like a curious internet user with a set of search terms, the researcher hopes to leverage the totality of what is already known.

Unlike an internet user, however, an individual conducting genomic research in recent years hasn't had the flexibility like that offered by a search engine like Google to seamlessly pull together myriad sources of information; nor could they easily apply that information in many different types of analyses. Instead, each analysis had to be done piecemeal, jumping from one analytical tool to another, each offering a limited interpretation. This is an obstacle that KnowEnG removes.

"A lot of times you start with one analysis and then you want to follow up with more analyses," Emad, who is now an assistant professor of electrical and computer engineering at McGill University, said. "One thing about KnowEnG that is very useful is that you can pipeline these different analyses one after the other. You may run one [analytical] pipeline in KnowEnG, get the results, and automatically generate data in the format that can be plugged into the next pipeline . . . So you can set up different types of analyses one after the other."

KnowEnG is also uniquely able to draw upon and synthesize diverse sources of existing genomic information, combining them into a vast "knowledge network" that can continue to expand over time as research continues and new forms of data emerge from new genomic technologies.

To highlight the functionality the center has achieved, a team led by Blatti and Emad used previously published data as case studies, re-analyzing the results within the KnowEnG platform and sharing the novel insights revealed.

"I worked very closely with Amin [Emad] to try to design a study that mimics previous studies that don't use prior knowledge, and then try to show where we can go beyond those studies with a knowledge-guided analysis," Blatti said. Blatti is now a research scientist at Illinois' National Center for Supercomputing Applications (NCSA).

Genomics is an incredibly broad field, and KnowEnG's capabilities span that breadth. The platform enables users to upload their data and customize step-by-step analyses for a wide variety of human genomic data forms, or for genomic data from any of 19 model organisms.

"Even though the paper is a paper about human cancer studies . . . we've also tried to target the model organisms that are studied in Illinois," Blatti said. "It's not just a cancer only platform--it's a wider platform."

Experts in biological data management have recommended that datasets and tools should be findable, accessible, interoperable, and usable (FAIR) within the research community. Center members ensured that KnowEnG would address those goals, making it be freely available through a web portal, as well as facilitating a variety of other modes of access. They also worked directly with test groups of users in biomedical research via the partnership with Mayo Clinic.

"We try to thoroughly understand the questions the user is asking of the data," said Colleen Bushell, a coauthor and Associate Director for Healthcare Innovation at NCSA. "Our approach is to then design a way to display answers to those questions clearly, and anticipate the next set of questions. When we simplify views of data, it's really just to answer some of those questions concisely, but then as they dive deeper into trying to understand the data, we provide more and more detail, more and more explanation." Bushell provided oversight of KnowEng implementation and guided the center's visualization team, led by co-authors Lisa Gatzke and Matthew Berry, in developing innovative ways to represent complex data and analytical processes within the platform, and to create user-experiences that makes it easy for biologist to manage their data, set up data science experiments, and execute them in a cloud environment.

The work process relied heavily on feedback from biomedical researchers at Mayo and Illinois at every stage of development.

"For me, the postdoc at the KnowEnG Center and what it involved was a unique opportunity and a unique environment," Emad said. "Talking to people at Mayo Clinic and other researchers that were quite knowledgeable in the biomedical domain allowed me to learn a lot. So every task for me was a learning opportunity."

Although the NIH funding for the center has ended, the Cancer Center at Illinois is providing funding to enable access to the KnowEnG this year. Development will continue through the efforts of the NCSA Healthcare Innovation program office. NCSA works in part to support the longevity of software developed at Illinois; the structure of KnowEnG was designed to accommodate the addition of new forms of data, new analytical processes, and new visualization strategies over time.

"NCSA is committed to really continue this platform," Bushell said. "This falls into NCSA's mission to give software a life beyond funding . . . we're focusing on tools related to healthcare and data analysis, and we collaborate closely with IGB researchers. We want people to know that these tools will be around."

Credit: 
Carl R. Woese Institute for Genomic Biology, University of Illinois at Urbana-Champaign

How nature tells us its formulas

image: The atom chip (in gold) at TU Wien

Image: 
TU Wien

Many of the biggest questions in physics can be answered with the help of quantum field theories: They are needed to describe the dynamics of many interacting particles, and thus they are just as important in solid state physics as in cosmology. Often, however, it is extremely complicated to develop a quantum field theoretical model for a specific problem - especially if the system in question consists of many interacting particles.

Now a team from the TU Wien and the University of Heidelberg has developed methods with which these models can be directly obtained from experimental measurements. Instead of comparing the experimental results to theoretical model predictions, it is, in a certain sense, possible to measure the theory itself. This should now shed new light on the complicated field of many-body quantum physics.

Quantum Simulators

In recent years, a new method of studying quantum physical systems has gained importance - the so-called "quantum simulators". "We simply do not have a satisfactory description of some quantum systems, for example high-temperature superconductors. Other systems can just not be observed directly, such as the early universe shortly after the Big Bang. Suppose we still want to learn something about such quantum systems - then we simply choose another system that can be easily controlled in the laboratory and adjust it so that it behaves in a similar way to the system we are actually interested in. For example, we can use experiments on ultracold atoms to learn about systems that we would otherwise not be able to study at all," explains Jörg Schmiedmayer from the Vienna Center of Quantum Science and Technology (VCQ) at TU Wien. This is possible because there are fundamental similarities between different quantum physical descriptions of different systems.

But no matter which quantum system is studied, scientists always come across a fundamental problem: "If there are too many particles involved, the formulas of quantum theory quickly become so complicated that they cannot be solved, not even with the best supercomputers in the world," explains Sebastian Erne. "That's a pity, because systems consisting of many particles are particularly interesting. In everyday life, it is always the case that many particles play a role at the same time."

Getting Rid of the Details

In general, it is not possible to solve the exact quantum theory for a many-particle-system, in which every single particle is considered. One has to find a simplified quantum description that contains all the essential properties, but no longer relies on details about the individual particles. "This is similar to describing a gas," explains Jörg Schmiedmayer. "We're not interested in every single atom, but in more general variables such as pressure and temperature."

But how do you arrive at such theories for many-body systems? Deriving them purely mathematically from the laws of nature that apply to individual particles is extremely complicated. But as it now turns out, this is not necessary. "We have found a method of reading the quantum field theoretical description directly from the experiment," says Schmiedmayer. "In a certain sense, nature provides the formulas, with which it must be described, all by itself."

We know that every quantum theory has to obey certain formal rules - we talk for example about correlations, propagators, vertices, Feynman diagrams - the basic building blocks of every quantum physical model. The research team of TU Wien and the University of Heidelberg has found a way to make these individual basic building blocks experimentally accessible. The experimental measurements result in an empirically obtained quantum theory for a many-body system, without having to work with paper and pencil.

"For years, we have suspected that this is theoretically possible, but not everyone believed us that it actually works," says Jörg Schmiedmayer. "Now we have shown that we were right - by looking at a special case where the theory can also be found and (in certain limits) solved mathematically. Our measurement results provide exactly the same theory building blocks."

Ultracold Atomic Clouds

The experiment was done with clouds of thousands of ultracold atoms that are trapped in a magnetic trap on an atomic chip. "From the quantum wave patterns of these atomic clouds, we can determine the correlation functions from which the basic building blocks of the appropriate theory can be derived," explains Schmiedmayer.

The results have now been published in the journal "Physical Review X". The team hopes that this will significantly simplify the study of quantum many-particle systems. Perhaps it will shed some light on some of the big questions in physics.

Credit: 
Vienna University of Technology

Researchers identify link between decreased depressive symptoms, yoga and the neurotransmitter GABA

(Boston)-- The benefits of yoga have been widely documented by scientific research, but previously it was not clear as to how yoga exerts its physiologic effect.

Now a new study from Boston University School of Medicine (BUSM) proposes that yoga can increase levels of Gamma-amino butyric acid (GABA) in the short-term and completing one yoga class per week may maintain elevated GABA that could mitigate depressive symptoms.

Depression is a highly prevalent and disabling disease. According to the World Health Organization, depression affects approximately 16 million people in the U.S. every year and is the leading cause of disability worldwide. Given its high morbidity, extensive research has been done on effective treatment modalities for depression. GABA is an amino acid that acts as a neurotransmitter in the central nervous system and has been associated with decreased depressive symptoms.

A group of 30 clinically depressed patients were randomly divided into two groups. Both groups engaged in lyengar yoga and coherent breathing with the only difference being the number of 90 minute yoga session and home sessions in which each group participated. Over three months, the high-dose group (HDG) was assigned three sessions per week while the low-intensity group (LIG) was assigned two sessions per week. Participants underwent magnetic resonance imaging (MRI) scans of their brain before the first yoga session and after the last yoga session. They also completed a clinical depression scale to monitor their symptoms.

Results showed that both groups had improvement in depressive symptoms after three months. MRI analysis found that GABA levels after three months of yoga were elevated (as compared to prior to starting yoga) for approximately four days after the last yoga session but the increase was no longer observed after approximately eight days. "The study suggests that the associated increase in GABA levels after a yoga session are 'time-limited' similar to that of pharmacologic treatments such that completing one session of yoga per week may maintain elevated levels of GABA," explained corresponding author Chris Streeter, MD, associate professor of psychiatry at BUSM.

According to the researchers, providing evidence-based data will be helpful in getting more individuals to try yoga as a strategy for improving their health and well-being. "A unique strength of this study is that pairing the yoga intervention with brain imaging provides important neurobiological insight as to the 'how' yoga may help to alleviate depression and anxiety. In this study, we found that an important neurochemical, GABA, which is related to mood, anxiety and sleep, is significantly increased in association with a yoga intervention," said collaborator and co-author Marisa Silveri, PhD, neuroscientist at McLean Hospital and associate professor of psychiatry at Harvard Medical School.

Credit: 
Boston University School of Medicine

New method for removing oil from water

image: is adsorbed within seconds by a leaf of the floating fern Salvinia and pulled from the water.

Image: 
(c) W. Barthlott, M. Mail/Uni Bonn

Oil poses a considerable danger to aquatic life. Researchers at the Universities of Bonn and Aachen and the Heimbach-GmbH have developed a new technology for the removal of such contaminations: Textiles with special surface properties passively skim off the oil and move it into a floating container. The scientists used surfaces from the plant kingdom as a model. The study has now been published in the journal "Philosophical Transactions A".

The video clip is as short as it is impressive: The 18-second sequence shows a pipette from which dark-colored crude oil drips into a glass of water. Then a researcher holds a green leaf against the spot. Within a matter of seconds the leaf sucks the oil from the surface of the water, leaving not even a trace behind.

The star of the movie, the small green leaf, comes from the floating fern Salvinia. The special abilities of its leaves make it highly interesting for scientists, because they are extremely hydrophobic: When submerged, they wrap themselves in an air jacket and remain completely dry. Researchers call this behavior "superhydrophobic", which can be translated as "extremely water repellent".

However, the Salvinia surface loves oil which is, in a way, a flip side of superhydrophobia. "This allows the leaves to transport an oil film on their surface", explains Prof. Dr. Wilhelm Barthlott, emeritus of the University of Bonn and former director of its botanic gardens. "And we have also been able to transfer this property to technically producible surfaces, such as textiles."

Functional textiles as "suction tubes"

Such superhydrophobic substances can then for instance be used to remove oil films from water surfaces efficiently and without the use of chemicals. However, unlike other materials that have been used for this purpose so far, they do not absorb the oil. "Instead, it travels along the surface of the fabric, moved forward solely by its adhesive forces," explains Barthlott. "For example, in the laboratory we hung such fabric tapes over the edge of a container floating on the water. Within a short time they had almost completely removed the oil from the water surface and transported it into the container." Gravity provides the power; the bottom of the container must therefore be below the water surface with the oil film. "The oil is then completely skimmed off - as if using an automatic skimming spoon for meat stock."

This also makes super-hydrophobic textiles interesting for environmental technology. After all, they promise a new approach to solving the acute environmental problem of increasing oil spills on water bodies. Oil films floating on water cause a number of problems. They prevent gas exchange through the surface and are also dangerous on contact for many plants and animals. As oil films also spread quickly over large surfaces, they can endanger entire ecosystems.

Cleaning without chemicals

The new process does not require the use of chemicals. Conventional binding agents simply absorb the oil and can then usually only be burned later. The superhydrophobia method is different: "The oil skimmed into the floating container is so clean that it can be reused," explains Prof. Barthlott.

The procedure is not intended for large-scale oil disasters such as those that occur after a tanker accident. But particularly small contaminations, such as engine oil from cars or ships, heating oil or leaks, are a pressing problem. "Even minor quantities become a danger to the ecosystem, especially in stagnant or slow-flowing waters," emphasizes the biologist. This is where he sees the major application potential of the new method, for which a patent has been filed by the University of Bonn.

Generally speaking, many surfaces exhibit superhydrophobic behavior, albeit to varying degrees. The basic prerequisite is first of all that the material itself is water-repellent, for example due to a wax coating. But that alone is not enough: "Superhydrophobia is always based on certain structures on the surface, such as small hairs or warts - often on a nanotechnological scale," explains the botanist from the University of Bonn. It is also thanks to him that science now knows much more about these relationships than it did a few decades ago.

The research work is funded by the Deutsche Bundesstiftung Umwelt DBU. "This now helps us to develop oil-absorbing materials with particularly good transport properties, in cooperation with RWTH Aachen University," says Barthlott.

Credit: 
University of Bonn

Flickering light mobilizes brain chemistry that may fight Alzheimer's

video: The hope of flickering light to treat Alzheimer's takes another step forward in this new study, which reveals stark biochemical mechanisms: 40 Hertz stimulation triggers a marked release of signaling chemicals.

Image: 
Georgia Tech / Evans / Karcz

For over a century, Alzheimer's disease has confounded all attempts to treat it. But in recent years, perplexing experiments using flickering light have shown promise.

Now, researchers have tapped into how the flicker may work. They discovered in the lab that the exposure to light pulsing at 40 hertz - 40 beats per second - causes brains to release a surge of signaling chemicals that may help fight the disease.

Though conducted on healthy mice, this new study is directly connected to human trials, in which Alzheimer's patients are exposed to 40 Hz light and sound. Insights gained in mice at the Georgia Institute of Technology are informing the human trials in collaboration with Emory University.

"I'll be running samples from mice in the lab, and around the same time, a colleague will be doing a strikingly similar analysis on patient fluid samples," said Kristie Garza, the study's first author. Garza is a graduate research assistant in the lab of Annabelle Singer at Georgia Tech and also a member of Emory's neuroscience program.

One of the surging signaling molecules, in particular, is associated with the activation of brain immune cells called microglia, which purge an Alzheimer's hallmark - amyloid beta plaque, junk protein that accumulates between brain cells.

Immune signaling

In 2016, researchers discovered that light flickering at 40 Hz mobilized microglia in mice afflicted with Alzheimer's to clean up that junk. The new study looked for brain chemistry that connects the flicker with microglial and other immune activation in mice and exposed a surge of 20 cytokines - small proteins secreted externally by cells and which signal to other cells. Accompanying the cytokine release, internal cell chemistry - the activation of proteins by phosphate groups - left behind a strong calling card.

"The phosphoproteins showed up first. It looked as though they were leading, and our hypothesis is that they triggered the release of the cytokines," said Singer, who co-led the new study and is an assistant professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory.

"Beyond cytokines that may be signaling to microglia, a number of factors that we identified have the potential to support neural health," said Levi Wood, who co-led the study with Singer and is an assistant professor in Georgia Tech's George W. Woodruff School of Mechanical Engineering.

The team publishes its findings in the Journal of Neuroscience on February 5, 2020. (There is no embargo. Pre-publication appeared in December but did not yet contain all edits and elements.) The research was funded by the National Institute of Neurological Disorders and Stroke at the National Institutes of Health, and by the Packard Foundation.

Singer was co-first author on the original 2016 study at the Massachusetts Institute of Technology, in which the therapeutic effects of 40 Hz were first discovered in mice.

Sci-fi surrealness

Alzheimer's strikes, with few exceptions, late in life. It destroys up to 30% of a brain's mass, carving out ravines and depositing piles of amyloid plaque, which builds up outside of neurons. Inside neurons, phosphorylated tau protein forms similar junk known as neurofibrillary tangles suspected of destroying mental functions and neurons.

After many decades of failed Alzheimer's drug trials costing billions, flickering light as a potentially successful Alzheimer's therapy seems surreal even to the researchers.

"Sometimes it does feel like science fiction," Singer said.

The 40 Hz frequency stems from the observation that brains of Alzheimer's patients suffer early on from a lack of what is called gamma, moments of gentle, constant brain waves acting like a dance beat for neuron activity. Its most common frequency is right around 40 Hz, and exposing mice to light flickering at that frequency restored gamma and also appears to have prevented heavy Alzheimer's brain damage.

Adding to the surrealness, gamma has also been associated with esoteric mind expansion practices, in which practitioners perform light and sound meditation. Then, in 2016, research connected gamma to working memory, a function key to train of thought.

Cytokine bonanza

In the current study, the surging cytokines hinted at a connection with microglial activity, and in particular, the cytokine Macrophage Colony-Stimulating Factor (M-CSF).

"M-CSF was the thing that yelled, 'Microglia activation!'" Singer said.

The researchers will look for a causal connection to microglia activation in an upcoming study, but the overall surge of cytokines was a good sign in general, they said.

"The vast majority of cytokines went up, some anti-inflammatory and some inflammatory, and it was a transient response," Wood said. "Often, a transient inflammatory response can promote pathogen clearance; it can promote repair."

"Generally, you think of an inflammatory response as being bad if it's chronic, and this was rapid and then dropped off, so we think that was probably beneficial," Singer added.

Chemical timing

The 40 Hz stimulation did not need long to trigger the cytokine surge.

"We found an increase in cytokines after an hour of stimulation," Garza said. "We saw phosphoprotein signals after about 15 minutes of flickering."

Perhaps about 15 minutes was enough to start processes inside of cells and about 45 more minutes were needed for the cells to secrete cytokines. It is too early to know.

20 Hz bombshell

As controls, the researchers applied three additional light stimuli, and to their astonishment, all three had some effect on cytokines. But stimulating with 20 Hz stole the show.

"At 20 Hz, cytokine levels were way down. That could be useful, too. There may be circumstances where you want to suppress cytokines," Singer said. "We're thinking different kinds of stimulation could potentially become a platform of tools in a variety of contexts like Parkinson's or schizophrenia. Many neurological disorders are associated with immune response."

The research team warns against people improvising light therapies on their own, since more data is needed to thoroughly establish effects on humans, and getting frequencies wrong could possibly even do damage.

Credit: 
Georgia Institute of Technology

Army develops big data approach to neuroscience

image: Aggregate distribution of cortical brain-wave activity, organized by standard frequency bands, across a range of depths.

Image: 
U.S. Army graphic

ABERDEEN PROVING GROUND, Md. (Feb 3, 2020) -- A big data approach to neuroscience promises to significantly improve our understanding of the relationship between brain activity and performance.

To date, there have been relatively few attempts to use a big-data approach within the emerging field of neurotechnology. In this field, the few attempts at meta-analysis (analysis across multiple studies) combine only the results from individual studies rather than the raw data. A new study is one of the first to combine data across a diverse set of experiments to identify patterns of brain activity that are common across tasks and people.

The Army in particular is interested in how the cognitive state of Soldiers can affect their performance during a mission. If you can understand the brain, you can predict and even enhance cognitive performance.

Researchers from the U.S. Army Combat Capabilities Development Command's Army Research Laboratory teamed with the University of Texas at San Antonio and Intheon Labs to develop a first-of-its-kind mega-analysis of brain imaging data--in this case electroencephalography, or EEG.

In the two-part paper, they aggregate the raw data from 17 individual studies, collected at six different locations, into a single analytical framework, with their findings published in a series of two papers in the journal NeuroImage. The individual studies included in this analysis encompass a diverse set of tasks such simulated driving and visual search.

"The vast majority of human neuroscientific studies use a very small number of participants employed in very specific tasks," said Dr. Jonathan Touryan, an Army scientist and co-author of the paper. "This limits how well the results from any single study can be generalized to a broader population and a larger range of activities."

Mega-analysis of EEG is extremely challenging due to the many types of hardware systems (properties and configuration of the electrodes), the diversity of tasks, how different datasets are annotated, and the intrinsic variability between individuals and within an individual over time, Touryan said.

These sources of variability make it difficult to find robust relationships between brain and behavior. Mega-analysis seeks to address this by aggregating large, heterogeneous datasets to identify universal features that link neural activity, cognitive state and task performance.

Next-generation neurotechnologies will require a thorough understanding of this relationship in order to mitigate deficits or augment performance of human operators. Ultimately, these neurotechnologies will enable autonomous systems to better understand the Soldier and facilitate communications within multi-domain operations, he said.

To combine the raw data from the collection of studies, the researchers developed Hierarchical Event Descriptors (HED tags) - a novel labeling ontology that captures the wide range of experimental events encountered in diverse datasets. This HED tag system was recently adopted into the Brain Imaging Data Structure international standard, one of the most common formats for organizing and analyzing brain data, Touryan said.

The research team also developed a fully automated processing pipeline to perform large-scale analysis of their high-dimensional time-series data--amounting to more than 1,000 recording sessions.

Much of this data was collected over the last 10 years through the U.S. Army's Cognition and Neuroergonomics Collaborative Technology Alliance and is now available in an online repository for the scientific community. The U.S. Army continues to use this data to develop human-autonomy adaptive systems for both the Next Generation Combat Vehicle and Soldier Lethality Cross-Functional Teams.

Credit: 
U.S. Army Research Laboratory

How the development of skulls and beaks made Darwin's finches one of the most diverse species

image: Main coordinated changes in both the shape of the beak and the shape of the skull found in the study to have characterized the evolution of the skull in both Darwin's finches and Hawaiian honeycreepers. Drawings of some of the species representing the extremes of skull shape.

Image: 
Guillermo Navalon

Darwin's finches are among the most celebrated examples of adaptive radiation in the evolution of modern vertebrates and now a new study, led by scientists from the University of Bristol, has provided fresh insights into their rapid development and evolutionary success.

Study of the finches has been relevant since the journeys of the HMS Beagle in the 18th century which catalysed some of the first ideas about natural selection in the mind of a young Charles Darwin.

Despite many years of research which has led to a detailed understanding of the biology of these perching birds, including impressive decades-long studies in natural populations, there are still unanswered questions.

Specifically, the factors explaining why this particular group of birds evolved to be much more diverse in species and shapes than other birds evolving alongside them in Galapagos and Cocos islands have remained largely unknown.

A similar phenomenon is that of the honeycreepers endemic to the Hawaiian archipelago. These true finches (unlike Darwin's finches which are finch-like birds belonging to a different family) radiated to achieve an order of magnitude more in species and shapes than the rest of the birds inhabiting those islands.

An international team of researchers from the UK and Spain tackled the question of why the rapid evolution in these birds from a different perspective.

They showed in their study published today in the journal Nature Ecology & Evolution that one of the key factors related to the evolutionary success of Darwin's finches and Hawaiian honeycreepers might lie in how their beaks and skulls evolved.

Previous studies have demonstrated a tight link between the shapes and sizes of the beak and the feeding habits in both groups, which suggests that adaptation by natural selection to the different feeding resources available at the islands may have been one of the main processes driving their explosive evolution.

Furthermore, changes in beak size and shape have been observed in natural populations of Darwin's finches as a response to variations in feeding resources, strengthening these views.

However, recent studies on other groups of birds, some of which stem from the previous recent research of the team, have suggested that this strong match between beak and cranial morphology and ecology might not be pervasive in all birds.

Professor Emily Rayfield, from the University of Bristol's School of Earth Sciences, co-authored the new study. She said: "Other factors such as constraints on skull shape during development, the use of the beak for many other functions and the fact that the skull and beak develop and function as a coherent unit may have contributed to this mismatch.

"Therefore, the strong connection between beak, cranial morphology and feeding ecology over the evolution of Darwin's finches, Hawaiian honeycreepers, and perhaps other lineages of birds, might have been only possible if this tight coevolution of cranial regions is somehow 'relaxed' and those regions are able to evolve more freely."

Lead author Guillermo Navalón, recently graduated from a PhD at the University of Bristol and now a Postdoctoral Researcher at the University of Oxford, added: "By taking a broad scale, numerical approach at more than 400 species of landbirds (the group that encompasses all perching birds and many other lineages such as parrots, kingfishers, hornbills, eagles, vultures, owls and many others) we found that the beaks of Darwin's finches and Hawaiian honeycreepers evolved in a stronger association with the rest of the skull than in most of the other lineages of landbirds.

"In other words, in these groups the beak is less independent in evolutionary terms than in most other landbirds."

Jesús Marugán-Lobón co-author of the study and Lecturer at the Autonomous University of Madrid, said: "We found that as a result of this stronger cranial integration, these birds could evolve in a more versatile way but mostly constrained along a very specific direction of adaptive change in the shape of their skulls.

"Paradoxically, we hypothesised that this allowed them to evolve many different shapes very rapidly, filling many of the available niches in their archipelagos as a result."

In contrast, the authors asserted that the other sympatric bird lineages that occupied the island archipelagos at similar time to the ancestors of finches and honeycreepers all belong to the group with the lowest cranial integration in their study and suggest that this was a limiting factor for rapid evolution in other lineages.

Guillermo Navalón added: "While these results are exciting, this is mainly the work of my PhD and at the minute we are working on solving different unanswered questions that stem from this research.

"For instance, are these evolutionary situations isolated phenomena in these two archipelagos or have those been more common in the evolution of island or continental bird communities? Do these patterns characterise other adaptive radiations in birds?

"Future research will likely solve at least some of these mysteries, bringing us one step closer to understanding better the evolution of the wonderful diversity of shapes in birds."

Credit: 
University of Bristol

A fundamental discovery about how gene activity is regulated

Researchers at Johns Hopkins Bloomberg School of Public Health have discovered a fundamental mechanism that regulates gene activity in cells. The newly discovered mechanism targets RNA, or ribonucleic acid, a close cousin of DNA that plays an important role in cellular activity.

The discovery, detailed in a paper published February 3 in the journal Molecular Cell, is a significant addition to the foundational understanding of how gene activity is regulated, and may ultimately lead to powerful new medical treatments.

The newly discovered mechanism effectively silences or dials down certain active genes as a basic cellular regulatory or quality-control system. It may even act as a defense against viruses. When genes are active, they are copied out into strands of RNA. These RNA strands perform cellular functions on their own or are translated into proteins. The new mechanism destroys RNA strands that have excessively folded over and stuck to themselves to form knots, hairpins, and other structures. These highly structured RNAs can occur during normal processing but could possibly also be caused by misfolding.

The finding is likely to have implications for medical research because many human disorders, including cancers and neurodegenerative diseases, such as ALS (Amyotrophic Lateral Sclerosis) and Huntington disease-like syndromes, involve failures of normal RNA regulation and/or the accumulation of abnormally folded or tangled RNA in affected cells.

"We know that there are mechanisms to clear misfolded proteins from cells--possibly this newly uncovered mechanism is involved in clearing misfolded RNAs," says principal investigator Anthony K. L. Leung, PhD, associate professor in the Department of Biochemistry and Molecular Biology at the Bloomberg School. "This newly discovered mechanism might also help scientists understand how normal cells keep themselves healthy, since RNA structure forms can play a role in cells maintaining cellular equilibrium."

Most of the regulatory and quality-control mechanisms that modulate the levels of RNAs in cells target RNAs containing specific sequences of nucleotides--the building blocks of RNAs. The newly discovered mechanism is unique in that it recognizes not sequences but a broad variety of structures formed where RNA strands, which are relatively sticky, have folded back onto themselves.

Leung and his team discovered the new mechanism while investigating a protein called UPF1, which is known to work in other RNA regulation pathways. They found that UPF1 and a partner protein called G3BP1 work together in the new mechanism, targeting only RNAs that contain a high level of structures. When the researchers depleted UPF1 or G3BP1 from cells to shut off the new mechanism, levels of highly structured RNAs rose sharply. The team also confirmed that the new mechanism, which they call structure-mediated RNA decay, is distinct from all other known RNA-removal mechanisms and works across different types of RNA throughout the genome.

"Based on further analyses, we predict that this structure-mediated RNA decay pathway could regulate at least one-fourth of human protein-coding genes and one-third of a class of non-coding genes called circular RNA," Leung says.

Leung and his colleagues now are following up to determine how this RNA decay mechanism actually targets and destroys RNAs. They also are investigating why this mechanism exists. Its functions, they speculate, may include the regulation of specific functional variants of protein-coding RNAs as well as the general disposal of RNAs that have acquired excessive loops and other structures.

The new mechanism may even have an antiviral role, the authors say. "Some single-stranded RNA viruses that are highly structured, such as poliovirus, have ways to get rid of the G3BP1 protein when they infect a cell," Leung says. "Possibly that's because this G3BP1-UPF1 RNA-decay pathway is otherwise a major threat to them."

Credit: 
Johns Hopkins Bloomberg School of Public Health

New score measuring multiple chronic illnesses performs better than current method

A new score that measures multiple long-term health conditions performs better than the current Charlson Comorbidity Index and may help in health care planning and delivery, according to new research in CMAJ (Canadian Medical Association Journal): http://www.cmaj.ca/lookup/doi/10.1503/cmaj.190757

"Multimorbidity scores offer a means of identifying those patients in the population who are most likely to benefit from a tailored approach to care, helping clinicians to prioritize their efforts accordingly," writes Dr. Rupert Payne, Centre for Academic Primary Care, University of Bristol, Bristol, United Kingdom, with coauthors.

Researchers from the United Kingdom developed and tested a measure of multiple illnesses, called the Cambridge Multimorbidity Score, using data from general practitioner records in the United Kingdom. They looked at 37 comorbidities and associated outcomes, such as general practitioner visits, unplanned hospital admissions and death.

"The score outperforms the widely used Charlson index across all outcomes. Performance is best for predicting death, particularly after adjusting for age and gender, and least good for predicting consultations with primary care physicians," says Dr. Payne.

The Cambridge Multimorbidity Score can be a useful predictor of future health care use, including emergency department visits and primary care utilization.

"These scores may be of considerable value for policy development and health care priority-setting, providing accurate, easy-to-implement ways of optimizing health care delivery to an aging population with multiple illnesses," says Dr. Payne.

Credit: 
Canadian Medical Association Journal

Invest in social equity to improve health for low-income people

Canada must invest in social spending and recognize that our health care system is not "universal" if Canadians living in low-income neighbourhoods are to have the same chance of good health as other Canadians, argues an editorial in CMAJ (Canadian Medical Association Journal).

People living in poorer neighbourhoods are at higher risk of dying from preventable diseases than people in affluent neighbourhoods. Even when there are no financial barriers, people with low income access health care less frequently. For example, only 54% of women living in the poorest neighbourhoods in Ontario completed screening for cervical cancer, compared with 67% of women in the richest urban neighbourhoods.

"These differences come with the human cost of thousands of avoidable deaths every year and are particularly harrowing for Indigenous Peoples," write Dr. Andrew Boozary, Executive Director of Health and Social Policy, University Health Network, Toronto, Ontario and Dr. Andreas Laupacis, CMAJ's editor-in-chief. "The persistence of these disparities amounts to discrimination against Canada's most disadvantaged populations."

Why is there such disparity in health outcomes? Poverty is associated with many health risks linked to social context, including housing insecurity, isolation, unhealthy food options and substance use disorders. In an international ranking of high-income countries, Canada came last on social program spending in 2017.

As well, publicly funded health care in Canada covers only certain services, leaving out coverage of prescription drugs for many Canadians, mental health counselling, most home care services and physiotherapy.

Within the health care system itself, differences in access and outcomes according to income persist. "It is well past time that we act on the undeniable importance of the social determinants of health and remedy the inequities within the health care system itself."

The authors call for significant public investment, innovative approaches and political will to level the playing field.

Credit: 
Canadian Medical Association Journal

Novel compound is promising drug candidate for Alzheimer's disease

image: Selective gamma-secretase inhibitor blocks substrate on amyloid precursor protein.

Image: 
Rensselaer Polytechnic Institute

TROY, N.Y. -- A newly identified compound is a promising candidate for inhibiting the production of amyloids, the abnormal proteins that form toxic clumps, called fibrils, inside the brains of patients with Alzheimer's disease. As published today in the Royal Society of Chemistry's Chemical Communications, the compound -- known as "C1" -- uses a novel mechanism to efficiently prevent the enzyme gamma-secretase from producing amyloids.

Amyloid fibrils are largely composed of the peptide Amyloid beta, which is produced when enzymes, including gamma secretase, make cuts to the amyloid precursor protein found in high concentrations the membrane of brain cells. C1 is a covalent gamma-secretase inhibitor that blocks the active site on the precursor protein where gamma-secretase would bind to transform it into amyloids, rather than - as traditional enzyme inhibitors do - blocking the active site on gamma-secretase itself.

"Historically, drug trials for gamma secretase inhibitors failed because traditional enzyme inhibitors have severe side effects. They stopped all of the normal functions of gamma secretase," said Chunyu Wang, a professor of biological sciences and member of the Center for Biotechnology and Interdisciplinary Studies (CBIS) at Rensselaer Polytechnic Institute. "Our compound binds to the cleavage site of the precursor protein instead of the enzyme itself, which may avoid many problems associated with traditional enzyme inhibitors."

In 2018, with support from the Warren Alpert Foundation, Wang began screening drugs to identify a compound that targets the amyloid precursor protein substrate, which would block the activity of gamma secretase involved in amyloid production while allowing all other functions. He began the search with "in silico screening," using computer modeling to test tens of millions of compounds.

C1 was one of several candidates to emerge from that screening. As described in the paper, C1 blocks amyloid production with high efficiency when present at micromolar concentrations, both in test tubes and in cell culture. The research is patent pending.

C1 is a covalent inhibitor, meaning it forms a chemical bond with its target. Wang said that because of their permanent bond, covalent inhibitors are more durable than their non-covalent counterparts. Covalent inhibitors make up about one-third of the drug market, even though they have traditionally been viewed as having a higher risk of causing immune reactivity. In recent years, there is surge in the development of covalent inhibitors, as more highly specific covalent inhibitors showed excellent efficacy towards challenging drug targets.

"With a new approach to tackling the principal pathology of Alzheimer's disease, Chunyu's work is generating a fresh roster of drug candidates with enormous promise," said Deepak Vashishth, the director of CBIS. "His works speaks to the power of the interdisciplinary culture of research at CBIS, and we are pleased with this early result."

Credit: 
Rensselaer Polytechnic Institute

Meat isn't good for you

Eating red meat, processed meat or poultry raises risk of cardiovascular disease

Eating meat - but not poultry- raises risk of dying from all causes

New findings contradict a recent controversial study saying people don't need to reduce their consumption of red meat and processed meat

CHICAGO --- Drop the steak knife. After a controversial study last fall recommending that it was not necessary for people to change their diet in terms of red meat and processed meat, a large, carefully analyzed new study links red and processed meat consumption with slightly higher risk of heart disease and death, according to a new study from Northwestern Medicine and Cornell University.

Eating two servings of red meat, processed meat or poultry -- but not fish -- per week was linked to a 3 to 7% higher risk of cardiovascular disease, the study found. Eating two servings of red meat or processed meat -- but not poultry or fish -- per week was associated with a 3% higher risk of all causes of death.

"It's a small difference, but it's worth trying to reduce red meat and processed meat like pepperoni, bologna and deli meats," said senior study author Norrina Allen, associate professor of preventive medicine at Northwestern University Feinberg School of Medicine. "Red meat consumption also is consistently linked to other health problems like cancer."

"Modifying intake of these animal protein foods may be an important strategy to help reduce the risk of cardiovascular disease and premature death at a population level," said lead study author Victor Zhong, assistant professor of nutritional sciences at Cornell, who did the research when he was a postdoctoral fellow in Allen's lab.

The paper will be published Feb. 3 in JAMA Internal Medicine.

The new findings come on the heels of a controversial meta-analysis published last November that recommended people not reduce the amount of red meat and processed meat they eat. "Everyone interpreted that it was OK to eat red meat, but I don't think that is what the science supports," Allen said.

"Our study shows the link to cardiovascular disease and mortality was robust," Zhong said.

What should we eat?

"Fish, seafood and plant-based sources of protein such as nuts and legumes, including beans and peas, are excellent alternatives to meat and are under-consumed in the U.S.," said study coauthor Linda Van Horn, professor of preventive medicine at Feinberg who also is a member of the 2020 U.S. Dietary Guidelines Advisory committee.

The study found a positive association between poultry intake and cardiovascular disease, but the evidence so far isn't sufficient to make a clear recommendation about poultry intake, Zhong said. Still, fried chicken is not recommended.

The new study pooled together a large diverse sample from six cohorts, included long follow-up data up to three decades, harmonized diet data to reduce heterogeneity, adjusted a comprehensive set of confounders and conducted multiple sensitivity analyses. The study included 29,682 participants (mean age of 53.7 years at baseline, 44.4% men and 30.7% non-white). Diet data were self-reported by participants, who were asked a long list of what they ate for the previous year or month.

Key findings:

A 3 to 7% higher risk of cardiovascular disease and premature death for people who ate red meat and processed meat two servings a week.

A 4% higher risk of cardiovascular disease for people who ate two servings per week of poultry, but the evidence so far is not sufficient to make a clear recommendation about poultry intake. And the relationship may be related to the method of cooking the chicken and consumption of the skin rather than the chicken meat itself.

No association between eating fish and cardiovascular disease or mortality.

Limitations of the study are participants' dietary intake was assessed once, and dietary behaviors may have changed over time. In addition, cooking methods were not considered. Fried chicken, especially deep fat-fried sources that contribute trans-fatty acids, and fried fish intake have been positively linked to chronic diseases, Zhong said.

Credit: 
Northwestern University

Lower protein diet may lessen risk for cardiovascular disease

Hershey, Pa. -- A plant-based diet may be key to lowering risk for heart disease. Penn State researchers determined that diets with reduced sulfur amino acids -- which occur in protein-rich foods, such as meats, dairy, nuts and soy -- were associated with a decreased risk for cardiovascular disease. The team also found that the average American consumes almost two and a half times more sulfur amino acids than the estimated average requirement.

Amino acids are the building blocks of proteins. A subcategory, called sulfur amino acids, including methionine and cysteine, play various roles in metabolism and health.

"For decades it has been understood that diets restricting sulfur amino acids were beneficial for longevity in animals," said John Richie, a professor of public health sciences at Penn State College of Medicine. "This study provides the first epidemiologic evidence that excessive dietary intake of sulfur amino acids may be related to chronic disease outcomes in humans."

Richie led a team that examined the diets and blood biomarkers of more than 11,000 participants from a national study and found that participants who ate foods containing fewer sulfur amino acids tended to have a decreased risk for cardiometabolic disease based on their bloodwork.

The team evaluated data from the Third National Examination and Nutritional Health Survey. They compiled a composite cardiometabolic disease risk score based on the levels of certain biomarkers in participants' blood after a 10-16 hour fast including cholesterol, triglycerides, glucose and insulin.

"These biomarkers are indicative of an individual's risk for disease, just as high cholesterol levels are a risk factor for cardiovascular disease," Richie said. "Many of these levels can be impacted by a person's longer-term dietary habits leading up to the test."

Participants were excluded from the study if they reported having either congestive heart failure, heart attack or a reported change in diet due to a heart disease diagnosis. Individuals were also omitted if they reported a dietary intake of sulfur amino acids below the estimated average requirement of 15 mg/kg/day recommended by the Food and Nutrition Board of the National Academy of Medicine.

For a person weighing 132 pounds, food choices for a day that meet the requirement might include a medium slice of bread, a half an avocado, an egg, a half cup of raw cabbage, six cherry tomatoes, two ounces of chicken breast, a cup of brown rice, three quarters of a cup of zucchini, three tablespoons of butter, a cup of spinach, a medium apple, an eight inch diameter pizza and a tablespoon of almonds.
Nutritionists collected information about participants' diets by doing in-person 24-hour recalls. Nutrient intakes were then calculated using the U.S. Department of Agriculture Survey Nutrient Database.

After accounting for body weight, the researchers found that average sulfur amino acid intake was almost two and a half times higher than the estimated average requirement. Xiang Gao, associate professor and director of the nutritional epidemiology lab at the Penn State University and co-author of the study, published today (Feb. 3) in Lancet EClinical Medicine, suggested this may be due to trends in the average diet of a person living in the United States.

"Many people in the United States consume a diet rich in meat and dairy products and the estimated average requirement is only expected to meet the needs of half of healthy individuals," Gao said. "Therefore, it is not surprising that many are surpassing the average requirement when considering these foods contain higher amounts of sulfur amino acids."

The researchers found that higher sulfur amino acid intake was associated with a higher composite cardiometabolic risk score after accounting for potential confounders like age, sex and history of diabetes and hypertension. They also found that high sulfur amino acid intake was associated with every type of food except grains, vegetables and fruit.

"Meats and other high-protein foods are generally higher in sulfur amino acid content," said Zhen Dong, lead author on the study and College of Medicine graduate. "People who eat lots of plant-based products like fruits and vegetables will consume lower amounts of sulfur amino acids. These results support some of the beneficial health effects observed in those who eat vegan or other plant-based diets."

Dong said that while this study only evaluated dietary intake and cardiometabolic disease risk factors at one point in time, the association between increased sulfur amino acid intake and risk for cardiometabolic disease was strong. She said the data supports the formation of a prospective, longitudinal study evaluating sulfur amino acid intake and health outcomes over time.

"Here we saw an observed association between certain dietary habits and higher levels of blood biomarkers that put a person at risk for cardiometabolic diseases," Richie said. "A longitudinal study would allow us to analyze whether people who eat a certain way do end up developing the diseases these biomarkers indicate a risk for."

Credit: 
Penn State