Earth

Genetic changes in tumours could help predict if patients will respond to immunotherapy

Researchers at the Francis Crick Institute, the UCL Cancer Institute, and the Cancer Research UK Lung Cancer Centre of Excellence have identified genetic changes in tumours which could be used to predict if immunotherapy drugs would be effective in individual patients.

Immunotherapies have led to huge progress treating certain types of cancer, but only a subset of patients respond, and hence a challenge for doctors and researchers is understanding why they work in some people and not others, and predicting who will respond well to treatment.

In their paper, published in Cell today (27 January), the scientists looked for genetic and gene expression changes in tumours in over 1,000 patients being treated with checkpoint inhibitors, a type of immunotherapy which stops cancer cells from switching off the body's immune response*.

They found that the total number of genetic mutations which are present in every cancer cell in a patient was the best predictor for tumour response to immunotherapy. The more mutations present in every tumour cell, the more likely they were to work. In addition, expression of gene CXCL9 was found to be a critical driver of an effective anti-tumour immune response.

The researchers also looked at the cases where checkpoint inhibitors had not been effective. For example, having more copies of a gene called CCND1 was linked to tumours being resistant to checkpoint inhibitors. More research is needed, but the scientists suggest that patients with this mutation in their tumours may benefit more from alternative drug treatment options.

Kevin Litchfield, co-lead author, visiting scientist at the Crick and group leader of the Tumour Immunogenomics and Immunosurveillance lab at UCL says: "This is the largest study of its kind, analysing genetic and gene expression data from across seven types of cancer and over a thousand people.

"It has enabled us to pinpoint the specific genetic factors which determine tumour response to immunotherapy, and combine them into a predictive test to identify which patients are most likely to benefit from therapy. Furthermore, it has improved our biological understanding of how immunotherapy works, which is vital for the design and development of new improved immunotherapeutic drugs."

The researchers are now working with clinical partners in Denmark to see if their test correctly identifies the patients who will or will not respond to checkpoint inhibitors and if this is more accurate than tests currently available.

Charles Swanton, chief clinician at Cancer Research UK and group leader at the Crick and UCL says and a lead author of the study: "Checkpoint inhibitors are really valuable in treating a number of cancers, including skin and lung cancers. But sadly, they do not always work and they can also sometimes cause severe side effects.

"If doctors have an accurate test, that tells them whether these drugs are likely to be effective in each individual patient, they will be able to make more informed treatment decisions. Crucially, they will be able to more quickly look for other options for patients who these drugs are unlikely to help."

Michelle Mitchell, chief executive of Cancer Research UK, says: "One of the main roadblocks preventing us from unleashing the full potential of immunotherapies is that we don't fully understand how these drugs work, or why this type of treatment doesn't benefit everyone. And we still can't fully predict who will respond to these expensive treatments.

"This new research has furthered our understanding around these issues, revealing new drug development tactics and approaches to treatment. It's fantastic to think of a future where we give patients a simple test before they start their immunotherapy to find out if this is the right course of treatment for them. This not only will spare patients from taking needless treatment and enduring serious side effects that might come with it, but it could also save the NHS treatment costs."

The work was partly funded by Cancer Research UK, the Royal Society, the Wellcome Trust, the Medical Research Council and Rosetrees Trust, among others.

Credit: 
The Francis Crick Institute

Researchers use car collisions with deer to study mysterious animal-population phenomena

image: Dan Reuman and colleagues at the University of Kansas have written a new study in the peer-reviewed journal Ecology Letters. By parsing data on weather, deer populations and deer-vehicle collisions in Wisconsin, the investigators show spatial synchrony could be driving population cycles, rather than the reverse.

Image: 
U.S. Fish and Wildlife Service

LAWRENCE -- For at least a century, ecologists have wondered at the tendency for populations of different species to cycle up and down in steady, rhythmic patterns.

"These cycles can be really exaggerated -- really huge booms and huge busts -- and quite regular," said Daniel Reuman, professor of ecology & evolutionary biology at the University of Kansas and senior scientist at the Kansas Biological Survey. "It attracted people's attention because it was kind of mysterious. Why would such a big thing be happening?"

A second observation in animal populations might be even harder to fathom: Far-flung communities of species, sometimes separated by hundreds of miles, often fluctuate in synchrony with one another -- an effect known as "spatial synchrony."

Now, Reuman and colleagues have written a new study in the peer-reviewed journal Ecology Letters showing these two effects to be linked, but not in the way that could be expected. By parsing data on weather, deer populations and deer-vehicle collisions in Wisconsin, the investigators show spatial synchrony could be driving population cycles, rather than the reverse.

Reuman compared the linked population phenomena to a famous physics experiment where two grandfather clocks are placed next to each other against a wall.

"Over time, the pendulums become synchronized," he said. "The reason is because both produce tiny vibrations in the wall. And the vibrations from one of them in the wall influences the other one just a little bit -- enough to get the pendulums to eventually become synchronous. One reason people think these cycling populations are easy to synchronize is if a few individuals can get from one to the other, like vibrations that go through the wall for the grandfather clocks. It's enough to bring these cycling populations into synchrony. That's how people thought about things before we started our work with this paper."

But Reuman and his co-authors describe this process can actually go the other way around. The researchers found weather patterns driven by El Nino influenced predictable fluctuations in deer populations across the state as well as synchrony between different deer populations.

Looking at datasets on local temperature and snowfall variations across the state, the team averaged them out, finding "buried underneath all of that randomness a hardly noticeable, but synchronous fluctuation," Reuman said.

The three-to-seven-year weather fluctuation directly influenced synchronous population cycles in the state's deer.

"All that local variation would cancel out because it might be a little bit warmer in one place, a little bit colder and another place -- but that overall synchronous component, which is related to El Nino in this case, reinforces all the local variation," Reuman said. "And it's the same years with deer. So, the reason why the synchrony is causing the cycling is because the synchrony is occurring only on the relevant timescales of the fluctuation. It's only that component of the three-to-seven-year oscillations that synchronize. All the faster and slower oscillations are all local variation that cancels out when you average across the whole state."

Moreover, the researchers found these deer population fluctuations predicted the numbers of car collisions with deer statewide more than traffic volume or other factors.

"It was a surprise to us when we figured out that that's what was going on," Reuman said. "What it amounts to is a new mechanism for these major population cycles and a new way that they can come about. That's fundamentally different from the old way that people were thinking about it."

Lead author Tom Anderson, assistant professor at Southern Illinois University Edwardsville, said the work shows it's "still possible to discover new information about well-studied scientific phenomena."

"Researchers have been examining population cycles for more than 100 years, yet our study still uncovered new information," Anderson said. "That is partly what makes science, and this project in particular, exciting, to be able to uncover new ways of thinking about something that others have thought about extensively. Our work also has important implications in a variety of other areas, including how fluctuations in populations of plants or animals will respond to climate change and that organisms that are economically and socially important to humans, like white-tailed deer, can undergo periods of high and low abundance due to naturally occurring processes across large spatial scales, which might have implications for their subsequent management."

According to co-author Lawrence Sheppard, postdoctoral researcher with the KU Department of Ecology & Evolutionary Biology and the Kansas Biological Survey, the unexpected relationship between spatial synchrony and population cycles was revealed by "new methods to study the different timescales of change in an ecosystem."

"We trace how particular timescales of change arise in the data and are communicated from one part of the system to another using 'wavelets,' which I first learned to apply to biomedical data during my Ph.D.," Sheppard said. "In particular, here we find that spatial synchrony on a particular timescale arises from an association with winter climate on that timescale, and the spatial synchrony in the deer population has a substantial statewide impact on human interactions with the deer."

Additional authors were Jonathan Walter of KU and the University of Virginia and Robert Rolley of the Wisconsin Department of Natural Resources.

Reuman said the findings could transfer to a wide range of other species and ecological systems, with ramifications for agriculture, fisheries, transportation managers and the insurance industry.

"We started out trying to understand the nature of synchrony in these things and trying to figure out what was causing it, and what its consequences are," Reuman said. "It's turned out to be related to these overall climatic indices. Now for deer, basically it's bad winter weather that we're talking about that synchronizes things. For another particular species, the nature of their relationship with the weather in a location is going to make the difference."

Credit: 
University of Kansas

In a tight spot

image: The interaction of two brain regions helps zebrafish decide which predator to flee from.

Image: 
MPI of Neurobiology/ Kuhl

Being constantly flooded by a mass of stimuli, it is impossible for us to react to all of them. The same holds true for a little fish. Which stimuli should it pay attention to and which not? Scientists at the Max Planck Institute of Neurobiology have now deciphered the neuronal circuit that zebrafish use to prioritize visual stimuli. Surrounded by predators, a fish can thus choose its escape route from this predicament.

Even though we are not exposed to predators, we still have to decide which stimuli we pay attention to - for example, when crossing a street. Which cars should we avoid, which ones can we ignore?

"The processes in the brain and the circuits that lead to this so-called selective attention are largely unexplored," explains Miguel Fernandes, a postdoctoral researcher in Herwig Baier's department. "But if we understand this in a simple animal model like the zebrafish, it can give us fundamental insights into decision-making mechanisms in humans."

For this reason, Miguel Fernandes and his colleagues studied the behavior of zebrafish in the predicament described above: Using virtual reality, the team simulated two predators approaching a fish from the left and right at the same speed. In most cases, the fish focused on one of the two predators and fled in the opposite direction. They thus integrated only one, the so-called "winner stimulus", into their escape route (winner-take-all strategy).

However, in some cases, the fish evaluated both stimuli and swam through the middle (averaging strategy). This showed that fish are in principle able to include both threats in their escape way. However, they usually pay attention to only one stimulus.

Two brain regions involved

With the knowledge gained from this behavioral analysis, the researchers investigated which brain regions are active during stimulus selection. In the nearly transparent zebrafish, they identified under the microscope two brain regions: the tectum, the processing hub for visual stimuli, and an appendage of it, the so-called nucleus isthmi (NI).

To determine the role of the NI more precisely, the researchers inactivated neurons in this brain region. Interestingly, in virtual reality experiments, the fish now used predominately the averaging strategy instead of the winner-take-all strategy - a sign that the NI plays a crucial role in determining a winner-stimulus.

By tracking down the cell extensions of the neurons, the scientists decoded the circuit between the two brain regions: Tectal neurons extend to the NI, whose cells, in turn, innervate the tectum. This creates a feedback loop that enhances the signals of winner-stimuli in the brain. All other stimuli classified as unimportant, on the other hand, are suppressed.

With this newly discovered circuit, the brain assigns a specific level of importance to all optical perceptions. As a basis for decision-making, this allows fish to react to important stimuli and ignore unimportant ones. Researchers can now continue to investigate for example how experience or stress influence the fish's reaction.

Credit: 
Max-Planck-Gesellschaft

Parkinson's disease risk and severity is tied to a channel in cells' 'recycling centers

Many genetic mutations have been found to be associated with a person's risk of developing Parkinson's disease. Yet for most of these variants, the mechanism through which they act remains unclear.

Now a new study in Nature led by a team from the University of Pennsylvania has revealed how two different variations--one that increases disease risk and leads to more severe disease in people who develop Parkinson's and another that reduces risk--manifest in the body.

The work, led by Dejian Ren, a professor in the School of Arts & Sciences' Department of Biology, showed that the variation that raises disease risk, which about 17% of people possess, causes a reduction in function of an ion channel in cellular organelles called lysosomes, also known as cells' waste removal and recycling centers. Meanwhile, a different variation that reduces Parkinson's disease risk by about 20% and is present in 7% of the general population enhances the activity of the same ion channel.

"We started with the basic biology, wanting to understand how these lysosomal channels are controlled," says Ren. "But here we found this clear connection with Parkinson's disease. To see that you can have a variation in an ion channel gene that can change the odds of developing Parkinson's both ways--increasing and decreasing it--is highly novel."

The fact that the channel seems to play a crucial role in Parkinson's also makes it an appealing potential target for a drug that could slow the disease's progression, the researchers note.

Scientists have understood since the 1930s that cells use carefully regulated ion channels embedded in their plasma membrane to control crucial aspects of their physiology, such as shuttling electrical impulses between neurons and from neurons to muscles.

But it wasn't until the past decade that researchers began to appreciate that the organelles within cells that have membranes, including endosomes and lysosome, also relied on ion channels to communicate.

"One reason is it's hard to look at them because organelles are really small," Ren says. During the last several years, his lab overcame this technical challenge and began studying these membrane channels and measuring the current of ions that crosses through them.

These ions pass through channel proteins that open and close in response to specific factors. About five years ago, Ren's group identified one membrane protein, TMEM175, that forms a channel allowing potassium ions to move in and out.

Around the same time, other teams doing genome-wide association studies found two variations in TMEM175 that influenced Parkinson's disease risk, turning it up or down.

"One variation is associated with a 20-25% increase in the odds of getting Parkinson's in the general population," Ren says. "And if you look only at people who have been diagnosed with Parkinson's, the frequency of that variation is even higher."

Intrigued by the connection, Ren reached out to Penn physician-scientist Alice Chen-Plotkin, who works with patients who have Parkinson's, to collaborate. In data from Parkinson's disease patients, she and colleagues found that motor and cognitive impairments progressed more rapidly in those patients who carried one of the TMEM175 genetic variations Ren was studying.

To find out what this variation was actually doing in cells, Ren's lab turned a close eye to lysosomes. In isolation, they found that the potassium current through TMEM175 was activated by growth factors, proteins like insulin that respond to the presence of nutrients in the body. And they confirmed that TMEM175 appeared to be the only active potassium channel in mouse lysosomes.

"When you starve a cell, this protein is not functional anymore," Ren says. "That was exciting to us because that tells us this is a major mechanism that can be used by the organelle to receive communications from the outside of the cell and maybe send communication back out."

They found that a kinase enzyme called AKT, which is typically thought to achieve its ends by adding a small molecule called a phosphate group to whatever protein it is acting upon, joined with TMEM175 to open the protein channel. But AKT opened it without introducing a phosphate group. "The textbook definitation of a kinase is that it phosphorylates proteins," Ren says. "To find this kinase acting without doing that was very surprising."

They next turned to mice genetically engineered to carry the same variations that had been found in the human population to see how the genetic changes affected the animals' ion channel activity. Mice with the disease-risk-increasing mutation had a potassium current of just about 50% of that of normal mice, and that current was extinguished in the absence of growth factors. In contrast, the ion channels in mice with the disease-risk-reducing mutation continued operating for several hours in the absence of growth factors, even longer than they did in normal mice.

"This tells you this mutation is somehow helping the mice resist the effects of nutrient depletion," Ren says.

To measure effects on neurons, they observed that the neurons with the mutation in cell culture associated with more severe Parkinson's were more susceptible to damage from toxins and nutrient depletion. "If the same is true in human neurons, that means 17% of the population carries a variation that may make their neurons more damaged when subjected to stressors," says Ren.

Collaborating with Penn researcher Kelvin Luk, the investigators looked at levels of misfolded protein in neurons in cell culture. Known in humans as Lewy bodies and a defining characteristic of Parkinson's, these inclusions increased "strikingly" within neurons when TMEM175 function declined, Ren says. This is likely due to an impairment in the function of lysosomes, which normally help digeset and recycle waste generated by the cell.

And, also associated with human Parkinson's, mice lacking TMEM175 lost a portion of the neurons that produce the neurotransmitter dopamine and performed worse on tests of coordination than normal mice.

Together with the findings in humans, the researchers believe their work points to a significant contributor to the pathology of Parkinson's disease. Moving forward, Ren's group hopes to delve deeper into the mechanism through which this ion channel is regulated. Their research may shed light not only on the molecular impairments involved in Parkinson's but also in other neurodegenerative diseases, particular those related to lysosomes, which include a number of rare but very severe conditions.

They'd also like to know, since this predisposing variation is carried by so many people, if it also influences how other genetic mutations contribute to the likelihood someone develops Parkinson's.

Credit: 
University of Pennsylvania

Global analysis suggests COVID-19 is seasonal

image: University of Illinois researchers, including Gustavo Caetano-Anolles, say COVID-19 is likely to become seasonal after the initial pandemic is brought under control.

Image: 
L. Brian Stauffer, University of Illinois

URBANA, Ill. - With cities around the globe locking down yet again amid soaring COVID-19 numbers, could seasonality be partially to blame? New research from the University of Illinois says yes.

In a paper published in Evolutionary Bioinformatics, Illinois researchers show COVID-19 cases and mortality rates, among other epidemiological metrics, are significantly correlated with temperature and latitude across 221 countries.

"One conclusion is that the disease may be seasonal, like the flu. This is very relevant to what we should expect from now on after the vaccine controls these first waves of COVID-19," says Gustavo Caetano-Anollés, professor in the Department of Crop Sciences, affiliate of the Carl R. Woese Institute for Genomic Biology at Illinois, and senior author on the paper.

The seasonal nature of viral diseases is so widespread that it has become part of the English vernacular. For example, we often speak of the "flu season" to describe the higher incidence of influenza during the cold winter months. Early in the pandemic, researchers and public health officials suggested SARS-CoV-2 may behave like other coronaviruses, many of which rear their heads in fall and winter. But data was lacking, especially on the global scale. The work of Caetano-Anollés and his students fills that specific knowledge gap.

First, the researchers downloaded relevant epidemiological data (disease incidence, mortality, recovery cases, active cases, testing rate, and hospitalization) from 221 countries, along with their latitude, longitude, and average temperature. They pulled the data from April 15, 2020, because that date represents the moment in a given year in which seasonal temperature variation is at its maximum across the globe. That date also coincided with a time during the early pandemic when COVID-19 infections were peaking everywhere.

The research team then used statistical methods to test if epidemiological variables were correlated with temperature, latitude, and longitude. The expectation was that warmer countries closer to the equator would be the least affected by the disease.

"Indeed, our worldwide epidemiological analysis showed a statistically significant correlation between temperature and incidence, mortality, recovery cases, and active cases. The same tendency was found with latitude, but not with longitude, as we expected," Caetano-Anollés says.

While temperature and latitude were unmistakably correlated with COVID-19 cases, the researchers are quick to point out climate is only one factor driving seasonal COVID-19 incidence worldwide.

They accounted for other factors by standardizing raw epidemiological data into disease rates per capita and by assigning each country a risk index reflecting public health preparedness and incidence of co-morbidities in the population. The idea was that if the disease was surging in countries with inadequate resources or higher-than-average rates of diabetes, obesity, or old age, the risk index would appear more important in the analysis than temperature. But that wasn't the case. The index did not correlate with the disease metrics at all.

Earlier work from Caetano-Anollés and his coworkers identified areas in the SARS-CoV-2 virus genome undergoing rapid mutation, some represented in the new virus variant out of Britain, and other genomic regions becoming more stable. Since similar viruses show seasonal upticks in mutation rates, the research team looked for connections between mutational changes in the virus and temperature, latitude, and longitude of the sites from which genomes were sampled worldwide.

"Our results suggest the virus is changing at its own pace, and mutations are affected by factors other than temperature or latitude. We don't know exactly what those factors are, but we can now say seasonal effects are independent of the genetic makeup of the virus," Caetano-Anollés says.

Caetano-Anollés notes more research is needed to explain the role of climate and seasonality in COVID-19 incidences, but he suggests the impact of policy, such as mask mandates, and cultural factors, such as the expectation to look out for others, are key players as well. However, he doesn't discount the importance of understanding seasonality in battling the virus.

The researchers say our own immune systems could be partially responsible for the pattern of seasonality. For example, our immune response to the flu can be influenced by temperature and nutritional status, including vitamin D, a crucial player in our immune defenses. With lower sun exposure during the winter, we don't make enough of that vitamin. But it's too soon to say how seasonality and our immune systems interact in the case of COVID-19.

"We know the flu is seasonal, and that we get a break during the summer. That gives us a chance to build the flu vaccine for the following fall," Caetano-Anollés says. "When we are still in the midst of a raging pandemic, that break is nonexistent. Perhaps learning how to boost our immune system could help combat the disease as we struggle to catch up with the ever-changing coronavirus."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Diving into devonian seas: Ancient marine faunas unlock secrets of warming oceans

image: Prof. Newton's scholarly work involves studies of modern and ancient biodiversity, including the quantitative dynamics of ancient and modern mass extinction.

Image: 
Syracuse University

Members of Syracuse University's College of Arts and Sciences are shining new light on an enduring mystery--one that is millions of years in the making.

A team of paleontologists led by Professor Cathryn Newton has increased scientists' understanding of whether Devonian marine faunas, whose fossils are lodged in a unit of bedrock in Central New York known as the Hamilton Group, were stable for millions of years before succumbing to waves of extinctions.

Drawing on 15 years of quantitative analysis with fellow professor Jim Brower (who died in 2018), Newton has continued to probe the structure of these ancient fossil communities, among the most renowned on Earth.

The group's findings, reported by the Geological Society of America (GSA), provide critical new evidence for the unusual, long-term stability of these Devonian period communities.

Such persistence, Newton says, is a longstanding scientific enigma. She and her colleagues tested the hypothesis that these ancient communities displayed coordinated stasis--a theory that attempts to explain the emergence and disappearance of species across geologic time.

Newton and Brower, along with their student Willis Newman G'93, found that Devonian marine communities vary more in species composition than the theory predicts. Newton points out that they sought not to disprove coordinated stasis but rather to gain a more sophisticated understanding of when it is applicable. "Discovering more about the dynamics of these apparently stable Devonian communities is critical," she says. "Such knowledge has immediate significance for marine community changes in our rapidly warming seas."

Since geologist James Hall Jr. first published a series of volumes on the region's Devonian fossils and strata in the 1840s, the Hamilton Group has become a magnet for research scientists and amateur collectors alike. Today, Central New York is frequently used to test new ideas about large-scale changes in Earth's organisms and environments.

During Middle Devonian time (approximately 380-390 million years ago), the faunal composition of the region changed little over 4-6 million years. "It's a significant amount for marine invertebrate communities to remain stable, or 'locked,'" explains Newton, a professor in the Department of Earth and Environmental Sciences.

She, Brower and student researchers spent years examining eight communities of animals that once dwelled in a warm, shallow sea on the northern rim of the Appalachian Basin (which, eons ago, lay south of the equator). When the organisms died, sediment from the seafloor began covering their shells and exoskeletons. Minerals from the sediment gradually seeped into their remains, causing them to fossilize. The process also preserved many of them in living position, conserving original shell materials at some sites.

These fossils currently populate exposed bedrock throughout Central New York, ranging from soft, dark, deep-water shale to hard, species-rich, shelf siltstone. "Communities near the top of the bedrock exhibit more taxonomic and ecological diversity than those at the bottom," Newton says. "We can compare the community types and composition through time. They are remarkable sites."

Coordinated stasis has been a source of contention since 1995, when it was introduced. At the center of the dispute are two model-based explanations: environmental tracking and ecological locking.

Environmental tracking suggests that faunas follow their environment. "Here, periods of relative stasis are flanked by coordinated extinctions or regional disappearances. When the environment changes, so do marine faunas," says Newton, also Professor of Interdisciplinary Sciences and Dean Emerita of Arts and Sciences.

Ecological locking, in contrast, views marine faunas as tightly structured communities, resistant to large-scale taxonomic change. Traditionally, this model has been used to describe the stability of lower Hamilton faunas.

Newton and her colleagues analyzed more than 80 sample sites, each containing some 300 specimens. Special emphasis was placed on the Cardiff and Pecksport Members, two rock formations in the Finger Lakes region that are part of the ancient Marcellus subgroup, famed for its natural gas reserves.

"We found that lower Hamilton faunas, with two exceptions, do not have clear counterparts among upper ones. Therefore, our quantitative tests do not support the ecological locking model as an explanation for community stability in these faunas," she continues.

Newton considers this project a final tribute to Newman, a professor of biology at the State University of New York at Cortland, who died in 2014, and Brower, who fell seriously ill while the manuscript was being finalized. "Jim knew that he likely would not live to see its publication," says Newton, adding that Brower died as the paper was submitted to GSA.

She says this new work extends and, in some ways, completes the team's earlier research by further analyzing community structures in the Marcellus subgroup. "It has the potential to change how scientists view long-term stability in ecological communities."

Credit: 
Syracuse University

Ultra-absorptive nanofiber swabs could improve SARS-CoV-2 test sensitivity

image: A new type of nanofiber swab could improve sample collection and test sensitivity for SARS-CoV-2 and other biological specimens; ruler at left shows centimeters.

Image: 
Adapted from <i>Nano Letters</i> <b>2021</b>, DOI: 10.1021/acs.nanolett.0c04956

Rapid, sensitive diagnosis of COVID-19 is essential for early treatment, contact tracing and reducing viral spread. However, some people infected with SARS-CoV-2 receive false-negative test results, which might put their and others' health at risk. Now, researchers reporting in ACS' Nano Letters have developed ultra-absorptive nanofiber swabs that could reduce the number of false-negative tests by improving sample collection and test sensitivity.

Currently, the most sensitive test for COVID-19 involves using a long swab to collect a specimen from deep inside a patient's nose, and then using a method called reverse transcriptase-polymerase chain reaction (RT-PCR) to detect SARS-CoV-2 RNA. But if the viral load is low, which can occur early in the course of infection, the swab might not pick up enough virus to be detectable. Jingwei Xie and colleagues wanted to develop a nanofiber swab that could absorb and then release more viruses and other biological specimens, improving the sensitivity of diagnostic tests.

The researchers used an electrospinning technique to make 1-cm-long cylinders composed of aligned nanofiber layers, which they coated with a thin layer of gelatin and bonded to plastic swab sticks. In lab tests, the porous nanofiber cylinders absorbed and released more proteins, cells, bacteria, DNA and viruses from liquids and surfaces than the cotton or flocked swabs commonly used for COVID-19 testing. The team made dilutions of SARS-CoV-2 virus, swabbed the liquid samples and tested for viral RNA with RT-PCR. Compared with the two other types of swabs, the nanofiber ones reduced the false-negative rate and detected SARS-CoV-2 at a 10-times lower concentration. In addition to allowing more accurate and sensitive COVID-19 testing, the nanofiber swabs have far-reaching potential in diagnosing other diseases, testing for foodborne illnesses and helping forensic teams identify crime suspects from miniscule biological specimens, the researchers say.

Credit: 
American Chemical Society

Ancient proteins help track early milk drinking in Africa

image: Cattle grazing in Entesekara in Kenya near the Tanzanian border

Image: 
A. Janzen

Tracking milk drinking in the ancient past is not straightforward. For decades, archaeologists have tried to reconstruct the practice by various indirect methods. They have looked at ancient rock art to identify scenes of animals being milked and at animal bones to reconstruct kill-off patterns that might reflect the use of animals for dairying. More recently, they even used scientific methods to detect traces of dairy fats on ancient pots. But none of these methods can say if a specific individual consumed milk.

Now archaeological scientists are increasingly using proteomics to study ancient dairying. By extracting tiny bits of preserved proteins from ancient materials, researchers can detect proteins specific to milk, and even specific to the milk of particular species.

Where are these proteins preserved? One critical reservoir is dental calculus - dental plaque that has mineralized and hardened over time. Without toothbrushes, many ancient people couldn't remove plaque from their teeth, and so developed a lot of calculus. This may have led to tooth decay and pain for our ancestors but it also produced a goldmine of information about ancient diets, with plaque often trapping food proteins and preserving them for thousands of years.

Now, an international team led by researchers at the Max Planck Institute for the Science of Human History in Jena, Germany and the National Museums of Kenya (NMK) in Nairobi, Kenya have analyzed some of the most challenging ancient dental calculus to date. Their new study, published in Nature Communications, examines calculus from human remains in Africa, where high temperatures and humidity were thought to interfere with protein preservation.

The team analyzed dental calculus from 41 adult individuals from 13 ancient pastoralist sites excavated in Sudan and Kenya and, remarkably, retrieved milk proteins from 8 of the individuals.

The positive results were greeted with enthusiasm by the team. As lead author Madeleine Bleasdale observes, "some of the proteins were so well preserved, it was possible to determine what species of animal the milk had come from. And some of the dairy proteins were many thousands of years old, pointing to a long history of milk drinking in the continent."

The earliest milk proteins reported in the study were identified at Kadruka 21, a cemetery site in Sudan dating to roughly 6,000 years ago. In the calculus of another individual from the adjacent cemetery of Kadruka 1, dated to roughly 4,000 years ago, researchers were able to identify species-specific proteins and found that the source of the dairy had been goat's milk.

"This the earliest direct evidence to date for the consumption of goat's milk in Africa," says Bleasdale. "It's likely goats and sheep were important sources of milk for early herding communities in more arid environments."

The team also discovered milk proteins in dental calculus from an individual from Lukenya Hill, an early herder site in southern Kenya dated to between 3,600 and 3,200 years ago.

"It seems that animal milk consumption was potentially a key part of what enabled the success and long-term resilience of African pastoralists," observes coauthor Steven Goldstein.

As research on ancient dairying intensifies around the world, Africa remains an exciting place to examine the origins of milk drinking. The unique evolution of lactase persistence in Africa, combined with the fact that animal milk consumption remains critical to many communities across the continent, makes it vital for understanding how genes and culture can evolve together.

Normally, lactase - an enzyme critical for enabling the body to fully digest milk - disappears after childhood, making it much more difficult for adults to drink milk without discomfort. But in some people, lactase production persists into adulthood - in other words these individuals have 'lactase persistence.'

In Europeans, there is one main mutation linked to lactase persistence, but in different populations across Africa, there are as many as four. How did this come to be? The question has fascinated researchers for decades. How dairying and human biology co-evolved has remained largely mysterious despite decades of research.

By combining their findings about which ancient individuals drank milk with genetic data obtained from some of the ancient African individuals, the researchers were also able to determine whether early milk drinkers on the continent were lactase persistent. The answer was no. People were consuming dairy products without the genetic adaptation that supports milk drinking into adulthood.

This suggests that drinking milk actually created the conditions that favoured the emergence and spread of lactase persistence in African populations. As senior author and Max Planck Director Nicole Boivin notes, "This is a wonderful example of how human culture has - over thousands of years - reshaped human biology."

But how did people in Africa drink milk without the enzyme needed to digest it? The answer may lie in fermentation. Dairy products like yogurt have a lower lactose content than fresh milk, and so early herders may have processed milk into dairy products that were easier to digest.

Critical to the success of the research was the Max Planck scientists' close partnership with African colleagues, including those at the National Corporation of Antiquities and Museums (NCAM), Sudan, and long-term collaborators at the National Museums of Kenya (NMK). "It's great to get a glimpse of Africa's important place in the history of dairying," observes coauthor Emmanuel Ndiema of the NMK. "And it was wonderful to tap the rich potential of archaeological material excavated decades ago, before these new methods were even invented. It demonstrates the ongoing value and importance of museum collections around the world, including in Africa."

Credit: 
Max Planck Institute of Geoanthropology

How blood stem cells maintain their lifelong potential for self-renewal

A characteristic feature of all stem cells is their ability to self-renew. But how is this potential maintained throughout life? Scientists at the German Cancer Research Center (DKFZ) and the Heidelberg Institute for Stem Cell Technology and Experimental Medicine* (HI-STEM) have now discovered in mice that cells in the so-called "stem cell niche" are responsible for this: Blood vessel cells of the niche produce a factor that stimulates blood stem cells and thus maintains their self-renewal capacity. During aging, production of this factor ceases and blood stem cells begin to age.

Throughout life, blood stem cells in the bone marrow ensure that our body is adequately supplied with mature blood cells. If there is no current need for cell replenishment, the blood stem cells remain in a deep sleep to protect themselves from damage to the genome, which can lead to cancer. Blood loss, infections and inflammations act like an alarm clock: immediately, the blood stem cells begin to divide and produce new cells - for example, to provide immune cells to fight viruses or to compensate for a loss of red blood cells or platelets. With each cell division, the stem cells always regenerate themselves as well, so that the stem cell pool is maintained. This is what scientists call self-renewal.

"The dormancy is the prerequisite for this unique ability of stem cells," explains Andreas Trumpp, a stem cell expert at DKFZ and HI-STEM. The almost unlimited self-renewal capacity is considered a key property of the very rare stem cells, which play a central role in the maintenance and repair of tissues and organs. However, cancer cells also possess this ability. They either derive directly from stem cells or acquire this ability through genetic modification. "Without self-renewal, there is no cancer," Trump sums it up.

A team of researchers led by Andreas Trumpp now wanted to find out which molecular signals control the self-renewal ability. In their current analyses, they discovered in mice that dormant blood stem cells carry large amounts of the receptor protein neogenin-1 (Neo-1) on their surface. In contrast, other blood cells do not produce this receptor. Further investigations revealed Neo-1 to be a key molecule for self-renewal: if the researchers genetically switched off the receptor in mice, the stem cells no longer slept, thus losing their ability to self-renew, and the animals' hematopoietic system exhausted prematurely.

Neo-1 is a receptor that enables the stem cell to receive external signals. But where do these important signals, which are essential for the self-renewal ability, come from? The researchers identified the signal molecule netrin-1 as the binding partner and activator of the Neo-1 receptor. Netrin-1 is produced by the endothelial cells that line the fine blood vessels in bone marrow. "We genetically knocked out netrin-1 in the stem cell niche of mouse bone marrow. The blood stem cells then lost the ability to self-renew. In contrast, when netrin-1 production was experimentally increased, they slept all the more deeply," said Simon Renders, first author of the study.

Scientists refer to the structures in the immediate vicinity of stem cells as a stem cell niche. The niche can consist of cellular and non-cellular components and exerts a major influence on the functions and fate of blood stem cells. Netrin-1-bearing cells of blood capillaries are also part of the niche. "Our results reconfirm the central role of the stem cell niche for stem cell function and thus for the regenerative capacity and health of our body," Trumpp explains.

The age-related depletion of the hematopoietic system could also be traced in the animals: With age, the bone marrow changes its structure, and the tiny blood vessels degenerate. Using older mice, the scientists were able to show that this is accompanied by a loss of netrin-1. The blood stem cells initially try to compensate for this lack of their important signal generator by increasing the formation of Neo-1. However, with increasing age, this compensation is no longer sufficient, and the hematopoietic system increasingly loses its self-renewal capacity. The result of these changes is an increasingly weaker immune system in old age.

Credit: 
German Cancer Research Center (Deutsches Krebsforschungszentrum, DKFZ)

Healthy lifespan analysis using nematodes

image: C-HAS can distinguish between nematodes that are alive, dead, or in an inactive state of survival by using superimposed periodic (before & after) images.

Image: 
Associate Professor Tsuyoshi Shuto

A research group from Kumamoto University (Japan) has developed an automated measurement system to assess healthy lifespans using nematodes (C. elegans). Based on qualitative differences in lifespans, this system can classify populations of nematodes that are, on average, healthy and long-lived, healthy and die prematurely, and living with long periods of poor health. Since there are many similarities between the mechanisms that determine the lifespan of C. elegans and humans, the researchers believe that this system will make it easier to develop drugs and find foods that extend the healthy lifespan of humans.

The concept of "healthy life expectancy" was proposed in the year 2000 by the World Health Organization (WHO) and is an important indicator for the health of a population. It refers to the average life expectancy minus the period of living dependent on continuous medical or nursing care. However, there is no clear scientific understanding of what constitutes a healthy lifespan in experimental animals or cells. Furthermore, the technology to objectively and rapidly analyze the factors that affect healthy life expectancy has not yet been established.

Despite being an extremely simple animal, C. elegans has differentiated organs such as nerves, skeletal muscles, and a digestive tract, and many mammalian animal-related genes are conserved. It is very useful for cutting-edge research in fields like genetics and molecular biology. However, while lifespan analysis of this nematode provides a great deal of useful information, previous lifespan studies had many limitations including 1) sensitivity to various stimuli at room temperature, 2) a long experimental time required for daily measurements, 3) a lack of objectivity due to a tendency for results to be dependent on experimental technique, and 4) the small number of samples that can be processed at one time making it unsuitable for simultaneous measurement of many samples.

The researchers attempted to resolve these issues by developing a new healthy lifespan assessment system that maintained the advantages provided by nematodes. They focused on determining the optimal conditions in a live cell imaging system for automatically measuring nematode survival, such as counting the number of nematodes in a sample, incubation temperature, medium thickness, feeding conditions, imaging interval, and survival determination method. This became C. elegans Lifespan Auto-monitoring System (C-LAS), a fully automated lifespan measurement system that can non-invasively measure a large number of samples (currently up to 36 samples). C-LAS uses overlapping images of nematodes to identify those that are moving, meaning they are alive, and those that are not moving, meaning they are dead.

Next, by using C-LAS to observe C. elegans, the researchers found that nematodes can be classified as being in one of three possible behavioral states: an active (alive) state, an inactive survival state, or an inactive (dead) state. They defined the period of active behavior as the "healthy lifespan" and established a new system that they called the C. elegans Healthspan Auto-monitoring System (C-HAS). Similar to C-LAS, C-HAS is an automated health and longevity measurement system that can distinguish between live and dead nematodes by overlapping periodic images. It is also possible to detect when nematodes are in an inactive survival state (alive but unhealthy) when they only partially overlap between images. Using C-HAS, researchers can use these parameters for mini-population analyses. This type of analysis makes it possible to divide nematodes with the same genetic background into four groups: those with an average lifespan, those that are healthy and long-lived, those that are unhealthy and die prematurely, and those that have a long period of frailty.

The researchers performed a mini-population analysis of nematode healthy lifespan using a combination of C-HAS and statistical analysis on common nematodes with the same genetic background. They found that about 28% of the population had average lifespans, about 30% had long and healthy lifespans, about 35% had healthy lifespans but died prematurely, and about 7% had a long period of frailty. They also found that activating—either genetically or through administration of the drug metformin—AMP-activated protein kinase (AMPK), which is closely associated with healthy life expectancy, dramatically increased the population with healthy longevity and reduced the population with long periods of frailty. Metformin is thought to increase healthy life expectancy in humans, and the present study supports this idea. Currently, clinical trials are underway to ascertain its association with healthy longevity.

"It might be a little unexpected to see nematodes being used to measure healthy lifespans, but we have already used C-HAS to identify new genes related to healthy lifespan that were previously unknown," said study leader, Associate Professor Tsuyoshi Shuto. "This technology makes it possible to easily search for genes, drugs, or foods that are related to, or even extend, human healthy lifespans with a speed and accuracy that could not be obtained when using laboratory animals. We expect that C-HAS can be used for drug discovery research and in the search for healthy foods in the future. We are currently working on the development of C-HAS-AI, which will incorporate deep learning into C-HAS to boost automated analysis."

Credit: 
Kumamoto University

'You say tomato, I say genomics': Genome sequences for two wild tomato ancestors

image: Two wild ancestors of tomato (SP: Solanum pimpinellifolium, SLC: Solanum lycopersicum var. cerasiforme)

Image: 
University of Tsukuba

Tsukuba, Japan - Tomatoes are one of the most popular types of fresh produce consumed worldwide, as well as being an important ingredient in many manufactured foods.

As with other cultivated crops, some potentially useful genes that were present in its South American ancestors were lost during domestication and breeding of the modern tomato, Solanum lycopersicum var. lycopersicum.

Because of its importance as a crop, the tomato genome sequence was completed and published as long ago as 2012, with later additions and improvements. Now, the team at University of Tsukuba, in collaboration with TOKITA Seed Co. Ltd, have produced high-quality genome sequences of two wild ancestors of tomato from Peru, Solanum pimpinellifolium and Solanum lycopersicum var. cerasiforme. They recently published the work in DNA Research.

The team used modern DNA sequencing technologies, which can read longer sequences than was previously possible, coupled with advanced bioinformatics tools to analyze the hundreds of gigabytes of data generated and to confirm the high quality of the data. They assembled the many sequence fragments, showed where sequences matched the known genes in the 12 chromosomes found in cultivated tomatoes, and also identified thousands of sequences of new genes that are not found in modern types. Many of these novel DNA sequences are present in only one or other of the ancestral species.

The researchers went on to analyze the transcriptome in the two ancestral tomatoes--those genes where the DNA is transcribed into RNA messages, which supply the instructions for the cell to make proteins--examining 17 different parts of the plants to show which genes were active. Together with comparisons with known genes, this information points to possible functions of the novel genes, for example in fruit development, or conferring disease resistance in the leaves, or salt tolerance in the roots.

"The new genome sequences for these ancestral tomatoes will be valuable for studying the evolution of this group of species and how the genetics changed during domestication", says corresponding author Professor Tohru Ariizumi. "In addition, the wild relatives contain thousands of genes not found in modern cultivated tomatoes. With this new information, researchers will be able to locate novel and useful genes that can be bred into tomatoes, and potentially other crops too. This will help plant breeders develop improved future types of tomato with features like better resistance to diseases, increased tolerance for the changing climate, and improved taste and shelf-life."

Credit: 
University of Tsukuba

Pace of prehistoric human innovation could be revealed by 'linguistic thermometer'

image: Global 'hot 'and 'cold' spots

Image: 
Image from Martin Sanchez on Unsplash https://unsplash.com/

Multi-disciplinary researchers at The University of Manchester have helped develop a powerful physics-based tool to map the pace of language development and human innovation over thousands of years - even stretching into pre-history before records were kept.

Tobias Galla, a professor in theoretical physics, and Dr Ricardo Bermúdez-Otero, a specialist in historical linguistics, from The University of Manchester, have come together as part of an international team to share their diverse expertise to develop the new model, revealed in a paper entitled 'Geospatial distributions reflect temperatures of linguistic feature' authored by Henri Kauhanen, Deepthi Gopal, Tobias Galla and Ricardo Bermúdez-Otero, and published by the journal Science Advances.

Professor Galla has applied statistical physics - usually used to map atoms or nanoparticles - to help build a mathematically-based model that responds to the evolutionary dynamics of language. Essentially, the forces that drive language change can operate across thousands of years and leave a measurable "geospatial signature", determining how languages of different types are distributed over the surface of the Earth.

Dr Bermúdez-Otero explained: "In our model each language has a collection of properties or features and some of those features are what we describe as 'hot' or 'cold'.

"So, if a language puts the object before the verb, then it is relatively likely to get stuck with that order for a long period of time - so that's a 'cold' feature. In contrast, markers like the English article 'the' come and go a lot faster: they may be here in one historical period, and be gone in the next. In that sense, definite articles are 'hot' features.

"The striking thing is that languages with 'cold' properties tend to form big clumps, whereas languages with 'hot' properties tend to be more scattered geographically."

This method therefore works like a thermometer, enabling researchers to retrospectively tell whether one linguistic property is more prone to change in historical time than another. This modelling could also provide a similar benchmark for the pace of change in other social behaviours or practices over time and space.

"For example, suppose that you have a map showing the spatial distribution of some variable cultural practice for which you don't have any historical records - this could be be anything, like different rules on marriage or on the inheritance of possessions," added Dr Bermúdez-Otero.

"Our method could, in principle, be used to ascertain whether one practice changes in the course of historical time faster than another, ie whether people are more innovative in one area than in another, just by looking at how the present-day variation is distributed in space."

The source data for the linguistic modelling comes from present-day languages and the team relied on The World Atlas of Language Structures (WALS). This records information of 2,676 contemporary languages.

Professor Galla explained: "We were interested in emergent phenomena, such as how large-scale effects, for example patterns in the distribution of language features arise from relatively simple interactions. This is a common theme in complex systems research.

"I was able to help with my expertise in the mathematical tools we used to analyse the language model and in simulation techniques. I also contributed to setting up the model in the first place, and by asking questions that a linguist would perhaps not ask in the same way."

Credit: 
University of Manchester

Secrets of traumatic stress hidden in the brain are exposed

image: Maps showing the location and extent of the regions of interest used in this study on the effects of an allostatic closed-loop neurotechnology (HIRREM) on brain functional connectivity laterality in military-related traumatic stress. Montreal Neurological Institute coordinates are listed below each slice.

Image: 
© Cereset and Wake Forrest School of Medicine

(Scottsdale, Ariz. - January 27, 2021) HIRREM (the legacy research technology of Cereset - a Brain State Company) was utilized by the Wake Forest School of Medicine to study symptoms of traumatic stress in military personnel before and after use of Cereset (legacy) intervention.

Whole brain, resting state magnetic resonance imaging (MRI) was done pre- and post- Cereset intervention. Significant effects on brain network connectivity have been previously reported.

For the current study, lateralization of brain connectivity was analyzed. Lateralization here refers to the distribution of brain connections within the right, and left side, or to the opposite side. This is important because common lobes in the right and left hemispheres of the brain execute different functions. Of note for these results, the parasympathetic division of the autonomic nervous system (which we think of as the "brake to rest and digest") is managed primarily on the left, and the sympathetic division (which we think of as the "gas to go") is driven primarily on the right." Thus, the dominant left is a "freeze or numbing" response indicator, and in contrast, a dominant right is the "fight or flight" response indicator.

Asymmetries were identified at baseline, along with changes in lateralization patterns after Cereset. Significant correlations were seen between whole-brain lateralization and symptoms of posttraumatic stress. These findings help to understand the effects of and how best to treat traumatic stress.

The brain is the organ of central command which activates and manages autonomic responses to trauma and stress. Results of this study support the model of bihemispheric autonomic management of traumatic stress (BHAM), proposed by Brain State and Wake Forest 6 years ago. Regulated connections of the brain appear to be the upstream driver of health and wellbeing. When connections between hemispheres are more balanced, individuals are healthier.

Of note is the basis for this brain movement towards a healthier pattern. Cereset merely reflects the brain's own electrical patterns using engineered tones to represent dominant brain frequencies. The client receiving Cereset is comfortably seated in a reclining chair to listen to the brain generated tones or BrainEcho®. This supports the brain to relax, to change, and to reset itself to improve balance and symmetry, which is associated with reduced symptoms of traumatic stress.

Many prior clinical studies have demonstrated benefits associated with the use of Cereset for symptoms of insomnia, depression, stress, anxiety, concussion, hot flashes, postural orthostatic tachycardia, and migraine. The current and previous clinical trials also included objective outcome measures such as heart rate variability to demonstrate significant improvement in autonomic function. Symptom benefits were durable for up to 6 months in the current study, and autonomic improvement was durable through the final follow-up visit at 4 months in a placebo-controlled trial for insomnia.

The addition of the MRI lateralization findings reported in this study and the correlation with symptoms not only support the BHAM model but also point to a mechanism of effect for the clinical benefits previously reported with Cereset. Symptoms associated with trauma, stress, and allostatic load now affect a majority of the population. Cereset noninvasively supports the brain to balance and regulate itself with resulting benefits for psychological symptoms of traumatic stress and improved autonomic function.

Credit: 
Cereset

Soil health is as environmentally important as air and water quality, say microbiologists

There are an estimated 40,000 to 50,000 species of micro-organism per gram of soil. Addition of certain microbes can tailor soil characteristics: removing contaminants, improving fertility and even making barren land available for farming.

The Microbiology Society's report calls for increased access to research into soil health, promoting outreach activities in agricultural colleges and schools and showcasing work in non-academic outlets. This, say microbiologists, is the best way to collaborate with farmers to improve soil health and agricultural productivity.

Tilling and excessive use of fertilisers have major effects on soil health. Microbiology can be used to help understand the impact of intensive farming and design feasible mitigation practices.

The report highlights collaboration with farmers as key for improving soil health, and sustainable soil management practices should be designed with agricultural requirements and practices in mind. Sustainable soil management should be incentivised, the report says, and research outcomes should be affordable and ready for use on farms.

The UK is estimated to be 30 to 40 years away from "fundamental eradication of soil fertility", and the UN have warned that if current degradation rates are not reversed there may be less than 60 harvests left in the world's soil.

The EU has raised soil health as one its top five priorities and many global initiatives are emerging in the area of soil protection. The UK should take advantage of this increased profile to consolidate active communities working together to improve the uptake and development of new sustainable land management practices.

Credit: 
Microbiology Society

Children cannot ignore what they hear when detecting emotions

Children determine emotion by what they hear, rather than what they see, according to new research.

The first-of-its-kind study, by Durham University's Department of Psychology, looked at how children pick up on the emotions of a situation.

They found that whilst adults prioritised what they see, young children showed an auditory dominance and overwhelmingly prioritised what they could hear.

The researchers say their findings could benefit parents currently managing home learning and professional educators by increasing their understanding of how young children pick up on what is going on around them.

The research may also provide new avenues to understanding emotional recognition in children with developmental challenges such as autism.

The findings are published in the Journal of Experimental Child Psychology.

Lead author Dr Paddy Ross, in Durham University's Department of Psychology, said: "Our study found that young children over-rely on what they hear to make judgements about the emotions of a situation. With so many children spending much more time at home currently, there is huge value in considering what they may hear and pick up on.

"There could also be applications for how to make online learning more effective as well as our understanding of how children with challenges such as autism may detect and understand emotions."

The research was designed to test whether the previously identified 'Colavita effect', which had shown that from around the age of eight years old humans tend to respond more to visual rather than auditory stimuli, held true for more complex situations such as emotional recognition in young children.

The team undertook two experiments with volunteers in three age categories (seven and under, eight to 11, and 18+).

The volunteers were shown pictures of humans, with faces blurred, for the visual stimuli, and human voices for the auditory stimuli, which conveyed happy and fearful and sad and angry emotions.

The stimuli were presented both on their own, and in corresponding and contrasting combinations, and participants were asked what the over-riding emotion was in each.

The team found that when the visual and auditory stimuli were combined, adults based their emotional assessment on what they could see whereas young children overwhelmingly gave precedence to what they could hear.

All age groups scored over 90 per cent when presented with visual and auditory stimuli in isolation. A similar score was recorded when the stimuli were combined, and participants were asked to ignore the visual stimuli and identify the emotion from the voice.

However, when younger and older children were asked to ignore the voice and base their judgement on the body stimuli, the team found that they performed significantly worse than adults when presented with a combination where the emotions displayed in the visual and auditory stimuli did not match.

Children also scored significantly below chance level, indicating that they were not merely guessing, but selecting the spoken emotion, rather than the visual, despite being told to ignore it.

Dr Ross now plans to undertake further research to investigate whether young children still rely on what they can hear when human facial expressions are present, and when human voices are replaced with music conveying similar emotions.

Credit: 
Durham University