Culture

People can see beauty in complex mathematics, study shows

Ordinary people see beauty in complex mathematical arguments in the same way they can appreciate a beautiful landscape painting or a piano sonata - and you don't need to be a mathematician to get it, a new study by Yale University and the University of Bath has revealed.

The study, published in science journal Cognition, showed people even agreed on what made such abstract mathematical arguments beautiful. The findings may have implications for teaching schoolchildren, who may not be entirely convinced that there is beauty in mathematics.

The similarities between mathematics and music have long been noted but the study co-authors, Yale mathematician Stefan Steinerberger and University of Bath psychologist Dr. Samuel G.B.Johnson, wanted to add art to the mix to see if there was something universal at play in the people judge aesthetics and beauty - be they in art, music or abstract mathematics.

The research was sparked when Steinerberger, while teaching his students, likened a mathematical proof to a 'really good Schubert sonata' - but couldn't put his finger on why. He approached Johnson, assistant professor of marketing at the University of Bath School of Management, who was completing his Ph.D. in psychology at Yale.

Johnson designed an experiment to test his question of whether people share the same aesthetic sensibilities about maths that they do about art or music - and if this would hold true for an average person, not just a career mathematician.

For the study, they chose four mathematical proof, four landscape paintings, and four classical piano pieces. None of the participants was a mathematician.

The mathematical proofs used were: the sum of an infinite geometric series, Gauss's summation trick for positive integers, the Pigeonhole principle, and a geometric proof of a Faulhaber formula. A mathematical proof is an argument which convinces people something is true.

The piano pieces were Schubert's Moment Musical No. 4, D 780 (Op. 94), Bach's Fugue from Toccata in E Minor (BWV 914), Beethoven's Diabelli Variations (Op. 120) and Shostakovich's Prelude in D-flat major (Op.87 No. 15).

The landscape paintings were Looking Down Yosemite Valley, California by Albert Bierstadt; A Storm in the Rocky Mountains, Mt. Rosalie by Albert Bierstadt; The Hay Wain by John Constable; and The Heart of the Andes by Frederic Edwin Church.

Johnson divided the study into three parts.

The first task required a sample of individuals to match the four maths proofs to the four landscape paintings based on how aesthetically similar they found them. The second task required a different group of people to compare the four maths proofs to the four piano sonatas.

Finally, the third asked another sample group to rate each of the four works of art and mathematical arguments for nine different criteria - seriousness, universality, profundity, novelty, clarity, simplicity, elegance, intricacy, and sophistication.

Participants in the third group agreed with each other about how elegant, profound, clear, etc., each of the mathematical arguments and paintings was.

But Steinerberger and Johnson were most impressed that these ratings could be used to predict how similar participants in the first group believed that each argument and painting were to each other. This finding suggests that perceived correspondences between maths and art really have to do with their underlying beauty.

Overall, the results showed there was considerable consensus in comparing mathematical arguments to artworks. And there was some consensus in judging the similarity of classical piano music and mathematics.

"Laypeople not only had similar intuitions about the beauty of math as they did about the beauty of art but also had similar intuitions about beauty as each other. In other words, there was consensus about what makes something beautiful, regardless of modality," Johnson said.

However, it was not clear whether the results would be the same with different music.

"I'd like to see our study done again but with different pieces of music, different proofs, different artwork," said Steinerberger. "We demonstrated this phenomenon, but we don't know the limits of it. Where does it stop existing? Does it have to be classical music? Do the paintings have to be of the natural world, which is highly aesthetic?"

Both Steinerberger and Johnson believe the research may have implications for maths education, especially at the secondary-school level.

"There might be opportunities to make the more abstract, more formal aspects of mathematics more accessible and more exciting to students at that age," said Johnson, "And that might be useful in terms of encouraging more people to enter the field of mathematics."

Credit: 
University of Bath

Study shows how serotonin and a popular anti-depressant affect the gut's microbiota

image: Senior author Elaine Hsiao says researchers hope to build on their current study to learn whether microbial interactions with antidepressants have consequences for health and disease.

Image: 
Reed Hutchinson/UCLA

A new study in mice led by UCLA biologists strongly suggests that serotonin and drugs that target serotonin, such as anti-depressants, can have a major effect on the gut's microbiota -- the 100 trillion or so bacteria and other microbes that live in the human body's intestines.

Serotonin -- a neurotransmitter, or chemical messenger that sends messages among cells -- serves many functions in the human body, including playing a role in emotions and happiness. An estimated 90% of the body's serotonin is produced in the gut, where it influences gut immunity.

The team -- led by senior author Elaine Hsiao and lead author Thomas Fung, a postdoctoral fellow -- identified a specific gut bacterium that can detect and transport serotonin into bacterial cells. When mice were given the antidepressant fluoxetine, or Prozac, the biologists found this reduced the transport of serotonin into their cells. This bacterium, about which little is known, is called Turicibacter sanguinis. The study is published this week in the journal Nature Microbiology.

"Our previous work showed that particular gut bacteria help the gut produce serotonin. In this study, we were interested in finding out why they might do so," said Hsiao, UCLA assistant professor of integrative biology and physiology, and of microbiology, immunology and molecular genetics in the UCLA College; and of digestive diseases in the David Geffen School of Medicine at UCLA.

Hsiao and her research group reported in the journal Cell in 2015 that in mice, a specific mixture of bacteria, consisting mainly of Turicibacter sanguinis and Clostridia, produces molecules that signal to gut cells to increase production of serotonin. When Hsiao's team raised mice without the bacteria, more than 50% of their gut serotonin was missing. The researchers then added the bacteria mixture of mainly Turicibacter and Clostridia, and their serotonin increased to a normal level.

That study got the team wondering why bacteria signal to our gut cells to make serotonin. Do microbes use serotonin, and if so, for what?

In this new study, the researchers added serotonin to the drinking water of some mice and raised others with a mutation (created by altering a specific serotonin transporter gene) that increased the levels of serotonin in their guts. After studying the microbiota of the mice, the researchers discovered that the bacteria Turicibacter and Clostridia increased significantly when there was more serotonin in the gut.

If these bacteria increase in the presence of serotonin, perhaps they have some cellular machinery to detect serotonin, the researchers speculated. Together with study co-author Lucy Forrest and her team at the National Institutes of Health's National Institute of Neurological Disorders and Stroke, the researchers found a protein in multiple species of Turicibacter that has some structural similarity to a protein that transports serotonin in mammals. When they grew Turicibacter sanguinis in the lab, they found that the bacterium imports serotonin into the cell.

In another experiment, the researchers added the antidepressant fluoxetine, which normally blocks the mammalian serotonin transporter, to a tube containing Turicibacter sanguinis. They found the bacterium transported significantly less serotonin.

The team found that exposing Turicibacter sanguinis to serotonin or fluoxetine influenced how well the bacterium could thrive in the gastrointestinal tract. In the presence of serotonin, the bacterium grew to high levels in mice, but when exposed to fluoxetine, the bacterium grew to only low levels in mice.

"Previous studies from our lab and others showed that specific bacteria promote serotonin levels in the gut," Fung said. "Our new study tells us that certain gut bacteria can respond to serotonin and drugs that influence serotonin, like anti-depressants. This is a unique form of communication between bacteria and our own cells through molecules traditionally recognized as neurotransmitters."

The team's research on Turicibacter aligns with a growing number of studies reporting that anti-depressants can alter the gut microbiota. "For the future," Hsiao said, "we want to learn whether microbial interactions with antidepressants have consequences for health and disease." Hsiao wrote a blog post for the journal about the new research.

Credit: 
University of California - Los Angeles

Mathematical model provides new support for environmental taxes

A new mathematical model provides support for environmental taxation, such as carbon taxes, as an effective strategy to promote environmentally friendly practices without slowing economic growth. Xinghua Fan and colleagues at Jiangsu University, China, publish their model and findings in the open-access journal PLOS ONE on September 4, 2019.

A worldwide "green development" movement calls for reducing pollution and increasing resource utilization efficiency without hindering economic expansion. Many governments have proposed or imposed environmental taxes, such as taxes on carbon emissions, to promote environmentally friendly economic practices. However, few studies have rigorously quantified the effects of environmental taxes on the interconnected factors involved in green development.

To help clarify the impact of environmental taxation, Fan and colleagues developed and validated a mathematical model that reflects the closely integrated relationships between environmental taxes, economic growth, pollution emissions, and utilization of resources, such as water and fossil fuels. Then they applied the model to real-world data in order to analyze the effects of environmental taxes on green development in China.

The analysis suggests that environmental taxes can indeed help to stimulate economic growth, decrease emissions, and improve resource utilization. The researchers explored several different scenarios, finding that the beneficial effects of an environmental tax are enhanced by advanced technology, elevated consumer awareness, and--especially--firm government control.

The authors suggest that their model could be applied to explore the effects of environmental taxes in other countries beyond China. Researchers may also seek to modify the model for application to different industries or economic sectors, as opposed to countries or regions. The model could potentially be improved by identification and incorporation of more sophisticated mathematical relationships between the various green development factors.

Credit: 
PLOS

Ancient animal species: Fossils dating back 550 million years among first animal trails

image: A fossilized trail of the animal Yilingia spiciformis, dating back 550 million years. The trail was found in China by a team of scientists including Shuhai Xiao of the Virginia Tech College of Science.

Image: 
Virginia Tech College of Science

In a remarkable evolutionary discovery, a team of scientists co-led by a Virginia Tech geoscientist has discovered what could be among the first trails made by animals on the surface of the Earth roughly a half-billion years ago.

Shuhai Xiao, a professor of geosciences with the Virginia Tech College of Science, calls the unearthed fossils, including the bodies and trails left by an ancient animal species, the most convincing sign of ancient animal mobility, dating back about 550 million years. Named Yilingia spiciformis - that translates to spiky Yiling bug, Yiling being the Chinese city near the discovery site - the animal was found in multiple layers of rock by Xiao and Zhe Chen, Chuanming Zhou, and Xunlai Yuan from the Chinese Academy of Sciences' Nanjing Institute of Geology and Palaeontology.

The findings are published in the latest issue of Nature. The trials are from the same rock unit and are roughly the same age as bug-like footprints found by Xiao and his team in a series of digs from 2013 to 2018 in the Yangtze Gorges area of southern China, and date back to the Ediacaran Period, well before the age of dinosaurs or even the Pangea supercontinent. What sets this find apart: The preserved fossil of the animal that made the trail versus the unknowable guesswork where the body has not been preserved.

"This discovery shows that segmented and mobile animals evolved by 550 million years ago," Xiao said. "Mobility made it possible for animals to make an unmistakable footprint on Earth, both literally and metaphorically. Those are the kind of features you find in a group of animals called bilaterans. This group includes us humans and most animals. Animals and particularly humans are movers and shakers on Earth. Their ability to shape the face of the planet is ultimately tied to the origin of animal motility."

The animal was a millipede-like creature a quarter-inch to an inch wide and up to 4 inches long that alternately dragged its body across the muddy ocean floor and rested along the way, leaving trails as loing as 23 inches. The animal was an elongated narrow creature, with 50 or so body segments, a left and right side, a back and belly, and a head and a tail.

The origin of bilaterally symmetric animals - known as bilaterians -- with segmented bodies and directional mobility is a monumental event in early animal evolution, and is estimated to have occurred the Ediacaran Period, between 635 and 539 million years ago. But until this finding by Xiao and his team, there was no convincing fossil evidence to substantiate those estimates. One of the recovered specimens is particularly vital because the animal and the trail it produced just before its death are preserved together.

Remarkably, the find also marks what may be the first sign of decision making among animals - the trails suggest an effort to move toward or away from something, perhaps under the direction of a sophisticated central nerve system, Xiao said. The mobility of animals led to environmental and ecological impacts on the Earth surface system and ultimately led to the Cambrian substrate and agronomic revolutions, he said.

"We are the most impactful animal on Earth," added Xiao, also an affiliated member of the Global Change Center at Virginia Tech. "We make a huge footprint, not only from locomotion, but in many other and more impactful activities related to our ability to move. When and how animal locomotion evolved defines an important geological and evolutionary context of anthropogenic impact on the surface of the Earth."

Rachel Wood, a professor in the School of GeoSciences at University of Edinburgh in Scotland, who was not involved with the study, said, "This is a remarkable finding of highly significant fossils. We now have evidence that segmented animals were present and had gained an ability to move across the sea floor before the Cambrian, and more notably we can tie the actual trace-maker to the trace. Such preservation is unusual and provides considerable insight into a major step in the evolution of animals."

Credit: 
Virginia Tech

Space dragons: Researchers observe energy consumption in quasars

image: Like a starved dragon, the supermassive black hole in the center of quasar gobbles materials with endless appetite. These materials glares shiningly when gathering into an accretion disk before finally sliding down into the black hole. Outside the accretion disk, materials are pumped from all directions to the center to feed the black hole. These materials are described as inflows.

Image: 
Image by CUI Jie, Universit of Science and Technoloy of China

Quasars are the Universe's brightest beacons; shining with magnitudes more luminosity than entire galaxies and the stars they contain. In the center of this light, at the heart of a quasar, researchers think, is an all-consuming black hole.

Researchers, for the first time, have observed the accelerated rate at which eight quasars consume interstellar fuel to feed their black holes.

They published their results on Sept 4th, Nature.

"As the most luminous steady beacons in the Universe, quasars are believed to be powered by an accretion disk around the central black hole," said Hongyan Zhou, paper author and faculty member at the University of Science and Technology of China. Zhou is also affiliated with the SOA Key Laboratory for Polar Science in the Polar Research Institute of China.

Zhou compared the black hole to a starved dragon.

"The supermassive black hole in the center of the quasar gobbles up an enormous amount of nearby materials, which glare and shine when they constitute an accretion disk before finally sliding down in the black hole," Zhou said. "Outside the accretion disk, materials are continuously pumped from all directions to the center by gravity to feed the black hole with an endless appetite."

An accretion disk is a spiraling mass of material centered around a monumental source of gravity consuming interstellar material--what researchers have theorized is a black hole. Much like how water empties out of a bathtub, the material spins much faster the closer it gets to the drain.

"We think this paradigm of black holes at the center of quasars is accurate, but fundamental questions remain unanswered: Is the accretion disk fueled with external mass? If so, how?" Zhou said.

The interstellar gas cannot be observed directly, as its radiation signature is overwhelmed by the accretion disk's brightness. Instead, researchers monitor for gas falling into the accretion disk that may pass through their line of sight. The gas makes a kind of eclipse between Earth and the accretion disk, casting lines onto the disk's spectrum of radiation.

The researchers used the Doppler effect to measure these lines and observe the velocity of gas feeding into the disk, toward the black hole. A classic Doppler effect example is how the pitch of a police siren drops once it passes. Astronomers call this passing pitch the "redshift" when measuring how quickly gases move toward an object away from Earth.

Zhou and his team measured velocities of 5,000 kilometers per second. For comparison, a passenger jet travels at less than a thousand kilometers per hour.

"Such a high velocity can only be accelerated by the strong gravity of the central black hole," Zhou said. "It's comparable to how, in a meteor shower, the closer the meteors get to the ground, the faster they fall."

In the quasars Zhou observed, the accretion disks were supplied with fast-falling external mass from surrounding space. The disks themselves then create inflows to the black hole.

Next, Zhou and his team plan to investigate exactly how these quasar "dragons" organize and differentiate the external mass from accretion disks to fuel inflows. According to Zhou, elucidation of this process could better inform the understanding of how quasars form, how long they last and when and how they end.

Credit: 
University of Science and Technology of China

Death march of segmented animal unravels critical evolutionary puzzle

video: This is a reconstruction video of Y. spiciformis and its traces.

Image: 
NIGPAS

The death march of a segmented bilaterian animal unearthed from ~550-million-year-old rocks in China shows that the oldest mobile and segmented animals evolved by the Ediacaran Period (635-539 million years ago). The research was published in Nature on Sept. 4 by an international research team from China and the U.S.

The origin of bilaterally symmetric animals (or bilaterians) with a segmented body plan is a monumental event in early animal evolution. Although scientists have estimated, on the basis of molecular clock analyses, that mobile and segmented bilaterians evolved in the Ediacaran Period, no convincing fossil evidence had been found to substantiate these estimates.

Recently, however, an international research team from the Nanjing Institute of Geology and Palaeontology of the Chinese Academy of Sciences and Virginia Tech in the United States reported the discovery of a segmented bilaterian fossil preserved in ~550-million-year-old rocks in China.

What makes this fossil unusual is that it is directly connected with a trail it made. The new fossil indicates that mobile and segmented animals evolved by the terminal Ediacaran Period. It also helps scientists understand the producers of Ediacaran trace fossils, since such fossils are often not preserved with their trace makers.

The new fossil species, found in the terminal Ediacaran Shibantan Formation of the Dengying Formation in the Yangtze Gorges area, is Yilingia spiciformis. It is an elongate and segmented bilaterian with repetitive and trilobate body units, which show anteroposterior and dorsoventral differentiation.

One specimen is directly connected with the trace it produced immediately before death, thus allowing the authors to interpret other similar trace fossils that are preserved in the same unit but are not directly connected with the animals that produced them.

The direct association with trace fossils suggests that Yilingia spiciformis was a mobile animal capable of locomotion. Evidence of body segmentation, polarity, and directional locomotion indicates that Yilingia spiciformis was a bilaterian animal. However, it is difficult to determine its exact phylogenetic position within the bilaterian family tree. The authors surmise that Yilingia spiciformis could be related to arthropods or annelids.

As one of the few Ediacaran animals demonstrably capable of producing long and continuous trails, Yilingia spiciformis sheds new light on the origin of segmentation and its possible relationship with animal motility. The emergence of motile animals had a profound environmental and ecological impact on Earth surface systems and ultimately led to the Cambrian substrate and agronomic revolutions.

Credit: 
Chinese Academy of Sciences Headquarters

'Information gerrymandering' poses a threat to democratic decision making, both online and off

image: Information gerrymandering can change the way we think about political decisions, as depicted in this image of a gerrymandered mind. People must integrate disparate sources of information when deciding how to vote. But information does not always flow freely -- it can be constrained by social networks and distorted by zealots and automated bots. Researchers showed that certain structures in a social network can sway the voting outcome of towards one party, even when both parties have equal size and each player has the same influence -- a phenomenon they called 'information gerrymandering'.

Image: 
Alexander Stewart

Electoral gerrymandering, in which political districts are drawn to favor one party, has attracted renewed attention of late. The centuries-old practice operates to bias the outcome of elections.

Now researchers led by Penn biologist Joshua B. Plotkin and the University of Houston's Alexander J. Stewart have identified another impediment to democratic decision making, one that may be particularly relevant in online communities.

In what the scientists have termed "information gerrymandering," it's not geographical boundaries that confer a bias but the structure of social networks, such as social media connections.

Reporting in the journal Nature, the researchers first predicted the phenomenon from a mathematical model of collective decision making, and then confirmed its effects by conducting social network experiments with thousands of human subjects. Finally, they analyzed a variety of real-world networks and found examples of information gerrymandering present on Twitter, in the blogosphere, and in U.S. and European legislatures.

"People come to form opinions, or decide how to vote, based on what they read and who they interact with," says Plotkin. "And in today's world we do a lot of sharing and reading online. What we found is that the information gerrymandering can induce a strong bias in the outcome of collective decisions, even in the absence of 'fake news.'

"This tells us that we need to be cautious about relying on social media for communication because the network structure is not under our control and yet it can distort our collective decisions."

The researchers' analysis revealed that information gerrymandering could easily produce biases of 20%. In other words, a group that was evenly split into two parties could nonetheless arrive at 60-40 decision due solely to information gerrymandering.

"The idea is akin to electoral gerrymandering, where one party can gain an advantage not by sheer number but deciding who votes in which district," Plotkin says.

The question of whether this influence could lead to biased outcomes was one that felt particularly salient to Plotkin, given concerns about how the flow of information has been changed by social media.

"Right now, we need research about the effects of social media on the health of liberal democracies," he says.

To begin, the researchers built a simple game in which players were assigned to competing groups, or parties. Placed on a network that determined whose voting intentions each person could see, players were incentivized so that the best outcome would be for their party to "win" the election. The second best outcome would be for the other party to win, and the worst result would be deadlock.

"What we found in a nutshell," says Plotkin, "is that, even when two parties have an equal number of members and everything seems fair--everyone in the network is equally influential--the structure of the social network can still bias the outcome toward one party or another."

The reason has to do with the way that the two parties interact with each other. When members of a single party are talking almost exclusively to one another and not across party lines, it can lead to what is known online as a filter bubble, where someone's views are reinforced by those around them. Put two such groups together, each on the opposite side of an issue, and deadlock ensues.

When information is gerrymandered, however, a few members of one party end up in a conversation dominated by members of the other party. There, they have the opportunity to persuade the other side, or to be persuaded.

"The party at a disadvantage," Plotkin explains, "is the one that has divided its influence--with most its members talking only to their own party, while a few of its members interact in bubbles dominated by the other party, where they are likely to be flipped."

Working with coauthor David Rand at the Massachusetts Institute of Technology and colleagues, the team conducted more than 100 online experiments with more than 2,500 human subjects to test the effects of information gerrymandering. The games entailed the same scenario as the mathematical model: Teams of 12 players each were assigned to "vote" for either the yellow party or the purple party and incentivized to favor their assigned party with consensus as a second-best outcome. The experiments varied the structure of the social network and confirmed the predicted effects of information gerrymandering on vote outcomes.

"We can swing the final vote in these experimental games by 20% or more just by the structure of the social network," Plotkin says. "Even if one party has a 2-to-1 size advantage, we predict the minority party can win a majority of votes through information gerrymandering."

Curious whether they could induce information gerrymandering using automated bots, the researcher also inserted "zealot bots" that refuse to compromise. Sure enough, the appropriate placement of only a few zealots could also induce information gerrymandering and undemocratic outcomes.

To assess real-world networks for the presence of information gerrymandering, the researchers analyzed data on bill co-sponsorship in the U.S. Congress as well as European legislatures and networks of social media users participating in political discussion.

They found that information gerrymandering was extremely common in these real-world networks.

The researchers see this as the beginning of a new avenue of study focused on how social networks impact collective decision making.

"There has been a lot of attention on fake news and online trolls, which are certainly disruptive," says Plotkin. "What we're studying is something different, which depends on the overall network structure--a more subtle but possibly more pernicious problem for democratic decision making."

Credit: 
University of Pennsylvania

How 'information gerrymandering' influences voters

Many voters today seem to live in partisan bubbles, where they receive only partial information about how others feel regarding political issues. Now, an experiment developed in part by MIT researchers sheds light on how this phenomenon influences people when they vote.

The experiment, which placed participants in simulated elections, found not only that communication networks (such as social media) can distort voters' perceptions of how others plan to vote, but also that this distortion can increase the chance of electoral deadlock or bias overall election outcomes in favor of one party.

"The structure of information networks can really fundamentally influence the outcomes of elections," says David Rand, an associate professor at the MIT Sloan School of Management and a co-author of a new paper detailing the study. "It can make a big difference and is an issue people should be taking seriously."

More specifically, the study found that "information gerrymandering" can bias the outcome of a vote, such that one party wins up to 60 percent of the time in simulated elections of two-party situations where the opposing groups are equally popular. In a follow-up empirical study of the U.S. federal government and eight European legislative bodies, the researchers also identified actual information networks that show similar patterns, with structures that could skew over 10 percent of the vote in the study's experiments.

The paper, "Information gerrymandering and undemocratic decisions," is being published today in Nature.

The authors are Alexander J. Stewart of the University of Houston; Mohsen Mosleh, a research scientist at MIT Sloan; Marina Diakonova of the Environmental Change Institute at Oxford University; Antonio Arechar, an associate research scientist at MIT Sloan and a researcher at the Center for Research and Teaching in Economics (CIDE) in Aguascalientes, Mexico; Rand, who is also the principal investigator for MIT Sloan's Human Cooperation Lab; and Joshua B. Plotkin of the University of Pennsylvania. Stewart is the lead author.

Formal knowledge

While there is a burgeoning academic literature on media preferences, political ideology, and voter choices, the current study is an effort to create general models of the fundamental influence that information networks can have. Through abstract mathematical models and experiments, the researchers can analyze how strongly networks can influence voter behavior, even when long-established layers of voter identity and ideology are removed from the political arena.

"Part of the contribution here is to try to formalize how information about politics flows through social networks, and how that can influence voters' decisions," says Stewart.

The study used experiments involving 2,520 participants, who played a "voter game" in one of a variety of conditions. (The participants were recruited via Amazon's Mechanical Turk platform and took part in the simulated elections via Breadboard, a platform generating multiplayer network interactions.) The players were divided into two teams, a "yellow" team and a "purple" team, usually with 24 people on each side, and were allowed to change their voting intentions in response to continuously updated polling data.

The participants also had incentives to try to produce certain vote outcomes reflective of what the authors call a "compromise worldview." For instance, players would receive a (modest) payoff if their team received a super-majority vote share; a smaller payoff if the other team earned a super-majority; and zero payoff if neither team reached that threshold. The election games usually lasted four minutes, during which time each voter had to decide how to vote.

In general, voters almost always voted for their own party when the polling data showed it had a chance of reaching a super-majority share. They also voted for their own side when the polling data showed a deadlock was likely. But when the opposing party was likely to achieve a super-majority, half the players would vote for it, and half would continue to vote for their own side.

During a baseline series of election games where all the players had unbiased, random polling information, each side won roughly a quarter of the time, and a deadlock without a super-majority resulted about half the time. But the researchers also varied the game in multiple ways. In one iteration of the game, they added information gerrymandering to the polls, such that some members of one team were placed inside the other team's echo chamber. In another iteration, the research team deployed online bots, comprising about 20 percent of voters, to behave like "zealots," as the scholars called them; the bots would strongly support one side only.

After months of iterations of the game, the researchers concluded that election outcomes could be heavily biased by the ways in which the polling information was distributed over the networks, and by the actions of the zealot bots. When members of one party were led to believe that most others were voting for the other party, they often switched their votes to avoid deadlock.

"The network experiments are important, because they allow us to test the predictions of the mathematical models," says Mosleh, who led the experimental portion of the research. "When we added echo chambers, we saw that deadlock happened much more often -- and, more importantly, we saw that information gerrymandering biased the election results in favor of one party over the other."

The empirical case

As part of the larger project, the team also sought out some empirical information about similar scenarios among elected governments. There are many instances where elected officials might either support their first-choice legislation, settle for a cross-partisan compromise, or remain in deadlock. In those cases, having unbiased information about the voting intentions of other legislators would seem to be very important.

Looking at the co-sponsorship of bills in the U.S. Congress from 1973 to 2007, the researchers found that the Democratic Party had greater "influence assortment" -- more exposure to the voting intentions of people in their own party -- than the Republican Party of the same time. However, after Republicans gained control of Congress in 1994, their own influence assortment became equivalent to that of the Democrats, as part of a highly polarized pair of legislative influence networks. The researchers found similar levels of polarization in the influence networks of six out of the eight European parliaments they evaluated, generally during the last decade.

Rand says he hopes the current study will help generate additional research by other scholars who want to keep exploring these dynamics empirically.

"Our hope is that laying out this information gerrymandering theory, and introducing this voter game, we will spur new research around these topics to understand how these effects play out in real-world networks," Rand says.

Credit: 
Massachusetts Institute of Technology

Denisovan finger bone more closely resembles modern human digits than Neanderthals

Scientists have identified the missing part of a finger bone fragment from the Denisova Cave in southern Siberia, revealing that Denisovans--an early human population discovered when the original fragment was genetically sequenced in 2010--had fingers indistinguishable from modern humans despite being more closely related to Neanderthals. This finding uncovers an important piece of evidence to the puzzle surrounding Denisovan skeletal morphology and suggests that finger bone characteristics unique to Neanderthals evolved after their evolutionary split from Denisovans. The Denisovans lived in Asia for hundreds of thousands of years, sometimes interbreeding with Neanderthals and perhaps archaic Eurasian humans, with some present-day human populations still carrying Denisovan DNA. However, only five Denisovan skeletal remains have been found--mostly molars--and the finger bone fragment previously recovered and used to generate the genome was too incomplete to reveal much about the whole appendage. E. Andrew Bennett et al. matched the missing fragment to the original by using DNA extraction and sequencing techniques to capture its entire mitochondrial DNA sequence. They then reanalyzed scans and photographs of the fragments and compared them with finger bones from Neanderthals, as well as Pleistocene and recent modern humans at various stages of development. The researchers found that the digit was a fifth finger bone from the right hand of an adolescent female Denisovan who likely died at about 13.5 years old. The authors say researchers should take caution when identifying potential Denisovan skeletal remains, since they may appear more similar to modern humans than to Neanderthals.

Credit: 
American Association for the Advancement of Science (AAAS)

Protein tangles linked with dementia seen in patients after single head injury

image: These scans show the amount of tau protein in the brains of patients who have suffered a traumatic brain injury. Red represents increased tau accumulation.

Image: 
Imperial College London

Scientists have visualised for the first time protein 'tangles' associated with dementia in the brains of patients who have suffered a single head injury.

This is the finding of a new study led by scientists from Imperial College London, published in the journal Science Translational Medicine.

In the early-stage study, researchers studied 21 patients who had suffered a moderate to severe head injury at least 18 years earlier (mostly from traffic accidents), as well as 11 healthy individuals who had not experienced a head injury.

The research, from scientists at Imperial's Dementia Research Institute as well as the University of Glasgow, showed some of these patients had clumps of protein in their brain called tau tangles.

The team, who recruited patients from the Institute of Health and Wellbeing at the University of Glasgow and from Imperial College Healthcare NHS Trust, say the research may accelerate the development of treatments that breakdown tau tangles, by enabling medics to monitor the amount of the protein.

Tau normally helps provide structural support to nerve cells in the brain - acting as a type of scaffolding, but when brain cells become damaged - for instance during a head injury, the protein may form clumps, or tangles.

Tau tangles are found in Alzheimer's disease and other forms of dementia, and associated with progressive nerve damage.

Scientists have known for some time that repeated head injury - such as those sustained in sports such as boxing, rugby and American Football - can lead to neurodegeneration and dementia in later life - with particularly strong links to a type of brain condition called chronic traumatic encephalopathy.

However, this is the first time scientists have seen the protein tangles in living patients who have suffered a single, severe head injury, explains Dr Nikos Gorgoraptis, author of the paper from Imperial's Department of Brain Sciences.

"Scientists increasingly realise that head injuries have a lasting legacy in the brain - and can continue to cause damage decades after the initial injury. However, up until now most of the research has focussed on the people who have sustained multiple head injuries, such as boxers and American Football players. This is the first time we have seen in these protein tangles in patients who have sustained a single head injury."

Dr Gorgoraptis adds that although these tangles have been detected in the brains of patients in post-mortem examination - where findings suggest around one in three patients with a single head injury develop protein tangles - they have not before been seen in the brains of living patients.

The study used a type of brain scan, called a PET scan, combined with a substance that binds to tau protein, called flortaucipir, to study the amount of tau protein in the brains of head injury patients.

The results revealed that, collectively, patients with head injury were more likely to have tau tangles. The paper also showed that patients with tau tangles had higher levels of nerve damage, particular in the white matter of the brain. None of the healthy individuals had tau tangles.

Interestingly, the results revealed patients with higher levels of tau tangles did not necessarily have any reduction in brain function, such as memory problems, compared to patients with fewer tangles.

However, Dr Gorgoraptis adds these tangles can develop years before a person starts to develop symptoms such as memory loss. He explained there are still many questions to answer about the tau tangles and brain damage.

"This research adds a further piece in the puzzle of head injury and the risk of neurodegeneration. Not all patients with head injury develop these protein tangles, and some patients can have them for many years without developing symptoms. While we know tau tangles are associated with Alzheimer's and other forms of dementia, we are only beginning to understand how brain trauma might lead to their formation. What is exciting about this study is this is the first step towards a scan that can give a clear indication of how much tau is in the brain, and where it is located. As treatments develop over the coming years that might target tau tangles, these scans will help doctors select the patients who may benefit and monitor the effectiveness of these treatments."

Credit: 
Imperial College London

Kids in neighbourhoods with larger households less likely to be killed in house fires

There is safety in numbers. That's one of the key findings of a study published today in CMAJ Open that found a child's risk of death or injury in a residential fire was greatly reduced in neighbourhoods with larger than average households.

Led by researchers with the BC Injury Research and Prevention Unit (BCIRPU) at BC Children's Hospital and the University of British Columbia (UBC), the study is the first to broadly investigate the socioeconomic factors that affect fire incidence and fire-related injuries and death at the neighbourhood level in Canada.

"Many studies have looked at how individual factors, like smoking or drinking, increase a person's risk of being injured in a fire, or those studies overlooked children altogether," said senior author Dr. Ian Pike, director of the BCIRPU and professor with the UBC department of pediatrics. "By shedding light on the neighbourhood-level factors, this research could help municipalities and emergency responders identify areas at greater risk and adapt campaigns to target those most vulnerable."

Researchers examined data from BC, Alberta, Manitoba and Ontario on fire incidence and injury along with neighbourhood-level data on income, education level, unemployment rate and number of people per home. They found that the single factor with the greatest impact on safety was household size; in neighbourhoods with larger households, each additional person reduced a child's risk of getting injured by 60 per cent and an adult's risk by 25 per cent. In this study, average household size ranged from 1.8 to 4.4 people per home.

"We don't know why bigger households are safer, but it does make sense," said lead author Dr. Emilie Beaulieu, a pediatrician and UBC postdoctoral fellow at BC Children's. "Older siblings may be getting the younger siblings out and, in lone-parent families, parents may face more barriers in getting their kids out, like having to travel back into the home to rescue a young child."

Despite progress in fire safety, residential fires are still responsible for many deaths and injuries in Canada every year; from 2005 to 2015, 145,252 residential fires and 9,952 casualties were reported. These traumatic events can result in long-term physical and emotional suffering.

"This research shows we should be rethinking our strategies around fire safety," said Beaulieu. "People living in neighbourhood with small families and households are at greater risk of being injured or killed when fires occur. We should ensure families in these areas have escape plans in place and that children and youth know how to get out and get safe."

Researchers used data from 2005-2015 from the National Fire Information Database along with 2011 census subdivision social domain data from Statistics Canada. The National Fire Information Database is a demonstration project carried out by the Canadian Association of Fire Chiefs and a division of Statistics Canada in collaboration with the Fire Marshalls and Fire Commissioners through a grant from the federal government.

Credit: 
University of British Columbia

80% cut in antibiotics entering Thames is needed to avoid surge in superbugs

image: The River Thames catchment was the subject of the CEH-led modelling study.

Image: 
Mike Hutchins

The amount of antibiotics entering the River Thames would need to be cut by as much as 80 per cent to avoid the development and spread of antibiotic-resistant 'superbugs', a new study has shown.

Scientists from the Centre for Ecology & Hydrology (CEH) modelled the effects of antibiotic prescriptions on the development of antibiotic-resistant bacteria in a river. It found that across three-quarters of the River Thames catchment, the antibiotics present, due to effluent discharge, were likely to be at levels high enough for antibiotic-resistant bacteria to develop.

The study comes after England's chief medical officer Professor Dame Sally Davies warned last week that bugs resistant to antibiotics could pose a more immediate risk to humanity than climate change, and may kill at least 10 million people a year across the world.

Dr Andrew Singer of the Centre for Ecology & Hydrology, who led the study, says: "Rivers are a 'reservoir' for antibiotic-resistant bacteria which can quickly spread to people via water, soil, air, food and animals. Our beaches offer a similar risk. It has been shown that surfers are four times more likely to carry drug-resistant bacteria than non-surfers."

How antibiotics get into our rivers

Up to 90% of prescribed antibiotics taken by people pass through the body and into the sewerage system, where about half end up in rivers when effluent is discharged.

Dr Singer explains: "The release of drugs and bugs into our rivers increases the likelihood of antibiotic-resistant genes being shared, either through mutation or 'bacterial sex'. This is the first step towards the development of superbugs as the drugs used to fight them will no longer work.

"Environmental pollution from drugs and bugs is a serious problem that we need to find solutions to."

The CEH-led research was based on prescription data from clinical commissioning groups for two classes of antibiotics - macrolides, such as erythromycin and azithromycin, and fluoroquinolones such as ciprofloxacin, levofloxacin and moxifloxacin.

Macrolides treat a range of respiratory and sexually transmitted infections such as pneumonia, whooping cough and chlamydia, while fluoroquinolones treat respiratory and urinary tract infections. These were chosen for the study because they biodegrade slowly while other antibiotics such as penicillin biodegrade before they get to the river.

Possible solutions

There are a number of different ways we could reduce the amount of antibiotics entering rivers, including:

reducing inappropriate prescriptions1, either because the antibiotics will not reduce the infection, or the course of treatment is longer than is medically necessary

preventative action so fewer medicines are needed in the first place, such as more rapid diagnosis of medical conditions, greater uptake of vaccinations for illnesses and better hygiene controls in hospitals.

increased investment in research and development of new wastewater treatment processes that would remove the drugs and bugs from sewage.

The Government and health officials recognise that reducing antibiotic prescribing is key to tackling antibiotic resistance, and there was a 6.1 per cent reduction in total antibiotic consumption in primary and secondary care in England between 2014 and 2018 (2).

However, antibiotics prescriptions per person in the UK per person is still higher than several European countries and double that of the Netherlands, where their controls on prescribing antibiotics and effective hygiene measures in the healthcare system have resulted in relatively low rates of antibiotic resistance.
Thames Water recognises that antibiotic resistance in water is an emerging risk and that more work needs to be done on how best to manage it (3).

The study - Translating antibiotic prescribing into antibiotic resistance in the environment: a hazard characterisation case study - which also involved Royal Holloway, University of London, and received funding from UKRI, has been published in the journal PLOS ONE.

Credit: 
UK Centre for Ecology & Hydrology

Racial disparity in Houston's pretrial population

There has been no shortage of discourse surrounding racial and ethnic disparities in the criminal justice system. In fact, the need to address these inequities have emerged as a central tenet of most viable criminal justice reform efforts. However, missing from the ongoing dialogue concerning race, crime, and justice, are attempts to evolve from the mere documentation of disparity's presence to action through empirically informed policy recommendations, program development and intervention designs. The following report represents one such localized movement toward action in Houston, Texas, the third largest criminal justice system in the nation, whereby we examine those behavioral characteristics and systematic responses that underlie the state of racial/ethnic disparities in the jail system.

Credit: 
Center for Justice Research at Texas Southern University

NIH, Cincinnati Children's scientists develop possible strategy for cancer drug resistance

image: The chemical structure of a prospective drug sitting inside the protein kinase IRAK4.

Image: 
Cincinnati Children's Hospital Medical Center

Scientists from the National Institutes of Health and Cincinnati Children's Hospital Medical Center have devised a potential treatment against a common type of leukemia that could have implications for many other types of cancer. The new approach takes aim at a way that cancer cells evade the effects of drugs, a process called adaptive resistance.

The researchers, in a range of studies, identified a cellular pathway that allows a form of acute myeloid leukemia (AML), a deadly blood and bone marrow cancer, to elude the activity of a promising class of drugs. They then engineered a compound that appears to launch a two-pronged attack against the cancer. In several experiments, the compound blocked a mutant protein that causes the AML. At the same time, it halted the cancer cells' ability to sidestep the compound's effects. The results, reported Sept. 4 in Science Translational Medicine, could lead to the development of new therapies against AML and cancers that act in similar ways.

Co-corresponding authors Daniel Starczynowski, Ph.D., at Cincinnati Children's, Craig Thomas, Ph.D., at NIH's National Center for Advancing Translational Sciences (NCATS) and their colleagues wanted to better understand drug resistance in a form of AML caused by a mutant protein called FLT3. This form of AML accounts for roughly 25% of all newly diagnosed AML cases, and patients often have a poor prognosis. A more thorough understanding of the drug resistance process could help them find ways to improve therapy options.

FLT3 belongs to a class of enzymes called kinases. Kinases are proteins that play a role in cell growth and proliferation. When kinases work overtime, they can cause some cancers. Drugs that block kinase activity have been effective in treating cancers. While many of these drugs work initially, often in combination with other therapies, cancer cells frequently find ways to bypass the drugs' effects and begin growing again. For many patients, this drug resistance can be deadly. FLT3 is always turned on in these cancer cells, sending chemical signals for the cells to grow and divide. Scientists have designed drugs to block FLT3 activity, but the AML cells eventually find ways to get the growth signals elsewhere.

To find out how this happens, Starczynowski's team treated FLT3-mutant cancer cells grown in the laboratory with an inhibitor drug. The drug initially killed many cancer cells. The scientists closely examined the AML cells that survived, including genes that were turned on and off in the resistant cells. They found that the leukemia cells were now getting chemical growth signals from different protein kinases through a system normally used by the immune system to respond to viruses and other invaders. Experiments on AML cells from patients treated with a FLT3 inhibitor showed similar results.

"We think that these leukemia cells were under stress from the FLT3 inhibitor drug because it was blocking the growth signals they badly wanted," Starczynowski said. "The cells found another way to get these signals through another route that compensated for this, a complex of the kinases IRAK1 and IRAK4, which are involved in the immune system."

To prove their theory, the scientists, in a series of experiments on leukemia cell models in the lab and cells from leukemia patients, showed that inhibiting FLT3 and IRAK1-IRAK4 activity at the same time was more toxic to cells relative to only inhibiting FLT3.

To prevent this drug resistance, the NCATS team worked to block the activity of FLT3 and the IRAK kinases with a single compound. Starting with an IRAK4 inhibitor, they developed a compound that was effective against all three protein kinases. These efforts led to a new compound, NCGC1481, that could effectively block the activity of the kinases in both human cancer cells in the lab and mouse models with cancer cells from patients. (NCGC is an acronym for the NCATS Chemical Genomics Center.)

"NCGC1481 not only stops the cancer cells' growth caused by the mutant protein, but also blocks the cancer's escape route," said first author Katelyn Melgar, Ph.D. "The result is a compound with superior activity in the models we tested."

The findings may be broadly applicable to many other cancers as well. "There is already evidence that this may be a general cancer resistance mechanism to protein kinase inhibitor drugs," Starczynowski said. "In theory, the use of drugs that affect many kinases at once, along with other kinds of cancer therapies, could be effective treatments."

For now, NCGC1481 is still a work in progress.

"Kinase inhibitor development aims to thread a needle between targets that are important, such as FLT3 and IRAK1-IRAK4, and targets that aren't," said Thomas. "We're still working on a final product that we hope can advance to clinical testing."

Credit: 
NIH/National Center for Advancing Translational Sciences (NCATS)

Laser-based ultrasound approach provides new direction for nondestructive testing

image: Many industrial buildings rely on ultrasound instruments that continually monitor the structural integrity of their systems without damaging or altering their features. One new technique draws on laser technology and candle soot to generate effective ultrasonic waves for nondestructive testing and evaluation. Researchers are using ultrasonic nondestructive testing that involves amplifying the signal from a photoacoustic laser source using laser-absorbing patch made from an array of nanoparticles from candle soot and polydimethylsiloxane. This image shows the fabrication process of patterned candle soot (CS) nanoparticle (NP) polydimethylsiloxane (PDMS) patch.

Image: 
Taeyang Kim

WASHINGTON, D.C., September 3, 2019 -- Many industrial buildings, including nuclear power plants and chemical plants, rely on ultrasound instruments that continually monitor the structural integrity of their systems without damaging or altering their features. One new technique draws on laser technology and candle soot to generate effective ultrasonic waves for nondestructive testing and evaluation.

A team of researchers is using ultrasonic nondestructive testing (NDT) that involves amplifying the signal from a photoacoustic laser source using laser-absorbing patch made from an array of nanoparticles from candle soot and polydimethylsiloxane. They discuss their work in this week's Applied Physics Letters, from AIP Publishing.

The approach marks one of the first NDT systems that combines elements of contact and noncontact ultrasound testing. The results of generating such ultrasonic waves with the photoacoustic patch demonstrate the promise of the broad range of noncontact applications for NDT.

"Laser-based NDT method has advantages of temperature-independent measurement and wide range of monitoring area by easily changing the position of devices," said Taeyang Kim, an author on the paper. "This technique provides a very flexible and simple method for noncontact and remote generation of ultrasonic surface waves."

Ultrasound waves can be made when a high-powered laser strikes a surface. The heat produced by the pulses induces a pattern of expansion and compression on the illuminated area, yielding an ultrasonic signal. The waves produced, called Lamb waves, then travel through material as an elastic wave.

The group used the candle soot nanoparticles paired with polydimethylsiloxane to absorb the laser. They turned to candle soot because it is readily available and efficient at absorbing lasers and can produce the elastic expansion needed to make the photoacoustic conversion that generates the Lamb wave.

By placing the particle in the patch in a line array, they were able to narrow the bandwidth of the waves, filtering out unwanted wave signals and increasing analytical accuracy. The researchers opted for an aluminum sensing system for the receiving transducer.

The patch increased the amplitude by more than twofold over conditions without the patch and confirmed it produced narrower bandwidth than other conditions.

Kim said the question of how the approach's durability in an industrial setting remains, as well as how well the patches perform on curved and rough surfaces.

"New NDT systems will attract more attention to explore the optimal materials for the patch or various applications for NDT industries," he said.

Next, the team looks to test the system in high-temperature nondestructive testing scenarios.

Credit: 
American Institute of Physics