Earth

First of its kind mapping model tracks how hate spreads and adapts online

image: This is a complex web of global hate highways built from strong interconnected clusters.

Image: 
Neil Johnson/GWU

WASHINGTON (Aug. 21, 2019)--Online hate thrives globally through self-organized, scalable clusters that interconnect to form resilient networks spread across multiple social media platforms, countries and languages, according to new research published today in the journal Nature. Researchers at the George Washington University developed a mapping model, the first of its kind, to track how these online hate clusters thrive. They believe it could help social media platforms and law enforcement in the battle against hate online.

With the explosion of social media, individuals are able to connect with other likeminded people in a matter of a few clicks. Clusters of those with common interests form readily and easily. Recently, online hate ideologies and extremist narratives have been linked to a surge in crimes around the world. To thwart this, researchers led by Neil Johnson, a professor of physics at GW, set out to better understand how online hate evolves and if it can be stopped.

"Hate destroys lives, not only as we've seen in El Paso, Orlando and New Zealand, but psychologically through online bullying and rhetoric," Dr. Johnson said. "We set out to get to the bottom of online hate by looking at why it is so resilient and how it can be better tackled. Instead of love being in the air, we found hate is in the ether."

To understand how hate evolves online, the team began by mapping how clusters interconnect to spread their narratives and attract new recruits. Focusing on social media platforms Facebook and its central European counterpart, VKontakte, the researchers started with a given hate cluster and looked outward to find a second one that was strongly connected to the original. They discovered that hate crosses boundaries of specific internet platforms, including Instagram, Snapchat and WhatsApp; geographic location, including the United States, South Africa and parts of Europe; and languages, including English and Russian.

The researchers saw clusters creating new adaptation strategies in order to regroup on other platforms and/or reenter a platform after being banned. For example, clusters can migrate and reconstitute on other platforms or use different languages to avoid detection. This allows the cluster to quickly bring back thousands of supporters to a platform on which they have been banned and highlights the need for crossplatform cooperation to limit online hate groups.

"The analogy is no matter how much weed killer you place in a yard, the problem will come back, potentially more aggressively. In the online world, all yards in the neighborhood are interconnected in a highly complex way--almost like wormholes. This is why individual social media platforms like Facebook need new analysis such as ours to figure out new approaches to push them ahead of the curve," Dr. Johnson said.

The team, which included researchers at the University of Miami, used insights from its online hate mapping to develop four intervention strategies that social media platforms could immediately implement based on situational circumstances:

Reduce the power and number of large clusters by banning the smaller clusters that feed into them.

Attack the Achilles' heel of online hate groups by randomly banning a small fraction of individual users in order to make the global cluster network fall a part.

Pit large clusters against each other by helping anti-hate clusters find and engage directly with hate clusters.

Set up intermediary clusters that engage hate groups to help bring out the differences in ideologies between them and make them begin to question their stance.

The researchers noted each of their strategies can be adopted on a global scale and simultaneously across all platforms without having to share the sensitive information of individual users or commercial secrets, which has been a stumbling block before.

Using this map and its mathematical modeling as a foundation, Dr. Johnson and his team are developing software that could help regulators and enforcement agencies implement new interventions.

Credit: 
George Washington University

Extreme wildfires threaten to turn boreal forests from carbon sinks to carbon sources

image: Wildfires in Alaska

Image: 
Merritt Turetsky

Frequent, Extreme Wildfires Threaten to Turn Boreal Forests from Carbon Sinks to Carbon Sources, Study Reveals

Carbon reservoirs in the soil of boreal forests are being released by more frequent and larger wildfires, according to a new study involving a University of Guelph researcher.

As wildfires continue to ravage northern areas across the globe, a research team investigated the impact of these extreme fires on previously intact carbon stores by studying the soil and vegetation of the boreal forest and how they changed after a record-setting fire season.

"Northern fires are happening more often, and their impacts are changing," said U of G Prof. Merritt Turetsky, who holds the Canada Research Chair in Integrative Biology. She worked on the study with lead authors Xanthe Walker and Michelle Mack from the Center for Ecosystem Science and Society at Northern Arizona University (NAU), as well as a team of Canadian scientists including Wilfrid Laurier University professor Jennifer Baltzer.

The work was supported by NASA, the Natural Sciences and Engineering Research Council of Canada and the Government of the Northwest Territories.

In 2014, the Northwest Territories suffered its largest fire season in recorded history. This series of mega-fires created the ideal environment to study whether carbon stores are being combusted by these types of fires.

"Between fires, boreal soils accumulate carbon, and in most cases only some of this carbon is released when the forests experience the next fire," said Baltzer. "Over time, this explains why the boreal forest is a globally significant carbon sink. We wanted to see whether the extreme 2014 fires tapped into these old-legacy carbon layers or whether they were still preserved in the ground."

For the study, published in the journal Nature, the research team collected soil samples from more than 200 forest and wetland plots across the territory. They applied a novel radiocarbon dating approach to estimate the age of the carbon in the samples.

"Carbon accumulates in these soils like tree rings, with the newest carbon at the surface and the oldest carbon at the bottom," said Mack. "We thought we could use this layering to see how far back in time, in the history of the forest, fires were burning."

The researchers found combustion of legacy carbon in nearly half of the samples taken from young forests (less than 60 years old). This carbon had escaped burning during the previous fire cycle but not during the record-setting fire season of 2014.

"In older stands that burn, this carbon is protected by thick organic soils," said NAU's Walker. "But in younger stands that burn, the soil does not have time to re-accumulate after the previous fire, making legacy carbon vulnerable to burning. This pattern could shift boreal forests into a new domain of carbon cycling, where they become a carbon source instead of a sink."

As wildfires are expected to occur more frequently and burn more intensely, old carbon may be released to the atmosphere more often.

"Understanding the fate of this stockpile of boreal carbon is really important in the context of atmosphere greenhouse gases and the Earth's climate," Turetsky said. "This is carbon the atmosphere lost hundreds or sometimes even thousands of years ago. Fire is one mechanism that can release that old carbon back to the atmosphere quickly where it can contribute to the greenhouse gas effect."

She said the potential switch of the boreal forest from carbon storage to carbon source directly impacts global climate and is not well represented in global models.

"In the context of territorial and pan-Canadian planning for climate change adaptation and mitigation, the Government of Northwest Territories recognizes the critical need to understand the role of our boreal forests in carbon storage, sequestration and release, and how our forest management practices can affect these processes," said Erin Kelly, the territory's assistant deputy minister of environment and natural resources.

Turetsky said this research is important both because of its scientific findings and because it involved stakeholders in tracking the effects of climate change in Canada.

Credit: 
University of Guelph

Studying quantum phenomena in magnetic systems to understand exotic states of matter

image: We have probed the magnetic excitations of Ba2CoSi2O6Cl2 directly via inelastic neutron scattering measurements. The five observed types of magnetic excitation
are dispersionless within the resolution limits, and hence triplet excitations are verified to be localized.

Image: 
Tokyo Tech

Apart from the states of matter that we are all aware of and accustomed to, which correspond to solids, liquids, and gases, more exotic states can be generated in specific materials under special conditions. Such states are of great interest to physicists because they help them gain a deeper understanding of quantum phenomena, which is key for scientists and engineers to innovate state-of-the-art technology.

Bose-Einstein condensate is one such state of matter that occurs at very low temperatures. In this state, most of the constituent particles of the condensate are in the so-called "ground state", which is the state with the lowest energy, and microscopic quantum phenomena can be easily observed. Interestingly, this state can also be exhibited by quasiparticles, which are not actual particles but represent collective microscopic excitations in a system and can be thus used to describe the system in a simplified, yet very useful manner. Magnons, a type of quasiparticle that manifests in magnetic materials, are collective excitations originating from electrons in a crystal. Magnons can normally hop between different locations in the crystal; however, in some compounds and under the effect of a magnetic field, they can be trapped in a kind of catch-22 situation, which results in them exhibiting rigid crystallinity. This is a very interesting quantum phenomenon called "magnon crystallization", where the magnons are said to be in a 'frustrated' state.

To explore this peculiar effect, a team of scientists led by Prof. Hidekazu Tanaka from Tokyo Tech, worked on characterizing the magnetic excitations occurring in a magnetic insulator bearing the chemical formula Ba2CoSi2O6Cl2. They performed neutron scattering experiments, in which neutron beams were fired onto Ba2CoSi2O6Cl2 crystals at different energies and angles to determine the properties of the crystals. Based on the results of these experiments, the team demonstrated that magnon crystallization occurs in Ba2CoSi2O6Cl2 and attributed the origin of this ordered state to the fundamental electronic interactions in the material, from a quantum-mechanical perspective. "Until recently, experimental studies on magnon crystallization have been limited to the Shastry-Sutherland compound, SrCu2(BO3)2, and this study is an attempt to investigate this fascinating quantum phenomenon in a different material," remarks Prof. Tanaka.

Understanding the ordering of magnons and their effects on the micro- and macroscopic magnetic properties of crystals could provide researchers valuable insight to correlate condensed matter physics with the principles of quantum mechanics. "This work shows that highly frustrated quantum magnets provide playgrounds for interacting quantum particles," concludes Prof. Tanaka. As recommended by the scientists, additional studies will be needed to further understand the Ba2CoSi2O6Cl2 system and gain a deeper foothold into quantum mechanics and its potential applications.

Credit: 
Tokyo Institute of Technology

'Key player' identified in genetic link to psychiatric conditions

Scientists have identified a specific gene they believe could be a key player in the changes in brain structure seen in several psychiatric conditions, such as schizophrenia and autism.

The team from Cardiff University's Neuroscience and Mental Health Research Institute has found that the deletion of the gene CYFIP1 leads to thinning of the insulation that covers nerve cells and is vital for the smooth and rapid communications between different parts of the brain.

The new findings, published in the journal Nature Communications and highlighted in the journal Nature Reviews Neuroscience, throws new light on the potential cause of psychiatric conditions and could ultimately point to new and more effective therapies.

Though there are a number of genetic changes that can alter the risk of psychiatric disorders, one prominent change is called Copy Number Variants (CNV) and involves the deletion of bits of DNA.

Specifically, a CNV is where DNA is deleted from one of the chromosome pairs.

Work done in the world-leading Medical Research Council Centre for Neuropsychiatric Genetics and Genomics at Cardiff has shown that people who have these deletions of DNA have a much higher chance of psychiatric disorder but as the deletions often contain many genes, it has so far been a mystery as to exactly which genes contribute to the increased risk.

In their study the team focussed on the deletion of one specific gene, CYFIP1, located in a precise location of chromosome 15, known as 15q11.2, which had already been identified by the same team as an area with links to the biological abnormalities associated with psychiatric disorder.

Using cutting-edge methods and growing brain cells where one copy of CYFIP1 was missing, the team were able to show that this was linked to abnormalities in myelin - an insulating layer or sheath that forms around nerves in the brain.

Moreover, the team were able to trace these abnormalities back to specific brain cells called oligodendrocytes which are responsible for producing myelin sheaths.

First author of the study Ana Silva, who carried out the work with colleagues as part of her PhD studies, supported by the Welcome Trust, said: "What surprised us most was how much of the 15q11.2 deletion effects could be explained by a single gene effect.

"We know that the risk of suffering from a psychiatric condition is influenced by a whole host of factors related to both the physical and social environment and our genetic make-up.

"We believe that CYFIP1 is a key player in the damaging effects of the 15q11.2 deletion and because we know what sort of brain functions this gene is involved in, we can use this knowledge to increase our understanding of psychiatric disorder and potentially find new and more effective therapies."

Lead author Professor Lawrence Wilkinson, Scientific Director of the Neuroscience and Mental Health Institute at Cardiff University, said: "Cardiff has been at the forefront of identifying genetic risk factors for psychiatric conditions and the challenge now is to make biological sense of the genetics to help us understand the disease pathology and design better treatments.

"Our work with CYFIP1 is an example of how genetic insights can guide research into biological mechanisms underlying dysfunction."

Professor Jeremy Hall, co-Senior author and Director of the Neuroscience and Mental Health Research Institute, said: "The combination of clinical and scientific expertise in Cardiff, together with the generous support from our funders the Welcome Trust, Medical Research Council, Waterloo Foundation and Hodge Foundation, means we are in a strong position to exploit the rapid advances in psychiatric genetics for the benefit of patients."

Following on from this research, the team are looking for myelin abnormalities in people with the 15q11.2 deletion using state-of-the-art facilities at Cardiff University's Brain Research Imaging Centre as well as working out the precise mechanism that causes the CYFIP1 myelin abnormalities in order to fix it.

Credit: 
Cardiff University

A serious mental disorder in one's youth can have a lasting impact on employment prospects

Mental disorders experienced in adolescence and early adulthood that require hospital care are connected with low income, poor education and unemployment over the life span of individuals.

A recently completed Finnish study indicates that those who have been hospitalised due to a mental disorder before turning 25 have considerably poorer prospects on the labour market compared to the rest of the population. The risk of being absent from the labour market and not completing an upper secondary-level qualification or a higher degree is great.

The employment rate was the lowest among individuals who were hospitalised for schizophrenia. Of them, less than 10% were employed during the follow-up period of the study.

Less than half of the individuals hospitalised for mood disorders worked after the age of 25.

The earnings of people with serious mental disorders in their youth were quite low and did not improve later. More than half had no earnings over the follow-up period.

The extensive register-based study involved more than 2 million individuals living in Finland between 1988 and 2015, who were monitored between 25 and 52 years of age.

"People suffering from mental disorders drop out from the labour market for a wide range of reasons. However, opportunities for contributing to professional life and acquiring an education should already be taken into consideration at the early stages of treating serious mental disorders, provided the patient's condition allows it," states Christian Hakulinen, a postdoctoral researcher from the University of Helsinki.

Credit: 
University of Helsinki

New evidence highlights growing urban water crisis

New research has found that in 15 major cities in the global south, almost half of all households lack access to piped utility water, affecting more than 50 million people. Access is lowest in the cities of sub-Saharan Africa, where only 22% of households receive piped water.

The research also found that of those households that did have access, the majority received intermittent service. In the city of Karachi in Pakistan, the city's population of 15 million people received an average piped water supply of only three days a week, for less than three hours.

These new findings add to data from the World Resources Institute's (WRI) Aqueduct tool, which recently found that by 2030, 45 cities with populations over 3 million could experience high water stress. The research, detailed in the Unaffordable and Undrinkable: Rethinking Urban Water Access in the Global South report shows that even in some places where water sources are available, water is not reaching many residents. Some cities, like Dar es Salaam, have relatively abundant supplies, yet daily access to clean, reliable and affordable water continues to be problematic for many residents.

"Decades of increasing the private sector's role in water provision has not adequately improved access, especially for the urban under-served," said Diana Mitlin, lead author, professor of global urbanism at The Global Development Institute at The University of Manchester. "Water is a human right and a social good, and cities need to prioritize it as such."

Analysis in the report showed that alternatives to piped water, like buying from private providers that truck water in from elsewhere, can cost up to 25% of monthly household income and is 52 times more expensive than public tap water.

Global indicators used for the Millennium Development Goals and Sustainable Development Goals have largely underestimated this urban water crisis because they do not take into account affordability, intermittency or quality of water.

UNICEF and the World Health Organization reported in 2015 that more than 90% of the world's population used improved drinking water sources - but "improved" encompasses such a wide variety of sources, such as public taps, boreholes or wells that it fails to reflect the reality for individuals and families in today's rapidly growing cities.

The question of whether water is affordable is not measured and while efforts have been made to increase water coverage, public authorities have paid little attention to affordability issues.

"Cities need to rethink how they view equitable access to water," said Victoria A. Beard, co-author, fellow at WRI Ross Center for Sustainable Cities, and professor of city and regional planning at Cornell University. "In many developing countries where urban residents lack access to safe, reliable and affordable water on a daily basis, these are the same countries that have made huge strides in guaranteeing universal access to primary education. Equitable access to water requires similar levels of political commitment. The solutions are not high tech. We know what needs to be done."

The World Health Organization reports that investing in universal drinking water coverage in urban areas would cost $141 billion over five years. But total global economic losses from unsafe water and sanitation systems are estimated to be at least 10 times greater, at $260 billion per year during the same period.

Researchers have identified four specific actions that can improve water access in urban areas, detailed in the report.

"Without changes, the number of people receiving intermittent or poor-quality water will increase in the years ahead, due to rapid urbanisation, increased water scarcity resulting from climate change, and a general underinvestment in water infrastructure," said Ani Dasgupta, global director of WRI Ross Center for Sustainable Cities.

"This will have huge costs for people and the economy. Cities must take actions now to guarantee all urban residents' access to safe, reliable water in the future."

ENDS

Credit: 
University of Manchester

More frequent wildfires in the boreal forest threaten previously protected soil carbon

As major wildfires increase in Canada's North, boreal forests that have acted as carbon sinks for millennia are becoming sources of atmospheric carbon, potentially contributing to the greenhouse effect.

That's the conclusion of an international research team that involves University of Saskatchewan (USask) adjunct researcher Jill Johnstone and recent USask PhD graduate Xanthe Walker who is now a post-doctoral researcher at the Center for Ecosystem Science and Society at Northern Arizona University (NAU). The findings were published Aug. 21 in the prestigious journal Nature.

"This study underscores why more frequent burning in the boreal forest due to wildfires is bad from a climate perspective," said Walker, lead author on the paper.

The research was launched in the aftermath of the severe 2014 fires in the Northwest Territories (N.W.T)--the largest fire season in the region's recorded history. Funded by the N.W.T government and agencies such as NSERC and NASA, the project aims to better understand what happened to boreal forest soils during these fires, knowledge that could improve forest and fire management and help northerners plan and adapt.

"We know that there is really old carbon in these soils--carbon that is hundreds to thousands of years old, carbon that is irreplaceable," said Michelle Mack of NAU, senior author on the paper who worked with Walker and other NAU collaborators on the study.

Carbon is critical to soil function and productivity. Between forest fires, boreal soils accumulate carbon and are a globally significant carbon sink: boreal forests store about one-third of the world's terrestrial carbon, primarily in soils. These pools of old carbon in the soil have been historically safe from combustion, since only some of this carbon is released when the forests undergo a fire.

But with warming of the forest climate and larger and more frequent wildfires, more of this sequestered carbon is being combusted and released--what the researchers describe as "mining" the carbon from the soil.

"The combustion of this "legacy carbon" in the soil has the potential to shift the global carbon cycle, as boreal forests that have acted as carbon sinks for millennia become sources of atmospheric carbon," said Johnstone, who began this work with former PhD student Walker as a professor at USask and is now an adjunct USask biology professor living in the Yukon. "This could potentially accelerate climate warming."

The research team, led by Northern Arizona University scientists, included the Northwest Territories (N.W.T.) government, four Canadian universities, University of Alaska-Fairbanks, and Woods Hole Research Center.

Walker and a large field group of researchers and students from Canadian universities hiked into the N.W.T. burn areas to sample the soil at more than 200 burned areas identified by Laval University researchers on maps derived from remote sensing.

"These were large and severe fires, and we thought: this is when and where it would burn," said Walker, referring to old pools of legacy carbon.

The team found that in older stands of trees, legacy carbon remained protected from combustion, but in stands younger than 60 years old, legacy carbon burned.

"In older stands that burn, this carbon is protected by thick organic soils," said Walker. "But in younger stands that burn, the soil does not have time to re-accumulate after the previous fire, making legacy carbon vulnerable to burning. This pattern could shift boreal forests into a new domain of carbon cycling, where they become a carbon source instead of a sink."

Walker said the good news is that the landscape in the Northwest Territories is comprised primarily of old forests, and legacy carbon stayed protected there.

"Young stands, lacking accumulated fuel, acted act as a kind of fire break. But as the climate continues to warm, young stands may begin to carry fire and threaten legacy carbon," she said.

To estimate the age of the carbon in the soil, the team used radiocarbon dating which measures the abundance of 14C carbon isotope in a soil sample. Carbon dating enables researchers to gather clues about how long certain carbon stores have been in the soil.

"Carbon accumulates in these soils like tree rings, with the newest carbon at the surface and the oldest carbon at the bottom," said Mack. "We thought we could use this layering to see how far back in time, in the history of the forest, fires were burning."

In nearly half (45 per cent) of the young stands the researchers sampled, legacy carbon burned. And while the amount of legacy carbon did not alter total carbon emitted from these fires, the pattern the researchers identified has global implications for future climate scenarios.

"The frequency of boreal forest fires is projected to increase even more with expected climate warming and drying, and as a result total burned area is expected to increase 130 to 350 per cent by mid-century," the authors write. This increase in the burn area would expand the proportion of young forests vulnerable to burning and loss of legacy carbon.

"By defining and analyzing 'legacy carbon,' this paper offers a new way to think about long-sequestered carbon stocks in boreal forests and how vulnerable they are to being burned during increasingly frequent and severe wildfires," said Brendan Rogers, a scientist at Woods Hole Research Center who co-authored the study. "This carbon-dating tool helps us understand when burning goes 'outside the norm' from a historical perspective and begins to combust carbon stocks that survived past fires."

Research published last year by this group confirmed that the 2014 N.W.T. fire season, which burned more than 2.8 million hectares, released an estimated 94.5 terra-grams of carbon--equivalent to half the annual carbon uptake from vegetation across all of Canada that year (Global Change Biology 2018, https://onlinelibrary-wiley-com.cyber.usask.ca/doi/full/10.1111/gcb.14287).

Jennifer Baltzer Wilfrid Laurier University researcher and a co-author of the study, said, "This research, with the help and partnership of the N.W.T. government, has really advanced our understanding of these fires and the tremendous impact extreme wildfire years have on globally critical stores of carbon."

Credit: 
University of Saskatchewan

Urban stormwater could release contaminants to ground, surface waters

A good rainstorm can make a city feel clean and revitalized. However, the substances that wash off of buildings, streets and sidewalks and down storm drains might not be so refreshing. Now, researchers reporting in ACS' Environmental Science & Technology have analyzed untreated urban stormwater from 50 rainstorms across the U.S., finding a wide variety of contaminants that could potentially harm aquatic organisms in surface waters and infiltrate ground water.

Previous studies of urban stormwater runoff have revealed a mixture of industrial chemicals, pesticides, pharmaceuticals and other substances that, at certain levels, can be toxic to aquatic life. As a result, many cities and water-management agencies are trying to develop stormwater control measures to minimize the transport of these contaminants to other water bodies, such as rivers or aquifers. However, data from a wide variety of locations across the U.S. are lacking. To help fill this research gap, Jason Masoner and colleagues wanted to catalog and quantify the contaminants in urban stormwater from 50 storm events at 21 sites across the nation.

The researchers analyzed 500 chemicals in urban stormwater collected during rainstorms. Samples contained a median of 73 organic chemicals, with pesticides being the most frequently detected chemical group. Eleven contaminants, including the insect repellent DEET, nicotine, caffeine and bisphenol A, were found in more than 90% of samples. The researchers also frequently detected prescription and non-prescription pharmaceuticals, indicating that the stormwater was contaminated with human waste, possibly from sewage leaks or other urban sources. Some of the contaminants were present at levels known to be toxic to aquatic life, but those present at lower concentrations could also have effects when combined with all of the other substances in the water. This study highlights the need for more research about the long-term effects of these contaminants on aquatic organisms exposed to the stormwater, the researchers say.

Credit: 
American Chemical Society

Link between brain immune cells and Alzheimer's disease development identified

image: Microglia, shown in red, surround and react to the amyloid-beta plaques, shown in green, in the Alzheimer's disease brain.

Image: 
Kim Green lab / UCI

Irvine, Calif. -- Scientists from the University of California, Irvine School of Biological Sciences have discovered how to forestall Alzheimer's disease in a laboratory setting, a finding that could one day help in devising targeted drugs that prevent it.

The researchers found that by removing brain immune cells known as microglia from rodent models of Alzheimer's disease, beta-amyloid plaques - the hallmark pathology of AD - never formed. Their study will appear Aug. 21 in the journal Nature Communications.

Previous research has shown most Alzheimer's risk genes are turned on in microglia, suggesting these cells play a role in the disease. "However, we hadn't understood exactly what the microglia are doing and whether they are significant in the initial Alzheimer's process," said Kim Green, associate professor of neurobiology & behavior. "We decided to examine this issue by looking at what would happen in their absence."

The researchers used a drug that blocks microglia signaling that is necessary for their survival. Green and his lab have previously shown that blocking this signaling effectively eliminates these immune cells from the brain. "What was striking about these studies is we found that in areas without microglia, plaques didn't form," Green said. "However, in places where microglia survived, plaques did develop. You don't have Alzheimer's without plaques, and we now know microglia are a necessary component in the development of Alzheimer's."

The scientists also discovered that when plaques are present, microglia perceive them as harmful and attack them. However, the attack also switches off genes in neurons needed for normal brain functioning. "This finding underlines the crucial role of these brain immune cells in the development and progression of Alzheimer's," said Green.

Professor Green and colleagues say their discovery holds promise for creating future drugs that prevent the disease. "We are not proposing to remove all microglia from the brain," Professor Green said, noting the importance of microglia in regulating other brain functions. "What could be possible is devising therapeutics that affect microglia in targeted ways."

He also believes the project's research approach offers an avenue for better understanding other brain disorders.

"These immune cells are involved in every neurological disease and even in brain injury," Professor Green said. "Removing microglia could enable researchers working in those areas to determine the cells' role and whether targeting microglia could be a potential treatment."

Credit: 
University of California - Irvine

Poo transplants to help save koalas

image: Poo transplants are helping expand koala microbiomes, allowing the marsupials to eat a wider range of eucalypts and possibly survive habitat loss.

Image: 
The University of Queensland

Poo transplants are helping expand koala microbiomes, allowing the marsupials to eat a wider range of eucalypts and possibly survive habitat loss.

A study featuring University of Queensland researchers has analysed and altered microbes in koalas' guts, finding that a faecal transplant may influence what species of eucalypt koalas can feed on.

UQ School of Chemistry and Molecular Biosciences Dr Michaela Blyton was inspired to conduct the research after a devastating drop in the koala population on Cape Otway in Victoria.

"In 2013 the koala population reached very high densities, leading them to defoliate their preferred food tree species, manna gum," Dr Blyton said.

"This led to 70 per cent mortality due to starvation, which was very distressing.

"What was interesting was that even though the koalas were starving, they generally didn't start feeding on a less preferred tree species, messmate, despite the fact that some koalas feed exclusively on messmate.

"This led me and colleague Dr Ben Moore at Western Sydney University to wonder if the microbes present in koalas' guts - their microbiomes - were limiting which species they could eat, and if we could allow them to expand their diet with faecal inoculations."

The team caught wild koalas that only ate manna gum and kept them in temporary captivity at the Cape Otway Conservation Ecology Centre.

"We collected poo from radio-collared wild koalas that ate messmate, concentrated the microorganisms in the poo, packaged it into acid-resistant capsules and gave them to the captive koalas," Dr Blyton said.

"We then monitored how much messmate the koalas were willing to eat over an 18-day period and assessed how the microbiomes changed after the inoculations, comparing their diets to those of control koalas that received manna gum microbes."

The researchers found that the faecal inoculations changed the koalas' microbiomes, allowing them to eat messmate.

"This could affect all aspects of their ecology including nutrition, habitat selection and resource use," Dr Blyton said.

"Koalas may naturally have trouble adapting to new diets when their usual food trees become over browsed or after being moved to a new location.

"This study provides a proof of concept for the use of encapsulated faecal material to successfully introduce and establish new microbes in koalas' guts.

"In future, capsules could be used to adjust koalas' microbiomes prior to moving them to safer or more abundant environments, and as probiotics during and after antibiotic treatment."

Credit: 
University of Queensland

Antibiotic use linked to heightened bowel cancer risk

Antibiotic use (pills/capsules) is linked to a heightened risk of bowel (colon) cancer, but a lower risk of rectal cancer, and depends, to some extent, on the type and class of drug prescribed, suggests research published online in the journal Gut.

The findings suggest a pattern of risk that may be linked to differences in gut microbiome (bacteria) activity along the length of the bowel and reiterate the importance of judicious prescribing, say the researchers.

In 2010, patients around the world took an estimated 70 billion doses of antibiotics--equivalent to 10 doses each. Antibiotics have a strong and long lasting impact on the gut microbiome, altering the balance of helpful and harmful bacteria.

The researchers wanted to find out if this might affect bowel and rectal cancer risk, and how.

They drew on data submitted to the nationally representative Clinical Practice Research Datalink (CPRD) between 1989 and 2012.

This contains the anonymised medical records of around 11.3 million people from 674 general practices--around 7% of the UK population.

The researchers collected prescribing information for 28,930 patients diagnosed with bowel (19,726 ) and rectal (9254) cancers during an average monitoring period of 8 years, and for 137,077 patients, matched for age and sex, who didn't develop these cancers.

Antibiotics had been prescribed to 70% (20,278) of patients with bowel and rectal cancers and to 68.5% (93,862) of those without. Nearly six out of 10 study participants had been prescribed more than one class of antibiotic.

Those with bowel cancer were more likely to have been prescribed antibiotics: 71.5% vs 69%. Exposure levels were comparable among those who developed rectal cancer (67%).

The association between bowel cancer and antibiotic use was evident among patients who had taken these drugs more than 10 years before their cancer was diagnosed.

Patients who developed bowel cancer were more likely to have been prescribed antibiotics targeting anaerobes, which don't need oxygen, as well as those targeting aerobes, which do--than patients without cancer.

But patients with rectal cancer were less likely to have been prescribed antibiotics targeting aerobic bacteria.

Cancer site was also associated with antibiotic use. Cancer of the proximal colon--the first and middle parts of the bowel--was associated with the use of antibiotics targeting anaerobes, when compared to people without cancer.

But antibiotic use was not associated with cancer of the distal colon--the last part of the bowel.

After taking account of potentially influential factors, such as overweight, smoking, and moderate to heavy drinking, cumulative use of antibiotics for a relatively short period (16+ days) was associated with a heightened risk of bowel cancer, with the impact strongest for cancers of the proximal colon.

The reverse was true for rectal cancers, where antibiotic use exceeding 60 days was associated with a 15% lower risk compared with no use.

When the analysis was restricted to patients who had been prescribed only one class of antibiotic, as opposed to none, penicillins were consistently associated with a heightened risk of bowel cancer of the proximal colon. Ampicillin/amoxicillin was the penicillin most commonly prescribed to these patients.

By contrast, the lower risk of rectal cancer was associated with prescriptions of tetracyclines.

This is an observational study, and as such, can't establish cause, and the researchers weren't able to capture potentially influential lifestyle factors for all the participants, nor hospital treatment, which may have affected overall cancer risk.

Nevertheless, their findings suggest "substantial" variation in the size and pattern of antibiotic effects along the length of the bowel, they say, concluding: "Whether antibiotic exposure is causal or contributory to colon cancer risk, our results highlight the importance of judicious antibiotic use by clinicians."

Credit: 
BMJ Group

Laser printing technology: Creating the perfect bioprinter

Scientists from Russia, China, and the US have drawn the attention of the scientific community to one of the newest and most promising areas in bioprinting - laser-induced forward transfer (LIFT). The researchers have compared laser printing parameters, bioink composition, donor ribbons, and collector substrates for LIFT bioprinters, as well as post-printing treatments of fabricated materials - all of this may affect the properties of printed tissues and organs. The study will help scientists select the most appropriate techniques and materials, avoid many pitfalls in the process of bioprinting, and set the priorities for the development of this technology in the coming years. The details of the analysis were published in Bioprinting.

Tissue-engineering materials are increasingly used in medicine, mainly because they are created through mimicking the natural environment for cell development. The use of cell carriers (scaffolds) is a step forward compared to traditional cell therapy, which employs stem cells on their own. Bioprinting technologies allow to recreate tissues or organ models ("organs-on-chips") through layer by layer deposition of cells and biomolecules such as drugs or growth factors (compounds regulating cell growth and development) on a three-dimensional support structure.

LIFT technology transfers cells and biomolecules using laser pulse energy. The laser beam of a LIFT bioprinter focuses on the donor ribbon - a glass slide coated with an energy absorbing material (e.g. metal) and a layer of bioink (hydrogel with cells and biomolecules). Where the laser beam hits the surface, it heats and evaporates the energy absorbing layer, generating a gas bubble that propels a jet from the hydrogel layer. The resulting jet lands on another glass slide, the collector substrate, depositing a droplet.

LIFT technology provides a high print speed and cell survival rate, precise transfer of cells or molecules, and allows to work with various objects including microorganisms and whole cell structures such as spheroids. However, each hydrogel-cell combination requires a calculation of specific laser transfer parameters.

The authors of the paper analyzed 33 studies of bioprinting using LIFT. They systematically analyzed the descriptions of laser sources, energy absorbing materials, donor ribbons, and collectors substrates, as well as comparing the objectives and outcomes of the studies.

The most commonly used laser wavelengths were 193 and 1064 nanometers (short ultraviolet and near infrared ranges, respectively), although much longer and shorter wavelengths were successfully experimented with as well. Gold, titanium, gelatin and gelatin-containing mixtures were used as an energy absorbing material, while researchers in five studies did not use this layer at all.

Most of the studies used murine fibroblasts (connective tissue cells that synthesize extracellular matrix proteins) or mesenchymal stromal cells (cells that can differentiate into various connective tissue cells). The choice depended on cell availability.

The bioink used by many research teams contained glycerol and methylcellulose to help the bioink retain moisture, or blood plasma to support cell growth. Another common component was hyaluronic acid because it improved bioink viscosity as well as promoting cell growth. One of the best bioink materials was collagen, the main component of connective tissue. In some studies, the bioink also formed a "functional pair" with the collector substrate: for instance, if the donor ribbon was alginate-based, then the collector substrate contained calcium ions, while fibrinogen-containing donor ribbons were used with collector substrates containing thrombin. Such "functional pairs" allow to maintain the shape of the printed constructs effectively, because the substances in the collector substrate act as bioink fixatives.

The studies also used different types of printing: 2D, whereby the cells were arranged in a single layer (the researchers printed lines, shapes, letters, numbers, or the Olympic flag), or 3D, which allows to recreate complex cellular structures such as stem cell niches. Three-dimensional structures were created by depositing the bioink layer by layer.

The authors of the studies used various techniques to assess the impact of the bioprinting process on cells. Most researchers note that cell viability was fairly high, and there was no damage to the DNA despite the mechanical impact and the spike in temperature. There were no changes in either the proliferation rate of cells or the ability of stem cells to differentiate (transform into more specialized cells). In some of the studies, printed tissues were implanted into laboratory animals. The authors of the review believe that with the improvement of this technology in the next few years, there will be more studies involving animals.

"LIFT technology is quite new, and is only beginning to 'conquer' the world of biomedicine. Naturally, it will be improved and further used in tissue engineering, possibly even in clinical practice. In my opinion, however, its most promising application is in combination with other technologies, which will allow to create tissues and organs for transplantation", says Peter Timashev, one of the paper's authors, Director of the Institute for Regenerative Medicine, Sechenov University.

Credit: 
Sechenov University

Painting a bigger biosociological picture of chronic pain

An integrated approach that unifies psychosocial factors with neurobiology sheds light on chronic pain traits and their underlying brain networks, according to a study published August 20 in the open-access journal PLOS Biology by A. Vania Apkarian of Northwestern University Feinberg School of Medicine, and colleagues.

Unraveling the mechanisms of chronic pain remains a major scientific challenge. Psychological and personality factors, socioeconomic status, and brain properties all contribute to chronic pain but have mainly been studied independently. As a result, the relative influence of these factors on each other, as well as their independent contribution to the state of chronic pain, remain unknown.

To address this gap in knowledge, Apkarian and colleagues analyzed psychological factors, personality, and socioeconomic status, and carried out functional magnetic resonance imaging (fMRI) brain scans, to begin to define a unified perspective of chronic pain. The authors administered a broad battery of questionnaires to patients with chronic back pain and collected repeated sessions of resting-state fMRI scans.

The questionnaire data revealed four independent dimensions that defined chronic pain traits, and two of these traits - "Pain-trait" and "Emote-trait" - were associated with back pain characteristics. While Pain-trait reflected pain catastrophizing and anxiety, Emote-trait reflected higher optimism, mindfulness capacities, lower neuroticism, and lower sensitivity to loss. These two traits were related to neurotraits - patterns of resting-state activity in distinct, distributed brain networks - which were stable across four fMRI sessions acquired over five weeks. Moreover, socioeconomic status was associated with chronic pain traits and their related brain networks, with higher income offering more protection. According to the authors, this integrated approach is a first step in providing metrics aimed at unifying the psychology and neurophysiology of chronic pain across diverse clinical conditions.

Credit: 
PLOS

Novel combination of drugs may overcome drug-resistant cancer cells

Cancer cells can adapt and develop resistance to chemotherapy drugs, making it difficult to eradicate tumors. A new study led by investigators from Brigham and Women's Hospital suggests that a combination of three drugs, including a new class of glucose-6-phosphate dehydrogenase inhibitors, could overcome cross-therapy resistance. The results of the study are published today in Science Signaling.

"We have only recently begun unravelling the full complexities of chemotherapy failure," said Aaron Goldman, PhD, an instructor of Medicine in the Brigham's Division of Bioengineering. "The drugs themselves are part of the problem in terms of where resistance is coming from. Resistance is not just intrinsic to cells."

The investigators used computational modeling, in vitro experiments, in vivo animal models and clinical explant, ex vivo models of human tumors to probe the metabolic processes underlying chemotherapy drug tolerance.

In accordance with the Warburg Effect -- a widely accepted paradigm for drug resistance -- the investigators observed that the cancer cells took up extra glucose, putting glycolytic pathways into overdrive. But counter to the Warburg Effect, the researchers saw an increase in mitochondrial activity, indicating high levels of cellular oxygen consumption.

Using mathematical modeling, Goldman and his team found that a three-drug combination administered in a time-sensitive progression sensitized the cancer cells. Aside from this new class of drugs, combinations of clinically available drugs could also be used to combat resistance, Goldman said.

The researchers acknowledge that they do not yet have a clear understanding of the cancer cell plasticity that allows cells to gain new metabolic phenotypes and become drug resistant. In the future, the investigators hope to use mathematical modeling and machine learning to develop increasingly precise drug regimens to inform new cancer therapies.

"We're mathematically modeling biological frameworks that will allow us to predict drug sequences," Goldman said. "We're not just putting drugs together -- we're developing combinations that rationally address resistance."

Credit: 
Brigham and Women's Hospital

OHIO professor Hla develops robust molecular propeller for unidirectional rotations

image: A look at how the propeller is made up and the unidirectional rotations.

Image: 
Saw-Wai Hla

ATHENS, Ohio (Aug. 20, 2019) - A team of scientists from Ohio University, Argonne National Laboratory, Universitié de Toulouse in France and Nara Institute of Science and Technology in Japan led by OHIO Professor of Physics Saw-Wai Hla and Prof. Gwenael Rapenne from Toulouse developed a molecular propeller that enables unidirectional rotations on a material surface when energized.

In nature, molecule propellers are vital in many biological applications ranging from the swimming bacteria to intracellular transport, but synthetic molecular propellers, like what has been developed, are able to operate in harsher environments and under a precise control. This new development is a multiple component molecular propeller specially designed to operate on solid surfaces. This tiny propeller is composed of three components; a ratchet shape molecular gear as a base, a tri-blade propeller, and a ruthenium atom acting as an atomic ball bearing that connects the two. The size of the propeller is only about 2 nanometers (nm) wide and 1 nm tall.

"What is special about our propeller is its multi-component design that becomes chiral on the gold crystal surface, i.e. it forms right- or left-tilted gears," said Hla. "This chirality dictates the rotational direction when energized."

Hla and his team have also been able to mechanically manipulate and record the molecule's stepwise rotations. This enables them to understand the detail motions at the single molecule level, allowing a direct visualization of the rotation of the individual molecular propellers from images acquired at each rotation steps.

The rotation occurs by an applied electric field, by transfer of electron energy or by mechanical force with a scanning tunneling microscope tip. Through this supply of power, scientists can control the rotation and switch off the propeller by denying it any energy.

Although the molecular propeller developed here is to investigate for the fundamental understanding of its operation, such molecular propellers may find potential applications from catalysts to medicine.

Molecular machines have recently become a trending topic in nanotechnology, with interest in this area of research rising when the 2016 Nobel Prize in Chemistry was awarded for the "design and synthesis of molecular machines."

Credit: 
Ohio University