Culture

Cells carrying Parkinson's mutation could lead to new model for studying disease

MADISON -- Parkinson's disease researchers have used gene-editing tools to introduce the disorder's most common genetic mutation into marmoset monkey stem cells and to successfully tamp down cellular chemistry that often goes awry in Parkinson's patients.

The edited cells are a step toward studying the degenerative neurological disorder in a primate model, which has proven elusive. Parkinson's, which affects more than 10 million people worldwide, progressively degrades the nervous system, causing characteristic tremors, dangerous loss of muscle control, cardiac and gastrointestinal dysfunction and other issues.

"We know now how to insert a single mutation, a point mutation, into the marmoset stem cell," says Marina Emborg, professor of medical physics and leader of University of Wisconsin-Madison scientists who published their findings Feb. 26 in the journal Scientific Reports. "This is an exquisite model of Parkinson's. For testing therapies, this is the perfect platform."

The researchers used a version of the gene-editing technology CRISPR to change a single nucleotide -- one molecule among more than 2.8 billion pairs of them found in a common marmoset's DNA -- in the cells' genetic code and give them a mutation called G2019S.

In human Parkinson's patients, the mutation causes abnormal over-activity of an enzyme, a kinase called LRRK2, involved in a cell's metabolism. Other gene-editing studies have employed methods in which the cells produced both normal and mutated enzymes at the same time. The new study is the first to result in cells that make only enzymes with the G2019S mutation, which makes it easier to study what role this mutation plays in the disease.

"The metabolism inside our stem cells with the mutation was not as efficient as a normal cell, just as we see in Parkinson's," says Emborg, whose work is supported by the National Institutes of Health. "Our cells had a shorter life in a dish. And when they were exposed to oxidative stress, they were less resilient to that."

The mutated cells shared another shortcoming of Parkinson's: lackluster connections to other cells. Stem cells are an especially powerful research tool because they can develop into many different types of cells found throughout the body. When the researchers spurred their mutated stem cells to differentiate into neurons, they developed fewer branches to connect and communicate with neighboring neurons.

"We can see the impact of these mutations on the cells in the dish, and that gives us a glimpse of what we could see if we used the same genetic principles to introduce the mutation into a marmoset," says Jenna Kropp Schmidt, a Wisconsin National Primate Research Center scientist and co-author of the study. "A precisely genetically-modified monkey would allow us to monitor disease progression and test new therapeutics to affect the course of the disease."

The concept has applications in research beyond Parkinson's.

"We can use some of the same genetic techniques and apply it to create other primate models of human diseases," Schmidt says.

The researchers also used marmoset stem cells to test a genetic treatment for Parkinson's. They shortened part of a gene to block LRRK2 production, which made positive changes in cellular metabolism.

"We found no differences in viability between the cells with the truncated kinase and normal cells, which is a big thing. And when we made neurons from these cells, we actually found an increased number of branches," Emborg says. "This kinase gene target is a good candidate to explore as a potential Parkinson's therapy."

Credit: 
University of Wisconsin-Madison

Newly identified cellular trash removal program helps create new neurons

image: An immunofluorescence image of a region of the brain called the dentate gyrus within the hippocampus which is one of two places that neural stem cells reside in a rodent brain. In this image you can see many neural stem cells in the brain labeled by vimentin in red and Sox2, a marker for cells that are self-renewing, in cyan.

Image: 
Image by Christopher Morrow

MADISON - New research by University of Wisconsin-Madison scientists reveals how a cellular filament helps neural stem cells clear damaged and clumped proteins, an important step in eventually producing new neurons.
The work provides a new cellular target for interventions that could boost neuron production when it's needed most, such as after brain injuries. And because clumping proteins are a hallmark of many neurodegenerative diseases, like Alzheimer's, the new study could provide insight into how these toxic proteins can be cleared away.
Assistant Professor of Neuroscience Darcie Moore led the work with her graduate student Christopher Morrow. Their study is available online in the journal Cell Stem Cell.

"As a long-term goal, we would love to be able to induce endogenous neural stem cells to help regenerate the tissue, especially after a stroke or some type of neurodegeneration," says Morrow.

In a mouse model, the team identified a cellular filament known as vimentin as a key component of neural stem cells' protein-management system. They found that vimentin brings proteasomes -- molecular garbage disposals that can digest targeted proteins -- to clumps of damaged proteins that must be removed for cells to function properly. Neural stem cells accumulate damaged proteins during the aging process, or when they are dormant or exposed to toxic chemicals.

When neural stem cells lacked vimentin, they were worse at clearing away targeted proteins, came out of dormancy more slowly and were less able to recover from being exposed to protein-damaging toxins. Mice unable to produce vimentin had a reduced ability to produce new neurons from stem cells at a younger age, suggesting that vimentin is important for keeping neural stem cells spry and productive during aging.

Textbooks used to teach that adult mammals didn't produce new neurons. Not so, says Morrow.

"Recent evidence suggests that neural stem cells are present in adult mammals, they're just not entering the cell cycle and dividing. And we also now know that a critical component of a neural stem cell entering into the cell cycle is clearing away proteins," says Morrow. "We're describing a program that neural stem cells have for clearing protein rapidly and efficiently and entering the cell cycle to undergo neurogenesis."

That program involves tagging damaged proteins, concentrating them in one spot in the cell, and then transporting digesters to that spot to break down the damaged proteins. To study what role vimentin plays in this program, he tagged the filament protein with a fluorescent marker and also studied mice unable to produce vimentin.

He saw that while neural stem cells could still tag and concentrate damaged proteins without vimentin, they needed this filament protein to bring proteasomes to the right place to clear all the old proteins away. With a reduced ability to dispose of accumulated proteins, neural stem cells were worse at coming out of dormancy and producing new neurons in mice.

It's a surprising revival for the role of vimentin, which scientists long assumed to be largely limited to helping cells move around and providing structural support for the cell. Twenty years ago, researchers developed mice unable to make vimentin -- and they seemed fine. But now it's becoming clear that vimentin is important for responding to challenging situations, such as aging or toxins, that threaten to gum up cells with clumped proteins.

Mutations in vimentin have been linked to diseases in humans, including cataracts and, in some cases, accelerated aging. And cancer cells rely on vimentin when they start metastasizing. More research is required to determine how vimentin affects cellular health, aging and disease in humans and other animals.

"In addition to focusing on neural stem cells as a path toward regenerative therapies, an obvious next step is to investigate how vimentin plays in a role in other diseases like cancer," says Moore. "This study gives us a lot to follow up on."

Credit: 
University of Wisconsin-Madison

Could new discovery play a role in diagnosing Alzheimer's earlier?

Scientists have detected that a previously overlooked gene behavior could potentially lead to a new way to diagnose Alzheimer's earlier.

Published in the journal Epigenetics, an international research team's findings - discovered in mice and confirmed in human samples - suggest that the gene Presenilin1 (PSEN1) should be monitored as a 'biomarker': to see what environmental triggers, such as lifestyle and nutrition, can influence brain function and neurodegeneration or/and to see how well the body responds to a treatment for the disease.

Led by Professor Andrea Fuso at the Sapienza University of Rome, the study is the first to observe that the methylation (when the DNA activity of a gene can change, without changing the actual DNA sequence) of the gene PSEN1 is a common feature of Alzheimer's.

The results of the study appear to show that PSEN1, which was already known to behave differently for people with Alzheimer's, may have been dismissed in previous studies due to methods used to investigate DNA methylation.

The limitations of comparing results from mouse models and humans include, mouse stages of development and neurodegeneration not corresponding precisely to those of human aging. In this study, the team note that the blood and brain samples were obtained from different subjects. They suggest that future studies should analyse DNA from the same individuals, and in a larger cohort, in order to validate this potential biomarker.

However, Professor Fuso, from the Department of Experimental Medicine, at Sapienza University of Rome, states that the new results do offer "an exciting new area of investigation".

"We've detected an early sign of the disease in a DNA modification, or epigenetic marker, that was previously overlooked, and that could even provide a starting point for developing new therapies, as well as earlier diagnosis" he added.

Worldwide, nearly 50 million people have Alzheimer's or related dementia. Yet, only 1-in-4 people with Alzheimer's disease have been diagnosed.

The earlier Alzheimer's can be detected, the better the chance of using treatment to delay the onset of severe dementia. Epigenetic alternations to genes, induced by environmental triggers such as lifestyle and nutrition, can influence brain function and neurodegeneration. Evidence from animal models has found that changes to regulation of the PSEN1 gene is associated with Alzheimer's-like pathology, but only a handful of studies have investigated DNA modification of the gene in humans.

For the current study, the authors analysed patterns of DNA modification that affect the expression of the PSEN1 gene during brain development and during the progression of Alzheimer's in mice. They checked the results in humans by analysing post-mortem human brain tissue from Alzheimer's patients and from prenatal and postnatal babies and adolescents. To see whether changes to DNA methylation could be detected in human blood, they analysed blood samples from 20 patients with late-onset Alzheimer's disease, comparing the results to 20 healthy controls.

In Alzheimer's-prone mice of both sexes, they found that the PSEN1 gene was overexpressed. In adult female mice only, this overexpression was associated with lower DNA methylation. The results from post-mortem human brain tissue found upregulation of the PSEN1 gene in Alzheimer's patients. In both sexes, there was a significant inverse relationship between the extent of gene expression and DNA methylation. The fact that sex-specific differences were not found in human tissue could be due to the relatively small sample size.

"Differences between the sexes in DNA modifications would be extremely interesting to researchers working to better understand Alzheimer's disease and to develop new therapies," says Professor Fuso.

The analysis of blood samples was able to detect lower PSEN1-related DNA methylation in Alzheimer's patients compared to controls. The difference was significant, although not as large as in brain samples. As lower methylation was detectable in the blood, and is associated with higher expression of PSEN1, it could offer a new way to diagnose Alzheimer's early, and less invasively, than sampling brain tissue.

Professor Fuso concludes: "Our results offer an exciting new area of investigation, deploying the methods we used to study DNA methylation so that modifications won't be missed. If found to be causal, our findings would provide a starting point for developing epigenetic therapies."

Credit: 
Taylor & Francis Group

How door-to-door canvassing slowed an epidemic

Liberia was the epicenter of a high-profile Ebola outbreak in 2014-15, which led to more than 10,000 deaths in West Africa. But for all the devastation the illness caused, it could have been worse without an innovative, volunteer-based outreach program Liberia's government deployed in late 2014.

Now, a study co-authored by an MIT professor shows how much that program, consisting of door-to-door canvassing by community volunteers, spread valuable information and changed public practices during the epidemic. The findings also demonstrate how countries with minimal resources can both fight back against epidemics and gain public trust in difficult circumstances.

"Mediated [volunteer-based] government outreach had a positive impact on all of the [health] outcomes we measured," says Lily Tsai, a professor of political science at MIT and co-author of a new paper detailing the study's findings. "People knew more [about Ebola], had a more factual understanding of the epidemic, and were more willing to comply with government control measures. And downstream, they're more likely to trust government institutions."

Indeed, after talking to canvassers, residents of Monrovia, Liberia's capital, were 15 percentage points more supportive of disease control policies, 10 percentage points less likely to violate a ban on public gatherings (to limit the spread of Ebola), 26 percentage points more likely to support victims' burials by government workers, and 9 percentage points more likely to trust Liberia's Ministry of Health, among other outcomes. They were also 10 percentage points more likely to use hand sanitizer.

Intriguingly, the volunteer-based outreach program succeeded after an earlier 2014 campaign, using Ministry of Health staff, was abandoned, having been "met with disbelief and outright violence," as the new paper states.

"There's often an assumption that government outreach doesn't work," says Tsai, the Ford Professor of Political Science at MIT. "What we find is that it does work, but it really matters how that government outreach is conducted and structured."

The research shows that, crucially, 30 percent of the people who spoke with canvassers already knew those volunteers, adding a layer of social trust to the program. And all volunteers canvassed in communities where they lived.

"They were building interpersonal trust and enabling people to hold them accountable for any misinformation," Tsai says. "They were like guarantors for a loan. It's a way of saying, 'You can trust me. I'm going to co-sign for the government. I'm going to guarantee it.'"

The paper, "Building Credibility and Cooperation in Low-Trust Settings: Persuasion and Source Accountability in Liberia During the 2014-2015 Ebola Crisis," appears in advance online form in the journal Comparative Political Studies.

In addition to Tsai, the authors are Benjamin S. Morse PhD '19, a senior training manager and researcher at MIT's Abdul Latif Jameel Poverty Action Lab (J-PAL), and Robert A. Blair, an assistant professor of political science and international and public affairs at Brown University.

When "costly signals" build confidence

Liberia faced many challenges while responding to the Ebola crisis. The nation's brutal civil wars, from 1989 to 2003, stripped away much of the government's functionality, and while the country has since taken major steps toward stability, there is still deep and widespread suspicion about government.

"In Liberia, you have a postconflict setting where citizens already mistrusted the government strongly," Tsai explains. "When citizens say they don't trust the government, they sometimes think the government is actually out to hurt them, physically."

To conduct the study, the research conducted multiple public-opinion surveys in Liberia in 2014 and 2015, and added 80 in-depth interviews with government leaders and residents in 40 randomly sampled communities in Monrovia.

To be sure, Ebola was a substantial problem in Liberia. Overall, there were 10,678 reported cases of Ebola and 4,810 deaths attributed to the illness. In June 2014, the surveys showed, 38 percent of Monrovia residents thought the government's statements about Ebola constituted a "lie" designed to generate more funding from outside aid groups.

However, the study found, once the volunteer-based program got underway, canvassers were able to not only reach large numbers of residents but persuade residents to believe what they were saying.

While knocking on doors in their own communities, the canvassers were equipped with bibs and badges to identify themselves as program volunteers. They distributed information and had conversations with other residents, and even offered their own contact information to people -- a significant (and potentially risky) gesture providing a form of accountability to other citizens.

"A large part of what worked was that the outreach workers made it possible for the people that they were canvassing to track them down," Tsai says. "That's a pretty big commitment, what we call a 'costly signal.' Costly signals help build trust, because it's not cheap talk."

Ultimately, while Ebola took a significant toll in Liberia, the volunteer campaign was "remarkably (and surprisingly) effective" in changing both behavior and attitudes, the paper concludes.

A case study in rebuilding trust?

Tsai believes that beyond the specific contours of Liberia's Ebola response, there are larger issues that can be applied to the study of other countries. For one, while Liberia received significant aid in combatting Ebola from the World Health Organization and other nongovernmental organizations, she thinks the need for short-term aid should not preclude the long-term building of government capacity.

"In the short term, it can make sense for external actors to substitute for the government," Tsai says. "In the medium and long term we need to think about what that substitution might do to the trust and confidence that people have in their government." For many people, she adds, "the assumption is the government either isn't capable of doing it, or shouldn't be doing it," when in fact even under resourced governments can make progress on serious issues.

Another point is that the Liberia case shows some ways governments can build confidence among their citizens.

"In so many countries these days, trust in institutions, trust in authorities, trust in sources of information is so low, and in the past there's been very little research on how to rebuild trust," Tsai notes. "There's a lot of research on what lowers trust."

However, she adds, "That's what I think is special about this case. Trust was successfully built and constructed under a pretty unlikely set of circumstances."

Credit: 
Massachusetts Institute of Technology

Study: Corporate tax incentives do more harm than good to states

A study of tax incentives aimed at attracting and retaining businesses finds that the vast majority of these incentives ultimately leave states worse off than if they had done nothing.

For the study, researchers at North Carolina State University examined data from 32 states from 1990-2015. The researchers evaluated all of the state and local tax incentives available in the 32 states, as well as an array of economic, political, governmental and demographic data. A computational model assessed the extent to which the effects of attracting or retaining businesses in a state offset the state's related tax incentives.

"We found that, in almost all instances, these corporate tax incentives cost states millions of dollars - if not more - and the returns were minimal," says Bruce McDonald, an associate professor of public administration at NC State and corresponding author of the study. "In fact, the combination of costly tax incentives and limited returns ultimately left states in worse financial condition than they were to begin with."

The two exceptions to the finding were job creation tax credits and job training grants.

"In both cases, the cost of the incentives was more than offset by tax revenue created by new jobs or by previously underemployed people finding higher-paying work," McDonald says.

"The takeaway message here is that maybe states shouldn't be offering these tax incentives. Or, at the very least, states need to examine their assumptions about the impact these incentives actually have, with the exception of incentives explicitly tied to job creation and training."

The 32 states included in the study account for about 90% of state and local tax incentives nationally. The states were Alabama, Arizona, California, Colorado, Connecticut, Florida, Georgia, Illinois, Indiana, Iowa, Kentucky, Louisiana, Maryland, Massachusetts, Michigan, Minnesota, Missouri, Nebraska, Nevada, New Jersey, New Mexico, New York, North Carolina, Ohio, Oregon, Pennsylvania, South Carolina, Tennessee, Texas, Virginia, Washington and Wisconsin.

Credit: 
North Carolina State University

Researchers combine advanced spectroscopy technique with video-rate imaging

image: The researchers used their new direct hyperspectral dual-comb imaging approach to acquire hyperspectral images of ammonia gas escaping from a bottle. The left image shows a photograph of the scene while the right image shows a map of ammonia transmittance extracted from a single interferogram. The inset shows the spectral response measured by the system at a particular pixel.

Image: 
Pedro Martín-Mateos, Universidad Carlos III de Madrid

WASHINGTON -- For the first time, researchers have used an advanced analytical technique known as dual-comb spectroscopy to rapidly acquire extremely detailed hyperspectral images. By acquiring a full spectrum of information for each pixel in a scene with high sensitivity and speed, the new approach could greatly advance a wide range of scientific and industrial applications such as chemical analysis and biomedical sensing.

"Dual-comb spectroscopy has revolutionized optical spectroscopy by providing unmatched spectral resolution and accuracy as well as short acquisition times without moving parts," said research team leader Pedro Martín-Mateos from Universidad Carlos III de Madrid, in Spain. "Our new direct hyperspectral dual-comb imaging approach will make it possible to expand most of the point-detection capabilities of current dual-comb systems to create a spectral image of an entire scene."

Dual-comb spectroscopy uses two optical sources, known as optical frequency combs, that emit a spectrum of colors - or frequencies - that are perfectly spaced like the teeth on a comb. As reported in Optica, The Optical Society's journal for high impact research, this is the first time that a dual-comb spectrum has been directly detected using a video camera.

"We demonstrate spectral interrogation of a 2D object in just one second, more than three orders of magnitude faster than previous demonstrations," said Martín-Mateos. "This fast acquisition time enables dual-comb hyperspectral imaging of fast or dynamic processes, which wasn't possible before."

Although the work was performed using near-infrared wavelengths, the researchers say that the concept can be easily transferred to a variety of spectral regions, widening the number of possible applications.

In particular, expanding the approach to the terahertz and millimeter wave spectral regions would open many new opportunities for nondestructive testing and product inspection in the food, agricultural and pharmaceutical industries. In the mid-infrared and the near-infrared regions it could also enhance the performance of chemical imaging, 3D mapping and surface topography technologies.

Video-rate detection

Dual-comb spectrometers work by interfering light from two closely matched optical frequency combs. This mixing process generates a signal known as an interferogram at rates that are typically in the tens of megahertz (million times per second), too fast to capture with even the fastest high-speed video cameras.

"We stretched the interferograms generated by our system up to a second to make it possible to detect the dual-comb interference signal using a video camera," explained Martín-Mateos. "This allows the spectral analysis of an entire scene, instead of just a point."

To do this the researchers built a system based on a very simple electro-optic dual-comb source made mostly of optical fiber components. The use of two acousto-optic modulators let them offset the optical combs by an arbitrarily low frequency, to create ultra-slow interferograms.

The researchers used the new method to acquire hyperspectral images of ammonia gas escaping from a bottle. They achieved an optical resolution of 1 GHz (0.0033 cm-1) at video rates of 25 frames per second, with each frame containing 327,680 individual spectral measurements. According to the researchers, the resolution they achieved allows easy distinction between different gases and is 100 times better than current commercial equipment.

"This enables us, for example, to easily identify and distinguish between different gases. The resolution demonstrated in this first experimental demonstration is two orders of magnitude better than that of current commercial equipment.

"Simplicity is one of the main strengths of the system," said Martín-Mateos. "It worked flawlessly and could be implemented in any optics laboratory."

The work is part of a larger project funded by the ATTRACT initiative (Horizon 2020), which aims to develop a fast hyperspectral imaging system that uses the terahertz region of the electromagnetic spectrum for inspection, quality control and classification of agricultural and food products. The researchers are now working to develop a terahertz dual-comb source to demonstrate the method in this spectral region.

Credit: 
Optica

Learning difficulties due to poor connectivity, not specific brain regions

image: Brain map showing examples of networks and hubs.

Image: 
Roma Siugzdaite

Different learning difficulties do not correspond to specific regions of the brain, as previously thought, say researchers at the University of Cambridge. Instead poor connectivity between 'hubs' within the brain is much more strongly related to children's difficulties.

Between 14-30% of children and adolescents worldwide have learning difficulties severe enough to require additional support. These difficulties are often associated with cognitive and/or behavioural problems. In some cases, children who are struggling at school receive a formal diagnosis of a specific learning difficulty or disability, such as dyslexia, dyscalculia or developmental language disorder, or of a developmental disorder such as attention deficit and hyperactivity disorder (ADHD), dyspraxia, or autism spectrum disorder.

Scientists have struggled to identify specific areas of the brain that might give rise to these difficulties, with studies implicating myriad brain regions. ADHD, for example, has been linked to the anterior cingulate cortex, caudate nucleus, pallidum, striatum, cerebellum, prefrontal cortex, the premotor cortex and most parts of the parietal lobe.

One potential explanation is that each diagnosis differs so much between one individual and the next, that each involves different combinations of brain regions. However, a more provocative explanation has been proposed by a team of scientists at the MRC Cognition and Brain Sciences Unit, University of Cambridge: there are, in fact, no specific brain areas that cause these difficulties.

To test their hypothesis, the researchers used machine learning to map the brain differences across a group of almost 479 children, 337 of whom had been referred with learning-related cognitive problems and 142 from a comparison sample. The algorithm interpreted data taken from a large battery of cognitive, learning, and behavioural measures, as well as from brain scans taken using magnetic resonance imaging (MRI). The results are published today in Current Biology.

The researchers found that the brain differences did not map onto any labels the children had been given - in other words, there were no brain regions that predicted having ASD or ADHD, for example. More surprisingly, they found that the different brain regions did not even predict specific cognitive difficulties - there was no specific brain deficit for language problems or memory difficulties, for example.

Instead, the team found that the children's brains were organised around hubs, like an efficient traffic system or social network. Children who had well-connected brain hubs had either very specific cognitive difficulties, such as poor listening skills, or had no cognitive difficulties at all. By contrast, children with poorly connected hubs - like a train station with few or poor connections - had widespread and severe cognitive problems.

"Scientists have argued for decades that there are specific brain regions that predict having a particular learning disorder or difficulty, but we've shown that this isn't the case," said Dr Duncan Astle, senior author on the study. "In fact, it's much more important to consider how these brain areas are connected - specifically, whether they are connected via hubs. The severity of learning difficulties was strongly associated with the connectedness of these hubs, we think because these hubs play a key role in sharing information between brain areas."

Dr Astle said that one implication of their work is that it suggests that interventions should be less reliant on diagnostic labels.

"Receiving a diagnosis is important for families. It can give professional recognition for a child's difficulties and open the door to specialist support. But in terms of specific interventions, for example from the child's teachers, they can be a distraction.

"It's better to look at their areas of cognitive difficulties and how these can be supported, for example using specific interventions to improve listening skills or language competencies, or at interventions that would be good for the whole class, like how to how to reduce working memory demands during learning."

The findings may explain why drugs treatments have not proven effective for developmental disorders. Methylphenidate (Ritalin), for example, which is used to treat ADHD, appears to reduce hyperactivity, but does not remediate cognitive difficulties or improve educational progress. Drugs tend to target specific types of nerve cells, but would have little impact on a 'hub-based' organisation that has emerged over many years.

While this is the first time that hubs and their connections have been shown to play a key role in learning difficulties and developmental disorders, their importance in brain disorders is becoming increasingly clear in recent years. Cambridge researchers have previously shown that they also play an important role in mental health disorders that begin to emerge during adolescence, such as schizophrenia.

Credit: 
University of Cambridge

A common gut microbe secretes a carcinogen

video: Cayetano Pleguezuelos-Manzano, Jens Puschhof and Axel Rosendahl Huber explain their research on genotoxic E. coli bacteria

Image: 
DEMCON | nymus 3D and Melanie Fremery, ©Hubrecht Institute

Cancer mutations can be caused by common gut bacteria carried by many people. This was demonstrated by researchers from the Hubrecht Institute (KNAW) and Princess Máxima Center in Utrecht, the Netherlands. By exposing cultured human mini-guts to a particular strain of Escherichia coli bacteria, they uncovered that these bacteria induce a unique pattern of mutations in the DNA of human cells. This mutation pattern was also found in the DNA of patients with colon cancer, implying that these mutations were induced by the 'bad' bacteria. It is the first time that researchers establish a direct link between the microbes inhabiting our bodies and the genetic alterations that drive cancer development. This finding may pave the way to prevention of colorectal cancer by pursuing the eradication of harmful bacteria. The results of this research were published in Nature on the 27th of February.

Our body contains at least as many bacterial as human cells. Most of these microbes contribute to a healthy life, while others may cause diseases. Among the bacteria with potentially harmful consequences is a strain of the best-known gut bacterium: Escherichia coli (E. coli). This particular E. coli strain is "genotoxic": it produces a small chemical, called "colibactin", which can damage the DNA of human cells. It has therefore long been suspected that the genotoxic E. coli, which live in the intestines of 1 out of 5 adults, could be harmful to their human hosts. "There are probiotics currently on the market that contain genotoxic strains of E. coli. Some of these probiotics are also used in clinical trials as we speak" explains Hans Clevers (Hubrecht Institute). "These E. coli strains should be critically re-evaluated in the lab. Though they may provide relief for some bodily discomfort in the short term, these probiotics could lead to cancer decades after the treatment".

Damage in the dish

Cancer cells are driven by specific DNA mutations, which allow these cells to grow into a tumor. Exposure to UV light or smoking can directly cause DNA damage, which induces mutations, and thus increase the chance that normal cells transform into cancers. But until now, it was unknown that the bacteria in our gut can similarly induce cancer mutations in cells through their DNA damaging effects.

A team of three PhD students from the groups of Hans Clevers (Hubrecht Institute) and Ruben van Boxtel (Princess Máxima Center for pediatric oncology) set out to identify the damaging effects of colibactin on human DNA. For this, they used tiny lab-grown human intestines, so called organoids, a model system that was previously developed in the group of Hans Clevers. The team developed a method to expose healthy human intestinal organoids to the genotoxic E. coli bacteria. After five months of bacterial exposure, they sequenced the DNA of the human cells and studied the number and types of mutations caused by the bacteria.

A tell-tale footprint

Each process that can cause DNA damage leaves behind a specific mutation pattern, which is called a mutational footprint or signature. Such specific signatures have already been identified for various cancer-causing agents, including tobacco smoke and UV light. Presence of these specific footprints in the DNA of cancers can tell us about past exposures, which may underlie disease initiation. "These signatures can have great value in determining causes of cancer and may even direct treatment strategies", explains Van Boxtel. "We can identify such mutational footprints in several forms of cancer, also in pediatric cancer. This time we wondered if the genotoxic bacteria also leave their unique distinguishing mark in the DNA."

"I remember the excitement when the first signatures appeared on the computer screen" says Axel Rosendahl Huber, "we had hoped for some indication of a signature that we could follow up on in other experiments, but the patterns were more striking than any signature we had analyzed before."

A puzzle falling into place

The genotoxic bacteria caused two co-occurring mutational patterns in the DNA of the organoids: the change of an A to any other of the four possible letters of the DNA code, and the loss of a single A in long stretches of A's. In both cases, another A was present on the opposite strand of the DNA double helix, 3 to 4 bases away from the mutated site.

The team was wondering if they could learn something about the mechanism of colibactin-induced DNA damage from their data. "While we were at the final stage of the project, different research teams identified the structure of colibactin and how it interacts with the DNA", says Cayetano Pleguezuelos-Manzano. Their research revealed colibactin's ability to bind two A's at the same time and cross-link these. "It was like a puzzle falling into place. The mutational patterns that we saw in our experiments could very well be explained by colibactin's chemical structure".

From organoid to patient

Once they established the footprint of genotoxic E. coli, the researchers set out to find traces of it in the DNA of cancer patients. They analyzed mutations in more than 5,000 tumors, covering dozens of different cancer types. Among these, one type stood out: "More than 5% of colorectal cancer had high levels of the footprint, while we only saw it in less than 0.1% of all other cancers," recalls Jens Puschhof, "Imagine studying a gut bacterium's footprint for months in a dish, and then finding back the same footprint in the DNA of patients." Only a few other cancers, known to be exposed to the bacteria, such as cancers in the oral cavity and the bladder, also had the footprint. "It is known that E. coli can infect these organs, and we are keen to explore if its genotoxicity may act in other organs beyond the colon. The signature we defined experimentally helps us with this".

An early warning

This study may have direct implications for human health. Individuals may be screened for the presence of these genotoxic bacteria; it is reported that 10-20 percent of people can harbor the 'bad' version of E. coli in their intestines. Antibiotic treatment could eradicate these bacteria early on. In the future it may be possible to catch colorectal cancer development very early or to even prevent tumors from developing.

Credit: 
Hubrecht Institute

A molecular atlas of skin cells

image: Maria Kasper, researcher at the Department of Biosciences and Nutrition, Karolinska Institutet.

Image: 
Anders Lindholm

Our skin protects us from physical injury, radiation and microbes, and at the same time produces hair and facilitates perspiration. Details of how skin cells manage such disparate tasks have so far remained elusive. Now, researchers at Karolinska Institutet in Sweden have systematically mapped skin cells and their genetic programs, creating a detailed molecular atlas of the skin in its complexity. The study is published today in the scientific journal Cell Stem Cell.

Mammalian skin has several important tasks to perform. It provides a waterproof protective barrier against the outside world, produces hair and harbours sweat glands. This tissue complexity requires many types of cells, such as fibroblasts, immune cells, nerve cells and pigment cells. To systematically study the skin, researchers at Karolinska Institutet have mapped the genes that are active in thousands of individual cells using a technique called single-cell RNA sequencing. Examining tissue from the skin and its hair-producing hair follicles at different stages of hair growth, the researchers uncovered how cells are coordinated during the phases of hair growth and rest.

"We found over 50 different kinds of cells in the skin, including new variations of cell types that have not been described before," says Maria Kasper, research group leader at the Department of Biosciences and Nutrition, Karolinska Institutet. "We've also seen that most types of skin cells are affected by different phases of hair growth".

As part of the study, the researchers described exactly where in the skin these cells are located and which genes they express. The authors have made this information available in an open-access online atlas, which helps others interested in specific genes to quickly find out if and where they are expressed. Conversely, researchers interested in specific cells can find out how gene expression changes during their task specification. The researchers behind this atlas believe that this information will be useful to other scientists studying for example skin diseases, wound healing or skin cancer.

By using their own atlas the authors have made several discoveries. For example, they have found that the outermost layer of the hair follicle consists of several types of cells organised in a specific way. They could also see how the hair progenitors, a type of stem cell that has started its specialisation towards specific hair follicle parts, goes through different molecular stages.

"This gives us vital knowledge on the flexibility of the skin, what the skin does to maintain its function and structure in different situations," says Simon Joost, first author and recent graduate from Maria Kasper's research group. "This knowledge may help us understand the flexibility of other organs, how they renew themselves and respond to different needs."

Credit: 
Karolinska Institutet

Astronomers detect biggest explosion in the history of the Universe

image: This extremely powerful eruption occurred in the Ophiuchus galaxy cluster, which is located about 390 million light-years from Earth. Galaxy clusters are the largest structures in the Universe held together by gravity, containing thousands of individual galaxies, dark matter, and hot gas.

Image: 
X-ray: NASA/CXC/Naval Research Lab/Giacintucci, S.; XMM:ESA/XMM; Radio: NCRA/TIFR/GMRTN; Infrared: 2MASS/UMass/IPAC-Caltech/NASA/NSF

Scientists studying a distant galaxy cluster have discovered the biggest explosion seen in the Universe since the Big Bang.

The blast came from a supermassive black hole at the centre of a galaxy hundreds of millions of light-years away.

It released five times more energy than the previous record holder.

Professor Melanie Johnston-Hollitt, from the Curtin University node of the International Centre for Radio Astronomy Research, said the event was extraordinarily energetic.

"We've seen outbursts in the centres of galaxies before but this one is really, really massive," she said.

"And we don't know why it's so big.

"But it happened very slowly--like an explosion in slow motion that took place over hundreds of millions of years."

The explosion occurred in the Ophiuchus galaxy cluster, about 390 million light-years from Earth.

It was so powerful it punched a cavity in the cluster plasma--the super-hot gas surrounding the black hole.

Lead author of the study Dr Simona Giacintucci, from the Naval Research Laboratory in the United States, said the blast was similar to the 1980 eruption of Mount St. Helens, which ripped the top off the mountain.

"The difference is that you could fit 15 Milky Way galaxies in a row into the crater this eruption punched into the cluster's hot gas," she said.

Professor Johnston-Hollitt said the cavity in the cluster plasma had been seen previously with X-ray telescopes.

But scientists initially dismissed the idea that it could have been caused by an energetic outburst, because it would have been too big.

"People were sceptical because the size of outburst," she said. "But it really is that. The Universe is a weird place."

The researchers only realised what they had discovered when they looked at the Ophiuchus galaxy cluster with radio telescopes.

"The radio data fit inside the X-rays like a hand in a glove," said co-author Dr Maxim Markevitch, from NASA's Goddard Space Flight Center.

"This is the clincher that tells us an eruption of unprecedented size occurred here."

The discovery was made using four telescopes; NASA's Chandra X-ray Observatory, ESA's XMM-Newton, the Murchison Widefield Array (MWA) in Western Australia and the Giant Metrewave Radio Telescope (GMRT) in India.

Professor Johnston-Hollitt, who is the director of the MWA and an expert in galaxy clusters, likened the finding to discovering the first dinosaur bones.

"It's a bit like archaeology," she said.

"We've been given the tools to dig deeper with low frequency radio telescopes so we should be able to find more outbursts like this now."

The finding underscores the importance of studying the Universe at different wavelengths, Professor Johnston-Hollitt said.

"Going back and doing a multi-wavelength study has really made the difference here," she said.

Professor Johnston-Hollitt said the finding is likely to be the first of many.

"We made this discovery with Phase 1 of the MWA, when the telescope had 2048 antennas pointed towards the sky," she said.

"We're soon going to be gathering observations with 4096 antennas, which should be ten times more sensitive."

"I think that's pretty exciting."

Credit: 
International Centre for Radio Astronomy Research

Baldness gene discovery reveals origin of hairy alpine plants

image: This is a species of Alpine snapdragon with hairy leaves.

Image: 
Ying Tan

Scientists have solved a puzzle that has long baffled botanists - why some plants on high mountainsides are hairy while their low-lying cousins are bald.

Alpine species of snapdragon have evolved to disable a gene that prevents those living at low altitudes from growing hairs on their stalks and leaves, researchers say.

The small hairs may act like UV sunscreen to protect alpine plants growing in full sun on lofty, exposed cliffs, the team says. Low-lying plants might not need to make the hairs because of the relative abundance of shade in valleys.

These insights could aid the production of useful chemicals secreted by the hairs of some plants, scientists say, including the antimalarial drug artemisinin, and the chemicals that give herbs and hops their flavours.

Researchers from the University of Edinburgh identified the gene that controls hair production - which they named the Hairy gene - in snapdragons by breeding alpine and lowland species with each other.

They found that the gene is switched off in alpine plants. It is switched on in low-lying species, which causes baldness by blocking the activation of sections of DNA involved in hair production, the team says.

Their findings show that the first snapdragons - which grew around 12 million years ago - were bald, and that newer, alpine species evolved as a result of mutations that deactivated the gene.

Credit: 
University of Edinburgh

How cardiorespiratory function is related to genetics

image: Participants were also taken in a cable car to a high altitude laboratory at the top of Aiguille du Midi mountain in Chamonix in France

Image: 
Lancaster University

How high altitudes affect people's breathing and its coordination with the heart beat is due to genetic differences say researchers.

Clear physiological differences have already been demonstrated between people living in the Himalayas and Andes compared with people living at sea level, revealing an evolutionary adaptation in the control of blood flow and oxygen delivery to the brain and the rest of the body.

Now an international team led by Professor Aneta Stefanovska of Lancaster University has identified genes that are related to cardiorespiratory function during so-called acute periodic breathing. Their report is published in the Journal of Physiology.

Periodic breathing (PB) occurs in most humans at high altitudes and is characterised by periodic alternations between hyperventilation (too-fast breathing) and apnoea (no breathing). The altered respiratory pattern due to PB is accompanied by changes in heart rate and blood flow.

Breathing, ECG of the heart and microvascular blood flow were simultaneously monitored for 30 minutes in 22 healthy male subjects, with the same measurements repeated under normal and low oxygen levels, both at real and simulated altitudes of up to 3800m.

As part of the experiment, the participants were also taken in a cable car to a high altitude laboratory at the top of Aiguille du Midi mountain in Chamonix in France and tested immediately on arrival and after six hours at this altitude of 3842m.

The researchers found that orchestration between the participants' hearts and lungs, as measured by phase coherence, responded differently to periodic breathing depending on whether they had one of two specific genetic variants affecting the cardiorespiratory response to insufficient oxygen.

Chronic periodic breathing is generally seen as an unfavourable state, being associated with increased mortality in chronic heart failure, but in healthy people it may be an indication of better alertness to oxygen insufficiency at high altitudes.

Hypoxia, as well occurring during rapid ascents to high-altitudes, can also be a significant problem at sea-level, being a contributory factor in many health conditions including cancer, strokes, and heart attacks.

Professor Stefanovska said: "The similarities between hypoxia-induced PB at altitude, and the breathing characteristics observed in certain pathological states, provide an opportunity to further our understanding of the physiological processes involved in chronic hypoxic states that occur even when oxygen is abundant.

"Considering living systems as collections of interacting oscillators whose dynamics is governed by multiple underlying open systems enables the observation of functional changes over time, and investigation of how they are altered in health and disease."

Credit: 
Lancaster University

How sound and visual effects on slot machines increase the allure of gambling

The sights and sounds of winning on a slot machine may increase your desire to play--and your memories of winning big, according to new research by University of Alberta scientists.

The study, led by Professor Marcia Spetch in the Department of Psychology, shows that people prefer to play on virtual slot machines that provide casino-related cues, such as the sound of coins dropping or symbols of dollar signs.

"These results show how cues associated with money or winning can make slot machines more attractive and can even make bigger wins more memorable," said Spetch. "Such cues are prevalent in casinos and likely increase the allure of slot machine gambling."

The researchers also found that people preferred to play on machines with these cues no matter how risky the machine was, and regardless of when the sound or visual effects appeared. "Attraction to slot machines and memory for winning can be influenced by factors other than the amount of money won on a slot machine," explained Christopher Madan, co-author from University of Nottingham in the United Kingdom and former PhD student of Spetch. "People should be aware that their attraction and sense of winning may be biased."

According to the Canadian Gaming Association, 98 per cent of Canadians gamble for fun and entertainment. Alberta is home to 28 casinos and more than 14,000 slot machines. In 2019, revenue generated by the gaming industry in Alberta was $2.7 billion.

This research was conducted in collaboration with Elliot Ludvig from Warwick University in the United Kingdom and with Yang Liu, a postdoctoral fellow in the Faculty of Medicine & Dentistry Department of Psychiatry at the University of Alberta. Funding for this research is provided by the Natural Sciences and Engineering Research Council of Canada (NSERC) and the Alberta Gambling Research Institute (AGRI).

The paper, "Effects of Winning Cues and Relative Payout on Choice between Simulated Slot Machines," was published in Addiction (doi: 10.1111/add.15010).

Credit: 
University of Alberta

Handheld 3D printers developed to treat musculoskeletal injuries

Biomedical engineers at the UConn School of Dental Medicine recently developed a handheld 3D bioprinter that could revolutionize the way musculoskeletal surgical procedures are performed.

The bioprinter, developed by Dr. Ali Tamayol, associate professor in the School of Dental Medicine biomedical engineering department, enables surgeons to deposit scaffolds--or materials to help support cellular and tissue growth--directly into the defect sites within weakened skeletal muscles.

Tamayol's research was recently published in the American Chemical Society journal.

"The printer is robust and allows proper filling of the cavity with fibrillar scaffolds in which fibers resemble the architecture of the native tissue," says Tamayol.

The scaffolds from the bioprinter adhere precisely to the surrounding tissues of the injury and mimic the properties of the existing tissue-- eliminating the need for any suturing.

Current methods for reconstructive surgery have been largely inadequate in treating volumetric muscle loss. As a result, 3D printing technology has emerged as an up and coming solution to help reconstruct muscle.

Dr. Indranil Sinha, a plastic surgeon at Brigham and Women's Hospital at Harvard joined Tamayol in this research study. With expertise in treatment of muscle injuries, Sinha says that a "good solution currently does not exist for patients who suffer volumetric muscle loss. A customizable, printed gel establishes the foundation for a new treatment paradigm can improve the care of our trauma patients."

Existing 3D bioprinting technology is not without its problems. Implanting the hydrogel-based scaffolds successfully requires a very specific biomaterial to be printed that will adhere to the defect site. While 3D bioprinted scaffolds mimicking skeletal muscles have been created in vitro, they have not been successfully used on an actual subject.

Tamayol's solution fixes the problem. Tamayol's bioprinter prints gelatin-based hydrogels - known as "bioink"--that have been proven to be effective in adhering to defect sites of mice with volumetric muscle loss injury. The mice showed a significant increase in muscle hypertrophy following Tamayol's therapy.

"This is a new generation of 3D printers than enables clinicians to directly print the scaffold within the patient's body," said Tamayol. "Best of all, this system does not require the presence of sophisticated imaging and printing systems."

Credit: 
University of Connecticut

Novel photocatalytic method converts biopolyols and sugars into methanol and syngas

image: Methanol and syngas act as the platform chemical connecting the biorefinery and petrochemical industry.

Image: 
WANG Min

A research group led by Prof. WANG Feng from the Dalian Institute of Chemical Physics (DICP) of the Chinese Academy of Sciences recently developed a photocatalytic method for the conversion of biopolyols and sugars to methanol and syngas. The results were published in Nature Communications.

Methanol is considered to be the most promising clean liquid fuel for the future and one that can be deployed on a large scale. In addition, it's a fundamental chemical material used for industrial production of ethylene and propylene. Currently, methanol is industrially produced from natural gas and coal. 

Production of methanol from renewable and abundant carbon resources rather than fossils is a promising route. The bio-derived syngas to fabricate biomethanol is traditionally produced via gasification at high temperature (700-1000 oC). The process usually generates a mixture of CO, CO2, hydrocarbons and deficient H2 as well as coke, char and tar. 

In the current study, the researchers converted biomass-derived polyols and sugars into methanol and syngas (CO+H2) via UV light irradiation at room temperature. The bio-syngas could be further used for the synthesis of methanol.

Cellulose and even raw wood sawdust can be converted into methanol or syngas after hydrogenolysis or hydrolysis pretreatment.

The researchers also found that Cu dispersed on titanium oxide nanorods (TNR) rich in defects effectively promoted selective C-C bond cleavage that produced methanol. Using this process, methanol was obtained from glycerol with co-production of H2. A syngas with CO selectivity up to 90% in the gas phase was obtained by controlling the energy band structure of Cu/TNR. The gas product could be facially tuned from CO2 to CO by controlling the energy band structure of Cu/TNR.

Credit: 
Chinese Academy of Sciences Headquarters