Culture

Cutting off kidney cancer at its roots

image: Human ccRCC organoids under the microscope, labeled with fluorescent markers. The scientists extracted cancer stem cells from patients and used them to grow these miniature versions of kidney tumors in the lab. The structure of the organoids resembles patients' tumors and contains the same types of cells.

Image: 
Birchmeier Lab, MDC

Scientists at the MDC have discovered stem cells responsible for the most common form of kidney cancer. The team of Walter Birchmeier has found a way to block the growth of these tumors in three models of the disease.

Not all cancer cells are equal. Tumors contain potent cancer stem cells which produce metastases and can regenerate the disease if they escape a treatment. This makes them vital targets for therapies - if scientists can isolate them and probe their weaknesses. But the cells are often so rare that for many types of cancer, they have yet to be found.

Professor Walter Birchmeier's lab at the Max Delbrueck Centrum for Molecular Medicine in the Helmholtz Association (MDC), in a collaboration with the Urology Department of the Charite, has now discovered cancer stem cells responsible for the most common form of kidney cancer: clear cell renal cell carcinoma, or ccRCC. In a Berlin-wide collaboration, the scientists found a weakness. The cells depend on two critical biochemical signals. Blocking them both hinders the growth of tumors in several laboratory models of the disease, suggesting a promising new approach to treating human patients. The work also emphasizes the continued importance of mice in medical research. The study appears in the current issue of Nature Communications and includes authors from the MDC, the Urology Department of the Charite Berlin, the Berlin Institute of Health (BIH), the Screening Unit of the Leibniz Institute FMP, the company EPO, and other partners.

Two biochemical weaknesses

Identifying ccRCC cancer stem cells was crucial to the project. Dr. Annika Fendler, a postdoc in the Birchmeier group and a member of the Charite Urology Department, was first author on the paper. She identified three proteins on the surfaces of the cells that enabled them to be tagged and then isolated. This permitted Dr. Hans-Peter Rahn to isolate the cells using fluorescence-activated cell sorting (FACS). The scientists found that cancer stem cells accounted for only about two percent of the total found in the human tumors.

"Our analysis of these cells shows that they depend on signals passed along two biochemical networks called WNT and NOTCH," Fendler says. Because these networks were known to play roles in other types of cancer, the lab has learned to disrupt them. They had already developed a potent inhibitor of WNT signals with the FMP, their partner institute on campus.

Previously a role for WNT and NOTCH had not been suspected in kidney tumors; mutations in these networks are rarely found in the disease. Both signals are, however, linked to a tumor suppressor gene called VHL, which is strongly associated with ccRCC. The new findings suggested that blocking WNT, NOTCH or both signals might target the cancer stem cells and interfere with the most aggressive components of the tumors.

In the clinic, inhibitors against various biochemical pathways are increasingly replacing chemotherapy in treatments for cancer patients. "But you have to know what pathways to target," Fendler says, "and not enough was known about the biology of ccRCC."

The promise of multiple model systems

Initial tests of the new inhibitors were promising. "Remarkably, three quarters of cell cultures derived from the patients responded to at least one type of inhibitor, and 50 percent of the rest were inhibited in the presence of the two inhibitors," Birchmeier says.

But here the lab confronted one of the main challenges of cancer research. "What we learn in the lab is usually very difficult to translate into the real context of a patient," Birchmeier says. "Regular cell line cultures and animal models obtained from other labs don't reflect the complexity of a disease in a person's body." A solution is to develop more types of models which are closer to the human disease.

Birchmeier and his colleagues were already proficient at extracting cancer stem cells from patients, growing them in cultures and challenging them with a huge palette of drugs. In collaborations with the company EPO on the Berlin-Buch campus, they have also transplanted patients' cancer stem cells into mice, which develop tumors virtually identical to those of their human counterparts. These animals are essential in the search for therapies: what cures a human tumor in mice might also work in a patient. In the current project, EPO injected WNT and NOTCH inhibitors, singly and and combinations, into tumor-bearing mice and observed what happened. Blocking both signals turned out to be the most effective strategy. But would it work equally well in humans?

A new type of model

Very recently scientists have learned to use patient cells to generate organoids: miniature versions of organs, containing many types of cells. They are composed of human tissue, but can be used without the ethical problems of testing drugs on human patients. Organoids had already been created for healthy kidneys, various organs, and tumors such as colon cancer.

"Other groups had tried with ccRCC, but had been less successful," Fendler says. "The tissue didn't grow very well or did not produce organoids. Both of these factors are important in developing models for drug testing and treatments. A patient with the disease needs fast and reliable models on which treatment responses can be tested."

Different models, similar results

"The most crucial finding from the study," Birchmeier says, "is to have identified the essential roles of WNT and NOTCH signaling systems in ccRCC, and to show that inhibiting them has an impact on the tumors." There remain subtle differences between the model systems that still need to be explored; at the moment, studies of mice are still needed.

In the meantime, the work provides important new experimental systems for scientists working on the disease. Annika Fendler has moved on to the Francis Crick Institute in London, where she continues to work on models of kidney cancer. Ultimately, the scientists hope, the strategy developed in the models will make the jump to the clinic, in custom-designed therapies that target the most dangerous cells in the tumors.

Credit: 
Max Delbrück Center for Molecular Medicine in the Helmholtz Association

Combination drug therapy for childhood brain tumors shows promise in laboratory models

In experiments with human cells and mice, researchers at the Johns Hopkins Kimmel Cancer Center report evidence that combining the experimental cancer medication TAK228 (also called sapanisertib) with an existing anti-cancer drug called trametinib may be more effective than either drug alone in decreasing the growth of pediatric low-grade gliomas. These cancers are the most common childhood brain cancer, accounting for up to one-third of all cases. Low grade pediatric gliomas arise in brain cells (glia) that support and nourish neurons, and current standard chemotherapies with decades-old drugs, while generally effective in lengthening life, often carry side effects or are not tolerated. Approximately 50% of children treated with traditional therapy have their tumors regrow, underscoring the need for better, targeted treatments.

The combination therapy, when tested in tumor cell lines derived from children's gliomas, stopped the tumor cells from growing. In mice, these drugs reduced tumor volume and allowed mice to live longer, the researchers say. Mice treated with the combination of drugs also had greatly decreased blood supply to their tumors, suggesting that treatment can starve tumors of the blood they need to grow. The research, described online in the journal Neuro-Oncology in December 2019, suggests that a clinical trial combining these agents in children would be beneficial, the investigators add.

"We thought one plus one might well equal three in the case of these drugs, and that's what we found," says senior study author and pediatric oncologist Eric Raabe, M.D., Ph.D., of Johns Hopkins Kimmel Cancer Center, and associate professor of oncology at the Johns Hopkins University School of Medicine.

Previous research showed that pediatric low-grade gliomas contain gene mutations that increase the activity of two cell signaling pathways: mammalian target of rapamycin complexes 1 and 2 (mTORC1/2) and Ras/mitogen-activated protein kinase (MAPK), says Raabe. Both enable proteins that promote cell growth. TAK228/sapanisertib, which is in clinical trials for adult patients with cancer, inhibits the mTOR pathway; trametinib, which is approved for treatment of melanoma, inhibits the MAPK pathway. When Raabe and team treated tumors or cells with just one of the drugs targeting one of the pathways, the cancer cells were able to use the other pathway to survive, Raabe says.

In the new study, Raabe and colleagues tested TAK228 and trametinib in patient-derived pediatric low-grade glioma cell lines grown in the laboratory. Using the two drugs together led to a 50% reduction in tumor cell growth. The combination therapy also suppressed activity by more than 50% in both the mTOR and MAPK signaling pathways, and reduced cell proliferation by more than 90%. The combination killed some pediatric low grade glioma cells -- increasing the cells killed by nearly threefold over cells treated with each agent alone.

The investigators then gave mice implanted with human low grade glioma tumors TAK228, trametinib, the combination of the two drugs, or a combination placebo. Survival was three times longer in the animals receiving the combination therapy versus single treatments, a difference of 36 days compared with 12 days. Combination therapy-treated tumors were 50% smaller on average over two weeks' treatment time compared with single drug therapy. Combination therapy in the animal models led to suppressed mTOR and MAPK pathways by more than 80%. The number of growing cells in these tumors decreased by more than 60%. The blood supply to the tumors was decreased by 50%-95%.

Raabe cautions that more preclinical research must be done to determine the best and safest potential dosing regimen, in part because trametinib stays in the body for four to five days, and the MAPK pathway it targets is needed by healthy cells for normal growth in children. In addition, mice receiving the combination therapy didn't grow as well as those receiving single drug therapy, so the dosing schedule needs to be customized for children, he says. Currently, TAK228 is in clinical trials in adults, and early phase clinical trials of TAK228 are being considered for pediatric brain tumors, Raabe says.

Credit: 
Johns Hopkins Medicine

Controlling the messenger with blue light

image: Optogenetic inhibition of mRNA translocation and translation in living cells. Blue light inactivation of protein translation from mRNA and the level of protein production is reduced with spatiotemporal precision.

Image: 
IBS

Researchers at the Center for Cognition and Sociality, within the Institute for Basic Science (IBS, South Korea), have developed a new optogenetic tool to visualize and control the position of specific messenger RNA (mRNA) molecules inside living cells. Using this approach, published in Nature Cell Biology and the research highlights section of Nature Reviews Genetics, the authors revealed something new about cell migration that could not have been discovered with previously available methods.

Cells are so well organized that everything seems to happen at the right time, and at the right place. Protein synthesis, for example, is the result of proper mRNA localization and translation: mRNA carries the information to produce specific proteins and is transferred where the cell needs these proteins the most. Then, ribosomes help with the translation of the genetic message into proteins. Numerous studies have demonstrated that mRNA translation is tightly related to mRNA localization. However, conventional chemical-based methods are insufficient to fully address which mRNA molecules are responsible for which cellular behavior.

To directly investigate the causal relationship between the translation of specific mRNAs and specific biological processes, the IBS team developed an optogenetic method, called mRNA-LARIAT, that controls mRNA position and translation in living cells.

The mRNA-LARIAT uses blue light to trap specific mRNAs into large clusters. As a result, the mRNAs clusters are not able to interact with the ribosomes and the translation of the corresponding proteins reduces. While mRNA-blocking chemicals cannot be controlled at specific space and time, using light as a clear advantage you can turn it on and off where and when you want.

This optogenetic technique was developed by combining the LARIAT system (Light-Activated Reversible Inhibition by Assembled Trap), which was previously developed by the same group (Lee et al., 2014) to trap proteins within cells, with components that bind to the mRNA.

Thanks to this new method, IBS researchers studied the role of mRNA carrying the genetic information to produce the protein β-actin, a key element for cell movement and contraction. Despite the presence of a large amount of pre-existing, long-lived β-actin protein in the cells, the mRNA-LARIAT targeted to β-actin slowed the production of new β-actin proteins and attenuated cell motility effectively and reversibly within 20 minutes. These results suggest that β-actin translation is constantly required for cell migration and even a minute fraction of newly synthesized β-actin can have a profound effect on this process. The team also demonstrated a way to distinguish between newly synthesized β-actin protein and pre-existing β-actin.

"The mRNA-LARIAT is generally adaptable and able to rapidly and reversibly manipulate translation of target transcript that can further be used to provide insights of how function of different mRNAs are coordinated in space and time" says Won Do Heo, KAIST professor and leading author of this research.

Credit: 
Institute for Basic Science

Light-sheet fluorescence imaging goes more parallelized

image: (Left) CLAM illumination profiles in multiple views. (Top right) 3D rendered image with three orthogonal standard-deviation-intensity projections of the tubular epithelial structure in the mouse kidney. (Bottom right) Sectional images of the mouse glomeruli captured by the CLAM microscope.

Image: 
Y.-X. Ren, J. Wu, Q. T. K. Lai, H. M. Lai, D. M. D. Siu, W. Wu, K. K. Y. Wong, and K. K. Tsia

An arsenal of advanced microscopy tools is now available to provide high-quality visualization of cells and organisms in 3D and has thus substantiated our understanding the complex biological systems and functions.

In a new paper published in Light: Science & Applications, a research team led by the University of Hong Kong (HKU) developed a new form of imaging modality, coined coded light-sheet array microscopy (CLAM) that allows full 3D parallelized fluorescence imaging without any scanning mechanism - a capability that is otherwise challenging in the existing techniques.

Established 3D biological microscopy techniques, notably confocal, multiphoton microscopy, and light-sheet fluorescence microscopy (LSFM), predominantly rely on laser-scanning for image capture. Yet, it comes at the expense of imaging speed because the entire volume has to be sequentially scanned point-by-point, line-by-line or plane-by-plane at a speed limited by the mechanical motions involving the imaging parts.

Even worse, many serial scanning approaches repeatedly excite out-of-focus fluorescence, and thus accelerate photobleaching and photodamage. They are thus not favorable for long-term, large-scale volumetric imaging critically required in applications as diverse as anatomical science, developmental biology and neuroscience.

3D parallelization in CLAM requires even gentler illumination to achieve a similar level of image sensitivity at the same volumetric frame rate. Hence, it further reduces the photobleaching rate and thus the risk of photodamage. This is a critical attribute for preserving the biological specimen viability in long term monitoring studies.

The heart of CLAM is the concept of 'infinity mirror" (i.e., a pair of parallel mirrors), which is common in visual art and decoration, and has previously been adopted by the same team for enabling ultrafast optofluidic single-cell imaging. Here the team employed the 'infinity mirror" together with simple beam shaping to transform a single laser beam into a high-density array of few tens of light-sheets for 3D parallelized fluorescence excitation.

"One distinct feature of CLAM is its ability to flexibly reconfigure the spatial density and temporal coherence of the light sheet array, simply by tuning the mirror geometry, such as mirror separation and tilt angle." explained Dr. Yuxuan Ren, the postdoctoral researcher and the first author of the work.

"This capability has been challenging in the existing coherent wavefront shaping methods, yet could allow efficient parallelized 3D LSFM in scattered tissue imaging with minimal speckle artifact." Ren added.

CLAM also adopts code division multiplexing (CDM) (e.g., orthogonal frequency division multiplexing demonstrated in this work), a technique widely used in telecommunication, to imprint the fluorescence signal from each image plane with a unique code. As a result, it allows parallelized 3D image capture with optical sectioning by using a 2D image sensor.

"CLAM has no fundamental limitation in scaling to higher volume rate as camera technology continually advances," Dr. Kevin Tsia, Associate Professor in Department of Electrical and Electronic Engineering at HKU and the leading researcher of the team pointed out.

"Also, CLAM can be adapted to any existing LSFM systems with minimal hardware or software modification. Therefore, it is readily available for dissemination to the wider community of LSFM and related 3D imaging techniques." added Tsia.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

New catalyst recycles greenhouse gases into fuel and hydrogen gas

image: Newly developed catalyst that recycles greenhouse gases into ingredients that can be used in fuel, hydrogen gas and other chemicals.

Image: 
Cafer T. Yavuz, KAIST

Scientists have taken a major step toward a circular carbon economy by developing a long-lasting, economical catalyst that recycles greenhouse gases into ingredients that can be used in fuel, hydrogen gas, and other chemicals. The results could be revolutionary in the effort to reverse global warming, according to the researchers. The study was published on February 14 in Science.

"We set out to develop an effective catalyst that can convert large amounts of the greenhouse gases carbon dioxide and methane without failure," said Cafer T. Yavuz, paper author and associate professor of chemical and biomolecular engineering and of chemistry at KAIST.

The catalyst, made from inexpensive and abundant nickel, magnesium, and molybdenum, initiates and speeds up the rate of reaction that converts carbon dioxide and methane into hydrogen gas. It can work efficiently for more than a month.

This conversion is called 'dry reforming', where harmful gases, such as carbon dioxide, are processed to produce more useful chemicals that could be refined for use in fuel, plastics, or even pharmaceuticals. It is an effective process, but it previously required rare and expensive metals such as platinum and rhodium to induce a brief and inefficient chemical reaction.

Other researchers had previously proposed nickel as a more economical solution, but carbon byproducts would build up and the surface nanoparticles would bind together on the cheaper metal, fundamentally changing the composition and geometry of the catalyst and rendering it useless.

"The difficulty arises from the lack of control on scores of active sites over the bulky catalysts surfaces because any refinement procedures attempted also change the nature of the catalyst itself," Yavuz said.

The researchers produced nickel-molybdenum nanoparticles under a reductive environment in the presence of a single crystalline magnesium oxide. As the ingredients were heated under reactive gas, the nanoparticles moved on the pristine crystal surface seeking anchoring points. The resulting activated catalyst sealed its own high-energy active sites and permanently fixed the location of the nanoparticles -- meaning that the nickel-based catalyst will not have a carbon build up, nor will the surface particles bind to one another.

"It took us almost a year to understand the underlying mechanism," said first author Youngdong Song, a graduate student in the Department of Chemical and Biomolecular Engineering at KAIST. "Once we studied all the chemical events in detail, we were shocked."

The researchers dubbed the catalyst Nanocatalysts on Single Crystal Edges (NOSCE). The magnesium-oxide nanopowder comes from a finely structured form of magnesium oxide, where the molecules bind continuously to the edge. There are no breaks or defects in the surface, allowing for uniform and predictable reactions.

"Our study solves a number of challenges the catalyst community faces," Yavuz said. "We believe the NOSCE mechanism will improve other inefficient catalytic reactions and provide even further savings of greenhouse gas emissions."

Credit: 
The Korea Advanced Institute of Science and Technology (KAIST)

The (un)usual suspect -- novel coronavirus identified

video: This pioneering study undertakes the sequencing of the CoV's genome for the first time.

Image: 
<em>Chinese Medical Journal</em>

In early December, a few people in the city of Wuhan in the Hubei province of China began falling sick after going to a local seafood market. They experienced symptoms like cough, fever, and shortness of breath, and even complications related to acute respiratory distress syndrome (ARDS). The immediate diagnosis was pneumonia, but the exact cause was unexplained. What caused this new outbreak? Is it the severe acute respiratory syndrome (SARS)-CoV? Is it the Middle East respiratory syndrome (MERS)-CoV? As it turns out, scientists had undertaken a study to identify this virus in December after analyzing the first few cases. This study is now published in Chinese Medical Journal and the identity of the virus has been established--it is a completely new virus, closely related to the bat SARS-like CoV. Dr. Jianwei Wang (Chinese Academy of Medical Sciences, Institute of Pathogen Biology), lead researcher on the study, states, "Our paper has established the identity of the bat-origin CoV that was unknown until now."

In this study, scientists from renowned research institutes in China, such as the Chinese Academy of Medical Sciences, Institute of Pathogen Biology, China-Japan Friendship Hospital, and Peking Union Medical College, jointly discovered and identified the new CoV--the main culprit of the Wuhan outbreak--by next generation sequencing (NGS). They focused on five patients admitted to Jin Yin-tan Hospital in Wuhan, most of whom were workers in the Huanan Seafood Market in Wuhan. These patients had high fever, cough, and other symptoms, and were initially diagnosed to have pneumonia, but of an unknown cause. Some patients' condition rapidly worsened to ARDS; one even died. Dr Wang says, "Chest x-rays of the patients showed some hazy opacities and consolidations, which are typical of pneumonia. However, we wanted to find out what caused the pneumonia, and our subsequent experiments revealed the exact cause--a new CoV that was not known before."

For the study, the scientists used bronchoalveolar lavage (BAL) fluid samples taken from the patients (BAL is a procedure in which sterile fluid is transferred to the lungs through a bronchoscope and then collected for analysis).

First, the scientists attempted to identify the virus by genome sequencing, using NGS technology. NGS is the preferred screening method for identifying unknown pathogens because it quickly detects and rules out all known pathogenic microorganisms in the sample. Based on sequencing of the DNA/RNA from the BAL fluid samples, the scientists found that most of the viral reads belonged to the CoV family. The scientists then assembled the different "reads" that belonged to CoVs and constructed a whole genomic sequence for the new virus; these sequences were 99.8-99.9% similar among all the patients' samples, confirming that this virus was the common pathogen in all the patients. Further, using homology analysis, where a genome sequence is compared against other known genome sequences (with a preset threshold of 90% for it to be considered a "new" sequence), they confirmed that the genome sequence of this new virus is 79.0% similar to the SARS-CoV, about 51.8% similar to the MERS-CoV, and about 87.6-87.7% similar to other SARS-like CoVs from Chinese horseshoe bats (called ZC45 and ZXC21). Phylogenetic analysis showed that the sequences of the five CoV strains obtained were closest to those of bat-derived strains, but formed separate evolutionary branches. These findings clearly suggest that the virus originated from bats. Dr Wang states, "Because the similarities of the viral replicase gene with all other known "similar" viruses are still less than 90%, and also taking into account the phylogenetic analysis results, we consider that this is indeed a new, previously unknown CoV. This new virus is temporarily called the 2019-nCoV."

Lastly, the scientists moved to "isolating" the virus from the BAL fluid samples by checking whether the fluid samples showed cytopathic effect to cell lines in the laboratory. The cells exposed to the fluid samples were observed under an electron microscope, and the scientists found characteristic CoV-like structures. They also used immunofluorescence--a technique that uses specific antibodies tagged with fluorescent dyes. For this, they used serum from the recovering patients (which contained antibodies), which reacted with the viral particles inside the cells; this confirmed that this virus was indeed the cause of the infection.

This study paves the way for future studies to understand the virus and its sources better, especially given its rapid spread, its ability to cause fatal ARDS, and the panic caused by the outbreak. Although 4 of the 5 patients from whom this virus was identified were from a seafood market in Wuhan, the exact origin of infection is unknown. The CoV could have been transmitted to humans through an "intermediate" carrier, such as in the case of SARS-CoV (palm civet meat) or MERS-CoV (camel). Dr Wang concludes, "All human CoVs are zoonotic, and several human CoVs have originated from bats, including the SARS- and MERS-CoVs. Our study clearly shows the urgent need for regular monitoring of the transmission of bat-origin CoVs to humans. The emergence of this virus is a massive threat to public health, and therefore, it is of critical importance to understand the source of this virus and decide the next steps before we witness a larger scale outbreak."

Credit: 
Cactus Communications

Genes from scratch -- far more common and important than we thought

Scientists from Trinity College Dublin and the University of Pittsburgh have discovered that de novo genes - genes that have evolved from scratch - are both more common and more important than previously believed.

Their findings appear in two studies, one which will appear in eLife tomorrow [Tuesday 18th February 2020], and one which was published earlier this month in Nature Communications.

DNA, genes, and de novo orphans

Over time, genes change via random mutations. Some of these changes result in serious defects and are rarely passed on to the next generations, others have little impact, and others confer significant advantages, which become favoured due to natural selection and end up being passed on to future generations. This is the main source of genetic novelty and how organisms differ from each other. However, genetic novelty can also be generated by totally new genes evolving from scratch.

In the eLife study, the scientists devised a way of assessing just how frequently genes seem to evolve from scratch. Their results were surprising.

Explaining de novo genes, first author on the paper, Nikolaos Vakirlis, Trinity, said:

"Most of the genes in a genome have 'cousins' in the genomes of other species; genes made up of similar DNA sequences that, once translated into proteins, perform similar functions. However, some genes are unique and can only be found in a single, or small number of closely related species. We call these 'orphan genes' because they appear to have no relatives and are often responsible for unique characteristics and abilities of organisms. For example, a gene that is unique to cod fish living in the arctic allows them to survive in sub-zero temperatures."

Orphan genes pose a tough evolutionary problem though. They don't look like other genes, so where do they come from? One idea is that they can originate seemingly from nothing: over long, evolutionary timescales, a completely novel gene can emerge de novo out of a region in the genome that is made up of junk DNA. Alternatively, with enough time, two 'cousin' genes can diverge so much that we can no longer identify the relationship between them. Thus, a gene may at a glance appear to be an orphan without having really emerged de novo.

A new approach to assessing de novo gene frequency

For a long time, scientists thought the majority of orphan genes were simply cases of 'missing relatives', which could be explained by the divergence of the sequences through mutations during evolution. The new research suggests this is not the case.

Aoife McLysaght, professor in genetics at Trinity College Dublin, said:

"To our surprise, at most, around one third of orphan genes result from divergence. So, in turn, this suggests that most unique genes in the species we looked at are the result of other processes, including de novo emergence, which is therefore much more frequent than scientists initially thought."

Are de novo emerging genes important?

In the second piece of research, published recently in leading journal Nature Communications, the scientists sought an answer to the obvious question: Are de novo emerging genes important?

This may seem a paradoxical question because something that has not yet emerged fully in the world of evolution wouldn't be expected to be overly important. After all, how can a gene that was never used before suddenly appear and play a major role?

This paradox can be resolved if emerging genes have high potential to be beneficial for the organism. So, while they are expected to play no particular role in their current form, random changes that affect their sequences or increase the amount of protein they produce when translated should lead to beneficial effects.

The scientists tested whether this hypothesis may be true by doing a series of biological and computational experiments using baker's yeast as a model organism. And when they artificially allowed emerging sequences to be expressed at higher levels than they are naturally, the cells tended to grow faster.

Importantly, growth was not enhanced by overexpressing established genes. So, emerging sequences do indeed carry the potential to be important to the cells.

Anne-Ruxandra Carvunis, Ph.D., assistant professor of computational systems biology at the University of Pittsburgh, said:

"Order seems like something that's hard to achieve, but our results go completely opposite to that. We found that simple order is rampant everywhere in the genome. The propensity to make simple shapes that are stable is already there, waiting to be exposed. De novo gene birth is thus becoming less and less mysterious as we better understand molecular innovation."

Credit: 
Trinity College Dublin

Enigmatic small primate finally caught on film in Taita, Kenya

image: Taita mountain dwarf galago, Paragalago sp.

Image: 
Hanna Rosti, hanna.z.rosti@helsinki.fi

Good news from the Kenyan Taita Hills: the Taita mountain dwarf galago still survives. This was confirmed by researchers working at the University of Helsinki Taita Research Station.

The tiny nocturnal prosimian, weighing only 100-180 grams, was first reported in 2002, but no sightings had been made since.

The dwarf galagos in the Taita Hills live in relatively cool montane forests at the altitude of 1,400-1,950 metres. They - as do all the dwarf galago species - live in small family groups and communicate using several types of calls. Because all dwarf galago species look similar, they are most conveniently identified on the basis of their distinctive calls.

Finding the small nocturnal animal is challenging, as the forest canopy is in places up to 50 metres high. The animals are spotted with a red light not visible to the animal itself.

"The tropical forest is magically beautiful at night, but one is lucky to catch even a glimpse of the tiny creatures," says University of Helsinki PhD student Hanna Rosti who has spent hours observing and recording the animals.

"Dwarf galagos make agile jumps from tree to tree and feed on moths, cicadas and other insects. I have seen them hunting above ground-dwelling safari ants, where they obviously take advantage of insects fleeing from the voracious ants."

Unfortunately, the tiny mammal seems to be at the verge of extinction.

"The future of Taita mountain dwarf galagos and other endemic animal and plant species depends on the future of native montane forests of the Taita Hills. The conservation status of the forests must be strengthened and their area should be expanded by planting native trees in areas destroyed by cutting and fire. This will protect galago habitats and will ensure that the montane forests continue to provide many vital ecosystem services," says Professor Jouko Rikkinen from the University of Helsinki. He has been studying the biota of the Taita Hills since 2009.

Many animals and plants of the local montane forests have evolved in isolation and the number of endemic species is remarkably high. The Taita Hills belong to the Eastern Arc Mountains, which represent a global biodiversity hotspot.

The diversity of the Taita Hills will never stop surprising Professor Petri Pellikka, the director of the Taita Research Station. "The mountains represent a living laboratory, with great possibilities for ground-breaking research and fascinating new findings."

Credit: 
University of Helsinki

Areas near concentration camps give more electoral support to the far right

image: Spatial distribution of support for FRPs (Gemeinde level) in the 2013 (left) and 2017 federal elections.

Image: 
UPF

The biggest systematic act of mass violence brought about by a state, the Holocaust, has recently caught the interest of political scientists regarding long-term effects on political attitudes and behaviour. To date, analyses had focused on the impact of this proximity regarding the redistribution of wealth and property, but not in terms of voting behaviour.

As part of this year's commemoration of the 75th anniversary of the liberation of the Nazi concentration camps, a study has been published that analyses the relationship between proximity to former concentration camps and voting for far right-wing parties in the last two federal elections held in Germany.

Toni Rodon, a researcher with the Department of Political and Social Sciences at UPF and member of the Research Group on Institutions and Political Actors, together with Julian M. Hoerner and Alexander Jaax from the London School of Economics and Political Science (LSE) are the authors of the research, which was published at the end of 2019 in the journal Research & Politics. "In our work we find that proximity to a former concentration camp is associated with a greater proportion of votes for far-right parties", the authors claim.

In their study, the researchers argue that while there has been a consensus on Germany's responsibility and the centrality of the Holocaust in the country's history, recently this view has been challenged. "We believe that the case of Germany can tell us a lot about the dynamics of the long-term impact of mass violence and its interaction with competition among political parties in shaping collective memory", they state.

Relationship between the proximity of concentration camps and support for the far right

The study analysed the percentage of votes obtained in the German federal elections of 2013 and 2017 by the following far-right parties: The National-Demokratische Partei Deutschlands (NPD), Die Rechte, Die Republikaner and Pro Deutschland. The party Alternative fur Deutschland (AfD) was also studied. Since its foundation in 2013 as a liberal-conservative party, it has evolved towards the right, and its most radical wing has come out against the culture of memory in Germany.

Thus, two separate study models were run, one without the AfD and the other which took its share of the vote into account in order to assess whether the emergence and transformation of the party in the German system affected the results.

The electoral data were aggregated at the "Gemeinde" level, the smallest administrative division of local government with corporate status having powers of self-government in Germany. Then, the article analyses the relationship between the electoral strength of the far right and the location of all former concentration camps, built between 1933 and 1945 spread throughout the country, but especially those in East Germany.

For the elections of both 2013 and 2017, the paper finds that being near a concentration camp is associated with strong support for the radical right. However, in 2017, when the study did not include the votes for the AfD, this relationship ceases to be significant. According to the researchers, this pattern can be explained by the party's capacity to concentrate the majority of the support of far-right voters, especially given the shift in its rhetoric, which explicitly called Germany's memory culture into question.

According to the study model, for example, in places in West Germany located near a concentration camp (within 200 metres), the radical right received between 0.4 and 0.7 more points than in areas one kilometre away.

"Resilience effect" versus "satiation effect"

One of the researchers' initial hypotheses, which was ultimately not confirmed, was that spatial proximity to places of historical memory, as are former concentration camps, meant that voters living nearby would be less likely to vote for a party of this kind, due to the so-called "resilience hypothesis".

Constantly remembering the consequences and extent of the crimes committed by the Germans may cause voters to resist any attempt to minimize or normalize Nazi crimes. Past experience could have become a shared memory passed down through generations, leading to an aversion to far-right politics and any attempts to minimize the crimes.

However, according to the authors, revelations about in-group transgressions might also prompt defensive responses and minimization of complicity in what is called the "satiation hypothesis", a psychological concept that refers to the phenomenon that repeated exposure to a semantic stimulus - in this context embodied by former concentration camps as places of memory - weakens the reaction and receptiveness of a subject. "Individuals repeatedly confronted with in-group transgressions become more receptive to alternative narratives in a process of cognitive dissonance", the authors affirm.

In this sense, the researchers only found this "memory satiation" effect in West Germany, where the contrition frame was much more dominant in the political discourse compared with East Germany. "This finding shows that the magnitude of the effect of proximity to a former camp is dependent on the strength of the political leaders challenging the prevalent culture of memory", the researchers conclude.

Credit: 
Universitat Pompeu Fabra - Barcelona

Stop or go: The cell maintains its fine motility balance with the help of tropomodulin

image: Protein called tropomodulin is a key player that maintains the balance between the protrusive and contractile actin-filament machineries within a cell.

Image: 
Reena Kumari, reena.kumari@helsinki.fi

In a healthy cell, there is a fine balance between the protrusive structures that make the cell more migratory and the contractile structures that maintain the cell's shape and its association with the environment. A disturbance in this balance leads to several diseases, such as invasive cancers.

The most important component of both protrusive and contractile machineries is a protein called actin. This means that the proper distribution of actin between these structures is essential for the normal function of the cell. Nevertheless, the mechanisms that ensure that actin is distributed correctly between the protrusive and contractile machineries have remained elusive.

Researchers at the University of Helsinki, Finland, and the University of Pennsylvania, Philadelphia, USA, have now identified a protein called tropomodulin as a key player that maintains the balance between the protrusive and contractile actin-filament machineries within a cell.

The function of tropomodulin has previously been studied mainly in the context of muscles, where it maintains the architecture of actin filaments within the contractile fibers of muscle cells.

"We have now revealed that tropomodulins stabilise the actin filaments of the contractile structures in non-muscle cells through interacting with specific proteins within these actin filament bundles. The depletion of tropomodulins led to a loss of contractile structures, accompanied by an excess of protrusive structures, and thus to severe problems in a cell's shape and force production," says Academy Professor Pekka Lappalainen from the HiLiFE - Institute of Biotechnology, University of Helsinki.

Researchers were surprised to see that the depletion of one protein can have such drastic effects on the balance of the actin machinery.

"Another exciting and unexpected finding of this study was the notion that the same protein can have a different function depending on the tissue or cell type. Our study also sheds light on why abnormal levels of tropomodulin are linked to the progression of various cancers," says PhD student Reena Kumari.

Credit: 
University of Helsinki

MRI method provides unprecedented insight into the brain's wiring network

The wiring network of the brain is made up of billions of nerve fibers called axons. The thickness of axons - together with other properties - significantly impacts the way in which they conduct neural signals, and therefore the overall processing speed of connected neurons and brain areas. In addition, many neurodegenerative conditions, such as Multiple Sclerosis and Alzheimer's Disease, as well as conditions such as cancer, brain injury, and stroke, are known to exhibit axonal damage.

It is clear that axons are crucially important for the functioning of the brain, but they are also rather mysterious. They are so thin (mere micrometers in diameter) that probing them non-invasively inside a living brain has been, until now, impossible.

A new collaborative study by researchers working at the NYU Grossman School of Medicine in the USA, Champalimaud Centre for the Unknown in Portugal and the Cardiff University Brain Research Imaging Centre (CUBRIC) in the UK, established a way to measure these microscopic wires using MRI (Magnetic Resonance Imaging). Their results were published in the scientific journal eLife.

Diving into white matter

What was the breakthrough that led the team to overcome this long-standing challenge? Jelle Veraart, the first author of the study, explains that the key was to find a way to tease apart two types of signals: those originating from inside of the axons, and those arising from the surrounding tissue.

"MRI primarily works by detecting signals of water molecules, which constitute about 80% of the human brain. Axons have water inside of them, but so does the tissue that surrounds them. So we, in the MRI Biophysics Group at the NYU Grossman School of Medicine, developed a method that preferentially suppresses the signal everywhere but in the axons. Specifically, we modelled how the water signal behaves in different parts of tissue. Then, we measured the properties of the axons in the sample using the residual signal, which corresponded only to them", Veraart explains.

But a model remains theoretical until it is validated. The team therefore tested the model using preclinical ultra-high MRI scanners and microscopy at the Champalimaud Centre for the Unknown. Once the model was verified using this state-of-the-art technology and methodology, the researchers moved on to the next step - applying it to humans. "We performed this phase in collaboration with Derek Jones at the Cardiff University Brain Research Imaging Centre, where they have special MRI equipment that is powerful enough to evoke the robust signals needed for this kind of measurement", says Veraart.

The results of the method were outstanding - whereas previous studies yielded measurements that were an order of magnitude larger than known axonal sizes, the team's measurements were unprecedentedly accurate - only a 10-15% error - in their estimate of the same features.

What was the source of the error? "We conduct our measurements in areas of 'white matter'. It's called that because it primarily consists of axons, which have a whitish color. Within one cubic millimeter of tissue, there are tens of thousands of axons. Our measurements estimate an averaged metric of all axons, and tend to be dominated by the larger axons in the tissue sample", explains Dmitry Novikov, of the NYU Grossman School of Medicine. "Furthermore, there are other features which might 'masquerade' as axons in our signals, such as the 'arms' of other types of cells in the brain called astrocytes."

Now that the method has been established, the researchers are thinking about the next steps. "The in-vivo quantification of axon diameters with MRI paves the way to a new line of research", says Noam Shemesh, of the Champalimaud Centre for the Unknown. "MRI is non-invasive, and so it can be used to safely perform studies that follow changes in axons over extended periods. The more we learn about these structures, the closer we are to understanding how the brain works, in health and disease".

"This study highlights the importance of collaboration in research", Veraart adds. "As a result of this prolific international teamwork, we have a complete story - from the theoretical development of a novel technique, the experimental validation in rodents, and finally to human translation."

Credit: 
Champalimaud Centre for the Unknown

First glimpse of body's 'steering wheel' joint sparks hope

For the first time, scientists have found a way to reveal the mechanics of the human body's 'steering wheel' - the subtalar joint.

The bones of the foot are unique in that they need to be both be extremely flexible allowing the foot to point, twist and flex, but in other positions they need to be absolutely rigid, such as pushing off or jumping so the person doesn't sprain their ankle.

The key to this ability is the subtalar joint, below the ankle, which until now, doctors couldn't see rotating while standing.

Ankle sprains are one of the commonest reasons for people to attend Accident and Emergency departments. More often than not, the subtalar joint is also injured but, because the joint is hidden, doctors find it difficult to diagnose sprains, which often leads to long-term ankle instability.

If left untreated, an injury to the subtalar joint can lead to flat feet and even arthritis.

It is hoped that being able to see the joint in action might give doctors the ability to tailor treatments to the many thousands of people with foot joint problems, in the same way it's possible to tailor treatments of the hip and knee joints.

The study, published in Nature's Scientific Reports, was led by Dr Gianluca Tozzi, Reader in Bioengineering at the University of Portsmouth, in collaboration with Mr Andrew Goldberg, Consultant Orthopaedic Surgeon at UCL and the Wellington Hospital in London.

Dr Tozzi said: "This is the first time this technique has been used in humans. It is non-invasive and gives clinicians a perfect view of a patient's subtalar joint motion under full weight-bearing, making it possible for the first time to determine the joint's centre of rotation which, in turn, opens the possibility of much-improved design of joint replacements.

"Being able to see the subtalar joint in action is made possible by a combination of 3D imaging (computed tomography) and digital volume correlation. The technology has a huge potential to be expanded, allowing doctors to see any strain in the bone, greatly improving clinical diagnosis.

"I've always hoped for this. Everyone working in healthcare research hopes to their work will be transferable from the lab to real life, making a difference to patients."

Mr Goldberg said: "Currently, surgery for arthritis usually involves joining the bones together making them stiff in a procedure known as joint fusion. While this is a successful procedure to treat pain, it does remove a mobile joint which can lead to stiffness and long-term wear of other joints that have to pick up the slack.

"No one has ever been able to replace this complex joint. This new research helps us to better understand the complex biomechanics of the foot and could pave the way for new treatments that just aren't currently available."

The study used standing CT scans and sophisticated image analysis to better understand how the subtalar joint works in eight men and women in three different positons.

Credit: 
University of Portsmouth

Bacteria get free lunch with butterflies and dragonflies

image: A caterpillar of the Plain Tiger butterfly (Danaus chrysippus) crawls on a leaf of its host plant Calotropis gigantia, sprawling with bacteria and fungi. The leaf was incubated with nutrient-rich media to unravel the magnificent microbial diversity in larval diet.

Image: 
Kruttika Phalnikar and Shoot for Science

For humans, trade is second nature and civilizations have flourished and fallen with the fate of their trade. In fact, the mutual scratching of backs is a cornerstone of many animal societies. On the other hand, deep and sustained mutualisms across species were long thought to be quirks of evolution, where radically different players managed to stick together and trade for mutual benefit. Famous examples include mitochondria (ex-bacterial cells), which are embedded in and power animal and plant cells. These ancient mutualisms are incredibly fascinating; for how could such delicate relationships survive the travails of time and evolution?

In the past two decades, new genetic tools to find and identify microbes have upended the notion of rare mutualisms. It turns out that most animals and plants house complex and structured microbial communities, providing them food, safety, and even passage to new hosts. What's more, the microbes pay rent! Some manufacture enzymes or vitamins for their hosts, while others take care of toxins and enemies. The currency is varied and rich, with hundreds of examples of fascinating mutualisms. Especially in insects, such associations are so common that they are proposed to have driven the incredible diversification of insects across the earth.

Against this backdrop of rampant mutualism, recent work from Deepa Agashe's group at NCBS presents a jarring contrast. Her team found that unlike other insects, neither butterflies nor dragonflies seem to have evolved strong mutualisms with their bacterial guests. Instead, bacteria seem to be transient acquaintances, sampled randomly from species encountered in the diet or environment.

The case of dragonflies is interesting, because they are thought to be generalist predators of aquatic ecosystems. Their protein-rich diet could perhaps be more easily digested with the help of bacterial enzymes. Postdoctoral fellow Rittik Deb and project assistant Ashwin Nair dissected the guts of many dragonflies, and used genetic tools to identify the bacterial residents and insect prey. Strong host-bacterial mutualism should lead to consistent and similar bacterial communities across individual hosts. Instead, the team found that dragonflies with more diverse diets also housed varied bacterial communities. The work also provided some of the first evidence that dragonflies are not generalist predators. Different dragonfly species - even those living by the same pond on the NCBS campus - specialize on different insect prey, acquiring different bacteria in the process.

Most butterfly caterpillars also only eat specific plants. PhD student Kruttika Phalnikar thus expected that different butterflies should have tailored mutualisms with different bacteria. But as they mature into adults, leaf-eating caterpillars transition to sipping nectar, which should entail a dramatic shift in the bacterial community. Collaborating with butterfly expert Krushnamegh Kunte, Phalnikar analysed bacterial communities from several wild-caught butterflies. Surprisingly, she found similar bacteria on plant leaves; in caterpillars that ate the leaves; and in mature adult butterflies. Caterpillars of different species also housed more similar bacteria than expected. Parallel results from an independent study in the neotropics indicated that butterflies may generally lack a stable microbiome.

Could we go a step further with this idea? Unlike dragonflies, butterflies can be reared in a greenhouse, and the team could directly test whether losing bacterial communities was costly. Using antibiotics, Phalnikar killed bacteria found in the caterpillars of two butterfly species. Indeed, the caterpillars developed just as well as control (unmanipulated) larvae. Even when she added fecal microbes back into the diet, caterpillar growth and survival was unaffected. Thus, butterflies do not seem to rely on bacteria to digest toxins in their food plants, or to acquire essential nutrients.

Together, these studies suggest a remarkable independence from bacterial mutualists in two very different insect groups. On the one hand this is puzzling, because establishing alliances is a very powerful (and oft-used) way to get ahead in life. The spectacular diversification of butterflies (India alone has ~1400 species) is also associated with the ability to eat a wide range of plants; many of which are toxic, difficult to digest, or offer poor nutrition. It seems incredible that butterflies managed to colonize all these niches on their own. On the other hand, it is not easy to find good partners, and even harder to maintain long-term relationships. Co-dependence is fraught with danger: partners may drift apart, go extinct, or turn on each other. We thus circle back to the idea that mutualisms should be rare.

These results open up new and exciting questions. How do butterflies and dragonflies manage without bacterial help? How did other insects successfully negotiate the pitfalls of co-dependence? More generally, can we predict when symbiosis will succeed? The wide spectrum of insect-bacterial mutualisms offers a unique opportunity to understand how trade partnerships establish, evolve, and dissolve over time.

Credit: 
National Centre for Biological Sciences

B cells may travel to remote areas of the brain to improve stroke recovery

LEXINGTON, Ky. (Feb. 17, 2020) -- New University of Kentucky research shows that the immune system may target other remote areas of the brain to improve recovery after a stroke.

The study in mice, published in PNAS by researchers from UK's College of Medicine, University of Texas Southwestern Medical Center and University of Pennsylvania reveals that after a stroke, B cells migrate to remote regions of the brain that are known to generate new neuronal cells as well as regulate cognitive and motor functions.

B cells are a type of white blood cell that makes antibodies. Less known and studied, however, is that B cells can produce neurotrophins that regulate the development and growth of neurons in the brain.

An ischemic stroke is the most common type of stroke that happens when an artery in the brain becomes clogged, typically by a blood clot. After ischemic stroke, it is well-known that B cells travel to the site of the stroke as part of the immune response. But this new study shows that B cells may also move into multiple areas of the brain - both injured and uninjured.

"This is rather unique because it broadens our idea that we need to look at other areas of the brain when studying stroke," says Ann Stowe, UK associate professor in the Department of Neurology and senior author of the study. "These areas are really critical for functional recovery so they could potentially be targets for drug development or therapies."

Researchers studied the post-stroke recovery of mice and through whole-brain imaging saw that B cells not only migrated to the infarction, or site of the stroke, but to other areas supporting motor and cognitive recovery. Mice with depleted B cells experienced reduced recovery in these areas, confirming these findings.

The results could lead to new therapeutic avenues for stroke patients. The Centers for Disease Control and Prevention reports that stroke is a leading cause of adult disability and the fifth leading cause of death in the U.S. Currently, there are only two FDA-approved treatments for acute stroke and no effective therapeutics to promote long-term repair in the brain after stroke damage.

"This study suggests that B cells might have a more neurotrophic role," Stowe says. "Hopefully from this, we can better understand the inflammatory processes after stroke - and long term, possibly identify what subsets of immune cells can support stroke recovery."

Credit: 
University of Kentucky

Insufficient evidence backing herbal medicines for weight loss

image: Pill bottle with measuring tape.

Image: 
Image by Vidmir Raic from Pixabay

Researchers from the University of Sydney have conducted the first global review of herbal medicines for weight loss in 19 years, finding insufficient evidence to recommend any current treatments.

Senior author Dr Nick Fuller said with overweight and obesity rates reaching epidemic proportions worldwide, many people are turning to herbal supplements as an alternative approach to maintain or lose weight.

"The problem with supplements is that unlike pharmaceutical drugs, clinical evidence is not required before they are made available to the public in supermarkets or chemists," said Dr Fuller from the University of Sydney's Boden Collaboration for Obesity, Nutrition, Exercise and Eating Disorders based at its Charles Perkins Centre.

The systematic review and meta-analysis, published in Diabetes, Obesity & Metabolism, analysed the latest international research in this area finding 54 randomised controlled trials comparing the effect of herbal medicines to placebo for weight loss in over 4000 participants.

Results of the review and metanalysis

The research team found that despite some of the herbal medicines showing statistically greater weight loss than placebo, weight loss was less than 2.5kg and therefore not of clinical significance.

"This finding suggests there is insufficient evidence to recommend any of these herbal medicines for the treatment of weight loss. Furthermore, many studies had poor research methods or reporting and even though most supplements appear safe for short-term consumption, they are expensive and are not going to provide a weight loss that is clinically meaningful," said Dr Fuller.

About herbal medicines for weight loss

The most recent data on the use of weight loss supplements, from a US study, showed that among people trying to lose weight 16 percent (12 percent of men and 19 percent of women) reported past-year use.

Herbal medicines, or 'herbal supplements' as they are commonly known, are products containing a plant or combinations of plants as the active ingredient. They come in various forms including pills, powders or liquids.

Common herbal supplements used for weight loss include green tea, garcinia cambogia, white kidney bean and African mango.

The authors write that between 1996 and 2006, 1000 dietary supplements for weight loss were listed on the Australian Register of Therapeutic Goods without evaluation of efficacy.

These substances can be sold and marketed to the public with sponsors (those who import, export or manufacture goods) only required to hold, but not necessarily produce, evidence substantiating their claims. The authors note that only 20 percent of new listings are audited annually to ensure they meet this requirement.

In some countries, the only requirement is that the supplement contains acceptable levels of non?medicinal substances.

"The growth in the industry and popularity of these products highlights the importance of conducting more robust studies on the effectiveness and safety of these supplements for weight loss," said Dr Fuller.

The review excluded studies where the herbal medicine did not include the whole plant, was comprised of plant oils or combined with other dietary supplements such as fibres and proteins. This analysis will be reported in a future paper.

Credit: 
University of Sydney