Culture

Bizarre new species discovered... on Twitter

image: The photo shared on Twitter of the millipede Cambala by Derek Hennen. The two red circles indicate the presence of the fungus.

Image: 
Derek Hennen

While many of us use social media to be tickled silly by cat videos or wowed by delectable cakes, others use them to discover new species. Included in the latter group are researchers from the University of Copenhagen's Natural History Museum of Denmark. Indeed, they just found a new type of parasitic fungus via Twitter.

It all began as biologist and associate professor Ana Sofia Reboleira of the National Natural History Museum was scrolling though Twitter. There, she stumbled upon a photo of a North American millipede shared by her US colleague Derek Hennen of Virginia Tech. She spotted a few tiny dots that struck her well-trained eyes.

"I could see something looking like fungi on the surface of the millipede. Until then, these fungi had never been found on American millipedes. So, I went to my colleague and showed him the image. That's when we ran down to the museum's collections and began digging", explains Ana Sofia Reboleira.

Together with colleague Henrik Enghoff, she discovered several specimens of the same fungus on a few of the American millipedes in the Natural History Museum's enormous collection -- fungi that had never before been documented. This confirmed the existence of a previously unknown species of Laboulbeniales - an order of tiny, bizarre and largely unknown fungal parasites that attack insects and millipedes.

The newly discovered parasitic fungus has now been given its official Latin name, Troglomyces twitteri.

SoMe meets museum

Ana Sofia Reboleira points out that the discovery is an example of how sharing information on social media can result in completely unexpected results:

"As far as we know, this is the first time that a new species has been discovered on Twitter. It highlights the importance of these platforms for sharing research - and thereby being able to achieve new results. I hope that it will motivate professional and amateur researchers to share more data via social media. This is something that has been increasingly obvious during the coronavirus crisis, a time when so many are prevented from getting into the field or laboratories."

Reboleira believes that social media is generally playing a larger and larger role in research.

She stresses that the result was possible because of her access to one of the world's largest biological collections.

"Because of our vast museum collection, it was relatively easy to confirm that we were indeed looking at an entirely new species for science. This demonstrates how valuable museum collections are. There is much more hiding in these collections than we know," says Ana Sofia Reboleira.

Underappreciated parasitic fungus

Laboulbeniales-fungi look like tiny larvae. The fungi are in a class of their own because they live on the outside of host organisms, and even on specific parts of bodies - in this case, on the reproductive organs of millipedes. The fungus sucks nutrition from its host animal by piercing the host's outer shell using a special suction structure, while the other half of the fungus protrudes.

Approximately 30 different species of parasitic Laboulbeniales-fungi attack millipedes. The vast majority of these were only discovered after 2014. According to Reboleira, there are most likely a great number remaining to be discovered. Research in the area of Laboulbeniales remains extremely scarce.

Nor is much known about their own biology, says Reboleira, who researches these fungi on a daily basis. She believes that these fungi can not only teach us about the insects upon which they live, but also about the mechanisms behind parasitism itself - that is, the relationship between parasites and their hosts. She hopes that the research will also provide useful knowledge about the parasites that attack and can be harmful to human health.

FACTS:

The new species Troglomyces twitteri belongs to the order of microscopic parasitic fungi known as Laboulbeniales. These fungi live on insects, arachnids and millipedes, and rely on their host organisms to survive.

The research results are published in the scientific journal MycoKeys: https://mycokeys.pensoft.net/article/51811

The research was conducted by: Sergi Santamaria of the Departament de Biologia Animal, de Biologia Vegetal i d'Ecologia, Universitat Autònoma de Barcelona, Spain; and Henrik Enghoff & Ana Sofia Reboleira from the Natural History Museum of Denmark at the University of Copenhagen.

Follow Ana Sofia Reboleira on Twitter: https://twitter.com/SReboleira

Millipede specimens from the Muséum national d'Histoire naturelle (MNHN) in Paris helped confirm the discovery of the new species of fungus.

The Natural History Museum of Denmark's entomological collection is one of the world's largest, housing more than 3.5 million pin-mounted insects and at least as many alcohol-preserved insect and land animal specimens. About 100,000 known species are represented (out of a total number of over one million species).

Credit: 
University of Copenhagen

Impacts of different social distancing interventions detectable two weeks later, shows German modeling study

In Germany, growth of COVID-19 cases declined after a series of three social distancing interventions, detectable at a two-week delay following each intervention, but only after the third- a far-reaching contact ban - did cases decline significantly. These results - from a modeling study designed to better estimate the impact of various levels of social distancing on virus spread - indicate that the full extent of social distancing interventions was necessary to stop exponential growth in Germany, the authors say. Further, the two-week delay it reveals in understanding an intervention's impacts warrants caution in lifting restrictions, they say; doing so too much too early could leave policymakers and others "effectively blind" to a worsened situation for nearly two weeks. In early days of a pandemic, reliable short-term forecasts of the impacts of interventions like social distancing are key to decisionmakers. When reliability of such forecasts is challenged initially, when case numbers are low, Bayesian modeling can help. Here, Jonas Dehning and colleagues combined a Susceptible-Infected-Recovered transmission model with Bayesian parameter inference to better estimate the impact of social distancing on SARS-CoV-2 virus spreading rate in Germany, where three interventions - starting with a 7 March cancelation of large public events and ending with a far-reaching contact ban late in the month --were implemented over three weeks. Using data on deaths through 21 April, the authors report evidence of three change points, each detectable two weeks after an intervention, and reflecting slowed spread of the virus. Only by the third change point, however, initiated by contact ban, did they see a crucial decline in new cases daily. Further simulations suggest that the consequences of delaying social distancing by as little as five days can have severe effects, the authors report. They say that their findings of a two-week delay indicate it is important to consider lifting restrictions only when the number of active cases is so low that a two-week increase will not pose a serious threat to healthcare infrastructure. While applied to Germany, the approach can be adapted to other countries or regions.

Credit: 
American Association for the Advancement of Science (AAAS)

Like thunder without lightning

Mergers between black holes and neutron stars in dense star clusters are quite unlike those that form in isolated regions where stars are few. Their associated features could be crucial to the study of gravitational waves and their source. Dr Manuel Arca Sedda of the Institute for Astronomical Computing at Heidelberg University came to this conclusion in a study that used computer simulations. The research may offer critical insights into the fusion of two massive stellar objects that astronomers observed in 2019. The findings were published in the journal "Communications Physics".

Stars much more massive than our sun usually end their lives as a neutron star or black hole. Neutron stars emit regular pulses of radiation that allow their detection. In August 2017, for example, when the first double neutron star merger was observed, scientists all around the globe detected light from the explosion with their telescopes. Black holes, on the other hand, usually remain hidden because their gravitational attraction is so strong that even light cannot escape, making them invisible to electromagnetic detectors.

If two black holes merge, the event may be invisible but is nonetheless detectable from ripples in space-time in the form of so-called gravitational waves. Certain detectors, like the "Laser Interferometer Gravitational Waves Observatory" (LIGO) in the USA, are able to detect these waves. The first successful direct observation was made in 2015. The signal was generated by the fusion of two black holes. But this event may not be the only source of gravitational waves, which could also come from the merger of two neutron stars or a black hole with a neutron star. Discovering the differences is one of the major challenges in observing these events, according to Dr Arca Sedda.

In his study, the Heidelberg researcher analysed the fusion of pairs of black holes and neutron stars. He used detailed computer simulations to study the interactions between a system made up of a star and a compact object, such as a black hole, and a third massive roaming object that is required for a fusion. The results indicate that such three-body interactions can in fact contribute to black hole-neutron star mergers in dense stellar regions like globular star clusters. "A special family of dynamic mergers that is distinctly different from mergers in isolated areas can be defined", explains Manuel Arca Sedda.

The fusion of a black hole with a neutron star was first observed by gravitational wave observatories in August 2019. Yet optical observatories around the world were unable to locate an electromagnetic counterpart in the region from which the gravitational wave signal originated, suggesting that the black hole had completely devoured the neutron star without first destroying it. If confirmed, this could be the first observed black hole-neutron star merger detected in a dense stellar environment, as described by Dr Arca Sedda.

Credit: 
Heidelberg University

New technology will show how RNA regulates gene activity

image: Illustration of DNA. This is an artist’s illustration of DNA provided by MIPT Press Office.

Image: 
MIPT Press Office

The discovery of a huge number of long non-protein coding RNAs, aka lncRNAs, in the mammalian genome was a major surprise of the recent large-scale genomics projects. An international team including a bioinformatician from the Research Center of Biotechnology of the Russian Academy of Sciences, and the Moscow Institute of Physics and Technology has developed a reliable method for assessing the role of such RNAs. The new technique and the data obtained with it allow generating important hypotheses on how chromatin is composed and regulated, as well as identifying the specific functions of lncRNAs.

Presented in Nature Communications, the technology is called RADICL-seq and enables comprehensive mapping of each RNA, captured while interacting with all the genomic regions that it targets, where many RNAs are likely to be important for genome regulation and structure maintenance.

RNA and gene regulation

It was previously believed that RNA functions mostly as an intermediary in building proteins based on a DNA template, with very rare exceptions such as ribosomal RNAs. However, with the development of genomic analysis, it turned out that not all DNA regions encode RNA, and not all transcribed RNA encodes proteins.

Although the number of noncoding RNAs and those that encode proteins is about the same, the function of most noncoding RNA is still not entirely clear.

Every type of cell has its own set of active genes, resulting in the production of specific proteins. This makes a brain cell different from a blood cell of the same organism -- despite both sharing the same DNA. Scientists are now coming to the conclusion that RNA is one of the factors that determine which genes are expressed, or active.

Long noncoding RNAs are known to interact with chromatin -- DNA tightly packaged with proteins. Chromatin has the ability to change its conformation, or "shape," so that certain genes are either exposed for transcription or concealed. Long noncoding RNAs contribute to this conformation change and the resulting change in gene activity by interacting with certain chromatin regions. To understand the regulatory potential of RNA -- in addition to it being a template for protein synthesis -- it is important to know which chromatin region any given RNA interacts with.

How it works

RNAs interact with chromatin inside the cell nucleus by binding to chromatin-associated proteins that fold a DNA molecule. There are several technologies that can map such RNA-chromatin interactions. However, all of them have significant limitations. They tend to miss interactions, or require a lot of input material, or disrupt the nuclear structure.

To address these shortcomings, a RIKEN-led team has presented a new method: RNA and DNA Interacting Complexes Ligated and Sequenced, or RADICL-seq for short. The technique produces more accurate results and keeps the cells intact up until the RNA-chromatin contacts are ligated.

The main idea of the RADICL-seq method is the following. First, the RNA is crosslinked to proteins located close to it in the nucleus of cells with formaldehyde. Then, DNA is cut into pieces by digesting it with a special protein. After that, the technology employs RNase H treatment to reduce ribosomal RNA content, thus increasing the accuracy of the result. Then, by using a bridge adapter (a molecule with single-stranded and double-stranded ends) the proximal DNA and RNA are ligated (fig. 2a). After the reversal of crosslinks, the RNA-adapter-DNA chimera is converted to double-stranded DNA for sequencing (fig. 2b), revealing the sequence of the ligated RNA and DNA.

Decoding the noncoding

In comparison with other existing methods, RADICL-seq mapped RNA-chromatin interactions with a higher accuracy. Moreover, the superior resolution of the technology allowed the team to detect chromatin interactions not only with the noncoding but also with the coding RNAs, including those found far from their transcription locus. The research confirmed that long noncoding RNAs play an important role in the regulation of gene expression occurring at a considerable distance from the regulated gene.

This technology can also be used to study cell type-specific RNA-chromatin interactions. The scientists proved it by looking at two noncoding RNAs in a mouse cell, one of them possibly associated with schizophrenia. They found that an interaction pattern between chromatin and those RNAs in two different cells -- the embryonic stem cell and the oligodendrocyte progenitor cell -- correlated with preferential gene expression in those cell types (fig. 3).

The new method's flexibility means scientists can gather additional biological information by modifying the experiment. In particular, this technology can make it possible to identify direct RNA-DNA interactions not mediated by chromatin proteins. The analysis performed by bioinformaticians from the Research Center of Biotechnology and MIPT showed that not only the standard double helix interactions between DNA and RNA but also those involving RNA-DNA triplexes could participate in gene regulation. Also, such interactions highlight the significance of noncoding RNA in protein targeting to particular gene loci.

"We are planning to conduct further research on the role of RNA in the regulation of gene expression, chromatin remodeling, and ultimately, cell identity. Hopefully, we will be able to regulate genes by using these noncoding RNAs in the near future. This can be especially helpful for treating diseases," says Yulia Medvedeva, who leads the Regulatory Transcriptomics and Epigenomics group at the Research Center of Biotechnology, RAS, and heads the Lab of Bioinformatics for Cell Technologies at MIPT. She also manages the grant project supported by the Russian Science Foundation, which co-funded the study.

Credit: 
Moscow Institute of Physics and Technology

Global cooling event 4,200 years ago spurred rice's evolution, spread across asia

image: A simplified map shows the spread of rice into both northern and southern Asia following a global cooling event approximately 4,200 years before present (yBP).

Image: 
Rafal Gutaker, New York University

A major global cooling event that occurred 4,200 years ago may have led to the evolution of new rice varieties and the spread of rice into both northern and southern Asia, an international team of researchers has found.

Their study, published in Nature Plants and led by the NYU Center for Genomics and Systems Biology, uses a multidisciplinary approach to reconstruct the history of rice and trace its migration throughout Asia.

Rice is one of the most important crops worldwide, a staple for more than half of the global population. It was first cultivated 9,000 years ago in the Yangtze Valley in China and later spread across East, Southeast, and South Asia, followed by the Middle East, Africa, Europe, and the Americas. In the process, rice evolved and adapted to different environments, but little is known about the routes, timing, and environmental forces involved in this spread.

In their study, the researchers reconstructed the historical movement of rice across Asia using whole-genome sequences of more than 1,400 varieties of rice--including varieties of japonica and indica, two main subspecies of Asian rice--coupled with geography, archaeology, and historical climate data.

For the first 4,000 years of its history, farming rice was largely confined to China, and japonica was the subspecies grown. Then, a global cooling event 4,200 years ago--also known as the 4.2k event, which is thought to have had widespread consequences, including the collapse of civilizations from Mesopotamia to China--coincided with japonica rice diversifying into temperate and tropical varieties. The newly evolved temperate varieties spread in northern China, Korea and Japan, while the tropical varieties and spread to Southeast Asia.

"This abrupt climate change forced plants, including crops, to adapt," said Rafal M. Gutaker, a postdoctoral associate at the NYU Center for Genomics and Systems Biology and the study's lead author. "Our genomic data, as well as paleoclimate modeling by our collaborators, show that the cooling event occurred at the same time as the rise of temperate japonica, which grows in milder regions. This cooling event also may have led to the migration of rice agriculture and farmer communities into Southeast Asia."

"These findings were then backed up by data from archaeological rice remains excavated in Asia, which also showed that after the 4.2k event, tropical rice migrated south while rice also adapted to northern latitudes as temperate varieties," said Michael D. Purugganan, the Silver Professor of Biology at NYU, who led the study.

After the global cooling event, tropical japonica rice continued to diversify. It reached islands in Southeast Asia about 2,500 years ago, likely due to extensive trade networks and the movement of goods and peoples in the region--a finding also supported by archeological data.

The spread of indica rice was more recent and more complicated; after originating in India's lower Ganges Valley roughly 4,000 years ago, the researchers traced its migration from India into China approximately 2,000 years ago.

While the researchers had thought that rainfall and water would be the most limiting environmental factor in rice diversity, they found temperature to be the key factor instead. Their analyses revealed that heat accumulation and temperature were very strongly associated with the genomic differences between tropical and temperate japonica rice varieties.

"This study illustrates the value of multidisciplinary research. Our genomic data gave us a model for where and when rice spread to different parts of Asia, archaeology told us when and where rice showed up at various places, and the environmental and climate modeling gave us the ecological context," said Purugganan. "Together, this approach allows us to write a first draft of the story of how rice dispersed across Asia."

Understanding the spread of rice and the related environmental pressures could also help scientists develop new varieties that meet future environmental challenges, such as climate change and drought--which could help address looming food security issues.

"Armed with knowledge of the pattern of rice dispersal and environmental factors that influenced its migration, we can examine the evolutionary adaptations of rice as it spread to new environments, which could allow us to identify traits and genes to help future breeding efforts," said Gutaker.

Credit: 
New York University

Viral infection: Early indicators of vaccine efficacy

Ludwig-Maximilians-Universität (LMU) in Munich researchers have shown that a specific class of immune cells in the blood induced by vaccination is an earlier indicator of vaccine efficacy than conventional tests for neutralizing antibodies.

The current coronavirus pandemic, together with episodic outbreaks of infections caused by other pathogenic viruses, represent a growing threat to societies around the world, especially when effective vaccines are either lacking altogether or are in short supply. This underlines the importance of novel approaches to the prevention and treatment of viral infections. However, this task will require a better understanding of the complexities of the cellular immune response to viral challenges and vaccines. An interdisciplinary team of scientists working on the vaccine virus that prevents yellow fever now reports an important advance towards this goal. Led by Dirk Baumjohann (who until recently was based in LMU's Biomedical Center and is now a professor at Bonn University) and Simon Rothenfusser (Department of Clinical Pharmacology, LMU Medical Center and Helmholtz Zentrum München), the group has shown that the quality of the immune response induced by vaccination can be rapidly ascertained by measuring the frequency of a specific subclass of white blood cells in the circulation. The new findings appear in the journal Clinical & Translational Immunology.

The current vaccine against yellow fever is one of the most effective vaccines on the market. It belongs to the type known as 'live vaccines, as it contains whole virus particles of a vaccine strain of the yellow fever virus with greatly reduced virulence. Following vaccination, the vaccine virus replicates in the body, thus inducing the immune system to produce antibodies against it. These antibodies bind specifically to the virus, thereby prevent its entry into cells and mark it for destruction by other components of the body's adaptive immune system. "In our study, we monitored the development of the immune response in the blood of healthy subjects," says Baumjohann. "The individuals involved belong to a cohort of 250 people who had been vaccinated against yellow fever. This cohort was recruited by the Departments of Clinical Pharmacology and Tropical Medicine at the LMU Medical Center, in collaboration with the Helmholtz Zentrum München."

The remarkable efficacy of the yellow fever vaccine is attributable to the fact that it triggers the production of very high levels of specific antibodies, which are detectable for decades after immunization. How the vaccine induces such an extraordinarily long-lived immune response is not fully understood. The induction of antibody formation and secretion is a complex process, in which cells called T follicular helper (Tfh) cells play an essential role. This particular subgroup of white blood cells is found primarily in secondary lymphoid organs, including the lymph nodes and the spleen. "However, these organs are difficult to analyze in humans. But what are referred to as 'Tfh-like cells' are found in the circulation, and their frequencies correlate quite well with the relative concentrations of classical Tfh cells in secondary lymphoid organs," explains Johanna Huber, a PhD student in Baumjohann's group and lead author of the new paper. "Therefore, analysis of circulating Tfh-like cells is relevant and can be performed directly in blood samples." Despite this ease of access, the role of these circulating T cells in the immune response to the yellow-fever vaccine had not previously been examined in detail.

Baumjohann and his colleagues have now shown that measurements of the properties and dynamics of Tfh-like cells allow one to predict the efficacy of the immune response induced by the yellow fever vaccine - even before detectable amounts of neutralizing antibodies that inhibit viral replication appear in the circulation. The formation of highly potent neutralizing antibodies in response to natural infection or vaccination can take up to several weeks. "In contrast, vaccine-induced T follicular helper cells can be quantified in the blood by about day 7 after immunization. In addition, we found that the frequency of a subpopulation of Tfh cells in the blood at two weeks after vaccination correlates with the quality of the antibody response that we detect a fortnight later," Baumjohann says.

The authors of the new study believe that these results are of potential significance for the development of effective vaccines against other viral diseases such as Covid19. "How SARS-Cov-2 triggers an immune response is essentially unknown at the moment. The vaccine against yellow fever is a paradigmatic example for the induction of live-long immunity against a specific virus. Our findings on the immune cell phenotypes in the yellow fever vaccine cohort will therefore provide a useful source of data for comparative studies on the immunogenicity of SARS-Cov-2 and other emerging classes of viral pathogens and vaccines for which it is not yet known if they induce a long-lasting protective immune response," adds Simon Rothenfusser.

Credit: 
Ludwig-Maximilians-Universität München

Early humans thrived in this drowned South African landscape

Early humans lived in South African river valleys with deep, fertile soils filled with grasslands, floodplains, woodlands, and wetlands that abounded with hippos, zebras, antelopes, and many other animals, some extinct for millennia.

In contrast to ice age environments elsewhere on Earth, it was a lush environment with a mild climate that disappeared under rising sea levels around 11,500 years ago.

An interdisciplinary, international team of scientists has now brought this pleasant cradle of humankind back to life in a special collection of articles that reconstruct the paleoecology of the Paleo-Agulhas Plain, a now-drowned landscape on the southern tip of Africa that was high and dry during glacial phases of the last 2 million years.

"These Pleistocene glacial periods would have presented a very different resource landscape for early modern human hunter-gatherers than the landscape found in modern Cape coastal lowlands, and may have been instrumental in shaping the evolution of early modern humans," said Janet Franklin, a distinguished professor of biogeography in the department of Botany and Plant Sciences at UC Riverside, an associate member of the African Centre for Coastal Palaeoscience at Nelson Mandela University in South Africa, and co-author of several of the papers.

Some of the oldest anatomically modern human bones and artifacts have been found in cliff caves along the coast of South Africa. For many years, the lack of shellfish in some layers at these sites puzzled archaeologists. In spite of apparently living near the ocean, the inhabitants hunted mostly big game -- the sort of animals that typically live farther inland.

Scientists knew a submerged landscape existed on the continental shelf just offshore, but it wasn't until recently, perhaps inspired by rising sea levels of our current human-caused global warming, they realized these caves might have made up the westernmost edge of a long-lost plain.

During most of the Pleistocene, the geological era before the one we live in now, these caves were not located on the coast. With so much of the Earth's water locked up in continent-sized glaciers, sea level was much lower, and humans could have thrived between the cliffs and a gentler coastline miles and miles to the east.

A special issue of Quaternary Science Reviews presents papers using a wide range of techniques to reconstruct the environment and ecology of the Paleo-Agulhas Plain. They reveal a verdant world rich with game, plant, and coastal resources, periodically cut off from the mainland during warm spells between glacial periods when sea level rose to levels similar to those of today, which would have played an important role in human evolution.

Franklin and her colleagues used modern vegetation patterns along the Cape south coast to develop models of the expected vegetation for the various soil types, as well as the climate (especially rainfall) and fire regimes of the past glacial periods that framed most of the timeframe in which modern humans emerged.

Joining her in the research were Richard M. Cowling and Alastair J. Potts of Nelson Mandela University; Guy F. Midgley at Stellenbosch University; Francois Engelbrecht of the University of Witwatersrand; and Curtis W. Marean of Arizona State University.

Vegetation was reconstructed based on a model of the ancient climate and fire patterns of these glacial phases that define human evolution. The group developed the vegetation model based on present-day patterns and environmental conditions, compared their model to an independently derived vegetation map to validate it, then applied it to the climate, landforms, and soils reconstructed for the peak of the last ice age on the Palaeo-Agulhas Plain.

Reconstruction, mapping, and modeling of the paleo-climate, geology, and soils by their collaborators are featured in other articles in the special issue.

The model found the paleo-landscape exposed during glacial low-sea levels added a land area the size of Ireland to the southern tip of Africa. Near the coast, it was dominated by "limestone fynbos," a low-stature, but species-rich shrubland typical of contemporary South Africa's Cape Floristic Province, a global plant diversity hotspot. The northern plains were mostly grasslands in shallow floodplains and on shale bedrock.

This savanna-like vegetation is rare in the modern landscape and would have supported the megafauna typical of glacial periods. These game animals, found in the archaeological record, include a great diversity of grazing animals, including the now-extinct giant Cape Buffalo, and others of which no longer occur naturally in this part of Africa, such as giraffes.

The Paleo-Agulhas plain had extremely high plant species diversity, as well as a greater variety of ecosystems and plant communities than currently found in this region, including shale grassland with dune fynbos-thicket mosaic on uplands and broad and shallow floodplains supporting a mosaic of woodland and grassland on fertile, alluvial soils.

Credit: 
University of California - Riverside

The dreaming brain tunes out the outside world

Scientists from the CNRS and the ENS-PSL in France (1) and Monash University in Australia have shown that the brain suppresses information from the outside world, such as the sound of a conversation, during the sleep phase linked to dreaming. This ability could be one of the protective mechanisms of dreams. The study, carried out in collaboration with the Centre du Sommeil et de la Vigilance, Hôtel-Dieu, AP-HP - Université de Paris, is published in Current Biology on 14 May 2020.

While we dream, we invent worlds that bear no relation to the quietness of our bedroom. In fact, it is rather unusual for elements of our immediate environment to be incorporated into our dreams. To better understand how the brain protects itself from outside influences, researchers invited 18 participants to a morning nap in the lab. Morning sleep is rich in dreams. Dreams mostly occur during what is known as REM sleep, since the brain is somehow in a waking state during this phase of sleep, showing brain activity similar to that when a person is awake. The body, on the other hand, is paralysed, although not entirely. During certain phases of REM sleep, the eyes continue to move. Research has shown that such movements are related to dreaming.

To study how the dreaming brain interacts with external sounds, the scientists got volunteer sleepers to listen to stories in French mixed with meaningless language. By combining the electroencephalogram with a machine learning technique, they confirmed that, even when the brain is asleep, it continues to record everything that goes on around it (2). They also showed that, during light sleep, the brain prioritises meaningful speech, just as it does when in the waking state. However, such speech is actively filtered out during eye movement phases in REM sleep. In other words, our sleeping brain can select information from the outside world and flexibly amplify or suppress it, depending on whether or not it is immersed in a dream!

The team believe that this mechanism enables the brain to protect the dreaming phase, which is necessary for emotional balance and consolidation of the day's learning. Although dreams are predominant during periods of eye movement, they can also occur during other phases of sleep. Are they then accompanied by a similar suppression of sensations from the outside world?

Credit: 
CNRS

How do plants forget?

The study now published in Nature Cell Biology reveals more information on the capacity of plants, identified as 'epigenetic memory,' which allows recording important information to, for example, remember prolonged cold in the winter to ensure they flower at the right time during the spring.

As soon as they produce seeds, this information is "erased" from memory so they don't bloom too early the following winter.

Although they do it differently than humans, plants also have memories. This so-called "epigenetic memory" occurs by modifying specialized proteins called histones, which are important for packaging and indexing DNA in the cell. One such histone modification, called H3K27me3, tends to mark genes that are turned off. In the case of flowering, cold conditions cause H3K27me3 to accumulate at genes that control flowering. Previous work from the Berger lab has shown how H3K27me3 is faithfully transmitted from cell to cell so that in the Spring, plants will remember that it was cold and that winter is over, allowing them to flower at the right time. But just as importantly, once they've flowered and made seeds, the seeds need to forget this 'memory' of the cold so that they do not flower too soon once winter comes around again. Since H3K27me3 is faithfully copied from cell to cell, how do plants go about forgetting this memory in seeds?

Jörg Becker, principal investigator at the Instituto Gulbenkian de Ciência, involved in the international team led by researcher Frédéric Berger, of the Gregor Mendel Institute of the Austrian Academy of Sciences, says that researchers set out to analyse histones in pollen, hypothesizing that the process of forgetting would most likely occur in the embedded sperm. According to Jörg Becker, "the study led us to identify a phenomenon, the so-called "epigenetic resetting", akin to erasing and reformatting data on a hard drive".

The researchers were surprised to find that H3K27me3 completely disappeared in sperm. They found that sperm accumulate a special histone that is unable to carry H3K27me3. This ensures that this modification is erased from hundreds of genes, not only those that prevent flowering but also ones which control a large array of important functions in seeds, which are produced once the sperm is carried by the pollen to fuse with the plant egg cell.

This phenomenon is called "epigenetic resetting" and is akin to erasing and reformatting data on a hard drive.

"This actually makes a lot of sense from an ecological perspective" says Dr. Borg, first author of the paper. "Since pollen can spread over long distances, by wind or bees for example, and much of the "memory" carried by H3K27me3 is related to environmental adaptation, it makes sense that seeds should "forget" their dad's environment and instead remember their mother's, since they are most likely to spread and grow next to mom."

According to Dr. Berger "Like plants, animals also erase this epigenetic memory in sperm, but they do it by replacing histones with a completely different protein. This is one of the first examples of how a specialized histone variant can help reprogram and reset a single epigenetic mark while leaving others untouched. There are many more unstudied histone variants in both plants and animals, and we expect that aspects of this resetting mechanism we have discovered will be found in other organisms and developmental contexts."

Credit: 
Instituto Gulbenkian de Ciencia

Blood clotting abnormalities reveal COVID-19 patients at risk for thrombotic events

video: Journal of the American College of Surgeons research findings highlight early research on blood clotting evaluation work that may help identify and treat dangerous complications of COVID-19. Commentary by lead study author Franklin Wright, MD, FACS, University of Colorado School of Medicine.

Image: 
American College of Surgeons

CHICAGO (May 15, 2020): When researchers from the University of Colorado Anschutz Medical Campus, Aurora, used a combination of two specific blood-clotting tests, they found critically ill patients infected with Coronavirus Disease 2019 (COVID-19) who were at high risk for developing renal failure, venous blood clots, and other complications associated with blood clots, such as stroke. Their study, which was one of the first to build on growing evidence that COVID-19-infected patients are highly predisposed to developing blood clots, linked blood clotting measurements with actual patient outcomes. The research team is now participating in a randomized clinical trial of a drug that breaks down blood clots in COVID-19-infected patients. "This is an early step on the road to discovering treatments to prevent some of the complications that come with this disease," said Franklin Wright, MD, FACS, lead author of the research article and an assistant professor of surgery at the University of Colorado School of Medicine. Their research is published as an "article in press" on the Journal of the American College of Surgeons website ahead of print.

Patients who are critically ill regardless of cause can develop a condition known as disseminated intravascular coagulation (DIC). The blood of these patients initially forms many clots in small blood vessels. The body's natural clotting factors can form too much clot or eventually not be able to effectively form any clot leading to issues of both excessive clotting and excessive bleeding. However, in patients with COVID-19 the clotting appears to be particularly severe and--as evidenced by case studies in China and elsewhere1--clots in COVID-19 patients do not appear to dissipate, explained Dr. Wright.

Trauma acute care surgeons and intensive care physicians who treat trauma, transplant, and cardiothoracic surgery patients at UC Health University of Colorado Hospital saw the potential of using a specialized coagulation test to examine clotting issues in COVID-19 patients. Thromboelastography (TEG) is a whole blood assay that provides a broad picture of how an individual patient's blood forms clots, including how long clotting takes, how strong clots are, and how soon clots break down. TEG is highly specialized and used primarily by surgeons and anesthesiologists to evaluate the efficiency of blood clotting; it is not widely used in other clinical settings. "The COVID pandemic is opening doors for multidisciplinary collaboration so trauma acute care surgeons and intensivists can bring the tools they use in their day-to-day lives and apply them in the critical care setting to new problems," Dr. Wright said.

The researchers evaluated outcomes for all patients who had a TEG assay as part of their treatment for COVID-19 infection as well as other conventional coagulation assays, including ones that measure D-dimer levels. D-dimer is a protein fragment that is produced when a blood clot dissolves. D-dimer levels are elevated when large numbers of clots are breaking down.

A total of 44 patients treated for COVID-19 infection between March 22 and April 20 were included in the analysis. Those whose bodies were not breaking down clots most often required hemodialysis and had a higher rate of clots in the veins. These patients were identified by TEG assays showing no clot breakdown after 30 minutes and a D-dimer level greater than 2600 ng/mL. Eighty percent of patients with both affirmative test findings were placed on dialysis compared with 14 percent who tested for neither finding. Patients with affirmative test findings also had a 50 percent rate of venous blood clots compared with 0 percent for those patients with neither finding.

"These study results suggest there may be a benefit to early TEG testing in institutions that have the technology to identify COVID-19 patients who may need more aggressive anticoagulation therapy to prevent complications from clot formation," Dr. Wright said.

A clinical trial of one form of treatment is already underway. The Denver Health and Hospital Authority is leading a multi-center study that includes UC Health University of Colorado Hospital, National Jewish Health-St Joseph Hospital, Beth Israel Deaconess Medical Center, and Long Island Jewish Hospital in conjunction with Genentech, Inc., enrolling patients with COVID-19 infection in a randomized clinical trial of tissue plasminogen activator (tPA). This drug is a clot-busting, thrombolytic medicine that was first approved by the U.S. Food and Drug Administration in 1987 for the treatment of heart attack and later approved for acute massive pulmonary embolism and acute ischemic stroke.2 The trial will assess the efficacy and safety of intravenous tPA in improving respiratory function and management of patients with aggressive blood clotting.

"This study suggests that testing whole blood clotting measurements may allow physicians to identify and treat patients with COVID-19 more effectively to prevent complications and encourage further research into therapies to prevent blood clots in these patients," Dr. Wright said.

Credit: 
American College of Surgeons

AJR details COVID-19 infection control, radiographer protection in CT exam areas

image: (HIS = hospital information system, RIS = radiology information system)

Image: 
American Journal of Roentgenology (AJR)

Leesburg, VA, May 15, 2020--In an open-access article published ahead-of-print by the American Journal of Roentgenology (AJR), a team of Chinese radiologists discussed modifications to the CT examination process and strict disinfection of examination rooms, while outlining personal protection measures for radiographers during the coronavirus disease (COVID-19) outbreak.

As Jieming Qu, Wenjie Yang, and colleagues at Shanghai Jiao Tong University Medical School Affiliated Ruijin Hospital noted, to undergo CT, patients must exit the fever clinic and proceed to an examination room elsewhere at the institution. Moreover, CT examination rooms are not designed according to the rule of three zones and two aisles--clean zone, semicontaminated zone, contaminated zone; patient aisle and health care worker aisle.

"We were able to urgently install a CT scanner in the fever clinic at the beginning of the outbreak, which allowed rapid screening and early diagnosis," Qu et al. wrote. A safe infection control strategy for examination of patients with suspected SARS-CoV-2 was also implemented, including reconstructing the area and planning the path a patient would take. Additionally, Qu, Yang, and team rerouted the walking pathway to be one-way, limiting ingress and egress while separating contaminated zones from clean zones.

Qu, Yang, and colleagues' extensive routine for examination room disinfection included using an air disinfector (maximum volume of 4000 m3/h) and a movable ultraviolet light (intensity higher than 70 μW/cm2 per meter); cleaning nonplastic equipment surfaces, radiation protection items, and doorknobs with a solution at least 75% alcohol; washing plastic surfaces with soapy solution; and mopping the floor with a disinfectant containing 2000 mg Cl per liter of water. Similarly, all patient waste was considered infectious medical waste and managed accordingly.

Typically, CT scanning is performed by two radiographers. As Qu et al. explained: "The operating radiographer works in the locked control room and controls the scanner (contaminated area). The positioning radiographer works inside and outside the scanning room (contaminated area) and is responsible for communicating with and positioning the patient. The positioning radiographer is not allowed to enter into the control room until the shift ends."

Once a shift is finished, the authors of this AJR article noted that the positioning radiographer is allowed to enter the clean zone only after protective equipment has been properly discarded in the buffer zone.

Credit: 
American Roentgen Ray Society

Persistent inequitable exposure to air pollution in Salt Lake County schools

image: Locations of the 174 public schools included in the study and the PM 2.5 sensors.

Image: 
Mullen et. al. Enviro Res (2020)

Salt Lake County, Utah’s air pollution varies over the year, and at times it is the worst in the United States. The geography traps winter inversions and summertime smog throughout the Salt Lake Valley, but underserved neighborhoods—and their schools—experience the highest concentrations. Previous research has shown pollution disparities using annual averages of PM 2.5 levels, the tiny breathable particles that can damage lungs just hours after exposure. Children are especially at risk and experience more than just health effects; exposure to PM 2.5 affects school attendance and academic success.

A new study utilized a community-university partnership of nearly 200 PM 2.5 sensors through the University of Utah’s Air Quality and U (AQ&U) network. U researchers explored social disparities in air pollution in greater detail than ever before, and their findings reveal persistent social inequalities in Salt Lake County. The paper posted online ahead of publication in the journal Environmental Research.

The researchers analyzed PM 2.5 levels at 174 public schools in Salt Lake County, Utah under three different scenarios: relatively clean, moderate inversion and major inversion days. Schools with predominately minority students were disproportionally exposed to worse air quality under all scenarios. Charter schools and schools serving students from low income households were disproportionally exposed when PM 2.5 was relatively good or moderate. The findings speak to the need for policies that protect school-aged children from environmental harm.

“The persistence of these injustices — from the pretty clean, but health-harming levels all the way up to the horrific air days—at schools serving racial/ethnic minority kids is unacceptable,” said Sara Grineski, U professor of sociology and environmental studies and senior author of the paper.

The authors expected social disparities on bad air days, but were surprised that they persisted on clean air days when PM 2.5 levels are still higher than recommended by the U.S. Environmental Protection Agency.

“What makes this project so novel is the community-U partnership that gave us access to this larger network of sensors and helped provide the detailed study. If we had relied on Utah Department of Air Quality, we’d only have had two monitors and would have missed the nuanced variability,” said Casey Mullen, a doctoral student at the U and lead author of the study.

A higher-resolution snapshot

The worst PM2.5 episodes occur during the winter, when cold air settles into the Salt Lake Valley and high pressure weather systems act as a lid that seals in particulate matter from vehicle exhaust, wood-burning fires and emission from industrial facilities. Locals refer to these periods as inversions, which can last from a few days to a few weeks. The lowest elevations experience high concentrations of PM 2.5 for the longest time, impacting the residential communities disproportionately. The study compared the PM 2.5 levels at 174 public schools in 10-minute increments over 2-day periods during each of three events: a major winter inversion (poor air quality), a moderate winter inversion (moderate air quality) and a relatively clean, fall day (good air quality). The extensive AQ&U network made up of 190 PM 2.5 sensors is extremely sensitive— each sensor collects PM 2.5 concentrations every second, then uploads the 60-second to a database that the public can access through the U’s AQ&U website: https://aqandu.org.

The researchers broke down 174 Salt Lake County public schools with respect to race/ethnicity, economic status, and student age. They also distinguished between school type; Title I Status (schools serving majority low-income households), charter school type, and alternative or special education school type. The average student body was 31% Hispanic, 15% non-Hispanic minority, and 54% white and about 45% of the schools were Title I eligible. Just over half of the schools were primary schools, about 16% were charter schools, and about 5% were alternative or special education schools.

During relatively clean air days, racial/ethnic minority students were disproportionally exposed to high concentrations. At the school level, a 21% increase in the proportion of Hispanic Students was associated with a 12% increase in concentration of PM 2.5. Charter schools were exposed to 20% higher concentrations of PM 2.5 than non-charter schools. During a moderate air quality day, charter, Title I schools, and schools with greater proportions of minority students were exposed to higher concentrations of PM 2.5. During bad air quality days, exposure concentrations were higher for schools with larger proportions of minority students.

“No one has yet looked at school type in terms of environmental justice. Charter schools are a new variable that intrigued us,” said Mullen. “It’s starting to build on some other story – why did we find these inequities in charter and Title I schools?”

Looking forward

This paper is one of many collaborations using the newly established AQ&U network.

“This is the first publication from such a diverse cross-disciplinary partnership arising from AQ&U, although we anticipate this is the first of many,” said Kerry Kelly, assistant professor in the Department of Chemical Engineering and co-author of the study. “We are enthusiastic about ongoing partnerships—to understand the effect of pollution microclimates on asthma exacerbations; to predict the severity of wildfire smoke plumes; and to engage student researchers and community partners in understanding the effect of sound walls on air quality.”

In future studies, the researchers hope to fill in even more gaps in the sensors to get a better picture of the social inequalities in Salt Lake County, Utah and in other areas, especially with regards to school-aged children.

“I see research like this continuing to build a wall of evidence that we have to do better in the way in which we regulate pollution exposure in the U.S. and worldwide,” said Grineski. “Evidence on top of evidence points to us having to do a better job of protecting people, especially kids, from pollution.”

Co-authors of the study include: Timothy Collins of the U’s Department of Geography; Wei Xing of the U’s School of Computing; Ross Whitaker and Miriah Meyer of the U’s School of Computing and the Scientific Computing and Imaging Institute (SCI); Tofigh Sayahi of the U’s Department of Chemical Engineering; Tom Becnel and Pierre-Emmanuel Gaillardon of the U’s Department of Electrical and Computer Engineering; and Pascal Goffin of SCI.

Journal

Environmental Research

DOI

10.1016/j.envres.2020.109543

Credit: 
University of Utah

Further evidence does not support hydroxychloroquine for patients with COVID-19

The anti-inflammatory drug hydroxychloroquine does not significantly reduce admission to intensive care or death in patients hospitalised with pneumonia due to covid-19, finds a study from France published by The BMJ today.

A randomised clinical trial from China also published today shows that hospitalised patients with mild to moderate persistent covid-19 who received hydroxychloroquine did not clear the virus more quickly than those receiving standard care. Adverse events were higher in those who received hydroxychloroquine.

Taken together, the results do not support routine use of hydroxychloroquine for patients with covid-19.

Hydroxychloroquine can reduce inflammation, pain, and swelling, and is widely used to treat rheumatic diseases. It is also used as an anti-malarial drug. Lab tests showed promising results, but accumulating trial and observational evidence has called into question whether there are any meaningful clinical benefits for patients with covid-19.

Despite this, hydroxychloroquine has already been included in Chinese guidelines on how best to manage the disease, and the US Food and Drug Administration (FDA) issued an emergency use authorization to allow the drug to be provided to certain hospitalized patients. The FDA has since warned against use outside clinical trials or hospital settings due to the risk of heart rhythm problems.

In the first study, researchers in France assessed the effectiveness and safety of hydroxychloroquine compared with standard care in adults admitted to hospital with pneumonia due to covid-19 who needed oxygen.

Of 181 patients, 84 received hydroxychloroquine within 48 hours of admission and 97 did not (control group).

They found no meaningful differences between the groups for transfer to intensive care, death within 7 days, or developing acute respiratory distress syndrome within 10 days.

The researchers say that caution is needed in the interpretation of their results, but that their findings do not support the use of hydroxychloroquine in patients hospitalised with covid-19 pneumonia.

In the second study, researchers in China assessed the effectiveness and safety of hydroxychloroquine compared with standard care in 150 adults hospitalised with mainly mild or moderate covid-19.

Patients were randomly split into two groups. Half received hydroxychloroquine in addition to standard care and the others received standard care only (control group).

By day 28, tests revealed similar rates of covid-19 in the two groups but adverse events were more common in those who received hydroxychloroquine. Symptom alleviation and time to relief of symptoms also did not differ meaningfully between the two groups.

While further work is needed to confirm these results, the authors say that their findings do not support the use of hydroxychloroquine to treat patients with persistent mild to moderate covid-19.

Credit: 
BMJ Group

COVID-19 death counts 'substantial underestimation' of actual deaths for some Italian regions

Official covid-19 death counts are likely to be a "substantial underestimation" of the actual number of deaths from the disease, at least for some Italian regions, concludes a study published by The BMJ today.

The findings, from an Italian city severely affected by the covid-19 pandemic, show that more residents died in March 2020 than in the entire previous year or in any single year since 2012, but that only about half of the deaths occurring during the recent outbreak were reported as confirmed covid-19 deaths.

The researchers say counting deaths from all causes (known as "all cause mortality") would yield a more complete picture of the pandemic's effects on population health.

The global spread of covid-19 has severely affected the Lombardy region of northern Italy. But although the reported death rates from covid-19 are high, the real figures could be even higher according to all cause mortality data.

In an effort to accurately determine deaths from covid-19, researchers analysed the change in all cause mortality over time in Nembro, a small city with a relatively stable population of around 11,500 in the province of Bergamo, Lombardy, northern Italy.

Their findings are based on monthly all cause mortality data between January 2012 and 11 April 2020, the number of confirmed deaths from covid-19 to 11 April 2020, and the weekly absolute number of deaths between 1 January and 4 April across recent years by age group and sex.

Monthly all cause mortality between January 2012 and February 2020 fluctuated around 10 per 1000 person years, with a maximum of 21.5 per 1000 person years.

In March 2020, monthly all cause mortality reached a peak of 154.4 per 1000 person years - for comparison, the corresponding rate for the same month in 2019 was only 14.3 per 1000 person years. For the first 11 days in April, this rate decreased to 23 per 1000 person years.

Of the 161 people who died between 23 February and 4 April 2020, none were aged 14 years or younger and 14 (8.7%) were aged between 15 and 64 years.

The observed increase in all cause deaths was largely driven by the increase in deaths among older people (65 years and older), especially men. Among those aged 75 years and older, 47 deaths were observed during the week of 8 March alone, 33 of which were in men.

This is a descriptive study, so it cannot establish a causal relationship, and the researchers acknowledge that some of the data might be provisional.

The authors point out that the steep increase in all cause mortality was even more pronounced after further analysis to test the robustness of the results, and their findings back up results from a recent larger report from more than 1000 Italian cities.

As such, they say that across Italian cities, all cause mortality has notably increased because of the covid-19 pandemic, "but this increase is not being completely captured by officially reported statistics on confirmed covid-19 deaths."

They point to several factors that might have contributed to this discrepancy, such as shortages of tests to confirm cause of death and patients dying of indirect consequences of covid-19, such as the healthcare system crisis.

"These results suggest that the full implications of the covid-19 pandemic can only be completely understood if, in addition to confirmed deaths related to covid-19, consideration is also given to all cause mortality in a given region and time frame," they conclude.

Credit: 
BMJ Group