Culture

A vaccine targeting aged cells mitigates metabolic disorders in obese mice

image: Upper: Senescent T cells were defined as PD-1+ CD153+ cells in CD4+ T cells. The proportion of senescent T cells in visceral adipose tissues (VAT) of the high fat diet (HFD) control group was significantly increased compared with that in the VAT of the normal diet (ND) control group. CD153-CpG vaccination of HFD mice resulted in a significant improvement in the accumulation of senescent T cells in VAT.
Lower: The quantification of the FACS plots shown in (A) is presented. *p

Image: 
Osaka University

Osaka, Japan - Aging is a multifaceted process that affects our bodies in many ways. In a new study, researchers from Osaka University developed a novel vaccine that removes aged immune cells and then demonstrated an improvement of diabetes-associated metabolic derangements by vaccinating obese mice.

Aged, or senescent, cells are known to harm their surrounding younger cells by creating an inflammatory environment. A specific type of immune cell, called T cell, can accumulate in fat tissues in obese individuals in senescence, causing chronic inflammation, metabolic disorders and heart disease. To reduce the negative effects of senescent cells on the body, senotherapy was developed to target and eliminate these rogue cells. However, as this approach does not discriminate between different types of senescent cells, it has remained unknown whether specific depletion of senescent T cells can improve their adverse effects on organ physiology.

"The idea that eliminating senescent cells improves the organ dysfunction that we experience during aging is fairly new," says corresponding author of the study Hironori Nakagami. "Because senescent T cells can facilitate metabolic derangements similar to diabetes, we wanted to come up with a new approach to reduce the number of senescent T cells to then reverse the negative effects they have on glucose metabolism."

To achieve their goal, the researchers developed a novel vaccine targeting the surface protein CD153 that is present on senescent T cells populating fat tissues, thereby ensuring that normal T cells are not affected. To test the effects of their vaccine, the researchers fed mice with a high-fat diet to make them obese and ultimately to mimic the metabolic changes seen in diabetes. These include insulin resistance and an improperly functioning glucose metabolism, both of which can facilitate a deterioration of the eyes, kidneys, nerves and the heart. When they vaccinated these mice against CD153, the researchers observed a sharp decline of senescent T cells in the fat tissues of the mice, demonstrating the success of their approach.

But did it improve glucose metabolism in the obese mice? To investigate this, the researchers turned to a test that is widely used in clinically diagnosing diabetic patients and performed an oral glucose tolerance test in the mice, in which blood glucose levels were measured for up to 2 hours after giving the animals a known amount of glucose to drink. Vaccination against CD153 was able to restore glucose tolerance in obese mice. Unvaccinated obese mice, however, continued to have difficulties metabolizing glucose after intake and took a much longer time to reach similar blood levels as the vaccinated animals. The researchers also measured the extent of insulin resistance, which is a cornerstone of the metabolic changes seen in obesity and diabetes. Vaccinated mice showed significant improvements in insulin resistance as compared with the unvaccinated animals, demonstrating that the hormone that the body produces to lower blood glucose levels functioned properly.

"These are striking results that show how reducing senescent T cells in adipose tissues improves glucose metabolism of obese mice," says Nakagami. "Our findings provide new insights into removing specific senescent cells using specific vaccines and could potentially be used as a novel therapeutic tool for controlling glucose metabolism in obese individuals."

Credit: 
Osaka University

Functional in silico dissection of the brain during the natural wake-sleep cycle

image: General approach followed to construct the semi-empirical model.

Image: 
UPF

The human brain is a complex system comprising 1010 non-linear units (neurons) that interact in 1015 sites (synapses). Considering such an astonishing level of complexity and heterogeneity, it is surprising that the global dynamics of the brain self-organize into a discrete set of well-defined states.

These states are frequently placed along a unidimensional continuum. This continuum corresponds to the level of consciousness, which is reduced in states such as sleep, general anaesthesia or post-comatose disorders. The intuition behind the concept of "level of consciousness" is that consciousness is graded and uniform.

An alternative to this conception is the multidimensional and mechanistic characterization in brain states in terms of cognitive capacities, using computational models to reproduce the underlying neural dynamics.

This is the focus of a study published in the advanced online edition of the journal NeuroImage. In this international study, the first author is Ignacio Pérez Ipiña, a researcher in the Department of Physics at the University of Buenos Aires, who enjoyed the collaboration of Gustavo Deco, ICREA research professor at the Department of Information and Communication Technologies (DTIC) and director of the Center for Brain and Cognition (CBC), at UPF, along with other researchers from research centres and universities in Germany, Argentina, Australia, Chile, Denmark and the UK.

The experimental protocol involved participation by a cohort of 63 healthy subjects. "We explore this alternative by introducing a semi-empirical model linking regional activation and long-range functional connectivity in the different brain states studied during the natural wake-sleep cycle", the authors claim. "Our model combines functional magnetic resonance imaging (fMRI) data, in vivo estimates of structural connectivity, and anatomically-informed priors to constrain the independent variation of regional activation", they add.

The functional segregation of the human brain into systems that are differentially activated during cognition has been known since the earliest days of neurology, and this knowledge was greatly advanced by the introduction of non-invasive neuroimaging tools. Due to this specialization, even if different brain states bring about global changes in brain metabolism, the functional consequences of these changes are likely to manifest regional dependence. Thus, "we performed a computational simulation of functional connectivity for the different levels of arousal in the wake-deep sleep progression", Gustavo Deco clarifies.

The best computational fit to the empirical data was achieved using priors based on functionally coherent networks, with the resulting model parameters dividing the cortex into regions presenting opposite dynamical behaviour. In this study, frontoparietal regions approached a bifurcation from dynamics at a fixed point governed by noise, while sensorimotor regions approached a bifurcation from oscillatory dynamics

"In agreement with human electrophysiological experiments, sleep onset induced subcortical deactivation, which was subsequently reversed for deeper stages. We simulated external perturbations, and identified the key regions relevant for the recovery of wakefulness from deep sleep. Our model represents sleep as a state with diminished perceptual gating and the latent capacity for global accessibility that is required for rapid arousals", Deco explains.

In conclusion, the study authors implemented a computational model that synthesizes different sources of empirical data to achieve a mechanistic and multidimensional description of the intermediate complexity of the different brain states visited during the progression from wakefulness to deep sleep. This research shows that using the proposed model, states of consciousness can be described in terms of multiple dimensions with interpretations given by the choice of anatomically-informed priors.

Credit: 
Universitat Pompeu Fabra - Barcelona

Novel pathology could improve diagnosis and treatment of Huntington's and other diseases

Bristol scientists have discovered a novel pathology that occurs in several human neurodegenerative diseases, including Huntington's disease.

The article, published in Brain Pathology, describes how SAFB1 expression occurs in both spinocerebellar ataxias and Huntington's disease and may be a common marker of these conditions, which have a similar genetic background.

SAFB1 is an important protein controlling gene regulation in the brain and is similar in structure to other proteins associated with neurodegenerative diseases of age. The team, from the University of Bristol's Translational Health Sciences, wanted to find out if this protein might be associated with certain neurodegenerative conditions.

The researchers analysed SAFB1 expression in the post-mortem brain tissue of spinocerebellar ataxias (SCA's), Huntington's disease (HD), Multiple sclerosis (MS), Parkinson's disease patients and controls.
They found that SAFB becomes abnormally expressed in the nerve cells of brain regions associated with SCA and HD. Both of these conditions are associated with a specific pathology, called a polyglutamine expansion (an amino acid repeat), which only occurs in SCAs and HD. The same pathology was therefore not seen in control Parkinson's disease or multiple sclerosis.

"These novel findings highlight a previously unknown mechanism causing disease which, importantly, suggests SAFB1 may be a diagnostic marker for polyglutamine expansion diseases, such as HD said lead author, James Uney, Professor of Molecular Neuroscience at the University of Bristol.

"We were also able to demonstrate how SAFB1 binds the SCA1 gene with the disease causing polyglutamine expansion (which causes spinocerebellar ataxia 1). As well as identifying a possible diagnostic marker, these findings open up the possibility of developing new therapeutic treatments for these rare but devastating neurodegenerative diseases.

"The next step is to establish whether inhibiting SAFB1 expression protects patients."

Professor Uney said there was scope in the future to broaden the study to include other diseases, such as Alzheimer's, disease.

Credit: 
University of Bristol

Goodbye Northwestern Crow, hello Mexican Duck

The latest supplement to the American Ornithological Society's Checklist of North and Middle American Birds, published in The Auk: Ornithological Advances, includes several major updates to the organization of the continent's bird species, including the addition of the Mexican Duck and the removal of the Northwestern Crow. The official authority on the names and classification of the region's birds, the checklist is consulted by birdwatchers and professional scientists alike and has been published since 1886.

The Northwestern Crow has long been considered a close cousin of the more familiar and widespread American Crow, with a range limited to the Pacific Northwest. However, a recent study on the genetics of the two species prompted AOS's North American Classification Committee to conclude that the two species are actually one and the same. "People have speculated that the Northwestern Crow and the American Crow should be lumped for a long time, so this won't be a surprise to a lot of people," says the U.S. Geological Survey's Terry Chesser, chair of the committee. "Northwestern Crows were originally described based on size, being smaller than the American Crow, and behavior, but over the years the people who've looked at specimens or observed birds in the field have mostly come to the conclusion that the differences are inconsistent. Now the genomic data have indicated that this is really variation within a species, rather than two distinct species."

However, birdwatchers disappointed to lose the Northwestern Crow from their life lists can take solace in the addition of a new species to the official checklist: the Mexican Duck. "The checklist recognized Mexican Duck until 1973, when it was lumped with Mallard," says Chesser. "But the Mexican Duck is part of a whole complex of Mallard-like species, including Mottled Duck, American Black Duck, and Hawaiian Duck, and all of those are considered distinct species except for, until recently, the Mexican Duck. Now genomic data have been published on the complex and on the Mexican Duck and Mallard in particular, and they show that gene flow between them is limited, which was enough to convince the committee to vote for the split."

Additional changes introduced in this year's checklist supplement include a massive reorganization of a group of Central American hummingbirds known as the emeralds -- adding nine genera, deleting six others, and transferring seven additional species between already-recognized genera -- as well as an update to the criteria for adding introduced, non-native species to the list that raises the bar for introduced species to officially be considered established. The full checklist supplement is available at https://academic.oup.com/auk/article-lookup/doi/10.1093/auk/ukaa030.

Credit: 
American Ornithological Society Publications Office

Could your computer please be more polite? Thank you

image: Researchers at Carnegie Mellon University have developed an automated method for making communications more polite. This image shows one possible implementation of this method: automated suggestions for improving politeness in emails.

Image: 
Carnegie Mellon University

PITTSBURGH--In a tense time when a pandemic rages, politicians wrangle for votes and protesters demand racial justice, a little politeness and courtesy go a long way. Now researchers at Carnegie Mellon University have developed an automated method for making communications more polite.

Specifically, the method takes nonpolite directives or requests -- those that use either impolite or neutral language -- and restructures them or adds words to make them more well-mannered. "Send me the data," for instance, might become "Could you please send me the data?"

The researchers will present their study on politeness transfer at the Association for Computational Linguistics annual meeting, which will be held virtually beginning July 5.

The idea of transferring a style or sentiment from one communication to another -- turning negative statements positive, for instance -- is something language technologists have been doing for some time. Shrimai Prabhumoye, a Ph.D. student in CMU's Language Technologies Institute (LTI), said performing politeness transfer has long been a goal.

"It is extremely relevant for some applications, such as if you want to make your emails or chatbot sound more polite or if you're writing a blog," she said. "But we could never find the right data to perform this task."

She and LTI master's students Aman Madaan, Amrith Setlur and Tanmay Parekh solved that problem by generating a dataset of 1.39 million sentences labeled for politeness, which they used for their experiments.

The source of these sentences might seem surprising. They were derived from emails exchanged by employees of Enron, a Texas-based energy company that, until its demise in 2001, was better known for corporate fraud and corruption than for social niceties. But half a million corporate emails became public as a result of lawsuits surrounding Enron's fraud scandal and subsequently have been used as a dataset for a variety of research projects.

But even with a dataset, the researchers were challenged simply to define politeness.

"It's not just about using words such as 'please' and 'thank you,'" Prabhumoye said. Sometimes, it means making language a bit less direct, so that instead of saying "you should do X," the sentence becomes something like "let us do X."

And politeness varies from one culture to the next. It's common for native North Americans to use "please" in requests to close friends, but in Arab culture it would be considered awkward, if not rude. For their study, the CMU researchers restricted their work to speakers of North American English in a formal setting.

The politeness dataset was analyzed to determine the frequency and distribution of words in the polite and nonpolite sentences. Then the team developed a "tag and generate" pipeline to perform politeness transfers. First, impolite or nonpolite words or phrases are tagged and then a text generator replaces each tagged item. The system takes care not to change the meaning of the sentence.

"It's not just about cleaning up swear words," Prabhumoye said of the process. Initially, the system had a tendency to simply add words to sentences, such as "please" or "sorry." If "Please help me" was considered polite, the system considered "Please please please help me" even more polite.

But over time the scoring system became more realistic and the changes became subtler. First person singular pronouns, such as I, me and mine, were replaced by first person plural pronouns, such as we, us and our. And rather than position "please" at the beginning of the sentence, the system learned to insert it within the sentence: "Could you please send me the file?"

Prabhumoye said the researchers have released their labeled dataset for use by other researchers, hoping to encourage them to further study politeness.

Credit: 
Carnegie Mellon University

Excess neutrinos and missing gamma rays?

image: NASA Hubble Space Telescope image of Galaxy NGC 1068 with its active black hole shown as an illustration in the zoomed-in inset. A new model suggests that the corona around such supermassive black holes could be the source of high-energy cosmic neutrinos observed by the IceCube Neutrino Observatory.

Image: 
NASA/JPL-Caltech

The origin of high-energy cosmic neutrinos observed by the IceCube Neutrino Observatory, whose detector is buried deep in the Antarctic ice, is an enigma that has perplexed physicists and astronomers. A new model could help explain the unexpectedly large flux of some of these neutrinos inferred by recent neutrino and gamma-ray data. A paper by Penn State researchers describing the model, which points to the supermassive black holes found at the cores of active galaxies as the sources of these mysterious neutrinos, appears June 30, 2020 in the journal Physical Review Letters.

"Neutrinos are subatomic particles so tiny that their mass is nearly zero and they rarely interact with other matter," said Kohta Murase, assistant professor of physics and of astronomy and astrophysics at Penn State and a member of Center for Multimessenger Astrophysics in the Institute for Gravitation and the Cosmos (IGC), who led the research. "High-energy cosmic neutrinos are created by energetic cosmic-ray accelerators in the universe, which may be extreme astrophysical objects such as black holes and neutron stars. They must be accompanied by gamma rays or electromagnetic waves at lower energies, and even sometimes gravitational waves. So, we expect the levels of these various `cosmic messengers' that we observe to be related. Interestingly, the IceCube data have indicated an excess emission of neutrinos with energies below 100 teraelectron volt (TeV), compared to the level of corresponding high-energy gamma rays seen by the Fermi Gamma-ray Space Telescope."

Scientists combine information from all of these cosmic messengers to learn about events in the universe and to reconstruct its evolution in the burgeoning field of "multimessenger astrophysics." For extreme cosmic events, like massive stellar explosions and jets from supermassive black holes, that create neutrinos, this approach has helped astronomers pinpoint the distant sources and each additional messenger provides additional clues about the details of the phenomena.

For cosmic neutrinos above 100 TeV, previous research by the Penn State group showed that it is possible to have concordance with high-energy gamma rays and ultra-high-energy cosmic rays which fits with a multimessenger picture. However, there is growing evidence for an excess of neutrinos below 100 TeV, which cannot simply be explained. Very recently, the IceCube Neutrino Observatory reported another excess of high-energy neutrinos in the direction of one of the brightest active galaxies, known as NGC 1068, in the northern sky.

"We know that the sources of high-energy neutrinos must also create gamma rays, so the question is: Where are these missing gamma rays?" said Murase. "The sources are somehow hidden from our view in high-energy gamma rays, and the energy budget of neutrinos released into the universe is surprisingly large. The best candidates for this type of source have dense environments, where gamma rays would be blocked by their interactions with radiation and matter but neutrinos can readily escape. Our new model shows that supermassive black hole systems are promising sites and the model can explain the neutrinos below 100 TeV with modest energetics requirements."

The new model suggests that the corona--the aura of superhot plasma that surrounds stars and other celestial bodies--around supermassive black holes found at the core of galaxies, could be such a source. Analogous to the corona seen in a picture of the Sun during a solar eclipse, astrophysicists believe that black holes have a corona above the rotating disk of material, known as an accretion disk, that forms around the black hole through its gravitational influence. This corona is extremely hot (with a temperature of about one billion degrees kelvin), magnetized, and turbulent. In this environment, particles can be accelerated, which leads to particle collisions that would create neutrinos and gamma rays, but the environment is dense enough to prevent the escape of high-energy gamma rays.

"The model also predicts electromagnetic counterparts of the neutrino sources in `soft' gamma-rays instead of high-energy gamma rays," said Murase. "High-energy gamma rays would be blocked but this is not the end of the story. They would eventually be cascaded down to lower energies and released as `soft' gamma rays in the megaelectron volt range, but most of the existing gamma-ray detectors, like the Fermi Gamma-ray Space Telescope, are not tuned to detect them."

There are projects under development that are designed specifically to explore such soft gamma-ray emission from space. Furthermore, upcoming and next-generation neutrino detectors, KM3Net in the Mediterranean Sea and IceCube-Gen2 in Antarctica will be more sensitive to the sources. The promising targets include NGC 1068 in the northern sky, for which the excess neutrino emission was reported, and several of the brightest active galaxies in the southern sky.

"These new gamma-ray and neutrino detectors will enable deeper searches for multimessenger emission from supermassive black hole coronae," said Murase. "This will make it possible to critically examine if these sources are responsible for the large flux of mid-energy level neutrinos observed by IceCube as our model predicts."

Credit: 
Penn State

How stress affects bone marrow

image: Both Sca-1 and CD86 are expressed on hematopoietic stem progenitor cells (HSPCs) at steady state. However, under biological stresses, such as infection and inflammation, myeloid progenitors acquire Sca-1 expression, which makes the identification of HSPCs impossible. Because CD86 expression remains unchanged even under the stresses, CD86-based analysis enables observation of bona fide hematopoietic responses.

Image: 
Department of Biodefense Research, Medical Research Institute, TMDU

Researchers from Tokyo Medical and Dental University (TMDU) identify the protein CD86 as a novel marker of infection- and inflammation-induced hematopoietic responses

Tokyo, Japan - Hematopoiesis can be affected by biological stresses, such as infection, inflammation and certain medications. In a new study, researchers from Tokyo Medical and Dental University (TMDU) identified a novel cell surface marker that enables the accurate analysis of hematopoietic responses to biological stress.

Hematopoiesis includes the production of all three types of blood cells: red blood cells, white blood cells, and platelets. It is a dynamic process that reacts to disease processes in and outside the bone marrow--the place where blood cells are produced. Previously, hematopoietic studies mainly relied on the analysis of the protein Sca-1, which is expressed by hematopoietic stem cells and early hematopoietic progenitor cells, both of which are common progenitors of all three types of blood cells. While Sca-1 is not expressed by most of late progenitor cells specific to one type of blood cell, recent reports have increasingly suggested that these cells start expressing Sca-1 again in times of biological stress (Figure 1), reducing the reliability of hematopoietic analyses based on Sca-1 expression.

"Accurate analysis of hematopoiesis is crucial to our understanding of the pathogenesis of various diseases," says corresponding author of the study Toshiaki Ohteki. "The goal of our study was to identify an alternative, stable marker that can be reliably used to study hematopoietic responses to stress situations."

To achieve their goal, Masashi Kanayama, a main contributor of this project, injected a bacterial toxin into mice to induce systemic bacterial infection and detected an increase of Sca-1-positive hematopoietic progenitor cells, suggesting that Sca-1-negative cells started expressing the protein as a response to infection. To identify a superior marker to Sca-1, he screened 180 cell surface proteins and identified the protein CD86 as a novel candidate marker. In contrast to Sca-1, CD86 expression did not increase significantly upon systemic bacterial infection (Figure 1), confirming its potential to distinguish early and late hematopoietic progenitor cells under stress conditions.

But could CD86 help understand how biological stress affects hematopoietic responses? To investigate this, the researchers focused on erythropoiesis, the production of red blood cells, in mice injected with the bacterial toxin. CD86-based analysis identified an early activation phase of erythropoiesis in the bone marrow within 18-24 hours after toxin injection, while Sca-1-based analysis did not. Further analysis showed that the number of red blood cells in the bone marrow peaked at 18 hours to then decreased to basal levels by 72 hours. Conversely, the number of red blood cells in the blood began to increase by 24 hours. Intriguingly, the researchers found that the newly produced cells had the morphological characteristics of mature red blood cells, that is, a smaller cell size and the absence of a nucleus, confirming that the cells were not red blood cell precursors.

"These are striking results that show how CD86 can rectify the shortcomings of Sca-1 in the analysis of hematopoiesis," says Ohteki. "Our findings provide new insights into the use of CD86 as an alternative marker to Sca-1 for assessing bona fide hematopoietic responses under stress conditions."

Credit: 
Tokyo Medical and Dental University

Toward principles of gene regulation in multicellular systems?

A team of quantitative biology researchers from Northwestern University have uncovered new insights into the impact of stochasticity in gene expression, offering new evolutionary clues into organismal design principles in the face of physical constraints.

In cells, genes are expressed through transcription, a process where genetic information encoded in DNA is copied into messenger RNA (mRNA). The mRNA is then translated to make protein molecules, the workhorses of cells. This entire process is subject to bursts of natural stochasticity -- or randomness -- which can impact the outcome of biological processes that proteins carry out.

The researchers' new experimental and theoretical analyses studied a collection of genes in Drosophila, a family of fruit flies, and found that gene expression is regulated by the frequency of these transcriptional bursts.

"It has been known for almost two decades that protein levels can demonstrate large levels of stochasticity owing to their small numbers, but this has never been empirically demonstrated in multicellular organisms during the course of their development," said Madhav Mani, assistant professor of engineering sciences and applied mathematics at the McCormick School of Engineering. "This work for the first time identifies the role of randomness in altering the outcome of a developmental process."

A paper outlining the work, titled "The Wg and Dpp Morphogens Regulate Gene Expression by Modulating the Frequency of Transcriptional Bursts," was published June 22 in the journal eLife. Mani is a co-corresponding author on the study along with Richard Carthew, professor of molecular biosciences in the Weinberg College of Arts and Sciences. Both are members of Northwestern's NSF-Simons Center for Quantitative Biology, which brings together mathematical scientists and developmental biologists to investigate the biology of animal development.

This study builds upon a recent paper in which the researchers studied the role of stochastic gene expression on sensory pattern formation in Drosophila. By analyzing experimental perturbations of Drosophila's senseless gene against mathematical models, the team determined the sources of the gene's stochasticity, and found that the randomness appears to be leveraged in order to accurately determine sensory neuron fates.

The researchers applied that understanding to this latest study using a technique called single molecule fluorescence in situ hybridization (smFISH) to measure nascent and mature mRNA in genes downstream of two key patterning factors, Wg and Dpp, responsible for the organ development of fruit fly wings. In comparing the measurements to their data models, the researchers found that, while each gene's pattern of expression is unique, the mechanism by which expression is regulated -- which the team named "burst frequency modulation" -- is the same.

"Our results show that proteins' levels of randomness are impacted by the physical structure of the genome surrounding the gene of interest by modulating the features of the 'software' that control the levels of gene expression," Mani said. "We developed an experimental approach to study a large collection of genes in order to discern overall trends as to how the stochastic software of gene regulation is itself regulated."

The observed patterns of gene regulation, Mani said, works like a stochastic light switch.

"Let's say you are quickly flipping a light switch on and off, but you want more brightness out of your bulb. You could either get a brighter bulb that produced more photons per unit time, or you could leave the switch 'on' more than 'off,'" Mani said. "What we found is that organisms control the amount of gene expression by regulating how often the gene is permitted to switch on, rather than making more mRNAs when it is on."

Carthew, director of the Center for Quantitative Biology, added that this mode of gene expression regulation was observed for multiple genes, which hints at the possibility of a broader biological principle where quantitative control of gene expression leverages the random nature of the process.

"From these studies, we are learning rules for how genes can be made more or less noisy," Carthew said. "Sometimes cells want to harness the genetic noise -- the level of variation in gene expression -- to make randomized decisions. Other times cells want to suppress the noise because it makes cells too variable for the good of the organism. Intrinsic features of a gene can imbue them with more or less noise."

While engineers are excited by the ability to control and manipulate biological systems, Mani said, more fundamental knowledge needs to be discovered.

"We only know the tip of the iceberg," Mani said. "We are far from a time when basic science is considered complete and all that is left is engineering and design. The natural world is still hiding its deepest mysteries."

Credit: 
Northwestern University

More than medicine: Pain-relief drug delivers choices for mothers in labor

image: For women who choose pain relief in labour, fentanyl has been shown to reduce pain intensity while enabling women to work with the contractions.

Image: 
Photo by Hu Chen on Unsplash

Choice and control are important factors for ensuring a positive childbirth experience, yet until recently, little was known about the impact of alternative administrations of fentanyl - one of the pain relief drugs used during labour- on both mother and baby.

Now, world-first research from the University of South Australia confirms nasal or subcutaneous (injected) administration of fentanyl a safe option for both mothers and babies, ensuring greater choices of pain relief for women during childbirth.

The study is the first to assess fentanyl concentrations following subcutaneous fentanyl administration.

Testing fentanyl levels in 30 mother-baby pairs (via maternal and cord blood samples taken with 30 minutes of birth) the study found that despite nasally administered levels of fentanyl being significantly higher than those by injection, all babies had lower levels of fentanyl in their systems than their mothers, regardless of administration method.

All babies had 5-min Apgar scores within normal ranges; none required admission to the nursery for special care and levels of the drug were considered very low, well below those shown to depress breathing.

This is in contrast to cord concentrations of pethidine and norpethidine where other studies have shown levels have been shown to be comparable to their mothers and significantly supressed a baby's behaviour in the first few weeks of life.

Lead researcher, UniSA's Dr Julie Fleet says the findings are an important step in understanding pain relief options in labour providing support for less invasive forms of drug administration.

"Many women worry about managing pain during labour and the impact that their choices might have for themselves and their newborn child," Dr Fleet says.

"For women who choose pain relief in labour, there are still very few options available - the most common are 'gas' (nitrous oxide and oxygen), injection of a narcotic or opioid (such as fentanyl, morphine or pethidine), or an epidural ¬- but as with all analgesics, there are side effects.

"Negating and managing side effects is critical for both mother and baby, which means the need for choices in pain relief is all the more essential.

"Fentanyl is a popular choice for regulating pain during labour because it provides rapid pain relief while not restricting mobility and reduces incidents of adverse side effects such as nausea, vomiting or sedation.

"It can also be administered via nasal spray or small injection under the skin, enabling women more control over their pain.

"The strength of this research is that it confirms that fentanyl can be used safely for both mother and baby - regardless of whether it is administered nasally or via injection - giving strong supportive evidence of its use as an alternative pain relief option."

In South Australia, subcutaneous administration of fentanyl is standard practice, with the nasal spray growing in popularity.

"Importantly, for women who choose pain relief in labour, fentanyl has been shown to reduce pain intensity while enabling women to work with the contractions.

"Additionally, women report it provides increased autonomy and satisfaction in birth - both important factors for ensuring a positive birthing experience."

Credit: 
University of South Australia

Findings weaken notion that size equals strength for neural connections

image: Neurons from the hippocampus region of a rodent brain (left); A zoomed in section of the neural dendrites show spines where many synaptic connections with other neurons are formed.

Image: 
Stephanie Barnes/MIT Picower Institute

Learning, memory and behavioral disorders can arise when the connections between neuron, called synapses, do not change properly in response to experience. Scientists have studied this "synaptic plasticity" for decades, but a new study by researchers at MIT's Picower Institute for Learning and Memory highlights several surprises about some of the basic mechanisms by which it happens. Getting to the bottom of what underlies some of those surprises, the research further suggests, could yield new treatments for a disorder called Fragile X that causes autism.

Two classic forms of synaptic plasticity are that synapses either get stronger or weaker and that the tiny spine structures that support them get bigger or smaller. For a long time, the field's working assumption has been that these functional and structural changes were closely associated: strengthening went along with an increase in spine size and weakening preceded spine shrinkage. But the study published in Molecular Psychiatry adds specific evidence to support a more recent view, backed by other recent studies, that those correlations are not always true.

"We saw these breakdowns of correlation between structure and function," said Mark Bear, Picower Professor in the Department of Brain and Cognitive Sciences and senior author of the study. "One conclusion is you can't use spine size as a proxy for synaptic strength - you can have weak synapses with big bulbous spines. We are not the only ones to make this case, but the new results in this study are very clear."

The study's co-lead authors are former lab members Aurore Thomazeau and Miquel Bosch.

Dimensions of dissociation

To conduct the study the team stimulated plasticity via two different neural receptors (called mGluR5 and NMDAR) under two different conditions (neurotypical rodents and ones engineered with the mutation that causes Fragile X). In Fragile X, Bear's lab has found the lack of the protein FMRP leads to excess synthesis of other proteins that cause synapses to weaken too much in a brain region called the hippocampus, which is a crucial area for memory formation.

The first surprise of the study was that activating mGluR5 receptors induced the weakening, called long-term depression (LTD), but did not lead to any spine shrinkage in either Fragile X or control mice for at least an hour. In other words, the structural change that was assumed to go along with the functional change didn't actually occur.

In the NMDAR case, the two forms of plasticity did occur together, both in control and fragile X rodents, but not without a few more surprises lurking just beneath the surface that further dissociated functional and structural plasticity. When the team blocked a flow of ions (and therefore electric current) in the NMDAR synapses, that only prevented the weakening, not the shrinking. To prevent the shrinking in control rodents, the researchers had to do something different: inhibit protein synthesis either directly or by inhibiting a regulatory protein called mTORC1.

"It was quite amazing to us," Bear said. "We are following up on that aggressively to better understand that signaling."

A new opportunity for Fragile X

If several of the surprises in the study are disruptive, Bear said, another one may provide new hope for treating Fragile X. That's because while Bear's lab has focused on intervening in the mGluR pathway to treat Fragile X, the new experiments involving NMDAR may reveal an additional avenue.

When the team tried to prevent spines from shrinking via NMDAR in Fragile X rodents by inhibiting protein synthesis or mTORC1 (like they did in the controls), they found it didn't work. It was as if there was already too much of some protein that promotes the shrinkage. The team was even able to replicate this Fragile X phenomenon in the controls by first stimulating mGluR5 - and an ensuing excess of protein synthesis - and then following up with the NMDAR activation.

As a nod to both the mystery and the disorder, Bear has begun to refer to this conjectured potential shrinkage-promoting molecule as "protein X."

"The question is what is protein X," Bear said. "The evidence is quite strong that there is a rapidly turned over protein X that is wreaking havoc in Fragile X. Now the hunt is on. We'll be really excited to find it."

Credit: 
Picower Institute at MIT

Breast cancer drug, olaparib, depletes store of immature eggs in mouse ovaries

image: In normal mouse primordial eggs (stained green), we never detect DNA damage. But, on top of killing primordial eggs, olaparib causes DNA damage in some eggs that survive, suggesting they will either soon die, or be too poor quality to give rise to a healthy baby.

Image: 
Human Reproduction

Australian researchers have shown for the first time that a new drug used to treat breast cancer patients damages the store of immature eggs in the ovaries of mice.

The authors of the study, which is published today (Wednesday) in Human Reproduction [1], one of the world's leading reproductive medicine journals, say the drug olaparib is being used to treat young as well as older women with breast cancer that is driven by mutations in the BRCA1/2 genes, but without knowing its effect on fertility. [2]

Their research shows that olaparib destroys a significant proportion (36%) of the immature eggs that are contained in structures called primordial follicles. Women are born with a finite number of follicles in their ovaries and during their reproductive lifespan some of the eggs (or oocytes) will start to grow to the stage at which they are released from the ovaries and can be fertilised by sperm. Therefore, a reduction in the store of follicles through damage from cancer treatment has the potential to affect a woman's fertility.

Olaparib has not been used on young women long enough to see how it affects their fertility, so this study in mice is an important indication of its effects.

First author of the study, Dr Amy Winship, a research fellow at Monash University's Biomedicine Discovery Institute (Melbourne, Australia), said: "Although there are differences between species, such as the number of eggs ovulated in a menstrual cycle, there are many important similarities that make the mouse an excellent model for studying the human ovary. The storage of primordial follicles, and the processes of activation, growth and ovulation are all the same.

"Fertility is very commonly overlooked in many safety tests in pre-clinical studies in animal models and also in human clinical trials for new cancer drugs. But we know this is an important and valid concern of young cancer patients and survivors, particularly as survival rates for many cancers are improving. We show for the first time that olaparib is harmful to the immature eggs stored in the ovaries that will give rise to the mature eggs required to sustain fertility and normal hormone levels."

Olaparib inhibits DNA repair by blocking the action of a family of enzymes called poly(ADP-ribose) polymerase or PARP, and so the drug is used to prevent cancer cells repairing themselves and continuing to replicate and grow. In 2018, olaparib was approved for use in women with BRCA1/2 breast cancer that had spread (metastasised) to other parts of the body. A randomised controlled trial, OlympiA [3], is currently investigating its use in women with early, potentially curable BRCA1/2 breast cancer that has not yet metastasised. Many of these will be young women who have yet to start or complete a family.

The researchers gave mice a single dose of chemotherapy (cyclophosphamide, doxorubicin, carboplatin or paclitaxel) or a placebo, followed by a daily dose of olaparib or a placebo for the next 28 days. Then the mice were killed humanely and the researchers counted the number of primordial follicles and growing follicles in the ovaries. They also counted the number of follicle remnants, in which the eggs had been destroyed.

"We found that olaparib significantly depleted primordial follicles by 36% compared to ovaries of mice who had not received the drug," said Dr Winship. "We detected a significant accumulation of primordial follicle remnants in mice given olaparib, while these were rarely detected in the ovaries of the untreated mice, indicating that olaparib is likely to destroy immature eggs. Olaparib did not affect growing eggs and follicles or hormone levels.

"If a drug kills growing eggs, ovulation and fertility might be temporarily impacted, but more eggs can be activated from the immature primordial stockpile and ovulation will resume as normal. In contrast, if the primordial eggs are killed, this is more serious. This can ultimately lead to complete infertility and early menopause once the stockpile is gone. We have shown for the first time, olaparib kills primordial, but not growing eggs and follicles."

Although mice treated with chemotherapy had reduced numbers of primordial follicles in their ovaries, the researchers found that olaparib did not exacerbate this loss when compared with mice that received chemotherapy but not olaparib.

In their paper, the researchers conclude: "Since direct measures of primordial follicle number are not possible in women, our data presented here have important clinical implications. Female cancer patients may present clinically with regular menstrual cycles and serum AMH concentrations within the normal range after olaparib treatment, but, unknowingly have a significantly depleted ovarian reserve of primordial follicles. Diminished ovarian reserve leads to infertility and premature menopause. Therefore, fertility preservation counselling should be considered for young female patients prior to olaparib treatment."

Anti-Mullerian Hormone (AMH) is often used to assess the reserve of follicles in a woman's ovary, but AMH levels were unaffected by olaparib, which could lead to a false sense of reassurance among women and their fertility doctors.

Professor Kelly-Anne Phillips is a medical oncologist at the Peter MacCallum Cancer Centre, Melbourne, Australia, and an author of the paper. She said: "Although the OlympiA trial of olaparib includes younger women with early breast cancer, at present the drug is approved only for use in women with metastatic breast cancer for whom fertility preservation is not appropriate. However, in future olaparib and similar drugs may be approved for use in young women, with the intention of curing them of the disease, and for these women fertility preservation is an important consideration. Therefore, it is very important to understand its effects on ovarian function and we encourage researchers to consider measuring ovarian function and fertility in future clinical trials so that we have data from humans to support or refute the findings of our study in mice."

Fertility preservation can include removing and freezing ovarian tissue, follicles or mature eggs before treatment with chemotherapy, olaparib or other anti-cancer treatments that have the potential to damage the ovaries and eggs.

Credit: 
European Society of Human Reproduction and Embryology

Novel software reveals molecular barcodes that distinguish different cell types

image: Dotted box represents a tissue sample containing four cell types. Filled and empty circles represent methylated and unmethylated CpGs, respectively.

Image: 
Image courtesy of the Waterland lab/Genome Biology, 2020.

There are about 75 different types of cells in the human brain. What makes them all different? Researchers at Baylor College of Medicine have developed a new set of computational tools to help answer this question. Although different cell types from the same organism carry the same DNA, they look and function differently because a different set of genes is active or inactive in each. Cells switch genes on or off by using epigenetic mechanisms, such as DNA methylation, which involves tagging genes with methyl chemical groups.

To better understand how epigenetic regulation works, researchers study DNA methylation signals in whole genome datasets. These datasets contain the sequences of the building blocks that make up the DNA in a cell population. However, when the tissue being studied, like the brain, is made up of many different cell types, existing analytical approaches can not distinguish methylation signals arising from those different cell types.

Now, a new set of computational methods developed at Baylor allows researchers to identify cell-type specific methylation patterns - molecular barcodes - in complex cell mixtures. These new computational tools, published in the journal Genome Biology and available for free download, can be applied to existing whole-genome methylation datasets from any species. This opens exciting new possibilities to improve our understanding of how DNA methylation regulates cellular function.

Identifying cell type-specific molecular barcodes

"The current gold-standard approach to study DNA methylation is whole genome bisulfite sequencing (WGBS), a next-generation sequencing technology that determines DNA methylation of each cytosine, one of the DNA building blocks, in the entire genome," said co-corresponding author Dr. Cristian Coarfa, associate professor of molecular and cellular biology and part of the Center for Precision Environmental Health at Baylor.

WGBS studies typically report the average methylation level at each cytosine. In tissues made up of multiple cell types, however, this average reflects a mashup of the methylation level of each cell type in the mixture, obscuring cell-type specific differences.

"The key insight that motivated the current study is that the DNA sequence 'reads' in WGBS data are direct descendants of DNA molecules originating from different cells of the tissue. We postulated that the methylation 'patterns' we detect on tissue sequencing reads contain information about what cell types the reads originated from," said co-corresponding author Dr. Robert A. Waterland, professor of pediatrics - nutrition at the USDA/ARS Children's Nutrition Research Center at Baylor and Texas Children's Hospital. "To test this we developed software that identifies these cell type-specific methylation patterns within bulk WGBS data. This software is called Cluster-Based analysis of CpG methylation (CluBCpG)."

As one validation, the researchers used CluBCpG to analyze WGBS datasets from two types of human immune cells, B cells and monocytes. They were able to identify over 100,000 unique molecular barcodes within each cell type. Then, they applied their method to mixtures of reads from another WGBS dataset from these two cell types, from entirely different people.

"Just by counting occurrences of these molecular barcodes in the novel datasets, CluBCpG allowed us to precisely determine the percentage of B cells and monocytes in each mixture," said Dr. C. Anthony Scott, former postdoctoral researcher in the Waterland lab and co-first author on the paper. "We also showed that these cell-type specific signals are associated with cellular functions in different types of human and mouse brain cells and blood cells, and that they can even predict which genes are expressed."

In the last 10 years, scientists generated thousands of WGBS data sets costing millions of dollars, yet were unable to appreciate much of the information available in the data. "It's a bit like wearing noise-cancelling headphones to the symphony," said Waterland, also a professor of molecular and human genetics at Baylor. "Now, for the first time, researchers can 'tune in' to the full richness and complexity of WGBS data."

Boosting the information content of existing datasets

The CluBCpG software works together with a second development, a sophisticated machine-learning software package called Precise Read-Level Imputation of Methylation (PReLIM). This software 'fills in' missing information on sequencing reads that cover some of the sites in a region, increasing the information content of existing WGBS datasets by 50 to 100 percent.

"PReLIM learns from the hundreds of millions of reads in each WGBS dataset to predict the methylation state at missing sites on individual sequence reads," said Jack D. Duryea, former student in the Waterland lab and co-first author on the paper. "We showed that PReLIM's predictions are correct 95 percent of the time."

Since WGBS datasets cost thousands of dollars to generate, getting 50 to 100 percent more data - at no extra charge - is a big deal.

The researchers anticipate these new computational developments will be applied to study methylation differences in normal cells as well as in disease.

"For instance, these methods will provide better resolution in studies aiming to identify methylation differences between a healthy brain and one with a disease. We might be able to determine, for example, that epigenetic changes linked to a disease occur only in one specific type of brain cell, which would be a major step toward understanding a disease," Waterland said.

Credit: 
Baylor College of Medicine

RDA publishes final version of COVID-19 recommendations and guidelines on data sharing

image: Today, 30 June 2020, the Research Data Alliance publishes the final version of the RDA COVID-19 Recommendations and Guidelines for Data Sharing covering four research areas - clinical data, omics practices, epidemiology and social sciences. This document is also complimented by overarching areas focusing on legal and ethical considerations, research software, community participation and indigenous data.

Image: 
Research Data Alliance

Under public health emergencies, and particularly the COVID19 pandemic, it is fundamental that data is shared in both a timely and an accurate manner. This coupled with the harmonisation of the many diverse data infrastructures is, now more than ever, imperative to share preliminary data and results early and often. It is clear that open research data is a key component to pandemic preparedness and response.

In late March, RDA received a direct request from one of its funders, the European Commission, to create global guidelines and recommendations for data sharing under COVID-19 circumstances. Over 600 data professionals and domain experts signed up and began work in early April 2020.

They have produced a rich set of detailed guidelines to help researchers and data stewards follow best practices to maximise the efficiency of their work, and to act as a blueprint for future emergencies; coupled with recommendations to help policymakers and funders to maximise timely, quality data sharing and appropriate responses in such health emergencies.

On 30 June 2020, RDA published the final version of the RDA COVID-19 Recommendations and Guidelines on data sharing covering four research areas - clinical data, omics practices, epidemiology and social sciences - complimented by overarching areas focusing on legal and ethical considerations, research software, community participation and indigenous data.

Some highlights of the common challenges that emerged from these areas include:

The unprecedented spread of the virus has prompted a rapid and massive research response with a diversity of outputs that pose a challenge to interoperability.

To make the most of global research efforts, findings and data need to be shared equally rapidly, in a way that is useful and comprehensible.

The challenge here, of course, is the trade-off between timeliness and precision. The speed of data collection and sharing needs to be balanced with accuracy, which takes time.

The lack of pre-approved data sharing agreements and archaic information systems hinder rapid detection of emerging threats and development of an evidence-based response.

While the research and data are abundant, multi-faceted, and globally produced, there is no universally adopted system or standard, for collecting, documenting, and disseminating COVID-19 research outputs.

Furthermore, many outputs are not reusable by, or useful to, different communities if they have not been sufficiently documented and contextualised, or appropriately licensed.

Correspondingly, research software is developed and maintained in ad hoc fashion. Access to the software developed for analysis in papers, is not placed consistently in papers and, if they are available, often they are placed in arbitrary locations with no guarantee of their persistence.

The report specifically emphasises the importance of the following during the COVID-19 emergency response:

Sharing clinical data in a timely and trustworthy manner to maximise the impact of healthcare measures and clinical research during the emergency response;

encouraging people to Publish their data alongside a paper (particularly important in reference to omics data);

underlining that epidemiology data underpin early response strategies and public health measures;

providing general guidelines to collect or link important social and behavioral data in all pandemic studies; 

evidencing the importance of sharing research software alongside the research data it analyses, and providing guidelines and best practices for enabling this;

offering general guidance to navigate the applicable rule of law and exploit relevant ethical frameworks relating to the collection, analysis and sharing of data in similar emergency situations;

looking at data management and sharing issues related to the technical, social, legal and ethical considerations from the community participation perspective.

The RDA COVID-19 activities were conducted under the RDA guiding principles of Openness, Consensus, Balance, Harmonisation, Community-driven, Non-profit and Technology-neutral. The results and outputs are open to all.

Credit: 
Research Data Alliance (RDA)

A new view of microscopic interactions

image: A team of scientists led by Arthur Suits at the University of Missouri has developed a new experimental approach to study chemistry.

Image: 
University of Missouri

When two cars collide at an intersection -- from opposite directions -- the impact is much different than when two cars -- traveling in the same direction -- "bump" into each other. In the laboratory, similar types of collisions can be made to occur between molecules to study chemistry at very low temperatures, or "cold collisions."

A team of scientists led by Arthur Suits at the University of Missouri has developed a new experimental approach to study chemistry using these cold "same direction" molecular collisions. Suits said their approach hasn't been done before.

"When combined with the use of a laser that 'excites' the molecules, our approach produces specific 'hot' states of molecules, allowing us to study their individual properties and provide more accurate experimental theories," said Suits, a Curators Distinguished Professor of Chemistry in the College of Arts and Science. "This is a condition that does not occur naturally but allows for a better understanding of molecular interactions.

Suits equated their efforts to analyzing the results of a marathon race.

"If you only look at the average time it takes everyone to complete the Boston Marathon, then you don't really learn much detail about a runner's individual capabilities," he said. "By doing it this way we can look at the fastest 'runner,' the slowest 'runner,' and also see the range and different behaviors of individual 'runners,' or molecules in this case. Using lasers, we can also design the race to have a desired outcome, which shows we are gaining direct control of the chemistry."

Suits said this is one of the first detailed approaches of its kind in this field.

"Chemistry is really about the collisions of molecules coming together and what causes chemical reactions to occur," he said. "Here, instead of crossing two beams of molecules with each other as researchers have often done before, we are now pointing both beams of molecules in the same direction. By also preparing the molecules in those beams to be in specific states, we can study collisions in extreme detail that happen very slowly, including close to absolute zero, which is the equivalent of the low temperature states needed for quantum computing."

Credit: 
University of Missouri-Columbia

COVID-19: Study shows virus can infect heart cells in lab dish

image: Clive Svendsen, PhD, director of the Cedars-Sinai Board of Governors Regenerative Medicine Institute, at work in his laboratory.

Image: 
Cedars-Sinai

LOS ANGELES (June 30, 2020) - A new study shows that SARS-CoV-2, the virus that causes COVID-19 (coronavirus), can infect heart cells in a lab dish, indicating it may be possible for heart cells in COVID-19 patients to be directly infected by the virus. The discovery, published today in the journal Cell Reports Medicine, was made using heart muscle cells that were produced by stem cell technology.

Although many COVID-19 patients experience heart problems, the reasons are not entirely clear. Pre-existing cardiac conditions or inflammation and oxygen deprivation that result from the infection have all been implicated. But until now, there has been only limited evidence that the SARS-CoV-2 virus directly infects the individual muscle cells of the heart.

"We not only uncovered that these stem cell-derived heart cells are susceptible to infection by novel coronavirus, but that the virus can also quickly divide within the heart muscle cells," said Arun Sharma, PhD, a research fellow at the Cedars-Sinai Board of Governors Regenerative Medicine Institute and first and co-corresponding author of the study. "Even more significant, the infected heart cells showed changes in their ability to beat after 72 hours of infection."

The study also demonstrated that human stem cell-derived heart cells infected by SARS-CoV-2 change their gene expression profile, further confirming that the cells can be actively infected by the virus and activate innate cellular "defense mechanisms" in an effort to help clear out the virus.

While these findings are not a perfect replicate of what is happening in the human body, this knowledge may help investigators use stem cell-derived heart cells as a screening platform to identify new antiviral compounds that could alleviate viral infection of the heart, according to senior and co-corresponding author Clive Svendsen, PhD.

"This viral pandemic is predominately defined by respiratory symptoms, but there are also cardiac complications, including arrhythmias, heart failure and viral myocarditis," said Svendsen, director of the Regenerative Medicine Institute and professor of Biomedical Sciences and Medicine. "While this could be the result of massive inflammation in response to the virus, our data suggest that the heart could also be directly affected by the virus in COVID-19."

Researchers also found that treatment with an ACE2 antibody was able to blunt viral replication on stem cell-derived heart cells, suggesting that the ACE2 receptor could be used by SARS-CoV-2 to enter human heart muscle cells.

"By blocking the ACE2 protein with an antibody, the virus is not as easily able to bind to the ACE2 protein, and thus cannot easily enter the cell," said Sharma. "This not only helps us understand the mechanisms of how this virus functions, but also suggests therapeutic approaches that could be used as a potential treatment for SARS-CoV-2 infection."

The study used human induced pluripotent stem cells (iPSCs), a type of stem cell that is created in the lab from a person's blood or skin cells. IPSCs can make any cell type found in the body, each one carrying the DNA of the individual. Tissue-specific cells created in this way are used for research and for creating and testing potential disease treatments.

"This work illustrates the power of being able to study human tissue in a dish," said Eduardo Marbán, MD, PhD, executive director of the Smidt Heart Institute, who collaborated with Sharma and Svendsen on the study. "It is plausible that direct infection of cardiac muscle cells may contribute to COVID-related heart disease."

The investigators also collaborated with co-corresponding author Vaithilingaraja Arumugaswami, DVM, PhD, an associate professor of molecular and medical pharmacology at the David Geffen School of Medicine at UCLA and member of the Eli and Edythe Broad Center of Regenerative Medicine and Stem Cell Research. Arumugaswami provided the novel coronavirus that was added to the heart cells, and UCLA researcher Gustavo Garcia Jr. contributed essential heart cell infection experiments.

"This key experimental system could be useful to understand the differences in disease processes of related coronaviral pathogens, SARS and MERS," Arumugaswami said.

Credit: 
Cedars-Sinai Medical Center