Earth

New study finds a link between sleep apnea and increased risk of dementia

A new study by Monash University has found that obstructive sleep apnea (OSA) has been linked to an increased risk of dementia.

The study, published in the Journal of Alzheimer's Disease, and led by Dr Melinda Jackson from the Turner Institute for Brain and Mental Health, found that severe OSA is linked to an increase in a protein, called beta-amyloid, that builds up on the walls of the arteries in the brain and increases the risk of dementia.

The study involved 34 individuals with recently diagnosed untreated OSA and 12 individuals who were asymptomatic for sleep disorders. It explored associations between brain amyloid burden using a PET brain scan, and measures of sleep, demographics and mood.

The OSA group recorded a higher amyloid burden, poorer sleep efficiency and less time spent in stage N3 sleep (a regenerative period where your body heals and repairs itself).

OSA is a common sleep disorder, affecting about 1 billion people worldwide and is caused by the collapse of the airway during sleep, resulting in intermittent dips in oxygen levels and arousals from sleep.

"The significance of finding the association between increased brain amyloid in patients with OSA will allow for further research to explore in more detail the implications of treating OSA for reducing dementia risk," Dr Jackson said.

Credit: 
Monash University

Low fitness linked to higher depression and anxiety risk

People with low aerobic and muscular fitness are nearly twice as likely to experience depression, finds a new study led by UCL researchers.

Low fitness levels also predicted a 60% greater chance of anxiety, over a seven-year follow-up, according to the findings published in BMC Medicine.

Lead author, PhD student Aaron Kandola (UCL Psychiatry) said: "Here we have provided further evidence of a relationship between physical and mental health, and that structured exercise aimed at improving different types of fitness is not only good for your physical health, but may also have mental health benefits."

The study involved 152,978 participants aged 40 to 69 of the UK Biobank study. Their baseline aerobic fitness at the start of the study period was tested by using a stationary bike with increasing resistance, while their muscular fitness was measured with a grip strength test. They also completed a questionnaire gauging depression and anxiety symptoms.

Seven years later they were tested again for depression and anxiety symptoms, and the researchers found that high aerobic and muscular fitness at the start of the study was associated with better mental health seven years later.

People with the lowest combined aerobic and muscular fitness had 98% higher odds of depression, 60% higher odds of anxiety, and 81% higher odds of having either one of the common mental health disorders, compared to those with high levels of overall fitness.

The researchers accounted for potentially confounding factors at baseline such as diet, socioeconomic status, chronic illness, and mental illness symptoms.

Previous studies have found that people who exercise more are less likely to experience mental illnesses, but most studies rely on people self-reporting their activity levels, which can be less reliable than the objective physical fitness measures used here.

Senior author Dr Joseph Hayes (UCL Psychiatry and Camden and Islington NHS Foundation Trust) said: "Our findings suggest that encouraging people to exercise more could have extensive public health benefits, improving not only our physical health but our mental health too. Improving fitness through a combination of cardio exercise and strength and resistance training appears to be more beneficial than just focusing on aerobic or muscular fitness."

Aaron Kandola added: "Reports that people are not as active as they used to be are worrying, and even more so now that global lockdowns have closed gyms and limited how much time people are spending out of the house. Physical activity is an important part of our lives and can play a key role in preventing mental health disorders.

"Other studies have found that just a few weeks of regular intensive exercise can make substantial improvements to aerobic and muscular fitness, so we are hopeful that it may not take much time to make a big difference to your risk of mental illness."

Credit: 
University College London

Jacky dragon moms' time in the sun affects their kids

image: Jacky dragon

Image: 
UNSW Sydney. Photo: Lisa Schwanz.

A new study conducted at the University of New South Wales and published in the November/December 2020 issue of Physiological and Biochemical Zoology sheds light on a possible connection between an animal's environmental conditions and the traits of its offspring. The study, Maternal Temperature, Corticosterone, and Body Condition as Mediators of Maternal Effects in Jacky Dragons (Amphibolurus muricatus), focused on how maternal condition and stress hormone (corticosterone) levels in jacky dragons (Amphibolurus muricatus) potentially translate a mom's heat exposure to effects on her offspring.

Ectothermic species, such as lizards, often rely on external heat sources to manage their body temperature. It turns out that maternal temperature can affect offspring traits such as gender and the rate of growth. While those trait changes have been documented, the means by which information passes from parent to offspring are poorly known.

Maternal effects theory, in particular, has helped answer a number of important biological questions related to evolution. However, the mechanisms for how those effects take place has not been nearly as well-developed, leading to unanswered questions concerning proximate physical causes.

"Mechanisms by which thermal information can be passed onto offspring have been underexplored," writes the study's author, Gracie Liu. "Here, we investigated corticosterone as a potential mediator of thermal maternal effects."

The study placed female jacky dragons in two different thermal regimes - one that exposed the subject to 7 hours of thermal basking treatments each day, the other exposing the subject to 11 hours of the same treatment each day - then measured the levels of corticosterone in the subjects' blood and examined any possible relation connected to their offspring.

Corticosterone is a steroid hormone associated with the "stress" response, energy mobilization, and suppression of the immune system. Such hormones often serve as a potential connection between offspring phenotypes and the environment of the mother, since they transfer from mother to offspring and can play a role in physiology, behavior and similar factors.

The results indicated that such "thermal opportunity" does have an effect on mothers and their offspring. Specifically, lizards exposed to the longer 11-hour regime show significantly higher corticosterone levels in their bloodstream than those exposed to the 7-hour regime.

However, it was corticosterone's connection to maternal body condition that led to increased reproductive output - which included both the number of eggs in a subject's given clutch and the overall size of the clutch - as well as an increased size in offspring at hatching. It did not, however, have a corresponding effect on the growth of those offspring or their gender.

More specifically, the basking treatment appeared to be interrelated with maternal corticosterone levels and body condition. That, in turn, had an interactive effect on the resulting clutch, which suggests that the combination of condition, corticosterone and overall maternal temperature could be conveying information about the mother's external environment into her offspring.

"These findings indicate that thermal opportunity alters physiology," Liu writes. "With potential consequences for fitness."

Credit: 
University of Chicago Press Journals

Hundreds of copies of Newton's Principia found in new census

image: Caltech's own copy of the first edition of the Principia is part of the Institute's Archives and Special Collections. In the 18th century, it belonged to French mathematician and natural philosopher Jean-Jacques d'Ortous de Mairan, whose signature can be seen in the left margin of the title page. More recently, it was in the collection of Caltech physicist Earnest Watson. The white "snake" seen at left helps hold the pages down.

Image: 
Caltech Archives

In a story of lost and stolen books and scrupulous detective work across continents, a Caltech historian and his former student have unearthed previously uncounted copies of Isaac Newton's groundbreaking science book Philosophiae Naturalis Principia Mathematica, known more colloquially as the Principia. The new census more than doubles the number of known copies of the famous first edition, published in 1687. The last census of this kind, published in 1953, had identified 187 copies, while the new Caltech survey finds 386 copies. Up to 200 additional copies, according to the study authors, likely still exist undocumented in public and private collections.

"We felt like Sherlock Holmes," says Mordechai (Moti) Feingold, the Kate Van Nuys Page Professor of the History of Science and the Humanities at Caltech, who explains that he and his former student, Andrej Svorenčík (MS '08) of the University of Mannheim in Germany, spent more than a decade tracing copies of the book around the world. Feingold and Svorenčík are co-authors of a paper about the survey published in the journal Annals of Science.

Moreover, by analyzing ownership marks and notes scribbled in the margins of some of the books, in addition to related letters and other documents, the researchers found evidence that the Principia, once thought to be reserved for only a select group of expert mathematicians, was more widely read and comprehended than previously thought.

"One of the realizations we've had," says Feingold, "is that the transmission of the book and its ideas was far quicker and more open than we assumed, and this will have implications on the future work that we and others will be doing on this subject."

In the Principia, Newton introduced the laws of motion and universal gravitation, "unifying the terrestrial and celestial worlds under a single law," says Svorenčík.

"By the 18th century, Newtonian ideas transcended science itself," says Feingold. "People in other fields were hoping to find a similar single law to unify their own respective fields. The influence of Newton, just like that of Charles Darwin and Albert Einstein, exerted considerable influence on many other aspects of life, and that is what made him such a canonical figure during the 18th century and beyond."

Principia Found Behind the Iron Curtain

Svorenčík says that the project was born out of a paper he wrote for a course in the history of science taught by Feingold. Originally from Slovakia, Svorenčík had written a term paper about the distribution of the Principia in Central Europe. "I was interested in whether there were copies of the book that could be traced to my home region. The census done in the 1950s did not list any copies from Slovakia, the Czech Republic, Poland, or Hungary. This is understandable as the census was done after the Iron Curtain descended, which made tracing copies very difficult."

To Svorenčík's surprise, he found many more copies than Feingold had expected. The summer after the class, Feingold suggested to Svorenčík that they turn his project into the first-ever complete, systematic search for copies of the first edition of the Principia. Their ensuing detective work across the globe turned up about 200 previously unidentified copies in 27 countries, including 35 copies in Central Europe. Feingold and Svorenčík even came across lost or stolen copies of the masterpiece; for example, one copy found with a bookseller in Italy was discovered to have been stolen from a library in Germany half a century earlier.

"We contacted the German library to let them know, but they were too slow to make a decision to buy back the copy or apprehend it somehow, so it ended up back on the market," says Feingold.

A Rare, Collectible Item

According to the historians, copies of the first edition of the Principia sell today for between $300,000 and $3,000,000 via auction houses like Christie's and Sotheby's as well as on the black market. They estimate that some 600, and possibly as many as 750 copies of the book's first edition were printed in 1687.

The primary person behind the book's publication was Edmond Halley, a well-known English scientist who made several discoveries about our solar system, including the periodicity of what later became known as Halley's Comet. Feingold explains that, before the Principia was written, Halley had asked Newton for some calculations regarding elliptical orbits of bodies in our solar system. When Halley saw the calculations, "he got so excited, he rushed back to Cambridge and basically forced Newton to write the Principia," says Feingold. In fact, Halley funded the publication of the book's first edition.

Soon after its publication, the book was recognized as a work of genius. "Because Halley had already prepared the public for what was to come," says Feingold, "there was a widespread recognition that the Principia was a masterpiece." Later, a "mystique" about Newton started to develop, according to Feingold, exemplified in a story about two students walking in Cambridge and spotting Newton on the street. "'There goes a man,' one of them said, 'who wrote a book that neither he nor anybody else understands,'" says Feingold.

The idea that the Principia was incomprehensible and little read is called into question with the new survey results. Not only does the research show that there was a larger market for the book than once thought, it also demonstrates that people were digesting its contents to a greater extent than realized.

"When you look through the copies themselves, you might find small notes or annotations that give you clues about how it was used," says Svorenčík, who has personally inspected about 10 percent of the copies documented in their census. When traveling to conferences in different countries, he would make time to visit local libraries. "You look at the condition of the ownership marks, the binding, deterioration, printing differences, et cetera." Even without inspecting the books up close, the historians could trace who owned them through library records and other letters and documents, and learn how copies were shared.

"It's harder to show how much people engaged with a book than simply owned it, but we can look at the notes in the margins and how the book was shared," says Feingold. "You can assume that for each copy, there are multiple readers. It's not like today, where you might buy a book and are the only one to read it. And then we can look for an exchange of ideas between the people sharing copies. You start to put together the pieces and solve the puzzle."

Svorenčík and Feingold hope that their census, which they call preliminary, will yield information about other existing copies tucked away with private owners, book dealers, and libraries. Continuing this line of research into the future, the historians plan to further refine our understanding of how the Principia shaped 18th-century science.

Credit: 
California Institute of Technology

Researchers find a backup mechanism that removes cellular debris from the brain

image: Microglial ablation or microglial dysfunction actuates phagocytic activity of astrocytes. Astrocytes possess phagocytic machinery and have the potential to compensate for microglia with dysfunctional phagocytic activity.

Image: 
Hiroyuki Konishi

Microglia -- the brain's immune cells -- play a primary role in removing cellular debris from the brain. According to a recent study by a Nagoya University-led research team in Japan, another kind of brain cell, called astrocyte, is also involved in removing debris as a backup to microglia. The finding, published recently in The EMBO Journal, could lead to new therapies that accelerate the removal of cellular debris from the brain and thereby reduce detrimental effects of the debris on surrounding cells.

Even in a healthy brain, neurons die at a certain rate, which increases with age. As dead cells and cellular debris accumulate, they harm surrounding cells, which in turn accelerates neuron death and causes neurodegenerative diseases such as Alzheimer's disease. Microglia -- brain "phagocytes" (a type of cell that engulfs and absorbs bacteria and cellular debris) -- act to clear the danger, but the debris sometimes overwhelms the microglia. This has led to suggestions that another mechanism that helps remove cellular debris is also at work.

To clarify the nature of the alternative debris-clearing mechanism, a research team led by Dr. Hiroshi Kiyama and Dr. Hiroyuki Konishi of the Graduate School of Medicine at Nagoya University first investigated what would happen to microglial debris in the brains of mouse models in which microglial death was induced. As expected, the team observed that dead microglia were cleared, indicating that indeed another phagocyte was at work.

The researchers next analyzed the expression of molecules in the brains of the mouse models and identified astrocytes that play a role in the removal of microglial debris. Then, using mutant mice with phagocytosis-impaired microglia, they examined how astrocytes work when microglia don't function properly. The results showed that almost half of the cellular debris was engulfed by astrocytes, not by microglia. This indicates that astrocytes have the potential to compensate for microglial dysfunction.

The team concluded that not only are astrocytes capable of engulfing cellular debris, but also that they are likely to actually do this when microglia don't function properly.

The team next plans to clarify how astrocytes recognize microglial dysfunction and deploy their phagocytic function. Drs. Kiyama and Konishi say, "Further investigation of how to control astrocytic phagocytosis may lead to new therapies that accelerate efficient debris clearance from aged or injured brains."

Credit: 
Nagoya University

Swedish, Finnish and Russian wolves closely related

image: A wolf in Finland.

Image: 
Ilpo Kojola

The Scandinavian wolf originally came from Finland and Russia, and unlike many other European wolf populations its genetic constitution is virtually free from dog admixture. In addition, individuals have migrated into and out of Scandinavia. These findings have emerged from new research at Uppsala University in which genetic material from more than 200 wolves was analysed. The study is published in the journal Evolutionary Applications.

The origin of the Scandinavian wolf strain has long been a controversial topic. Previous genetic studies have indicated migration from the east, without being able to give unequivocal answers about the geographical provenance of this population. The new survey provides a clearer picture of how it formed.

"We can see that those wolves that founded the Scandinavian population in the 1980s were genetically the same as present-day wolves in Finland and Russian Karelia," says Hans Ellegren, professor of evolutionary biology at Uppsala University.

These results are a culmination of earlier research. In 2019, the same research group published a study in which they had analysed wolves' Y-chromosome DNA only - that is, the male-specific genes that can be passed on solely from fathers to their male offspring, showing paternal lineages over past generations. In the new, much more extensive study, also led by Hans Ellegren and Linnéa Smeds at Uppsala University, whole-genome sequences were analysed.

From time to time, new wolves migrate into Sweden from the east. Now, on the other hand, the scientists found genetic evidence for migration in the opposite direction: Scandinavian-born wolves among animals found in Finland.

"We've probably never had a specific Scandinavian population. Throughout the ages, wolves have likely moved back and forth between the Scandinavian peninsula and regions to the east," Ellegren says.

The researchers also sought answers to the question of whether there has been genetic mixing of dogs and Scandinavian wolves. Hybridisation between feral dogs and wolves is common in many parts of the world, and may be difficult to avoid. As late as in 2017, a hybrid wolf-dog litter was found in the county of Södermanland, southwest of Greater Stockholm. If such crossbreeds were allowed to reproduce, they would constitute a threat to the genomic integrity of the wolf strain.

When genetic material from Scandinavian and Finnish wolves was compared with that from some 100 dogs of various breeds, however, the scientists were unable to find any evidence that wolf-dog hybridisation has left its mark on the genetic composition of this wolf population - at least, no signs that recent crossbreeding has affected the wolves.

Credit: 
Uppsala University

Russian scientists created a chemical space mapping method and cracked the mystery of Mendeleev number

image: Compound hardness map.

Image: 
Artem R. Oganov

Scientists had long tried to come up with a system for predicting the properties of materials based on their chemical composition until they set sights on the concept of a chemical space that places materials in a reference frame such that neighboring chemical elements and compounds plotted along its axes have similar properties. This idea was first proposed in 1984 by the British physicist, David G. Pettifor, who assigned a Mendeleev number (MN) to each element. Yet the meaning and origin of MNs were unclear. Scientists from the Skolkovo Institute of Science and Technology (Skoltech) puzzled out the physical meaning of the mysterious MNs and suggested calculating them based on the fundamental properties of atoms. They showed that both MNs and the chemical space built around them were more effective than empirical solutions proposed until then. Their research supported by a grant from the Russian Science Foundation's (RSF) World-class Lab Research Presidential Program was presented in The Journal of Physical Chemistry C.

Systematizing the enormous variety of chemical compounds, both known and hypothetical, and pinpointing those with a particularly interesting property is a tall order. Measuring the properties of all imaginable compounds in experiments or calculating them theoretically is downright impossible, which suggests that the search should be narrowed down to a smaller space.

David G. Pettifor put forward the idea of ??chemical space in the attempt to somehow organize the knowledge about material properties. The chemical space is basically a reference frame where elements are plotted along the axes in a certain sequence such that the neighboring elements, for instance, Na and K, have similar properties. The points within the space represent compounds, so that the neighbors, for example, NaCl and KCl, have similar properties, too. In this setting, one area is occupied by superhard materials and another by ultrasoft ones. Having the chemical space at hand, one could create an algorithm for finding the best material among all possible compounds of all elements. To build their "smart" map, Skoltech scientists, Artem R. Oganov and Zahed Allahyari, came up with their own universal approach that boasts the highest predictive power as compared to the best-known methods.

For many years scientists were clueless as to how Pettifor derived his MNs (if not empirically), while their physical meaning remained a nearly "esoteric" mystery for years.

"I had been wondering about what these MNs are for 15 years until I realized that they are most likely rooted in the atom's fundamental properties, such as radius, electronegativity, polarizability, and valence. While valence is variable for many elements, polarizability is strongly correlated with electronegativity. This leaves us with radius and electronegativity which can be reduced to one property through a simple mathematical transformation. And here we go: we obtain an MN that turns out to be the best way to describe all the properties of an atom, and by a single number at that," explains Artem R. Oganov, RSF grant project lead, a professor at Skoltech and MISiS, a Member of the Academia Europaea, a Fellow of the Royal Society of Chemistry (FRSC) and a Fellow of the American Physical Society (APS).

The scientists used the calculated MNs to arrange all the elements in a sequence that posed as the abscissa and ordinate axes at the same time. Each point in space In corresponds to all compounds of the corresponding elements. In this space, using measured or predicted properties of compounds, one can map any specific characteristic, for example, hardness, magnetization, enthalpy of formation, etc. A property map thus produced clearly showed the areas containing the most promising compounds, such as superhard or magnetic materials.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Scientists speed up artificial organoid growth and selection

image: Robots facilitate stem cell production

Image: 
Daria Sokol/MIPT Press Office

The method currently used to produce stem cell-derived tissues has a very limited throughput. By semi-automating tissue differentiation, researchers from MIPT and Harvard have made the process nearly four times faster, without compromising on quality. Presented in Translational Vision Science & Technology, the new algorithm is also useful for analyzing the factors that affect cell specialization.

The retina is a tissue of the eye that consists of several layers of neurons forming a chain. It senses light and preprocesses visual information before feeding it to the brain.

Because of their limited potential for regeneration, the loss of retinal neurons leads to permanent blindness. Today about 15 million people in the U.S. alone have retinal degenerative diseases. This number is growing, mainly because of population aging.

Medical researchers are pursuing a number of approaches to address retinal health issues. Some of the options are neuroprotection, gene therapy, and cell replacement. While these approaches target different diseases and employ different mechanisms and methodologies, one thing is universally true: Their development requires massive amounts of retinal cells for research purposes.

It is possible to grow retinal tissue in vitro. This involves placing stem cell clusters into a special medium that induces the spontaneous formation of undeveloped neurons, followed by their differentiation into retinal cells. That process produces actual retinal neurons organized as complex tissue, without any external stimulation of development pathways during specialization.

The method has its limitations, though. For one thing, it has to do with the random nature of the initial neuron growth stimulation. A further inconvenience is that it takes 30 days for the artificial retina of a mouse to develop correctly, and up to one year for human organoids. The MIPT-Harvard team made an attempt to address these problems by increasing the number of cells produced and improving their quality.

The researchers compared the quality of the robot- and human-grown cells by producing several thousand retinal tissue samples for automatic processing and the same amount for manual handling. The biologists scanned the welles housing the tissue samples from the first group and analyzed the resulting images with a Python script they wrote to perform that specific task. The program determines the areas in the photos where the glow of the fluorescent protein is the strongest. Since this protein is only produced in developing retinal cells, the high fluorescence intensity points to the parts of the sample with the right tissue. That way the software can determine the amount of developing retina in each organoid.

The automation algorithm was able to optimize cell production by simultaneously testing many systems -- without any adverse effect on tissue quality. The approach reduced the time researchers needed for cell processing from two hours to just 34 minutes.

"We implemented automated liquid change during retinal differentiation and showed it had no negative effect on cell specialization," commented Evgenii Kegeles of the Genome Engineering Lab at MIPT. "We also developed a tool for automatic retina identification and organoid classification, which we showed in action, optimizing cell specialization conditions and monitoring tissue quality."

"One of our goals in this research has been to scale up cell differentiation to enable a high-throughput tissue production for drug tests and cell transplantation experiments. Automated sample handling makes it possible to reduce the effort on the part of the personnel and produce several times more cells in the same period of time. With some slight modifications, the algorithm would be applicable to other organoids, not just the retina," Kegeles added.

"Numbers matter: The automation empowered us to produce trillions of retinal neurons for transplantation and we are excited to see the translation of our approach into routine cell manufacture," commented Petr Baranov from the Schepens Eye Research Institute of Massachusetts Eye and Ear.

Credit: 
Moscow Institute of Physics and Technology

Large volcanic eruption caused the largest mass extinction

image: The researchers found coronene-mercury enrichments in sedimentary rocks deposited in southern China and Italy 252 million years ago. Paired coronene - mercury enrichments are products of multiple phases of a large igneous province volcanism. This, they say, could have led to the environmental changes that caused the disappearance of many terrestrial and marine animal species.

Image: 
Kunio Kaiho

Researchers in Japan, the US and China say they have found more concrete evidence of the volcanic cause of the largest mass extinction of life. Their research looked at two discrete eruption events: one that was previously unknown to researchers, and the other that resulted in large swaths of terrestrial and marine life going extinct.

There have been five mass extinctions since the divergent evolution of early animals 450 - 600 million years ago. The third was the largest one and is thought to have been triggered by the eruption of the Siberian Traps - a large region of volcanic rock known as a large igneous province. But the correlation between the eruption and mass extinction has not yet been clarified.

Sedimentary mercury enrichments, proxies for massive volcanic events, have been detected in dozens of sedimentary rocks from the end of the Permian. These rocks have been found deposited inland, in shallow seas and central oceans, but uncertainty remains as to their interpretation. Mercury can be sourced from either direct atmospheric deposition from volcanic emissions and riverine inputs from terrestrial organic matter oxidation when land/plant devastation - referred to as terrestrial ecological disturbance - occurs.

The largest mass extinction occurred at the end of the Permian - roughly 252 million years ago. This mass extinction was marked by the transition from the divergence of the Paleozoic reptiles and marine animals like brachiopods and trilobites to Mesozoic dinosaurs and marine animals such as mollusks. Approximately 90% of species disappeared at the end of the Permian.

Current professor emeritus at Tohoku University, Kunio Kaiho led a team that looked into possible triggers of the largest mass extinction. They took sedimentary rock samples from two places - southern China and Italy - and analyzed the organic molecules and mercury (Hg) in them. They found two discrete coronene-Hg enrichments coinciding with the first terrestrial ecological disturbance and the following mass extinction in both areas.

"We believe this to be the product of large volcanic eruptions because the coronene anomaly was formed by abnormally high temperature combustion," says professor Kaiho. "High temperature magma or asteroid/comet impacts can make such a coronene enrichment.

From the volcanic aspect, this could have occurred because of the higher temperature combustion of living and fossil organic matter from lava flows and horizontally intruded magma (sill) into the sedimentary coal and oil. The different magnitude of the two coronene-mercury enrichments shows that the terrestrial ecosystem was disrupted by smaller global environmental changes than the marine ecosystem. The duration between the two volcanic events is tens of thousands of years."

Huge volcanic eruptions can produce sulfuric acid aerosols in the stratosphere and carbon dioxide in the atmosphere, which causes global climate changes. This rapid climate change is believed to be behind the loss of land and marine creatures as seen in Fig. 1.

Coronene is a highly condensed six-ring polycyclic aromatic hydrocarbon, which requires significantly higher energy to form as compared to smaller PAHs. Therefore, high temperature volcanic combustion can cause the coronene enrichments. This means that high temperature combustion of hydrocarbons in the sedimentary rocks by lateral intrusion of magmas formed CO2 and CH4 causing high pressure and eruption to induce global warming and the mass extinction. The coronene-mercury concentration firstly evidenced that volcanic hydrocarbon combustion helped contribute to the extinction through global warming.

Kaiho's team is now studying other mass extinctions in the hopes of further understanding the cause and processes behind them.

Credit: 
Tohoku University

Researchers model source of eruption on Jupiter's moon Europa

image: This artist's conception of Jupiter's icy moon Europa shows a hypothesized cryovolcanic eruption, in which briny water from within the icy shell blasts into space. A new model of this process on Europa may also explain plumes on other icy bodies.

Image: 
Image Justice Blaine Wainwright

On Jupiter's icy moon Europa, powerful eruptions may spew into space, raising questions among hopeful astrobiologists on Earth: What would blast out from miles-high plumes? Could they contain signs of extraterrestrial life? And where in Europa would they originate? A new explanation now points to a source closer to the frozen surface than might be expected.

Rather than originating from deep within Europa's oceans, some eruptions may originate from water pockets embedded in the icy shell itself, according to new evidence from researchers at Stanford University, the University of Arizona, the University of Texas and NASA's Jet Propulsion Laboratory.

Using images collected by the NASA spacecraft Galileo, the researchers developed a model to explain how a combination of freezing and pressurization could lead to a cryovolcanic eruption, or a burst of water. The results, published Nov. 10 in Geophysical Research Letters, have implications for the habitability of Europa's underlying ocean - and may explain eruptions on other icy bodies in the solar system.

Harbingers of life?

Scientists have speculated that the vast ocean hidden beneath Europa's icy crust could contain elements necessary to support life. But short of sending a submersible to the moon to explore, it's difficult to know for sure. That's one reason Europa's plumes have garnered so much interest: If the eruptions are coming from the subsurface ocean, the elements could be more easily detected by a spacecraft like the one planned for NASA's upcoming Europa Clipper mission.

But if the plumes originate in the moon's icy shell, they may be less hospitable to life, because it is more difficult to sustain the chemical energy to power life there. In this case, the chances of detecting habitability from space are diminished.

"Understanding where these water plumes are coming from is very important for knowing whether future Europa explorers could have a chance to actually detect life from space without probing Europa's ocean," said lead author Gregor Steinbrügge, a postdoctoral researcher at Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth).

The researchers focused their analyses on Manannán, an 18-mile-wide crater on Europa that was created by an impact with another celestial object some tens of millions of years ago. Reasoning that such a collision would have generated a tremendous amount of heat, they modeled how melting and subsequent freezing of a water pocket within the icy shell could have caused the water to erupt.

"The comet or asteroid hitting the ice shell was basically a big experiment which we're using to construct hypotheses to test," said co-author Don Blankenship, senior research scientist at the University of Texas Institute for Geophysics (UTIG) and principal investigator of the Radar for Europa Assessment and Sounding: Ocean to Near-surface (REASON) instrument that will fly on Europa Clipper. "The polar and planetary sciences team at UTIG are all currently dedicated to evaluating the ability of this instrument to test those hypotheses."

The model indicates that as Europa's water transformed into ice during the later stages of the impact, pockets of water with increased salinity could be created in the moon's surface. Furthermore, these salty water pockets can migrate sideways through Europa's ice shell by melting adjacent regions of less brackish ice, and consequently become even saltier in the process.

"We developed a way that a water pocket can move laterally - and that's very important," Steinbrügge said. "It can move along thermal gradients, from cold to warm, and not only in the down direction as pulled by gravity."

A salty driver

The model predicts that when a migrating brine pocket reached the center of Manannán crater, it became stuck and began freezing, generating pressure that eventually resulted in a plume, estimated to have been over a mile high. The eruption of this plume left a distinguishing mark: a spider-shaped feature on Europa's surface that was observed by Galileo imaging and incorporated in the researchers' model.

"Even though plumes generated by brine pocket migration would not provide direct insight into Europa's ocean, our findings suggest that Europa's ice shell itself is very dynamic," said co-lead author Joana Voigt, a graduate research assistant at the University of Arizona, Tucson.

The relatively small size of the plume that would form at Manannán indicates that impact craters probably can't explain the source of other, larger plumes on Europa that have been hypothesized based on Hubble and Galileo data, the researchers say. But the process modeled for the Manannán eruption could happen on other icy bodies - even without an impact event.

"Brine pocket migration is not uniquely applicable to Europan craters," Voigt said. "Instead the mechanism might provide explanations on other icy bodies where thermal gradients exist."

The study also provides estimates of how salty Europa's frozen surface and ocean may be, which in turn could affect the transparency of its ice shell to radar waves. The calculations, based on imaging from Galileo from 1995 to 1997, show Europa's ocean may be about one-fifth as salty as Earth's ocean - a factor that will improve the capacity for the Europa Clipper mission's radar sounder to collect data from its interior.

The findings may be discouraging to astrobiologists hoping Europa's erupting plumes might contain clues about the internal ocean's capacity to support life, given the implication that plumes do not have to connect to Europa's ocean. However, the new model offers insights toward untangling Europa's complex surface features, which are subject to hydrological processes, the pull of Jupiter's gravity and hidden tectonic forces within the icy moon.

"This makes the shallow subsurface - the ice shell itself - a much more exciting place to think about," said co-author Dustin Schroeder, an assistant professor of geophysics at Stanford. "It opens up a whole new way of thinking about what's happening with water near the surface."

Credit: 
Stanford University

Radioactive elements may be crucial to the habitability of rocky planets

image: These illustrations show three versions of a rocky planet with different amounts of internal heating from radioactive elements. The middle planet is Earth-like, with plate tectonics and an internal dynamo generating a magnetic field. The top planet, with more radiogenic heating, has extreme volcanism but no dynamo or magnetic field. The bottom planet, with less radiogenic heating, is geologically 'dead,' with no volcanism.

Image: 
Illustrations by Melissa Weiss

The amount of long-lived radioactive elements incorporated into a rocky planet as it forms may be a crucial factor in determining its future habitability, according to a new study by an interdisciplinary team of scientists at UC Santa Cruz.

That's because internal heating from the radioactive decay of the heavy elements thorium and uranium drives plate tectonics and may be necessary for the planet to generate a magnetic field. Earth's magnetic field protects the planet from solar winds and cosmic rays.

Convection in Earth's molten metallic core creates an internal dynamo (the "geodynamo") that generates the planet's magnetic field. Earth's supply of radioactive elements provides more than enough internal heating to generate a persistent geodynamo, according to Francis Nimmo, professor of Earth and planetary sciences at UC Santa Cruz and first author of a paper on the new findings, published November 10 in Astrophysical Journal Letters.

"What we realized was that different planets accumulate different amounts of these radioactive elements that ultimately power geological activity and the magnetic field," Nimmo explained. "So we took a model of the Earth and dialed the amount of internal radiogenic heat production up and down to see what happens."

What they found is that if the radiogenic heating is more than the Earth's, the planet can't permanently sustain a dynamo, as Earth has done. That happens because most of the thorium and uranium end up in the mantle, and too much heat in the mantle acts as an insulator, preventing the molten core from losing heat fast enough to generate the convective motions that produce the magnetic field.

With more radiogenic internal heating, the planet also has much more volcanic activity, which could produce frequent mass extinction events. On the other hand, too little radioactive heat results in no volcanism and a geologically "dead" planet.

"Just by changing this one variable, you sweep through these different scenarios, from geologically dead to Earth-like to extremely volcanic without a dynamo," Nimmo said, adding that these findings warrant more detailed studies.

"Now that we see the important implications of varying the amount of radiogenic heating, the simplified model that we used should be checked by more detailed calculations," he said.

A planetary dynamo has been tied to habitability in several ways, according to Natalie Batalha, a professor of astronomy and astrophysics whose Astrobiology Initiative at UC Santa Cruz sparked the interdisciplinary collaboration that led to this paper.

"It has long been speculated that internal heating drives plate tectonics, which creates carbon cycling and geological activity like volcanism, which produces an atmosphere," Batalha explained. "And the ability to retain an atmosphere is related to the magnetic field, which is also driven by internal heating."

Coauthor Joel Primack, a professor emeritus of physics, explained that stellar winds, which are fast-moving flows of material ejected from stars, can steadily erode a planet's atmosphere if it has no magnetic field.

"The lack of a magnetic field is apparently part of the reason, along with its lower gravity, why Mars has a very thin atmosphere," he said. "It used to have a thicker atmosphere, and for a while it had surface water. Without the protection of a magnetic field, much more radiation gets through and the surface of the planet also becomes less habitable."

Primack noted that the heavy elements crucial to radiogenic heating are created during mergers of neutron stars, which are extremely rare events. The creation of these so-called r-process elements during neutron-star mergers has been a focus of research by coauthor Enrico Ramirez-Ruiz, professor of astronomy and astrophysics.

"We would expect considerable variability in the amounts of these elements incorporated into stars and planets, because it depends on how close the matter that formed them was to where these rare events occurred in the galaxy," Primack said.

Astronomers can use spectroscopy to measure the abundance of different elements in stars, and the compositions of planets are expected to be similar to those of the stars they orbit. The rare earth element europium, which is readily observed in stellar spectra, is created by the same process that makes the two longest-lived radioactive elements, thorium and uranium, so europium can be used as a tracer to study the variability of those elements in our galaxy's stars and planets.

Astronomers have obtained europium measurements for many stars in our galactic neighborhood. Nimmo was able use those measurements to establish a natural range of inputs to his models of radiogenic heating. The sun's composition is in the middle of that range. According to Primack, many stars have half as much europium compared to magnesium as the sun, and many stars have up to two times more than the sun.

The importance and variability of radiogenic heating opens up many new questions for astrobiologists, Batalha said.

"It's a complex story, because both extremes have implications for habitability. You need enough radiogenic heating to sustain plate tectonics but not so much that you shut down the magnetic dynamo," she said. "Ultimately, we're looking for the most likely abodes of life. The abundance of uranium and thorium appear to be key factors, possibly even another dimension for defining a Goldilocks planet."

Using europium measurements of their stars to identify planetary systems with different amounts of radiogenic elements, astronomers can start looking for differences between the planets in those systems, Nimmo said, especially once the James Webb Space Telescope is deployed. "The James Webb Space Telescope will be a powerful tool for the characterization of exoplanet atmospheres," he said.

Credit: 
University of California - Santa Cruz

As cancer has evolved, it is time for cancer research to do the same

London, UK (10 Nov 2020) -- The observance of Lung Cancer Awareness Month in November affords an opportunity to take stock of current approaches to lung cancer research, and to cancer research more widely. While there have been improvements in understanding, prevention and overall survival, it is still shocking to note that more than 1.7 million people died of lung cancer in 2018 alone This is the case despite the fact that lung cancer research has been relatively well-funded, and this suggests that an urgent need to rethink how we prioritise the funding available for cancer research. As the former European Commissioner for Research, Enterprise and Science said, "cancer has evolved and we need new tactics to match it."

For cancer, in particular, more than 95% of potential drugs fail in clinical trials.

Animal testing, with its poorly predictive preclinical models of human efficacy and safety, represents a major obstacle in the drug development pipeline and is the main reason that drug attrition rates are so high. It is still more frustrating to consider that predictive, human biology-based tools are increasingly available, yet receive only a meagre proportion of current research funding.

To investigate the extent to which modern, human-based non-animal approaches are supplanting animal models, research at Humane Society International led by Drs Lindsay Marshall and Marcia Triunfol compared studies using animal-dependent methods--so-called "xenograft" models in which patient tumour biopsies are injected into an animal--and human "organoids", which use patient tumour samples to create in vitro models that more closely resemble a patient's tumour.

Their preliminary analysis showed that for human organoids, there has been a modest increase in outputs, as measured by number of publications, funding, and publications associated with clinical trial. However, when the authors compared this to xenograft research, they found that animal-dependent research is still favoured and that publications, funding and clinical trials associated with xenografts all outnumber organoids by at least 10 to 1.

Human organoid models recapitulate key characteristics of the original cancer and maintain the structure seen in the patient's tumour. This makes human organoid models better for studying how the patient will respond to specific drugs, and how the tumour will progress. Considering this, and the promise of human organoids for studying cancer, the authors of this research expressed their disappointment in learning that the U.S. National Institutes of Health awarded over five times more funding to animal xenograft-based research programs than to those involving human organoids.

The authors contend that, "There is a great need to level the 'playing field' for the human cell-based technologies compared to animals, given that, in the European Union, only 0.036% of the research budget is dedicated to these more human relevant, non-animal approaches. This deficit is adding hugely to the obstacles preventing animal replacement in human health research." They go on to say, "We would like to stress that funding should be made available to assess the capacity of the organoids in predicting patient responses or for evaluating novel, possibly combination therapies, and not for comparative studies aiming to 'validate' organoids against animals, given that the animal models are, and remain, unvalidated." A number of studies have described the pitfalls of using animal xenografts as a proxy model for a patient's tumour. These models are time-consuming, very expensive, and some studies have shown that they do not faithfully replicate the changes and abnormalities first seen in the original tumour, but rather create new ones.

With at least half of European cancer research funding coming from the public via charity donations, and U.S. taxpayers contributing heavily to their own government's research budget, there is certainly a case to be made for the responsibility scientists have to the public. Given the disapproval of many citizens concerning the use of animal in research, this might include reducing reliance on animal testing and accelerating the discovery and development of new, effective and affordable medicines.

Overall, it appears that the transition to more effective and more human-relevant non-animal technologies is proceeding at a very slow pace. Yet the work of Dr Marshall and colleagues indicates that a faster transition is entirely possible, as long as dedicated support for human biology-based approaches, more agile regulatory requirements and smarter clinical trials are put in place.The authors make several recommendations to accelerate this change, including the establishment of training programs for researchers, and greater transparency regarding the use of organoids/other human cell-based tools to develop drug testing regimes for patients. With lung cancer, together with breast and colorectal cancers, accounting for almost a third of all cancer mortality annually worldwide, and cancer research consuming millions of animals' lives, these would surely be small efforts for the potential gains to be made.

The article entitled "Patient-Derived Xenograft vs Organoids: A Preliminary Analysis of Cancer Research Output, Funding and Human Health Impact in 2014-2019" has been published in the journal Animals.

Credit: 
Humane Society International

Research news tip sheet: story ideas from Johns Hopkins Medicine

image: Illustration for Research Story Tips from Johns Hopkins Medicine

Image: 
Johns Hopkins Medicine

SCIENTISTS CREATE MAP OF GATEKEEPER THAT COULD HELP BRAIN CELLS SURVIVE STROKE

Media Contact: Rachel Butch, rbutch1@jhmi.edu

Researchers at Johns Hopkins Medicine have revealed the structure of a gatekeeping protein that could one day impact the treatment of conditions such as heart attack and stroke. Understanding the structure of the protein -- known as the proton-activated chloride channel (PAC) -- helps researchers develop ways to reduce permanent damage caused by conditions associated with acidosis, a condition marked by increased acidity in the blood.

A report of the study, was published Nov. 4, 2020 in Nature.

"Knowing the structure of the PAC helps us understand how this acid-sensing protein works in different pH [the measure of acidity or basicity of a solution] and gives us potential ways to manipulate it for better medical outcomes," says Zhaozhu Qiu, Ph.D., assistant professor of physiology at the Johns Hopkins University School of Medicine and co-corresponding author of the study.

The PAC protein is activated when the environment around the cell becomes acidic. Typically, acidity in the body is held at normal levels through constant blood flow. However, when the circulatory system is disrupted by heart disease, stroke or some cancers, the area quickly becomes acidic as cellular waste builds up.

The Johns Hopkins Medicine-led research team first reported the gene sequences encoding this new "acid sensor" in Science last year. In the study, the researchers showed that stroke can over-activate PAC and kill brain cells in mice.

For the current study, Qiu's team worked with collaborators, led by Wei Lü, Ph.D., assistant professor of structural biology at the Van Andel Institute and co-corresponding author of the study, to collect two types of images of the PAC. One was taken in the protein's relaxed state -- at a cell's normal biological acidity level -- and the other in its active state, when the environment is highly acidic.

To create an accurate picture, they used the specialized, high-powered cryo-electron microscope that supercools molecules so they form precise, easily imaged structures at sizes nearly to the atomic level. The images showed that the PAC resembles a wedding bouquet, with parts that move in response to acidic pH. The movement links to the opening of the gate that enables chloride ions to flow in and out of the cell.

"The PAC structures are beautiful! Combined with functional studies, we revealed a completely new acid-sensing mechanism," says James Osei-Owusu, a doctoral student in Qiu's lab and co-first author of the study. "It sets an example for why getting a structure of the protein is really important and provides a blueprint of how the tiny molecular machine works."

The researchers plan to conduct further studies to test the protein-activated channel's sensitivity to acidic environments and screen for drugs that inhibit the chloride movement through it.

"PAC is widely distributed in many tissues," says Qiu. "Its function in healthy cells is still a big mystery. We hope to solve this puzzle in the near future."

STUDY SHOWS JOHNS HOPKINS MEDICINE DEVICE SAFELY TREATS BRAIN SWELLING 'UNDERCOVER'

Media Contact: Michael E. Newman, mnewma25@jhmi.edu

It's slightly shorter in length than a credit card and only as thick as a stack of seven pennies, but the medical device known as the valve-agnostic cranial implant (VACI) has proven large in its quality-of-life return for adult patients with hydrocephalus, a dangerous brain swelling. The team that created the VACI, led Johns Hopkins Medicine researchers, recently announced preliminary findings from a multicenter clinical trial that showed -- when compared with traditional shunts used to remove the excess cerebrospinal fluid (CSF) associated with hydrocephalus -- the device successfully treats the condition with fewer complications, enables easier maintenance and monitoring without follow-up surgery, and gives the patient a more normal appearance.

The results are published in the October 2020 issue of The Journal of Craniofacial Surgery.

According to the National Institute of Neurological Disorders and Stroke, hydrocephalus is an abnormal buildup of CSF -- the clear, colorless liquid that protects and cushions the brain -- circulating in the brain's cavities (ventricles). Hydrocephalus occurs when the normal flow and absorption of CSF is blocked, leaving the excess fluid to widen and swell the ventricles. This puts pressure on the brain and keeps it from properly functioning, leading in turn leading to neurological damage and, in severe cases, death.

Hydrocephalus is most commonly treated in adults by implanting a 1-inch-thick shunt device onto the skull and draining the excess CSF through a tube into either the chest cavity or abdomen, where the fluid is absorbed. However, the traditional shunt -- a device that basically has not changed in design since its development over 60 years ago -- has a high risk of complications, such as skin breakdown, infection and long-term scalp pain; typically requires multiple surgeries for repair or replacement throughout a patient's lifetime; and forms a noticeable bump that many patients find aesthetically unpleasing.

"Our team knew there had to be a better solution, so we created the VACI, a pre-molded, computer-designed cranial implant that cradles a shunt invisibly within the 4 to 5 millimeters of skull space between the scalp and the brain," says Chad Gordon, D.O., director of neuroplastic and reconstructive surgery, and professor of plastic and reconstructive surgery at the Johns Hopkins University School of Medicine. "The recent clinical trial was conducted by surgical teams at various institutions to determine if the VACI could improve patient safety and minimize the complications often seen with traditional shunts."

In the trial, 25 adult patients with hydrocephalus -- 14 women and 11 men ranging in age from 22 to 84 -- were fitted with the VACI at four medical institutions. The patients were monitored for an average of 13 months with 23 (92%) reporting no major scalp or shunt-related complications. One patient experienced a scalp wound over a catheter away from the device and another developed a CSF infection. Neither of these complications were related to the VACI.

"Based on its successful performance, we believe that the VACI is a newfound weapon against neurosurgical-induced deformities, postoperative complications and suboptimal surgical outcomes when treating adult hydrocephalus," says Gordon.

The VACI, now known by its trade name InvisiShunt, was first used in a patient in 2018. The device and the surgical procedure for its implantation are part of a new medical discipline being pioneered by Gordon and his Multidisciplinary Adult Cranioplasty Center team to use the cranial bone space -- a field they have dubbed "neuroplastic and reconstructive surgery."

The team's other achievements include the first-ever cranial implants with: (1) closed-loop direct brain neurostimulators for treating epilepsy, and (2) a "smart" wireless biosensor for continuous monitoring of pressure inside the skull after bone removal to relieve traumatic swelling.

"Both eliminate the risks associated with placing bulky devices under the scalp, thereby making the procedures safer and better tolerated by our neurosurgical patients," says Gordon.

Gordon is co-founder of Longeviti Neuro Solutions, which manufactures and markets the InvisiShunt under an arrangement approved by The Johns Hopkins University. Both Gordon and the university are entitled to royalty distributions for the technology. Two other authors, Judy Huang, M.D., and Erol Veznedaroglu, M.D., own stock in the company and serve as a paid consultant, respectively.

Credit: 
Johns Hopkins Medicine

New study uses satellites and field studies to improve coral reef restoration

image: "Coral gardening" or "outplanting" has become a popular and promising solution for restoration.

Image: 
Arizona State University Center for Global Discovery and Conservation Science

Our planet's coral reef ecosystems are in peril from multiple threats. Anthropogenic CO2 has sparked a rise in global average sea surface temperatures, pushing reef survival beyond its upper thermal limits. Coastal development from industry, aquaculture, and infrastructure generates sedimentation and increased turbidity in coastal waters, which raises particulate organic carbon (POC) levels. Additionally, sedimentation reduces photosynthetically active radiation (PAR), the much-needed sunlight soaked up by the symbiotic algae corals rely on for food.

With most of the world's reefs under stress, "coral gardening" or "outplanting" has become a popular and promising solution for restoration. Outplanting involves transplanting nursery-grown coral fragments onto degraded reefs. When successful, outplanting helps build coral biomass and restore reef function; but even with thousands of corals outplanted each year, the results are mixed. Newly settled corals are particularly vulnerable to stressors such as pollution, unfavorable light conditions, and temperature fluctuations. Therefore, identifying which stressors have the greatest bearing on coral health and survival is crucial for ensuring successful reef restoration.

A recent study published in Restoration Ecology by researchers from Arizona State University's Center for Global Discovery and Conservation Science (GDCS) found evidence that POC levels are one of the most important factors in determining coral outplant survival. This finding suggests that potential coral outplanting sites should be selected in areas where sedimentation levels are low, away from coastal development, or where coastal development is carefully managed for reef conservation.

"New restoration protocols can use remotely sensed data of multiple oceanographic variables to assess the environmental history of a site. This will help evaluate and optimize site selection and give their outplants the best chance of survival.," said Shawna Foo, lead author and postdoctoral researcher at GDCS.

The study was based on an analysis of coral outplanting projects worldwide between 1987 and 2019. The team assessed satellite-based data on multiple oceanographic variables including POC, PAR, salinity, sea surface temperature, and surface currents to quantify and assess each environmental driver's relative importance to and influence on coral outplant survival.

"Our results provide, for the first time, a clear set of conditions needed to maximize the success of coral restoration efforts. The findings are based on a vast global dataset and provide a critically needed compass to improving the performance of coral outplants in the future," said Greg Asner, co-author of the study and director of GDCS.

Notably, the researchers observed better survival rates for corals outplanted farther away from the coast than six kilometers. This finding has implications for many restoration projections which are often located near land for accessibility purposes, such as diving operations. The researchers also found better coral recovery in water deeper than six meters; corals outplanted in shallow waters showed elevated vulnerability to disturbance and bleaching. Overall, coral outplants had the greatest chance of survival in regions with stable PAR, lower levels of POC, minimal temperature anomalies, and increased water depth and distance away from land. The researchers note that finding restoration sites with all of these characteristics could pose a challenge in some areas, but a consideration of all drivers in combination will greatly help the chances of outplant survival.

Credit: 
Arizona State University

A novel finding on Kabuki syndrome, a rare genetic disease

image: Cibio researchers at University of Trento

Image: 
University of Trento

Delayed growth, craniofacial dysmorphism, skeletal abnormalities, moderate intellectual disability and, often, congenital heart defects. This is how the Kabuki syndrome manifests itself, a rare genetic disease, which has an incidence of one case in every 30,000 births.

It has a long time since the cause of the disease has been identified: mutations of KMT2D gene codify for MLL4, a protein involved in the regulation of chromatin, which is the complex of proteins and nucleic acids contained in the nucleus of cells. However, research still has a long way to go to identify new therapeutic approaches for ameliorating the pathological conditions affecting Kabuki syndrome patients.

An Italian team has taken a step forward in this direction, involving biological, mathematical, physical and genetic skills from various scientific realities. The research was developed at the CIBIO Department of Cellular, Computational and Integrated Biology of the University of Trento, with the contribution of the Italian Institute of Technology (IIT), the Telethon Institute of Genetics and Medicine (TIGEM) of Pozzuoli (Naples), the University of Naples Federico II, the Institute of High Performance Computing and Networks of the National Research Council of Naples (CNR-ICAR) and the Vita-Salute San Raffaele University of Milan. The project started at the "Romeo and Enrica Invernizzi" National Institute of Molecular Genetics (INGM) Foundation in Milan.

The study opens up new perspectives in the field of rare genetic diseases as it has succeeded in identifying how the structure and mechanical properties of the nucleus are altered in Kabuki syndrome patients. The findings will be published in the scientific journal Nature Genetics.

Alessio Zippo, at the head of the team that conceived the study, explains: "Our research group has reproduced for the first time the onset of Kabuki syndrome in the laboratory. To do this, we used healthy human stem cells and introduced the genetic mutation that we find in patients' cells. By using cutting-edge technologies, we have discovered that the nuclear architecture is compromised due to an altered chromatin compartmentalization".

Furthermore, the study shows that the impaired formation of cartilage and bones derives from the inability of cells to respond to the mechanical signals that normally guide the process.

He continues: &laquoWe have identified and tested a therapy that restores the properties of the cells affected by the mutation, both in vitro and in vivo. It is about the inhibition of ATR, a nuclear protein that acts as a molecular sensor (mechano-sensor) in response to nuclear mechanical stimuli".

The next step will be to better define the therapeutic potential of targeting ATR to restore the functionality of stem cells and therefore the correct formation of cartilage and the appropriate lengthening of the bones in patients affected by the syndrome.

Credit: 
Università di Trento