Culture

Adelphi, OHIO researchers determine dinosaur replaced teeth as fast as sharks

video: This video shows the Majungasaurus skull in 3D, including unerupted teeth.

Image: 
Eric Lund/Joseph Groenke (Ohio University); Tom Pascucci and Sae Bom Ra (Adelphi University)

GARDEN CITY, N.Y. (Nov. 27, 2019) - A meat-eating dinosaur species (Majungasaurus) that lived in Madagascar some 70 million years ago replaced all its teeth every couple of months or so, as reported in a new study published today in the open-access journal PLOS ONE, surprising even the researchers.

In fact, Majungasaurus grew new teeth roughly two to thirteen times faster than those of other carnivorous dinosaurs, says paper lead-author Michael D. D'Emic, an assistant professor of biology at Adelphi University. Majungasaurus would form a new tooth in each socket approximately every two months.

"This meant they were wearing down on their teeth quickly, possibly because they were gnawing on bones," D'Emic said. "There is independent evidence for this in the form of scratches and gouges that match the spacing and size of their teeth on a variety of bones - bones from animals that would have been their prey." Importantly, the study also examined two other species of predatory dinosaur (Allosaurus and Ceratosaurus), providing an opportunity to consider tooth growth patterns at a broader scale.

Some animals today, too, will gnaw on bones, including rodents, D'Emic said. It's a way for them to ingest certain nutrients. It also requires exceptionally strong teeth - but Majungasaurus did not have those.

"That's our working hypothesis for why they had such elevated rates of replacement," D'Emic said. The rapid-fire tooth growth puts Majungasaurus in same league with sharks and big, herbivorous dinosaurs, he adds.

In collaboration with Patrick O'Connor, professor of anatomy at Ohio University, D'Emic used a collection of isolated fossil teeth to examine microscopic growth lines in the teeth. These growth lines are similar to tree rings, but instead of being deposited once a year, they are deposited daily. At the same, the team used computerized tomography (CT) on intact jaws to visualize unerupted teeth growing deep inside the bones. This allowed them to estimate tooth-replacement rates in a large number of individual jaws so they could cross-check their results.

The time-consuming process would not have been possible without the involvement of students at both OHIO and Adelphi. Graduate students Thomas Pascucci (Adelphi University) and Eric Lund (Ohio University) played important roles as part of the research team, serving to conduct both microscopic and digital computed tomography analyses at the heart of the study.

"As an interdisciplinary PhD student, being able to work on impactful, multi-institutional research utilizing novel approaches has been really influential and highlights the power of interdisciplinary approaches to answering tough scientific questions," Lund said.

"The ability to interface with colleagues across the state, country, or planet, particularly when we can include students in different parts of the research process, is a game changer when we consider collaborative research in the 21st Century," O'Connor stated. "This project addresses yet another aspect of the biology of Majungasaurus, and predatory dinosaurs more generally," he added, "heralding the next phase of research based on recent field discoveries."

Credit: 
Ohio University

New study shows a carnivorous dinosaur species regrew all its teeth every few months

image: Cover art by Sae Bom Ra, a 2019 graduate of Adelphi University

Image: 
Sae Bom Ra

Talk about high turnover.

A meat-eating dinosaur species that lived in Madagascar some 70 million years ago replaced all its teeth every couple of months or so, a new study has found, surprising even the researchers.

In fact, Majungasaurus grew new teeth roughly two to 13 times faster than those of other carnivorous dinosaurs, says paper lead author Michael D. D'Emic, an assistant professor of biology at Adelphi University. Majungasaurus would form a new tooth in each socket every couple of months.

"This meant they were wearing down their teeth quickly, possibly because they were gnawing on bones," D'Emic says. "There is independent evidence for this in the form of scratches and gouges that match the spacing and size of their teeth on a variety of bones - bones from animals that would have been their prey."

Some animals today, too, will gnaw on bones, including rodents, D'Emic explains. It's a way for them to ingest certain nutrients. It also requires exceptionally strong teeth - but Majungasaurus did not have those.

"That's our working hypothesis for why they had such elevated rates of replacement," D'Emic says. The rapid-fire tooth growth puts Majungasaurus in same league with sharks and big, herbivorous dinosaurs, he adds.

Although at least a few hundred meat-eating dinosaur species roamed the Earth, researchers have analyzed tooth-replacement rates for only about a half-dozen of them, D'Emic says. He also has looked into tooth-replacement patterns in plant-eating dinosaurs.

"I'm hoping this latest project spurs more people to study other species. I bet that will reveal further surprises," he says. "And hopefully that will lead to a better understanding of how dinosaurs evolved to be successful for so long."

Importantly, the recent study examined two additional species of predatory dinosaur (Allosaurus and Ceratosaurus), providing an opportunity to consider tooth-growth patterns at a broader scale.

In collaboration with Patrick O'Connor, professor of anatomy at Ohio University, and Ph.D. student Eric Lund, D'Emic used a collection of isolated fossil teeth to examine microscopic growth lines in the teeth. These growth lines are similar to tree rings, but instead of being deposited once a year, they were deposited daily. At the same, the team used computerized tomography (CT) on intact jaws to visualize unerupted teeth growing deep inside the bones. That allowed them to estimate tooth-replacement rates in a large number of individual jaws so they could cross-check their results.

The time-consuming process would not have been possible without the involvement of Adelphi University students: Former undergraduates Elizabeth Mardakhayava and Joanna Gavras and current graduate student Thomas Pascucci are co-authors on the paper.

"Future research will be able to use this study to estimate tooth-replacement rate in dinosaurs without destructively sampling teeth," Pascucci explains.

Credit: 
Adelphi University

Researchers finally grasp the work week of enzymes

Enzymes are used widely in our everyday lives. Like tiny soldiers, enzymes in washing powder work to dismantle fat stains from clothing, just as they are used to transform straw into bioethanol or act as miniature pharmaceutical factories.

Now, researchers from the University of Copenhagen's Department of Chemistry have discovered a way to monitor enzyme workflows. Their results have just been published in Scientific Reports.

"We have never been capable of witnessing what enzymes do while they work. It is the same as, not just being able to observe someone going to and from work, but having the ability to see what they are doing while at work and seeing how effective their work is," according to Søren Schmidt-Rasmussen Bohr, whose doctoral thesis is based upon the research.

He explains that being able to monitor enzymes and map their work week makes it possible to target the amino acid composition of enzymes, which directly controls their function.

Targeted enzyme development is a major area of research

Enzyme design is an area of research that has attracted immense international interest. So much so, that last year's Nobel Prize in Chemistry was awarded for the targeted optimisation of enzymes.

With an understanding of how various amino acids in enzymes work, one can begin to customize enzymes and make them far more effective. A few of the more evident examples include the design of enzymes that more efficiently convert straw into biofuels as well as designs that reduce the concentration of enzymes in washing powders, where a few effective leftovers become tough enough to get the job done.

"Depending on the enzyme, it could be advantageous to either prolong the amount of time they work or make them more effective while at work. This will make many industrial processes both cheaper and greener," according to Associate Professor Nikos Hatzakis, who directs the research.

Lower drug costs, greener chemistry

What Søren Schmidt-Rasmussen hopes for most is that the new approach can be a step towards creating more effective enzymes for drug manufacturing that serves to reduce the toxic footprint of the pharmaceuticals industry. More efficient enzymes will result in fewer waste products and manufacturing at lower temperatures, which will reduce CO2 emissions.

"With better enzymes, one can simplify the chemical processes needed to manufacture pharmaceuticals, which will ultimately lead to a reduction in drug costs," asserts Søren Schmidt-Rasmussen Bohr.

Method observes enzyme location

The researchers have combined a method known as "Single Particle Tracking", whereby the position and speed of enzymes is observed, with advanced data processing that can predict how long enzymes are at work and at pause. In practice, advanced fluorescence microscopy is used to zoom in on the nanoscale and observe the movements of individual enzymes. Thereafter, statistical models are deployed to determine what the enzymes are actually up to as they move over and interact with fats.

Until now, most targeted enzyme development has been accomplished by randomly swapping some amino acids for others, which has made it quite complicated to produce the best possible enzymes for a given purpose.

Credit: 
University of Copenhagen

New Cretaceous mammal fossil sheds light on evolution of middle ear

image: Reconstruction of Jeholbaatar kielanae.

Image: 
XU Yong

Researchers from the Institute of Vertebrate Paleontology and Paleoanthropology (IVPP) of the Chinese Academy of Sciences and the American Museum of Natural History (AMNH) have reported a new species of multituberculate - a type of extinct Mesozoic rodent - with well-preserved middle ear bones from the Cretaceous Jehol Biota of China. The findings were published in Nature on November 27.

The new mammal, Jeholbaatar kielanae, has a middle ear that is distinct from those of its relatives. WANG Yuanqing and WANG Haibing from IVPP, along with MENG Jin from AMNH, proposed that the evolution of its auditory apparatus might have been driven by specialization for feeding.

Fossil evidence shows that postdentary bones were either embedded in the postdentary trough on the medial side of the dentary or connected to the dentary via an ossified Meckel's cartilage in early mammals, prior to their migration into the cranium as seen in extant mammals.

Detachment of the mammalian middle ear bones from the dentary occurred independently at least three times. But how and why this process took place in different clades of mammals remains unclear.

The Jeholbaatar kielanae specimen was discovered in the Jiufotang Formation in China's Liaoning Province (Jehol Biota). It displays the first well-preserved middle-ear bones in multituberculates, providing solid evidence of the morphology and articulation of these bony elements, which are fully detached from the dentary.

It reveals a unique configuration with more complete components than those previously reported in multituberculates. The new fossil reveals a transitional stage in the evolution of the surangular - a "reptilian" jawbone.

In light of current evidence, scientists argue that the primary (malleus-incus) and secondary (squamosal-dentary) jaw joints co-evolved in allotherians, allowing a distinct palinal (anteroposterior) jaw movement while chewing.

Detachment of the auditory apparatus of the middle ear would have gained higher selective pressure in order to increase feeding efficiency, suggesting that evolution of the middle ear was probably triggered by functional constraints on the feeding apparatus in allotherians.

Credit: 
Chinese Academy of Sciences Headquarters

Barbequed clams on the menu for ancient Puerto Ricans

image: Photographs of all shells analyzed in this study.

Image: 
Cardiff University

Scientists have reconstructed the cooking techniques of the early inhabitants of Puerto Rico by analysing the remains of clams.

Led by Philip Staudigel, who conducted the analysis as a graduate student at the University of Miami Rosenstiel School and is now a postdoctoral researcher at Cardiff University, the team has used new chemical analysis techniques to identify the exact cooking temperatures at which clams were cooked over 2500 years ago.

With cooking temperatures getting up to around 200oC according to the new analysis, the team believe the early Puerto Ricans were partial to a barbeque rather than boiling their food as a soup.

The study, which also involved academics from the University of Miami and Valencia College, has been published today in the journal Science Advances.

Whilst the results throw new light on the cultural practices of the first communities to arrive on the island of Puerto Rico, they also provide at least circumstantial evidence that ceramic pottery technology was not widespread during this period of history - it's likely that this would be the only way in which the clams could have been boiled.

Lead author of the study Dr Philip Staudigel, currently at Cardiff University's School of Earth and Ocean Sciences, said: "Much of peoples' identity draws upon on where they came from, one of the most profound expressions of this is in cooking. We learn to cook from our parents, who learned from their parents.

"In many parts of the world, written records extend back thousands of years, which often includes recipes. This is not the case in the Caribbean, as there were no written texts, except for petroglyphs. By learning more about how ancient Puerto Rican natives cooked their meals, we can relate to these long-gone peoples through their food."

In their study, the team analysed over 20kg of fossilised clam shells at the University of Miami's Rosenstiel School of Marine and Atmospheric Sciences Stable Isotope Lab, which were collected from an archaeological site in Cabo Rojo, Puerto Rico.

The pre-Arawak population of Puerto Rico were the first inhabitants of the island, arriving sometime before 3000 BC, and came from Central and/or South America. They existed primarily from fishing, hunting, and gathering near the mangrove swamps and coastal areas where they had settled.

The fossilised shells, dating back to around 700 BC, were cleaned and turned into a powder, which was then analysed to determine its mineralogy, as well as the abundance of specific chemical bonds in the sample.

When certain minerals are heated, the bonds between atoms in the mineral can rearrange themselves, which can then be measured in the lab. The amount of rearrangement is proportional to the temperature the mineral is heated.

This technique, known as clumped isotope geochemistry, is often used to determine the temperature an organism formed at but in this instance was used to reconstruct the temperature at which the clams were cooked.

The abundance of bonds in the powdered fossils was then compared to clams which were cooked at known temperatures, as well as uncooked modern clams collected from a nearby beach.

Results showed that that the majority of clams were heated to temperatures greater than 100°C - the boiling point of water - but no greater than 200°C. The results also revealed a disparity between the cooking temperature of different clams, which the researchers believe could be associated with a grilling technique in which the clams are heated from below, meaning the ones at the bottom were heated more than the ones at the top.

"The clams from the archaeological site appeared to be most similar to clams which had been barbequed," continued Dr Staudigel.

"Ancient Puerto Ricans didn't use cookbooks, at least none that lasted to the present day. The only way we have of knowing how our ancestors cooked is to study what they left behind. Here, we demonstrated that a relatively new technique can be used to learn what temperature they cooked at, which is one important detail of the cooking process."

Credit: 
Cardiff University

Scientists develop first implantable magnet resonance detector

image: Illustration of the tiny MRT needle in the brain tissue.

Image: 
whitehoune - stock.adobe.com, MPI f. Biological Cybernetics, University of Stuttgart. Compositing: Martin Vötsch (design-galaxie.de).

A team of neuroscientists and electrical engineers from Germany and Switzerland developed a highly sensitive implant that enables to probe brain physiology with unparalleled spatial and temporal resolution. They introduce an ultra-fine needle with an integrated chip that is capable of detecting and transmitting nuclear magnetic resonance (NMR) data from nanoliter volumes of brain oxygen metabolism. The breakthrough design will allow entirely new applications in the life sciences.

The group of researchers led by Klaus Scheffler from the Max Planck Institute for Biological Cybernetics and the University of Tübingen as well as by Jens Anders from the University of Stuttgart identified a technical bypass that bridges the electrophysical limits of contemporary brain scan methods. Their development of a capillary monolithic nuclear magnetic resonance (NMR) needle combines the versatility of brain imaging with the accuracy of a very localized and fast technique to analyze the specific neuronal activity of the brain. "The integrated design of a nuclear magnetic resonance detector on a single chip supremely reduces the typical electromagnetic interference of magnetic resonance signals. This enables neuroscientists to gather precise data from minuscule areas of the brain and to combine them with information from spatial and temporal data of the brain's physiology," explains principal investigator Klaus Scheffler. "With this method, we can now better understand specific activity and functionalities in the brain."

According to Scheffler and his group, their invention may unveil the possibility of discovering novel effects or typical fingerprints of neuronal activation, up to specific neuronal events in brain tissue. "Our design setup will allow scalable solutions, meaning the possibility of expanding the collection of data from more than from a single area - but on the same device. The scalability of our approach will allow us to extend our platform by additional sensing modalities such as electrophysiological and optogenetic measurements," adds the second principal investigator Jens Anders.

The teams of Scheffler and Anders are very confident that their technical approach may help demerge the complex physiologic processes within the neural networks of the brain and that it may uncover additional benefits that can provide even deeper insights into the functionality of the brain. With their primary goal to develop new techniques that are able to specifically probe the structural and biochemical composition of living brain tissue, their latest innovation paves the way for future highly specific and quantitative mapping techniques of neuronal activity and bioenergetic processes in the brain cells.

Credit: 
Max-Planck-Gesellschaft

Prostate cancer 'super responders' live for 2 years on immunotherapy

Some men with advanced prostate cancer who have exhausted all other treatment options could live for two years or more on immunotherapy, a major clinical trial has shown.

Researchers found that a small proportion of men were 'super responders' and were alive and well even after the trial had ended despite having had a very poor prognosis before treatment.

The study found that one in 20 men with end-stage prostate cancer responded to the immunotherapy pembrolizumab - but although the number who benefited was small, these patients sometimes gained years of extra life.

The most dramatic responses came in patients whose tumours had mutations in genes involved in repairing DNA, and the researchers are investigating whether this group might especially benefit from immunotherapy.

The phase II clinical trial was led globally by a team at The Institute of Cancer Research, London, and The Royal Marsden Foundation Trust, and involved 258 men with advanced prostate cancer who had previously been treated and become resistant to androgen deprivation therapy and docetaxel chemotherapy.

The study is published today (Wednesday) in the Journal of Clinical Oncology and was funded by the drug's manufacturer Merck, Sharpe & Dohme.

Overall, 5 per cent of men treated with pembrolizumab saw their tumours actually shrink or disappear, while a larger group of 19 per cent had some evidence of tumour response with a decrease in prostate-specific antigen (PSA) level.

Among a group of 166 patients with particularly advanced disease and high levels of PSA, the average length of survival was 8.1 months with pembrolizumab.

Nine of these patients saw their disease disappear or partly disappear on scans. And of these, four were super-responders who remained on treatment at the end of study follow-up, with responses lasting for at least 22 months.

A second group of patients whose PSA levels were lower but whose disease had spread to the bone lived for an average of 14.1 months on pembrolizumab.

New larger trials are now under way to test whether men with DNA repair gene mutations in their tumours, or those whose cancer has spread to the bone, should receive pembrolizumab as part of their care.

The study also compared the effectiveness of pembrolizumab in men whose tumours had a protein called PD-L1 on the surface of their cancer cells and those whose tumours did not. Targeting PD-L1 activity with pembrolizumab takes the 'brakes' off the immune system, setting it free to attack cancer cells.

But the study found that testing for PD-L1 was not sufficient to tell which patients would respond to treatment. Men with PD-L1 in their tumours survived 9.5 months compared with 7.9 months for patients without PD-L1 in their tumours.

Identifying better tests to pick out who will respond best will be critical if pembrolizumab is to become a standard part of prostate cancer treatment.

Pembrolizumab was well tolerated, with 60 per cent of patients reporting any side effects and only 15 per cent of patients experiencing grade 3-5 side effects.

Professor Johann de Bono, Regius Professor of Cancer Research at The Institute of Cancer Research, London, and Consultant Medical Oncologist at The Royal Marsden NHS Foundation Trust, said:

"Our study has shown that a small proportion of men with very advanced prostate cancer are super responders to immunotherapy and could live for at least two years and possibly considerably longer.

"We don't see much activity from the immune system in prostate tumours, so many oncologists thought immunotherapy wouldn't work for this cancer type. But our study shows that a small proportion of men with end-stage cancer do respond, and crucially that some of these men do very well indeed.

"We found that men with mutations in DNA repair genes respond especially well to immunotherapy, including two of my own patients who have now been on the drug for more than two years. I am now leading a larger-scale trial specifically for this group of patients and am excited to see the results."

Professor Paul Workman, Chief Executive of The Institute of Cancer Research, London, said:

"Immunotherapy has had tremendous benefits for some cancer patients and it's fantastic news that even in prostate cancer, where we don't see much immune activity, a proportion of men are responding well to treatment.

"A limitation with immunotherapy is that there's no good test to pick out those who are most likely to respond. It's encouraging to see testing for DNA repair mutations may identify some patients who are more likely to respond, and I'm keen to see how the new, larger trial in this group of patients plays out."

Credit: 
Institute of Cancer Research

Solving fossil mystery could aid quest for ancient life on Mars

image: A life-like structure created in the lab by chemical gardening.

Image: 
Sean McMahon

The search for evidence of life on Mars could be helped by fresh insights into ancient rocks on Earth.

Research which suggests that structures previously thought to be fossils may, in fact, be mineral deposits could save future Mars missions valuable time and resources.

Microscopic tubes and filaments that resemble the remains of tiny creatures may have been formed by chemical reactions involving iron-rich minerals, the study shows.

Previous research had suggested that such structures were among the oldest fossils on Earth.

The new findings could aid the search for extraterrestrial life during future missions to Mars by making it easier to distinguish between fossils and non-biological structures.

The discovery was made by a scientist from the University of Edinburgh who is developing techniques to seek evidence that life once existed on Mars.

Astrobiologist Sean McMahon created tiny formations in the lab that closely mimic the shape and chemical composition of iron-rich structures commonly found in Mars-like rocks on Earth, where some examples are thought to be around four billion years old.

Dr McMahon created the complex structures by mixing iron-rich particles with alkaline liquids containing the chemicals silicate or carbonate.

This process - known as chemical gardening - is thought to occur naturally where these chemicals abound. It can occur in hydrothermal vents on the seabed and when deep groundwater circulates through pores and fractures in rocks.

His findings suggest that structure alone is not sufficient to confirm whether or not microscopic life-like formations are fossils. More research will be needed to say exactly how they were formed.

The study, published in the journal Proceedings of the Royal Society B, was funded by the European Union's Horizon 2020 programme.

Dr Sean McMahon said: "Chemical reactions like these have been studied for hundreds of years but they had not previously been shown to mimic these tiny iron-rich structures inside rocks. These results call for a re-examination of many ancient real-world examples to see if they are more likely to be fossils or non-biological mineral deposits."

Credit: 
University of Edinburgh

Dance of the RNases: Coordinating the removal of RNA-DNA hybrids

image: S restricted expression and G2/M restricted expression.

Image: 
ill./©: IMB

Two research teams led by Professors Brian Luke and Helle Ulrich at the Institute of Molecular Biology have deciphered how two enzymes, RNase H2 and RNase H1, are coordinated to remove RNA-DNA hybrid structures from chromosomes. RNA-DNA hybrids are important for promoting normal cell activities like gene regulation and DNA repair, but having too many is also a risk for DNA damage and can lead to neurodegenerative disease and cancer. In their article, which was published today in Cell Reports, Brian and Helle show that the enzyme RNase H2 removes RNA-DNA hybrids primarily after DNA replication. Any remaining RNA-DNA structures are then removed by RNase H1, which acts independently of cell cycling.

DNA is normally found as a stable, double-stranded structure. However, DNA also sometimes interacts with RNA to form RNA-DNA hybrid structures that regulate gene expression and DNA repair. R-loops are a special type of RNA-DNA hybrid in which an RNA strand binds to one strand of a DNA molecule and pushes out the other DNA strand so that it is exposed as a single-stranded loop. R-loops can regulate gene activity, but also quickly become dangerous because incorrect removal can damage the DNA, potentially causing mutations. Therefore, excess R-loop formation can be toxic for cells - indeed, mutations in R-loop removal proteins are known to contribute to neuroinflammatory diseases and cancer.

R-loop removal is catalysed by the enzymes RNase H1 and RNase H2, which degrade the RNA strand. In addition, RNase H2 also has a secondary ability to excise single ribonucleotides, which can sometimes be mistakenly incorporated into DNA by polymerases in a process known as ribonucleotide excision repair (RER). Previous studies showed that mutation of RNase H2 disrupted genome stability more than RNase H1 mutation, suggesting that RNase H2 has a more important role in maintaining genome stability. However, it was never fully understood how these important enzymes are coordinated.

To dissect the distinct roles of RNase H1 and H2 in R-loop removal, the Luke lab engineered yeast to express RNase H1 and H2 only during specific phases of the cell cycle and then exposed them to methyl methanesulfonate (MMS), an agent that increases R-loop formation. Only yeast that could effectively remove R-loops would survive, while those with impaired R-loop removal would not survive.

With support from the Ulrich lab, they found that yeast expressing RNase H2 exclusively during G2 (the 'growth phase' of the cell cycle following DNA replication) were resistant to MMS, whereas yeast expressing RNase H2 only during S phase (the DNA replication phase) were more sensitive. This suggested that RNase H2 primarily acts to process R-loops during G2. In contrast, yeast expressing RNase H1 in either G2 or S phase were both able to survive in MMS. Surprisingly, RNase H2 expression in S phase actually induced more DNA damage, which required a special type of DNA repair called homologous recombination to fix. This pathway was not previously known to act during S phase. Therefore, this study may have revealed an unexplored repair pathway which counters damage caused by RNase H2 activity during DNA replication.

These results may explain why cells have evolved two different RNAse H enzymes. Brian Luke clarifies: "We think that RNase H2 is the 'housekeeping' enzyme that repairs the majority of RNA-DNA hybrids, but it is strictly regulated by cell cycling and acts only in G2 phase, or after DNA replication." Brian and Helle speculate this may be because the additional RER activity of RNase H2 creates nicks in the DNA, which are a risk for double-strand breaks during DNA replication in S phase. Therefore, cells may have also evolved a secondary enzyme, RNase H1, which does not have RER activity and can act in all phases of the cell cycle, including S phase.

These findings help us to further understand how cells repair DNA damage associated with RNA-DNA hybrids and how impairment of this process contributes to disease.

Credit: 
Johannes Gutenberg Universitaet Mainz

Comprehensive reviews by leading experts focus on challenging areas of vitamin D research

Vitamin D is essential for human health. Although much is known about its important role in bone metabolism and in certain areas of non-skeletal health, there are many open questions and topics of debate. With the help of guest editor, Terry Aspray, Calcified Tissue International has commissioned seven state-of-the-art expert reviews which provide insights into current consensus and new directions of research in a wide range of topics - including vitamin D's role in maternal health, rheumatoid arthritis, respiratory disease, and muscle.

Dr Aspray stated: "I am very impressed with the breadth of coverage of this topic, evident from these reviews, which cover a range of clinically relevant science relating to vitamin D from true experts in the field."

The reviews include:

Vitamin D Measurement, the Debates Continue, New Analytes Have Emerged, Developments Have Variable Outcomes

William D. Fraser, Jonathan C. Y. Tang, John J. Dutton, Inez Schoenmakers

This review describes developments in the measurement of the commonly analysed vitamin D metabolites in clinical and research practice. It describes current analytical approaches, discusses differences between assays, their origin, and how these may be influenced by physiological and experimental conditions. An overview of the value and application of the measurement of 1,25 dihydroxyvitamin D, 24,25 dihydroxyvitamin D and free 25OHD in the diagnosis of patients with abnormalities in vitamin D metabolism and for research purposes is provided.

Vitamin D Deficiency: Defining, Prevalence, Causes, and Strategies of Addressing

Kevin D. Cashman

Vitamin D deficiency, even assessed at the most conservative thresholds, is widespread in both low- and high-income country settings. This review addresses environmental factors and personal characteristics which prevent or impede dermal synthesis of vitamin D and looks at a number of strategies for addressing low dietary vitamin D intake and consequently lowering the risk of vitamin D deficiency.

Vitamin D, and Maternal and Child Health

Rebecca J. Moon, Justin H. Davies, Cyrus Cooper, Nicholas C. Harvey

Vitamin D has important roles in calcium metabolism and in the prevention of rickets and osteomalacia. Low levels of 25-hydroxyvitamin D are common in the general population and amongst pregnant women. This paper reviews the evidence to support routine vitamin D supplementation in childhood and pregnancy and points to the need for future research to focus on individual characteristics and genetic factors which may influence the response to supplementation.

Vitamin D, Autoimmune Disease and Rheumatoid Arthritis

Stephanie R. Harrison, Danyang Li, Louisa E. Jeffery, Karim Raza, Martin Hewison

The overall aim of this review is to provide a fresh perspective on the potential role of vitamin D in Rheumatoid Arthritis (RA) pathogenesis and treatment. It explores the immune activities of vitamin D that impact autoimmune disease, with specific reference to RA. As well as outlining the mechanisms linking vitamin D with autoimmune disease, the review also looks at the different studies that have linked vitamin D status to RA, and the current supplementation studies that have explored the potential benefits of vitamin D for its prevention or treatment.

Targeting Vitamin D Deficiency to Limit Exacerbations in Respiratory Diseases: Utopia or Strategy With Potential?

Karen Maes, Jef Serré, Carolien Mathyssen, Wim Janssens, Ghislaine Gayan-Ramirez

This review focuses on vitamin D as a potential candidate to treat or prevent exacerbations in Cystic Fibrosis, COPD, and asthma. Patients with such respiratory diseases often experience an acute worsening of respiratory symptoms (termed exacerbations), mostly triggered by a respiratory infection. Exacerbations often require hospitalization and are an important cause of mortality. Many patients do not benefit from existing therapies and suffer from recurrent events, and vitamin D deficiency is extremely prevalent in these patients.

The Latest Evidence from Vitamin D Intervention Trials for Skeletal and Non-skeletal Outcomes

Arvind Sami, Bo Abrahamsen

Vitamin D has long been considered a central part of the treatment paradigm for osteoporosis. Initial studies in high-risk populations with widespread vitamin D deficiency found a reduction of both vertebral and non-vertebral fractures. Subsequent studies in the general population have yielded mixed but mostly disappointing results both for skeletal and especially non-skeletal outcomes. Recent sequential trial meta-analyses suggest that future studies are likely to be futile given the overall disappointing result. However, mega-trials are still in progress, and additional results have been released. This narrative review aims to evaluate new literature to determine if there has been any substantial change in the message.

Vitamin D and Skeletal Muscle: Emerging Roles in Development, Anabolism and Repair

Christian M. Girgis

This review focuses on morphologic and functional roles of vitamin D in muscle, from strength to contraction, to development and ageing, and will characterise the controversy of the vitamin D receptor's (VDR) expression in skeletal muscle, central to our understanding of vitamin D's effects on this tissue.

Credit: 
International Osteoporosis Foundation

New study shows a minimum dose of hydromethylthionine could slow cognitive decline

ABERDEEN, Scotland and Singapore, 26 November, 2019 - In a paper published in today's online issue of the Journal of Alzheimer's Disease (DOI 10.3233/JAD-190772), TauRx has reported unexpected results of a pharmacokinetic analysis of the relationship between treatment dose, blood levels and pharmacological activity of the drug hydromethylthionine on the brain in over 1,000 patients with mild-to-moderate Alzheimer's disease. These results showed that, even at the lowest dose of hydromethylthionine previously tested in two Phase 3 global clinical trials (8 mg/day), the drug produced concentration-dependent effects on cognitive decline and brain atrophy.

Hydromethylthionine, taken as a tablet, is the WHO-approved non-proprietary name for the compound previously referred to by TauRx as LMTM. This drug blocks abnormal aggregation of tau protein in the brain,2, 3 which is increasingly recognised as an important driver of clinical dementia.1 In Phase 3 global clinical trials conducted in almost 1,700 patients with mild-to-moderate Alzheimer's disease between 2012-2016, hydromethylthionine was tested at doses of 150-250 mg/day against a low dose of 8 mg/day, which was intended only as a control to mask the discolouration of urine that can sometimes occur with the drug. The study designs were based on the findings from an earlier trial that used a different variant of the drug. 6 Surprisingly, there was no difference between the high doses and the low dose of hydromethylthionine on any of the clinical outcomes in the trials. 4,5

To further explore these results, the researchers conducted a new pharmacokinetic population analysis using plasma concentration data from 1,162 of the patients who participated in either of the two completed Phase 3 hydromethylthionine trials to measure how blood levels of the drug relate to its effects on the brain. Using a new assay, the researchers found that the effects of hydromethylthionine at the 8 mg/day dose were determined by the blood level, and that the majority of patients had high enough blood levels of the drug at this dose to produce meaningful reductions in cognitive decline and brain atrophy. They concluded that a slightly higher dose of hydromethylthionine of 16 mg/day would ensure that all patients would have the blood levels needed to maximise the drug's activity, since its effects plateau at higher concentrations and doses. The pharmacokinetic profile they found, typical of many drugs, now explains why the pharmacological effects of hydromethylthionine at the high doses tested in the trials were no better than those seen in patients with high blood levels at the 8 mg/day dose.

The analysis also showed that whilst hydromethylthionine has a similar concentration-response profile in patients taking the drug as an add-on therapy to the routinely used symptomatic treatments in Alzheimer's disease, the maximum effect in these patients was reduced by half. This finding supports the hypothesis that symptomatic drugs for this condition interfere with the disease-modifying treatment effects of hydromethylthionine. This hypothesis was initially proposed on the basis of the drug's Phase 3 trial results. 4,5

"Since we already have a substantial database supporting the safety and tolerability of hydromethylthionine in clinical trials of patients with mild-to-moderate Alzheimer's disease, the additional results of this analysis have given us the confidence to expand the scope of the new TauRx Lucidity clinical trial to confirm the potential efficacy of the hydromethylthionine 16 mg/day dose in these types of patients," said Prof. Claude Wischik, of Aberdeen University and executive chairman of TauRx Therapeutics Ltd.

He noted that hydromethylthionine is taken in a convenient oral form at home and does not require patients to attend clinics for intravenous infusions or injections, unlike various other Alzheimer's disease treatments currently being tested in clinical trials.

"In addition to the reduction in brain atrophy, we were surprised to see the large cognitive effects of treatment in the patient group with the higher blood levels of hydromethylthionine at the 8 mg daily dose," he added. "According to scores from the ADAS-cog scale, the effect was around 7.5 points, or three times that seen from current routine Alzheimer's treatments, and would be equivalent to an 85% reduction in cognitive decline over 65 weeks." The Alzheimer's Disease Assessment Scale-cognitive subscale (ADAS-Cog) is the standard cognitive scale used to measure neuropsychological changes in Alzheimer's disease clinical trials. A 4-point change is generally considered as indicating a clinically meaningful difference.

Professor George Perry, Editor-in-Chief of Journal of Alzheimer's Disease, commented: "The extensive data, experience, and now pharmacokinetics, highlight the potential of hydromethylthionine treatment as an important new avenue forward in Alzheimer's disease. The clinical benefit and reduction in brain atrophy greatly exceed those reported for other therapeutic routes."

Professor Serge Gauthier, Director of the Alzheimer Disease Research Unit, McGill Center for Studies in Aging, commented: "The researchers are aiming to confirm what they have found so far in the placebo-controlled trial that is now ongoing. Hydromethylthionine is the best hope we have right now for a disease-modifying drug acting on the tau pathology associated with Alzheimer's disease."

Credit: 
IOS Press

How do scars form? Fascia function as a repository of mobile scar tissue

image: Fascia cells (green) rising into dermal open wounds dragging their surrounding matrix (magenta).

Image: 
© Helmholtz Zentrum München / Donovan Correa -Gallegos

Abnormal scarring is a serious threat resulting in non-healing chronic wounds or fibrosis. Scars form when fibroblasts, a type of cell of connective tissue, reach wounded skin and deposit plugs of extracellular matrix. Until today, the question about the exact anatomical origin of these fibroblasts has not been answered. In order to find potential ways of influencing the scarring process, the team of Dr. Yuval Rinkevich, Group Leader for Regenerative Biology at the Institute of Lung Biology and Disease at Helmholtz Zentrum München, aimed to finally find an answer. As it was already known that all scars derive from a fibroblast lineage expressing the Engrailed-1 gene - a lineage not only present in skin, but also in fascia - the researchers intentionally tried to understand whether or not fascia might be the origin of fibroblasts.

Fibroblasts kit - ready to heal wounds

In order to find out, they used an array of techniques including genetic lineage tracing, anatomical fate mapping, and genetic ablation, a method which in selected cells leads to apoptosis, cell death. This extinguished the fascia fibroblasts. It was discovered that no matrix was incorporated into the wounds and only abnormal and unhealthy scars with major disadvantages were formed.

In another approach the team placed a porous film beneath the skin to prevent fascia fibroblasts from migrating upwards. This, however, led to chronic open wounds. The researchers concluded that fascia contains a specialized prefabricated kit of sentry fibroblasts, embedded within a movable sealant, that preassemble together all the cell types and matrix components needed to heal wounds. They are assuming that guided homing of fascia initiates the hallmark response to external and internal injuries.

Scarring ensures survival

The new findings are important in context of ensuring survival: In mammals scarring injury induces a universal fibrotic tissue response that quickly patches wounds with scars - and thus prevents infection and bleeding to death. The hitherto tenet in wound repair was that scars form de novo by fibroblasts depositing extracellular matrix at sites of injury. With this study, the researchers could proof that scars originate from reservoirs of matrix jelly that are dragged into open wounds by sentry fibroblasts embedded in the fascia. These novel findings contradict current paradigms of how wounds repair.

New methods of scarless regenerative healing

The knowledge that fascia is the origin of scars and the finding of new mechanisms of wound repair provide a novel therapeutic space to curtail pathological fibrotic responses and induce scarless regenerative healing across a range of medical settings.

"The findings of our research give fascia tissue a new role for future science. This will shift the attention of the scientific community to not only to look at fibroblasts in the dermis but also at native cells in the fascia when researching on wound healing," says Rinkevich.

Donovan Correa-Gallegos, PhD student at Helmholtz Zentrum München and first co-author of the study, comments: "Our new findings challenge and reconfigure the traditional view of the body's matrix system of connective tissue. This is opening up a new biological concept that radiates to a variety of aspects of scar-related disease."

Credit: 
Helmholtz Munich (Helmholtz Zentrum München Deutsches Forschungszentrum für Gesundheit und Umwelt (GmbH))

Problems of homophobia and transphobia in sport

The overarching aim of the Europe-wide joint project was to develop strategies and training measures in the field of sport in order to counter discrimination and violence related to sexual orientation or gender identity. Participation in sport at all levels should be made easier for lesbians, gays, bisexuals, transgender persons and intersexuals. In the first study, an online survey was used in which more than 5,500 LGBTI* from all 28 EU states were asked about their experiences in sport. In the second study, representatives of 15 sports associations, sports federations and umbrella organisations from the five project countries were interviewed about their strategies for combating homo-/transphobic discrimination in sport.

The overwhelming majority of respondents perceive homophobia and transphobia to be a problem in sport; homophobic and transphobic language is widespread, especially in team sports. As a result, one third of those active in sport conceal their sexual orientation or gender identity within the context of their sporting activities. More than a third of those questioned were unable to name a single organisation or individual they could contact in the event of a negative experience or incidence. "Discrimination against LGBTI* is a problem facing society as a whole," says Professor Ilse Hartmann-Tews, Director of Studies at the German Sports University, "which is why each one of us should feel responsible for creating a culture of respect." In the area of organised sport, the study recommends an open and proactive attitude towards questions of sexual and gender diversity on the part of all men and women active at every level of clubs, associations and sports federations.

The collaboration of five European project countries has lasted three years and will end on 31 December. Results were presented and discussed at various levels, including the final conference of OUTSPORT held in Budapest, an international conference on the situation of LGBTI* in sport in Barcelona, the sports committee of the NRW state parliament in Düsseldorf and the Federal Network Conference of Queer Sports Clubs (BuNT) in Hamburg.

Credit: 
German Sport University

Biodiversity and wind energy

image: Victim of wind power plant

Image: 
Christian Voigt, Leibniz-IZW

The replacement of fossil and nuclear energy sources for electricity production by renewables such as wind, sun, water and biomass is a cornerstone of Germany's energy policy. Amongst these, wind energy production is the most important component. However, energy production from wind is not necessarily ecologically sustainable. It requires relatively large spaces for installation and operation of turbines, and bats and birds die after collisions with rotors in significant numbers. For these reasons, the location and operation of wind energy plants are often in direct conflict with the legal protection of endangered species. The almost unanimous opinion of experts from local and central government authorities, environmental NGOs and expert offices is that the current mechanisms for the protection of bats in wind power projects are insufficient. This is one conclusion from a survey by the Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW) published in the "Journal of Renewable and Sustainable Energy".

More than 500 representatives of various stakeholders (expert groups) involved in the environmental impact assessment of wind turbines took part in the Leibniz-IZW survey. This group included employees of conservation agencies and government authorities, representatives of non-governmental organisations in the conservation sector, consultants, employees of wind energy companies and scientists conducting research on renewable energies or on biodiversity. The survey focused on assessments on the contribution of wind energy to the transformation of the energy system, on ecologically sustainable installation and operation of wind turbines and on possible solutions for the trade-off between tackling climate change and protecting biological diversity.

"We found both significant discrepancies and broad consensus among participants," states Christian Voigt, department head at Leibniz-IZW and first author of the survey. "The overwhelming majority of respondents confirmed that there is a direct conflict between green electricity and bat protection. Most importantly, they considered the protection of biodiversity to be just as important as the contribution to protect the global climate through renewable energy production." Most stakeholders agreed that small to moderate losses in the yield of wind power plants in terms of electricity production and thus in financial terms instigated by the consistent application of conservation laws must become acceptable. Possible shutdown times in electricity production should be compensated. "We will probably have to accept a higher price of green electricity for the purpose of the effective protection of bats in order to compensate for the shutdown times of wind turbines," Voigt sums up. "This does not address the unsolved issue of how to deal with habitat loss, especially when wind turbines are placed in forests."

The conflict between wind power projects and the objectives of biological conservation intensified in recent years because the rapidly rising number of wind plants - there are now around 30,000 on mainland Germany - has made suitable locations scarce. As a result, new plants are increasingly erected in locations where conflicts with wildlife and the protection of wildlife are more likely, for example in forests. "According to members of conservation authorities, only about 25 % of wind turbines are operated under mitigation schemes such as temporary halt of wind turbine operation during periods of high bat activity (for example during the migration season), at relatively low wind speeds and at high air temperatures even though the legal framework that protects bats would require the enforcemnt of such measures," adds author Marcus Fritze of Leibniz-IZW. In addition, it became clear from the survey that members of the wind energy industry hold views on some aspects of the green-green dilemma that differ from those of other expert groups. "Representatives of the wind energy industry consider compliance with climate protection targets as more important than measures to protect species. All other expert groups disagree with this notion," said Fritze. "A consistent dialogue between all participants therefore seems particularly important in order to enable ecologically sustainable wind energy production."

The survey also showed that

more than 95% of respondents consider the transformation of the energy system ("Energiewende") to be important,

all expert groups agreed on aiming for an ecologically sustainable energy transition,

two thirds of stakeholders in the wind energy industry shared the view that wind energy production should be promoted more strongly than energy production from other renewable sources whereas 85% of representatives from the other stakeholders disagreed with this,

86% of the survey participants outside the wind energy sector gave green electricity no higher priority than the protection of wildlife whereas only 4% of representatives of the wind sector industry shared this opinion (almost half were undecided or consider wind power to be more important than biodiversity protection).

For the purpose of this survey, the authors selected bats as a representative group of species for all wildlife affected by wind turbines, because large numbers of bats die at turbines and they enjoy a high level of protection both nationally and internationally, and therefore play an important role in planning and approval procedures for wind turbines. Indeed, the high collision rates of bats at wind turbines may be relevant to entire bat populations. The common noctule is the most frequent victim of wind turbines; this species is rated as declining by the Federal Agency for Nature Conservation in Germany. Furthermore, the results of years of research in the department headed by Voigt at the Leibniz-IZW show that the losses affect both local bat population as well as migrating individuals. Thus, fatalities at wind turbines in Germany affect bat populations in Germany as well as populations in other European regions from where these bats originate.

On the basis of the survey results, the authors argue in favour of a stronger consideration of nature conservation objectives in the wind energy industry and for an appreciation of the targets of the conservation of biological diversity. They suggest ways in which the cooperation of those involved in the planning and operation of wind power projects can be improved that both wind energy production and the goals of biological conservation can be satisfied.

Credit: 
Forschungsverbund Berlin

Unique sled dogs helped the inuit thrive in the North American Arctic

image: A team of Greenland sled dogs are descendants of the Inuit of North American Arctic.

Image: 
Article author Tatiana Tatiana Feuerborn

Inuit sled dogs have changed little since people migrated to the North American Arctic across the Bering Strait from Siberia with them, according to researchers who have examined DNA from the dogs from that time span. The legacy of these Inuit dogs survives today in Arctic sled dogs, making them one of the last remaining descendant populations of indigenous, pre-European dog lineages in the Americas.

The latest research is the result of nearly a decade's work by University of California, Davis, researchers in anthropology and veterinary genetics, who analyzed the DNA of hundreds of dogs' ancient skeletal remains to determine that the Inuit dog had significantly different DNA than other Arctic dogs, including malamutes and huskies.

The article, "Specialized sledge dogs accompanied the Inuit dispersal across the North American Arctic," was published Wednesday in the Proceedings of the Royal Society B: Biological Sciences. From UC Davis, authors include Christyann Darwent, professor of anthropology; Ben Sacks, adjunct professor and director of the Mammalian Ecology and Conservation Unit, Veterinary Genetics Laboratory, School of Veterinary Medicine; and Sarah Brown, a postdoctoral researcher. Lead author Carly Ameen is an archaeologist from the University of Exeter; Tatiana Feuerborn is with the Globe Institute in Denmark and Centre for Palaeogenetics in Sweden; and Allowen Evin is at the CNRS, Université de Montpellier, Institut des Sciences de l'Evolution in Montpellier, France. The list of authors includes many others from a large number of collaborating institutions.

Qimmiit (dogs in Inuktitut) were viewed by the Inuit as particularly well-suited to long-distance hauling of people and their goods across the Arctic and consuming local resources, such as sea mammals, for food.

The unique group of dogs helped the Inuit conquer the tough terrain of the North American Arctic 2,000 years ago, researchers said. Inuit dogs are the direct ancestors of modern Arctic sled dogs, and although their appearance has continued to change over time, they continue to play an important role in Arctic communities.

Experts examined the DNA from 921 dogs and wolves who lived during the last 4,500 years. Analysis of the DNA, and the locations and time periods in which they were recovered archaeologically, shows dogs from Inuit sites occupied beginning around 2,000 years ago were genetically different from dogs already in the region.

According to Sacks "the genetic profiles of ancient dogs of the American Arctic dating to 2,000 years ago were nearly identical to those of older dogs from Siberia, but contrasted starkly with those of more ancient dogs in the Americas, providing an unusually clear and definitive picture of the canine replacement event that coincided with the expansion of Thule peoples across the American Arctic two millennia ago."

Preserving an important history

Research confirms that native peoples maintained their own dogs. By analyzing the shape of elements from 391 dogs, the study also shows that the Inuit had larger dogs with a proportionally narrower cranium to earlier dogs belonging to pre-Inuit groups.

The National Science Foundation-funded portion of the research at UC Davis was inspired by Inuit activist and author Sheila Watt-Cloutier, who told Darwent about Inuit sled-dog culling undertaken by Canadian police in the 1950s and asked if there was a way to use scientific methods to tell the history and importance of sled dogs in the Arctic. Preservation of these distinctive Inuit dogs is likely a reflection of the highly specialized role that dogs played in both long-range transportation and daily subsistence practices in Inuit society.

Credit: 
University of California - Davis