Culture

Inheritance in plants can now be controlled specifically

image: An inversion (left) in thale cress (background) can be undone with CRISPR/Cas (center) to reactivate the exchange of genes (right) in the said section.

Image: 
(Figure: Michelle Rönspies/KIT)

A new application of the CRISPR/Cas molecular scissors promises major progress in crop cultivation. At Karlsruhe Institute of Technology (KIT), researchers from the team of molecular biologist Holger Puchta have succeeded in modifying the sequence of genes on a chromosome using CRISPR/Cas. For the first time worldwide, they took a known chromosome modification in the thale cress model plant and demonstrated how inversions of the gene sequence can be undone and inheritance can thus be controlled specifically. The results are published in Nature Communications (DOI: 10.1038/s41467-020-18277-z).

About 5,000 years ago, genetic information of thale cress was modified. To date, it has spread widely and is of major interest to science. On the chromosome 4 of the plant, a so-called inversion occurred: The chromosome broke at two points and was reassembled again. The broken out section was reinserted, but rotated by 180°. As a result, the sequence of genes on this chromosome section was inverted. This chromosome mutation known as "Knob hk4S" in research is an example of the fact that evolution cannot only modify the genetic material of organisms, but determine it for a long term. "In inverted sections, genes cannot be exchanged between homologous chromosomes during inheritance," molecular biologist Holger Puchta, KIT, explains.

Researchers Remove Obstacle to Crop Cultivation

Inversions do not only affect thale cress (Arabidopsis thaliana), a wild plant used as a model organism in genetics due to its completely decoded genome and its small chromosome number. Inversions can also be found in crop plants. They are an obstacle to cultivation that uses modifications of the genetic material to produce maximum yields and a good taste of the plant and to make the plant resistant to diseases, pests, and extreme climatic conditions.

For the first time, researchers from the Chair for Molecular Biology and Biochemistry held by Puchta at KIT's Botanical Institute have now succeeded in undoing natural inversions. "We considerably extended the applications of the CRISPR/Cas molecular scissors," Puchta says. "We no longer use the scissors for exchanging arms between chromosomes, but also for recombining genes on a single chromosome. For the first time, we have now demonstrated that it is possible to directly control inheritance processes. We can achieve genetic exchange in an area, in which this has been impossible before. With this, we have established chromosome engineering as a new type of crop cultivation."

Molecular Scissors Precisely Cut the DNA

Together with researchers from the team of Professor Andreas Houben, Leibniz Institute of Plant Genetics and Crop Plant Research (IPK) in Gatersleben, and Professor Paul Fransz from the University of Amsterdam, KIT scientists took the most prominent natural inversion hk4S on chromosome 4 of thale cress and demonstrated how this inversion can be undone and how genetic exchange can be achieved in cultivation. Their findings are reported in Nature Communications. The researchers also think that it is possible to use CRISPR/Cas to produce new inversions, which would be another step towards combining desired traits and eliminating undesired properties in crop cultivation.

Holger Puchta is considered a pioneer of genome editing with molecular scissors using the natural principle of mutation to precisely modify the genetic information in plants without introducing foreign DNA. His current project "Multidimensional CRISPR/Cas mediated engineering of plant breeding," CRISBREED for short, now focuses on the recombination of plant chromosomes by means of CRISPR/Cas technology. For this project, Puchta was granted the renowned Advanced Grant by the European Research Council (ERC) for the second time in a row. CRISPR (stands for Clustered Regularly Interspaced Short Palindromic Repeats) represents a certain section on the DNA that carries the genetic information. Cas is an enzyme that recognizes this section and cuts the DNA precisely at that point in order to remove, insert, or exchange genes, recombine chromosomes, and for the first time modify the gene sequence on them.

Credit: 
Karlsruher Institut für Technologie (KIT)

A difficult year for forests, fields and meadows

image: Near Davos, the researchers are observing how the coniferous forest responds to different climatic conditions.

Image: 
ETH Zurich

It was - once again - an unusually hot year: in 2018, large parts of Europe were beset by an extremely hot and dry summer. In Switzerland, too, the hot weather got people sweating - right on the heels of a string of unusually warm months. It was - at the time - the third hottest summer and the fourth warmest spring since measurements began in 1864.

A solid base of measurements

Obviously, such unusual weather conditions also had an impact on ecosystems. Scientists from the group led by Nina Buchmann, Professor of Grassland Sciences, have now used extensive measurement data to show exactly how forests, fields and meadows reacted to the exceptional conditions in 2018. The researchers evaluated measurements from five sites, all of which are part of the Swiss FluxNet initiative, explains Mana Gharun, a postdoc in Buchmann's group and the study's lead author: "The five sites cover all altitude levels from 400 to 2,000 metres above sea level. This means we've taken very different ecosystems into account."

At each of these sites, Buchmann's group has been taking measurements for years at very high temporal resolution of how much CO2, water vapour and other greenhouse gases are exchanged between plants, the atmosphere and the soil, right across the entire ecosystem. This allows the researchers to determine how the sites react to different climatic conditions.

Sharp drop in productivity

Their evaluation, which the researchers have just published in a special issue of the journal Phil Trans B, shows that the heat and drought of 2018 had a particularly severe impact on ecosystems at lower altitudes. In the mixed forest on the Lägeren mountain near Zurich and in the meadows close to Chamau, productivity fell by an average of 20 percent compared with the two previous years. The situation is different for ecosystems at higher altitudes: the coniferous forest near Davos, the meadow near Früebüel and the Weissenstein alpine pasture on the Albula Pass all benefited from warmer temperatures and a longer growing season. The more favourable growth conditions there led to higher productivity in these ecosystems.

However, respiration rates for plants and soil organisms also increased at almost all the sites. This means that while these systems absorbed more CO2 from the atmosphere, they also released more CO2 back into it. "Overall, this results in a lower net carbon uptake for the two forests and the Chamau meadow," Gharun notes. "This finding is unfortunate, since the general expectation is that under warmer conditions these ecosystems would act as carbon sinks to help mitigate climate change," she adds.

Buchmann points out that it is still too early for a final assessment: "We definitely need long-?term data series before we can put these findings in their proper context." She and her group have been collecting measurement data at the abovementioned sites for many years, so she has a good foundation for such long-?term studies.

A lot of snow after the winter

What made 2018 exceptional was not just the warm temperatures in spring and summer, but also the heavy precipitation during the preceding winter: when spring came, the mountains were covered by snow, which then melted very quickly due to the warm conditions. This benefited the higher-?altitude ecosystems in particular. In contrast, the situation at lower altitudes was more difficult, as the ecosystems there were unable to use the excess water from winter to build up a soil moisture reservoir for the summer. Accordingly, they suffered more from the summer drought and heat.

"Water availability is a decisive factor in how ecosystems survive periods of heat," Buchmann says. "Thus, it is important to look beyond the actual dry period when studying a drought." Another unsettling consideration is that the new CH2018 climate change scenarios predict more rain and less snow in winter. The higher levels of precipitation expected in the winter months is therefore of limited benefit to ecosystems when the water runs off quickly, rather than being stored as snow.

Stressed trees

Forests are now in a critical situation. There are several indications of this, one of which is that not only spruce trees but also old beech trees are now showing stress symptoms in many places across the Swiss Plateau. This is probably also due to the fact that the following year, 2019, was also warmer and drier than average. "What we are seeing in the forests is a memory effect," Buchmann explains, "so it's possible that the impacts of such periods may not show up until long after the actual extreme event."

How well the trees survive periods of drought and heat also depends on the depth at which they absorb water. Beech roots, for example, penetrate the soil to a depth of 50 or 60 centimetres and are therefore more likely to reach deeper moist layers. Spruce roots, on the other hand, reach a depth of only about 20 centimetres, making them more likely to be affected by droughts. "Things are going to get uncomfortable for lowland spruce in the medium term," Buchmann notes. "That's not a good forecast for forestry."

Gloomy outlook for farmers

What about the meadows? The two researchers have not yet found a memory effect there because meadows recover more quickly after a dry period. Nevertheless, meadows at lower altitudes produce significantly less forage in a year like 2018 - bad news for farmers. Grassland farming is the central pillar of Swiss agriculture. If less grass grows on meadows in the future because of increasing summer droughts, this will have direct consequences for milk and meat production.

Credit: 
ETH Zurich

How to have a blast like a black hole

image: Magnetic reconnection is generated by the irradiation of the LFEX laser into the micro-coil. The particle outflow accelerated by the magnetic reconnection is evaluated using several detectors. As an example of the results, proton outflows with symmetric distributions were observed.

Image: 
Osaka University

Laser Engineering at Osaka University have successfully used short, but extremely powerful laser blasts to generate magnetic field reconnection inside a plasma. This work may lead to a more complete theory of X-ray emission from astronomical objects like black holes.

In addition to being subjected to extreme gravitational forces, matter being devoured by a black hole can be also be pummeled by intense heat and magnetic fields. Plasmas, a fourth state of matter hotter than solids, liquids, or gasses, are made of electrically charged protons and electrons that have too much energy to form neutral atoms. Instead, they bounce frantically in response to magnetic fields. Within a plasma, magnetic reconnection is a process in which twisted magnetic field lines suddenly "snap" and cancel each other, resulting in the rapid conversion of magnetic energy into particle kinetic energy. In stars, including our sun, reconnection is responsible for much of the coronal activity, such as solar flares. Owing to the strong acceleration, the charged particles in the black hole's accretion disk emit their own light, usually in the X-ray region of the spectrum.

To better understand the process that gives rise to the observed X-rays coming from black holes, scientists at Osaka University used intense laser pulses to create similarly extreme conditions on the lab. "We were able to study the high-energy acceleration of electrons and protons as the result of relativistic magnetic reconnection," Senior author Shinsuke Fujioka says. "For example, the origin of emission from the famous black hole Cygnus X-1, can be better understood."

This level of light intensity is not easily obtained, however. For a brief instant, the laser required two petawatts of power, equivalent to one thousand times the electric consumption of the entire globe. With the LFEX laser, the team was able to achieve peak magnetic fields with a mind-boggling 2,000 telsas. For comparison, the magnetic fields generated by an MRI machine to produce diagnostic images are typically around 3 teslas, and Earth's magnetic field is a paltry 0.00005 teslas. The particles of the plasma become accelerated to such an extreme degree that relativistic effects needed to be considered.

"Previously, relativistic magnetic reconnection could only be studied via numerical simulation on a supercomputer. Now, it is an experimental reality in a laboratory with powerful lasers," first author King Fai Farley Law says. The researchers believe that this project will help elucidate the astrophysical processes that can happen at places in the Universe that contain extreme magnetic fields.

Credit: 
Osaka University

New insight into mammalian stem cell evolution

image: The researchers compared 134 gene sets belonging to the pluripotency gene regulatory networks of 48 mammalian species, and found that this network is highly conserved across species.

Image: 
Mindy Takamiya/Kyoto University iCeMS

The genes regulating pluripotent stem cells in mammals are surprisingly similar across 48 species, Kyoto University researchers report in the journal Genome Biology and Evolution. The study also shows that differences among these 'gene regulating networks' might explain how certain features of mammalian pluripotent stem cells have evolved.

Pluripotent stem cells can self-renew and give rise to all other types of cells in the body. Their characteristics are controlled by a network of regulatory genes and molecules, but little is known about how this network has evolved across mammals.

To this end, Ken-ichiro Kamei of Kyoto University's Institute for Integrated Cell-Material Sciences (iCeMS), with Miho Murayama and Yoshinori Endo of the Wildlife Research Center, compared 134 gene sets belonging to the pluripotency gene regulatory networks of 48 mammalian species.

They found that this network has been highly conserved across species, meaning genetic sequences have remained relatively unchanged over the course of evolution. This high degree of conservation explains why human genetic sequences can reprogram other mammalian tissue cells to turn into pluripotent stem cells. However, since it is also evident that the regulating networks differ across mammals, there might be more efficient combinations of reprogramming factors for each species. Improving techniques for deriving induced pluripotent stem (iPS) cells from mammalian cells, including those from endangered species, could provide a big boost to research and conservation.

"We have been trying to generate induced pluripotent stem cells from various mammalian species, such as the endangered Grévy's zebra and the bottlenose dolphin," says Kamei.

Interestingly, the team found relatively high evolutionary changes in genes just downstream of one of the core gene regulatory networks. "This could indicate that mammalian pluripotent stem cells have diversified more than we thought," says Inoue-Murayama.

The differences between gene regulatory networks in mammalian pluripotent stem cells might also be associated with unique adaptions.

For example, the naked mole rat has been positively selected for a pluripotency regulatory gene that could be involved in giving it its extraordinary longevity and cancer resistance. The gene might also be involved in the development of the extremely sensitive hairs that help them navigate underground.

The researchers also found evidence of positive selection for certain pluripotency gene regulatory network genes involved in the adaptation of large animals, such as the minke whale, the African elephant and the flying fox, to their environments. Surprisingly, these same genes are associated with cancer in other mammals. Since these large animals are known for being relatively resistant to cancer, the researchers suggest that the adaptive alterations these genes underwent in these animals somehow also changed some of their functions, thus giving this group a degree of cancer resistance.

The researchers say the study is among the first to compare the pluripotency gene regulatory networks across major taxa, and could be applicable to evolutional biology studies and for facilitating and improving the generation of induced pluripotent stem cells from new species.

Credit: 
Kyoto University

Rare hyperinflammatory syndrome in children with COVID-19 described

image: Petter Brodin, researcher at SciLifeLab and the Department of Women's and Children's Health, Karolinska Institutet, Sweden. Photo: Ulf Sirborn

Image: 
Ulf Sirborn

Researchers at Karolinska Institutet and Science for Life Laboratory in Sweden and Tor Vergata University of Rome in Italy have mapped the immune response in children affected by a rare but life-threatening inflammatory syndrome associated with COVID-19. The study, which is published in the scientific journal Cell, reveals that the inflammatory response differs from that in Kawasaki disease and severe acute COVID-19.

In the current SARS-CoV-2 pandemic, with very few exceptions, children have presented with mild symptoms. However, paediatricians have discovered a new, life-threatening hyperinflammatory syndrome resembling Kawasaki disease and named Multisystem Inflammatory Syndrome in Children associated with COVID-19, MIS-C (see box).

In a new collaborative study, researchers have worked out the immunological aspects of this rare condition. They compared blood samples from 13 MIS-C-patients treated at Karolinska University Hospital in Stockholm, Sweden and Bambino Gesù Children's Hospital in Rome, Italy, with samples from 28 Kawasaki disease patients collected from 2017 to 2018, prior to COVID-19. The analyses also included samples from children with mild COVID-19.

"Our results show that MIS-C is truly a distinct inflammatory condition from Kawasaki disease, despite having some shared features," says Petter Brodin, paediatrician and researcher at the Department of Women's and Children's Health, Karolinska Institutet, and one lead author of the study. "The hyperinflammation and cytokine storm detected in children with MIS-C is also different from that seen in adult patients with severe, acute COVID-19, which we recently described in another publication."

When comparing MIS-C to these other inflammatory states, the study observed differential frequency of specific immune cell populations, inflammatory cytokines and chemokines in the blood. Unlike children with Kawasaki disease and children with mild COVID-19, children who developed MIS-C were lacking IgG-antibodies to common cold coronaviruses. The researchers also found several autoantibodies that target the body's own proteins and that may contribute to the pathogenesis of MIS-C. They are now also looking into genetic risk factors for developing MIS-C after SARS-CoV-2 infection.

"There is an urgent need to better understand why a small minority of children infected with SARS-CoV-2 develop MIS-C, and we are adding a piece to the puzzle," says Dr Brodin. "Better knowledge of the pathogenesis is important for development of optimal treatments that can dampen the cytokine storm and hopefully save lives, as well as for vaccine development to avoid MIS-C caused by vaccination."

Credit: 
Karolinska Institutet

Gen Z not ready to eat lab-grown meat

Gen Z are the new kids on the block. As a cohort of 5 million people born between 1995-2015 encompassing 20 percent of the Australian population and 2 billion people globally -- they're consumers to be reckoned with.

New research by the University of Sydney and Curtin University to published on 8 September in Frontiers in Nutrition, found that, despite having a great concern for the environment and animal welfare, 72 percent of Generation Z were not ready to accept cultured meat - defined in the survey as a lab-grown meat alternative produced by in-vitro cell cultures of animal cells, instead of from slaughtered animals.

However, despite their lack of enthusiasm for the new meat alternative, 41 percent believed it could be a viable nutritional source because of the need to transition to more sustainable food options and improve animal welfare.

"Our research has found that Generation Z - those aged between 18 and 25 - are concerned about the environment and animal welfare, yet most are not ready to accept cultured meat and view it with disgust," said the study's lead researcher, Dr Diana Bogueva from the University of Sydney's School of Chemical and Biomolecular Engineering.

59 percent of participants were concerned about the environmental impact of traditional livestock farming specifically, however many were not clear on what those impacts were nor did they understand the associated resource depletion.

"In-vitro meat and other alternatives are important as they can help to reduce greenhouse emissions and lead to better animal welfare conditions. However, if cultured meat is to replace livestock-based proteins, it will have to emotionally and intellectually appeal to the Gen Z consumers. It may be through its physical appearance, but what seems to be more important is transparency around its environmental and other benefits," said Dr Bogueva.

Gen Z's concerns about cultured meat

The participants had several concerns relating to cultured meat, including an anticipated taste or disgust, health and safety, and whether it is a more sustainable option.

Societal concerns were also prevalent throughout the study, with a large number of respondents worried that eating cultured meat would be in conflict with perceptions of gender and national identity.

"Gen Z value Australia's reputation as a supplier of quality livestock and meat, and many view traditional meat eating as being closely tied to concepts of masculinity and Australian cultural identity," said Dr Bogueva.

Others were concerned about animal welfare, whereas some viewed cultured meat as a conspiracy orchestrated by the rich and powerful and were determined not to be convinced to consume it. Several participants were also unsure whether cultured meat was an environmentally sustainable option.

"Generation Z are also unsure whether cultured meat is actually more environmentally sustainable, described by several respondents as potentially "resource consuming" and not being "environmentally friendly"," said Dr Bogueva.

"The respondents were effectively divided into two groups: the "against" described cultured meat as "another thing our generation has to worry about" and questioned the motivations of those developing it, while supporters described it as "money invested for a good cause" and "a smart move" by people who are "advanced thinkers."

"This Generation has vast information at its fingertips but is still concerned that they will be left with the legacy of exploitative capitalism that benefits only a few at the expense of many. They have witnessed such behaviour resulting in climate change and are now afraid that a similar scenario may develop in relation to food, particularly as investors are pursuing broader adoption of cultured meat," Dr Bogueva said.

Gen Z's five main attitudes towards cultured meat

17 percent of respondents rejected all alternatives, including cultured meat, seeing it as chemically produced and heavily processed.

11 percent rejected all alternatives in favour of increased consumption of fruit and vegetables, saying they will stick with a vegetarian diet.

35 percent rejected cultured meat and edible insects but accepted plant-based alternatives because they "sounded more natural" and are "normal".

28 percent believed cultured meat was acceptable or possibly acceptable if the technology could be mastered.

A fifth group (9 percent) accepted edible insects but rejected cultured meat as it was too artificial and not natural like insects

How the research was conducted

The researchers collected Generation Z's opinions of cultured meat via an online survey. 227 randomly selected, Australian-based respondents were asked questions about their demographics, dietary preferences (such as how often they liked to eat meat), how they felt about cultured meat and whether they thought it was necessary to accept and consume, as well as their preference for different meat alternatives (such as insects, plant based and cultured meat).

Credit: 
University of Sydney

Rubbing skin activates itch-relief neural pathway

image: Schematic diagram of mechanisms underlying itch relief by stroking skin. Rubbing or stroking of the skin activates vesicular glutamate transporter 3+-low threshold mechanoreceptors (VGLUT3+-LTMRs; red), followed by excitation of itch inhibitory interneurons (blue) in the superficial dorsal horn. The inhibitory interneurons use dynorphin as a neurotransmitter to inhibit pruritogen-responsive neurons (green).

Image: 
Sakai et al., JNeurosci 2020

Stop scratching: rubbing skin activates an anti-itch pathway in the spinal cord, according to research in mice recently published in JNeurosci.

It can be hard to resist the relief of scratching an itch, even though scratching damages skin, especially in sensitive areas like the eyes. But stroking can relieve an itch, too. Sakai et al. investigated the neural pathway behind this less-damaging form of itch relief.

The research team triggered the urge to scratch in mice by administering an itch-inducing chemical underneath their skin. The team then recorded the electrical response from dorsal horn neurons in the spinal cord while they stroked the animals' paws. The neurons fired more often as the mice were stroked and less often after the stroking ended. These neurons respond to both touch and itch, so the increase corresponds to the added touch, not increased itchiness, while the decrease corresponds to itch relief. The same decrease could be seen when the team directly stimulated touch-sensing neurons under the skin. However, inhibiting both sensory neurons and a subtype of anti-itch interneurons in the spinal cord failed to decrease the response from dorsal horn neurons, while activating sensory neurons stopped the mice from scratching. The results show that stroking sets off a cascade, activating sensory neurons under the skin that then activate anti-itch interneurons in the spinal cord, resulting in reduced dorsal horn neural activity and itch relief.

Credit: 
Society for Neuroscience

How do stone forests get their spikes? New research offers pointed answer

video: This video shows an experiment in which a dissolving block of candy develops into an array of sharp spikes. The block starts out with internal pores and is entirely immersed under water, where it dissolves and becomes a "candy forest" before collapsing.

Image: 
NYU's Applied Mathematics Lab

Stone forests--pointed rock formations resembling trees that populate regions of China, Madagascar, and many other locations worldwide--are as majestic as they are mysterious, created by uncertain forces that give them their shape.

A team of scientists has now shed new light on how these natural structures are created. Its research, reported in the latest issue of the journal Proceedings of the National Academy of Sciences (PNAS), also offers promise for the manufacturing of sharp-tipped structures, such as the micro-needles and probes needed for scientific research and medical procedures.

"This work reveals a mechanism that explains how these sharply pointed rock spires, a source of wonder for centuries, come to be," says Leif Ristroph, an associate professor at New York University's Courant Institute of Mathematical Sciences and one of the paper's co-authors. "Through a series of simulations and experiments, we show how flowing water carves ultra-sharp spikes in landforms."

The researchers, who included Michael Shelley, a professor at the Courant Institute, note that the study also illuminates a mechanism that explains the prevalence of sharply pointed rock spires in karst--a topography formed by the dissolution of rocks, such as limestone.

In their study, the scientists simulated the formation of these pinnacles over time through a mathematical model and computer simulations that took into account how dissolving produces flows and how these flows also affect dissolving and thus reshaping of a formation.

To confirm the validity of their simulations, the researchers conducted a series of experiments in NYU's Applied Mathematics Lab. Here, the scientists replicated the formation of these natural structures by creating sugar-based pinnacles, mimicking soluble rocks that compose karst and similar topographies, and submerging them in tanks of water. Interestingly, no flows had to be imposed, since the dissolving process itself created the flow patterns needed to carve spikes.

The experimental results reflected those of the simulations, thereby supporting the accuracy of the researchers' model (see "Video2ExperimentSimulation" in the below drive). The authors speculate that these same events happen--albeit far more slowly--when minerals are submerged under water, which later recedes to reveal stone pinnacles and stone forests.

Credit: 
New York University

Vitamin D levels in the blood can predict future health risks and death

Free, circulating vitamin D levels in the blood may be a better predictor of future health risks in aging men, according to a study being presented at e-ECE 2020. These data suggest the free, precursor form of vitamin D found circulating in the bloodstream is a more accurate predictor of future health and disease risk, than the often measured total vitamin D. Since vitamin D deficiency is associated with multiple serious health conditions as we get older, this study suggests that further investigation into vitamin D levels and their link to poor health may be a promising area for further research.

Vitamin D deficiency is common in Europe, especially in elderly people. It has been associated with a higher risk for developing many aging-related diseases, such as cardiovascular disease, cancer and osteoporosis. However, there are several forms, or metabolites, of vitamin D in the body but it is the total amount of these metabolites that is most often used to assess the vitamin D status of people. The prohormone, 25-dihydroxyvitamin D is converted to 1,25-dihydroxyvitamin D, which is considered the active form of vitamin D in our body. More than 99% of all vitamin D metabolites in our blood are bound to proteins, so only a very small fraction is free to be biologically active. Therefore the free, active forms may be a better predictor of current and future health.

Dr Leen Antonio from University Hospitals Leuven in Belgium and a team of colleagues investigated whether the free metabolites of vitamin D were better health predictors, using data from the European Male Ageing Study, which was collected from 1,970 community-dwelling men, aged 40-79, between 2003 and 2005. The levels of total and free metabolites of vitamin D were compared with their current health status, adjusting for potentially confounding factors, including age, body mass index, smoking and self-reported health. The total levels of both free and bound vitamin D metabolites were associated with a higher risk of death. However, only free 25-hydroxyvitamin D was predictive of future health problems and not free 1,25-dihydroxyvitamin D.

Dr Antonio explains, "These data further confirm that vitamin D deficiency is associated with a negative impact on general health and can be predictive of a higher risk of death."

As this is an observational study, the causal relationships and underlying mechanisms remain undetermined. It was also not possible to obtain specific information about the causes of death of the men in the study, which may be a confounding factor.

"Most studies focus on the association between total 25-hydroxyvitamin D levels and age-related disease and mortality. As 1,25-dihydroxyvitamin D is the active form of vitamin D in our body, it was possible it could have been a stronger predictor for disease and mortality. It has also been debated if the total or free vitamin D levels should be measured. Our data now suggest that both total and free 25-hydroxyvitamin D levels are the better measure of future health risk in men," says Dr Antonio

Dr Antonio and her team are currently finalising the statistical analysis and writing a manuscript on these findings.

Credit: 
European Society of Endocrinology

International registries show PCI rates increased in Japan, US

Japan and the U.S. have seen an increase in percutaneous coronary intervention (PCI) procedures, which is driven primarily by a rise in elective PCIs in Japan compared to non-elective in the U.S., according to a study in the Journal of the American College of Cardiology. Since adoption of large-scale PCI trial results vary internationally, the study sought to analyze large national registries in both countries to illuminate international variation in PCI practice as a foundation for further quality improvement.

In a study looking at NCDR CathPCI Registry data in the U.S. and J-PCI registry data in Japan, researchers from the Japanese Association of Cardiovascular Intervention and Therapeutics in Tokyo and several U.S.-based hospitals, compared temporal trends in procedural volume, patient characteristics, pre-procedural testing, procedural characteristics and quality metrics in the U.S. and Japan between 2013 and 2017.

Researchers found that PCI volume increased by 15.8% in the U.S.--from 550,872 in 2013 to 637,650 in 2017--primarily due to an increase in non-elective PCIs. In Japan, PCIs increased by 36%--from 181,750 in 2013 to 247,274 in 2017--primarily due to an increase in elective PCIs. Elective PCI rates were more than two-fold greater in Japan (72.7%) than in the U.S. (33.8%).

Data also showed the ratio of non-elective vs. elective PCI and the performance of non-invasive stress testing in stable disease was lower in Japan than in the U.S. Computed tomography angiography was more commonly used in Japan.

Credit: 
American College of Cardiology

Changing what we eat could offset years of climate-warming emissions, new analysis finds

Plant protein foods--like lentils, beans, and nuts--can provide vital nutrients using a small fraction of the land required to produce meat and dairy. By shifting to these foods, much of the remaining land could support ecosystems that absorb CO2, according to a new study appearing in the journal Nature Sustainability.

In their study, the researchers analyzed and mapped areas where extensive production of animal-sourced food, which requires 83 percent of Earth's agricultural land, suppresses native vegetation, including forests.

The study highlights places where changing what people grow and eat could free up space for ecosystems to regrow, offsetting our CO2 emissions in the process.

"The greatest potential for forest regrowth, and the climate benefits it entails, exists in high- and upper-middle income countries, places where scaling back on land-hungry meat and dairy would have relatively minor impacts on food security," says Matthew Hayek, the principal author of the study and an assistant professor in New York University's Department of Environmental Studies.

Burning fossil fuels for energy emits CO2, warming the planet. When warming reaches 1.5 °C (2.7 °F) above pre-industrial levels, more severe impacts like droughts and sea level rise are expected. Scientists describe how much fossil fuel we can burn before hitting that limit using the global "carbon budget."

According to the authors' findings, vegetation regrowth could remove as much as nine to 16 years of global fossil fuel CO2 emissions, if demand for meat were to drastically plummet in the coming decades along with its massive land requirements. That much CO2 removal would effectively double Earth's rapidly shrinking carbon budget.

"We can think of shifting our eating habits toward land-friendly diets as a supplement to shifting energy, rather than a substitute," says Hayek. "Restoring native forests could buy some much-needed time for countries to transition their energy grids to renewable, fossil-free infrastructure."

In their report, the authors emphasize that their findings are designed to assist locally tailored strategies for mitigating climate change. Although meat consumption in many countries today is excessive and continues to rise, raising animals remains critical in some places.

These considerations will be important as countries attempt to develop their economies sustainably, according to Colorado State University's Nathan Mueller, one of the study's co-authors.

"Land use is all about tradeoffs," explains Mueller, an assistant professor in the Department of Ecosystem Science and Sustainability and the Department of Soil and Crop Sciences. "While the potential for restoring ecosystems is substantial, extensive animal agriculture is culturally and economically important in many regions around the world. Ultimately, our findings can help target places where restoring ecosystems and halting ongoing deforestation would have the largest carbon benefits."

Recent proposals to cover much of Earth's surface in forests have generated controversy as a climate solution. Physically planting upward of a trillion trees would require a substantial physical effort. Additionally, poor planning could encourage uniform tree plantations, limit biodiversity, or deplete dwindling water in dry areas. Lastly, challenges lie in finding enough land to keep trees safe from logging or burning in the future, releasing stored carbon back into the atmosphere as CO2.

However, the researchers kept these potential problems in mind when devising their study.

"We only mapped areas where seeds could disperse naturally, growing and multiplying into dense, biodiverse forests and other ecosystems that work to remove CO2¬ for us," Hayek says. "Our results revealed over 7 million square kilometers where forests would be wet enough to regrow and thrive naturally, collectively an area the size of Australia."

Technological fixes for climate change may soon be on the horizon, like machinery that removes CO2 directly from the atmosphere or power plant exhaust pipes. Placing too much confidence in these technologies could prove dangerous, however, according to study co-author Helen Harwatt, a fellow of the Harvard Law School.

"Restoring native vegetation on large tracts of low yield agricultural land is currently our safest option for removing CO2," says Harwatt. "There's no need to bet our future solely on technologies that are still unproven at larger scales."

But the benefits of cutting back on meat and dairy reach far beyond addressing climate change.

"Reduced meat production would also be beneficial for water quality and quantity, wildlife habitat, and biodiversity," notes William Ripple, a co-author on the study and a professor of ecology at Oregon State University.

Recent events have also shone a spotlight on the importance of healthy ecosystems in preventing pandemic diseases with animal origins, such as COVID-19.

"We now know that intact, functioning ecosystems and appropriate wildlife habitat ranges help reduce the risk of pandemics," Harwatt adds. "Our research shows that there is potential for giving large areas of land back to wildlife. Restoring native ecosystems not only helps the climate; when coupled with reduced livestock populations, restoration reduces disease transmission from wildlife to pigs, chickens, and cows, and ultimately to humans."

Credit: 
New York University

'Wild West' mentality lingers in modern populations of US mountain regions

When historian Frederick Jackson Turner presented his famous thesis on the US frontier in 1893, he described the "coarseness and strength combined with acuteness and acquisitiveness" it had forged in the American character.

Now, well into the 21st century, researchers led by the University of Cambridge have detected remnants of the pioneer personality in US populations of once inhospitable mountainous territory, particularly in the West.

A team of scientists algorithmically investigated how landscape shapes psychology. They analysed links between the anonymised results of an online personality test completed by over 3.3 million Americans, and the "topography" of 37,227 US postal - or ZIP - codes.

The researchers found that living at both a higher altitude and an elevation relative to the surrounding region - indicating "hilliness" - is associated with a distinct blend of personality traits that fits with "frontier settlement theory".

"The harsh and remote environment of mountainous frontier regions historically attracted nonconformist settlers strongly motivated by a sense of freedom," said researcher Friedrich Götz, from Cambridge's Department of Psychology.

"Such rugged terrain likely favoured those who closely guarded their resources and distrusted strangers, as well as those who engaged in risky explorations to secure food and territory."

"These traits may have distilled over time into an individualism characterised by toughness and self-reliance that lies at the heart of the American frontier ethos" said Götz, lead author of the study.

"When we look at personality across the whole United States, we find that mountainous residents are more likely to have psychological characteristics indicative of this frontier mentality."

Götz worked with colleagues from the Karl Landsteiner University of Health Sciences, Austria, the University of Texas, US, the University of Melbourne in Australia, and his Cambridge supervisor Dr Jason Rentfrow. The findings are published in the journal Nature Human Behaviour.

The research uses the "Big Five" personality model, standard in social psychology, with simple online tests providing high-to-low scores for five fundamental personality traits of millions of Americans.

The mix of characteristics uncovered by study's authors consists of low levels of "agreeableness", suggesting mountainous residents are less trusting and forgiving - traits that benefit "territorial, self-focused survival strategies".

Low levels of "extraversion" reflect the introverted self-reliance required to thrive in secluded areas, and a low level of "conscientiousness" lends itself to rebelliousness and indifference to rules, say researchers.

"Neuroticism" is also lower, suggesting an emotional stability and assertiveness suited to frontier living. However, "openness to experience" is much higher, and the most pronounced personality trait in mountain dwellers.

"Openness is a strong predictor of residential mobility," said Götz. "A willingness to move your life in pursuit of goals such as economic affluence and personal freedom drove many original North American frontier settlers."

"Taken together, this psychological fingerprint for mountainous areas may be an echo of the personality types that sought new lives in unknown territories."

The researchers wanted to distinguish between the direct effects of physical environment and the "sociocultural influence" of growing up where frontier values and identities still hold sway.

To do this, they looked at whether mountainous personality patterns applied to people born and raised in these regions that had since moved away.

The findings suggest some "initial enculturation" say researchers, as those who left their early mountain home are still consistently less agreeable, conscientious and extravert, although no such effects were observed for neuroticism and openness.

The scientists also divided the country at the edge of St. Louis - "gateway to the West" - to see if there is a personality difference between those in mountains that made up the historic frontier, such as the Rockies, and eastern ranges e.g. the Appalachians.

While mountains continue to be a "meaningful predictor" of personality type on both sides of this divide, key differences emerged. Those in the east are more agreeable and outgoing, while western ranges are a closer fit for frontier settlement theory.

In fact, the mountainous effect on high levels of "openness to experience" is ten times as strong in residents of the old western frontier as in those of the eastern ranges.

The findings suggest that, while ecological effects are important, it is the lingering sociocultural effects - the stories, attitudes and education - in the former "Wild West" that are most powerful in shaping mountainous personality, according to scientists.

They describe the effect of mountain areas on personality as "small but robust", but argue that complex psychological phenomena are influenced by many hundreds of factors, so small effects are to be expected.

"Small effects can make a big difference at scale," said Götz. "An increase of one standard deviation in mountainousness is associated with a change of around 1% in personality."

"Over hundreds of thousands of people, such an increase would translate into highly consequential political, economic, social and health outcomes."

Credit: 
University of Cambridge

First 'plug and play' brain prosthesis demoed in paralyzed person

In a significant advance, UC San Francisco Weill Institute for Neurosciences researchers working towards a brain-controlled prosthetic limb have shown that machine learning techniques helped an individual with paralysis learn to control a computer cursor using their brain activity without requiring extensive daily retraining, which has been a requirement of all past brain-computer interface (BCI) efforts. 

"The BCI field has made great progress in recent years, but because existing systems have had to be reset and recalibrated each day, they haven't been able to tap into the brain's natural learning processes. It's like asking someone to learn to ride a bike over and over again from scratch," said study senior author Karunesh Ganguly, MD, PhD, an associate professor in the UCSF Department of Neurology. "Adapting an artificial learning system to work smoothly with the brain's sophisticated long-term learning schemas is something that's never been shown before in a person with paralysis."

The achievement of "plug and play" performance demonstrates the value of so-called ECoG electrode arrays for BCI applicartions. An ECoG array comprises a pad of electrodes about the size of a post-it note that is surgically placed on the surface of the brain. They allow long-term, stable recordings of neural activity and have been approved for seizure monitoring in epilepsy patients. In contrast, past BCI efforts have used "pin-cushion" style arrays of sharp electrodes that penetrate the brain tissue for more sensitive recordings but tend to shift or lose signal over time. In this case, the authors obtained investigational device approval for long-term chronic implantation of ECoG arrays in paralyzed subjects to test their safety and efficacy as long-term, stable BCI implants. 

In their new paper, published September 7, 2020 in Nature Biotechnology , Ganguly's team documents the use of an ECoG electrode array in an individual with paralysis of all four limbs (tetraplegia). The participant is also enrolled in a clinical trial designed to test the use of ECoG arrays to allow paralyzed patients to control a prosthetic arm and hand, but in the new paper, the participant used the implant to control a computer cursor on a screen. 

The researchers developed a BCI algorithm that uses machine learning to match brain activity recorded by the ECoG electrodes to the user's desired cursor movements. Initially, the researchers followed the standard practice of resetting the algorithm each day. The participant would begin by imagining specific neck and wrist movements while watching the cursor move across the screen. Gradually the computer algorithm would update itself to match the cursor's movements to the brain activity this generated, effective passing control of the cursor over to the user. However, starting this process over every day put a severe limit on the level of control that could be achieved. It could take hours to master control of the device, and some days the participant had to give up altogether. 

The researchers then switched to allow the algorithm to continue updating to match the participant's brain activity without resetting it each day. They found that the continued interplay between brain signals and the machine learning-enhanced algorithm resulted in continuous improvements in performance over many days. Initially there was a little lost ground to make up each day, but soon the participant was able to immediately achieve top level performance. 

"We found that we could further improve learning by making sure that the algorithm wasn't updating faster than the brain could follow -- a rate of about once every 10 seconds," said Ganguly, a practicing neurologist with UCSF Health and the San Francisco Veterans Administration Medical Center's Neurology & Rehabilitation Service. "We see this as trying to build a partnership between two learning systems -- brain and computer -- that ultimately lets the artificial interface become an extension of the user, like their own hand or arm."

Over time, the participant's brain was able to amplify patterns of neural activity it could use to most effectively drive the artificial interface via the ECoG array, while eliminating less effective signals -- a pruning process much like how the brain is thought to learn any complex task, the researcher say. They observed that the participant's brain activity seemed to develop an ingrained and consistent mental "model" for controlling the BCI interface, something that had never occurred with daily resetting and recalibration. When the interface was reset after several weeks of continuous learning, the participant rapidly re-established the same patterns of neural activity for controlling the device -- effectively retraining the algorithm to its former state.

"Once the user has established an enduring memory of the solution for controlling the interface, there's no need for resetting," Ganguly said. "The brain just rapidly convergences back to the same solution."

Eventually, once expertise was established, the researchers showed they could turn off the algorithm's need to update itself altogether, and the participant could simply begin using the interface each day without any need for retraining or recalibration. Performance did not decline over 44 days in the absence of retraining, and the participant could even go days without practicing and see little decline in performance. The establishment of stable expertise in one form of BCI control (moving the cursor) also allowed researchers to begin "stacking" additional learned skills -- such as "clicking" a virtual button -- without loss of performance.

Such immediate "plug and play" BCI performance has long been a goal in the field, but has been out of reach because the "pincushion-style" electrodes used by most researchers tend to move over time, changing the signals seen by each electrode. Also, because these electrodes penetrate brain tissue, the immune system tends to reject them, gradually impairing their signal. ECoG arrays are less sensitive than these traditional implants, but their long-term stability appears to compensate for this shortcoming. The stability of ECoG recordings may be even more important for long-term control of more complex robotic systems such as artificial limbs, a key goal of the next phase of Ganguly's research.

"We've always been mindful of the need to design technology that doesn't end up in a drawer, so to speak, but which will actually improve the day-to-day lives of paralyzed patients," Ganguly said. "These data show that ECoG-based BCIs could be the foundation for such a technology." 

Credit: 
University of California - San Francisco

Ancient bony fish forces rethink of how sharks evolved

video: Virtual three-dimensional model of the braincase of Minjinia turgenensis generated from CT scan

Image: 
Imperial College London/Natural History Museum

Sharks' non-bony skeletons were thought to be the template before bony internal skeletons evolved, but a new fossil discovery suggests otherwise.

The discovery of a 410-million-year-old fish fossil with a bony skull suggests the lighter skeletons of sharks may have evolved from bony ancestors, rather than the other way around.

Sharks have skeletons made cartilage, which is around half the density of bone. Cartilaginous skeletons are known to evolve before bony ones, but it was thought that sharks split from other animals on the evolutionary tree before this happened; keeping their cartilaginous skeletons while other fish, and eventually us, went on to evolve bone.

Now, an international team led by Imperial College London, the Natural History Museum and researchers in Mongolia have discovered a fish fossil with a bony skull that is an ancient cousin of both sharks and animals with bony skeletons. This could suggest the ancestors of sharks first evolved bone and then lost it again, rather than keeping their initial cartilaginous state for more than 400 million years.

The team published their findings today in Nature Ecology & Evolution.

Lead researcher Dr Martin Brazeau, from the Department of Life Sciences at Imperial, said: "It was a very unexpected discovery. Conventional wisdom says that a bony inner skeleton was a unique innovation of the lineage that split from the ancestor of sharks more than 400 million years ago, but here is clear evidence of bony inner skeleton in a cousin of both sharks and, ultimately, us."

Most of the early fossils of fish have been uncovered in Europe, Australia and the USA, but in recent years new finds have been made in China and South America. The team decided to dig in Mongolia, where there are rocks of the right age that have not been searched before.

They uncovered the partial skull, including the brain case, of a 410-million-year-old fish. It is a new species, which they named Minjinia turgenensis, and belongs to a broad group of fish called 'placoderms', out of which sharks and all other 'jawed vertebrates' - animals with backbones and mobile jaws - evolved.

When we are developing as foetuses, humans and bony vertebrates have skeletons made of cartilage, like sharks, but a key stage in our development is when this is replaced by 'endochondral' bone - the hard bone that makes up our skeleton after birth.

Previously, no placoderm had been found with endochondral bone, but the skull fragments of M. turgenensis were "wall-to-wall endochondral". While the team are cautious not to over-interpret from a single sample, they do have plenty of other material collected from Mongolia to sort through and perhaps find similar early bony fish.

And if further evidence supports an early evolution of endochondral bone, it could point to a more interesting history for the evolution of sharks.

Dr Brazeau said: "If sharks had bony skeletons and lost it, it could be an evolutionary adaptation. Sharks don't have swim bladders, which evolved later in bony fish, but a lighter skeleton would have helped them be more mobile in the water and swim at different depths.

"This may be what helped sharks to be one of the first global fish species, spreading out into oceans around the world 400 million years ago."

Credit: 
Imperial College London

Improving European healthcare through cell-based interceptive medicine

image: Magnification of miniature chips: Single cells are encapsulated in tiny droplets and supplied with reagents for further processing.

Image: 
Felix Petermann, MDC

Hundreds of innovators, research pioneers, clinicians, industry leaders and policy makers from all around Europe are united by a vision of how to revolutionize healthcare. In two publications - a perspective article in the journal Nature and the LifeTime Strategic Research Agenda - they now present a detailed roadmap of how to leverage the latest scientific breakthroughs and technologies over the next decade, to track, understand and treat human cells throughout an individual's lifetime.

The LifeTime initiative, co-coordinated by the Max Delbrueck Center of Molecular Medicine in the Helmholtz Association (MDC) in Berlin and the Institut Curie in Paris, has developed a strategy to advance personalized treatment for five major disease classes: cancer, neurological, infectious, chronic inflammatory and cardiovascular diseases. The aim is a new age of personalized, cell-based interceptive medicine for Europe with the potential of improved health outcomes and more cost-effective treatment, resulting in profoundly changing a person's healthcare experience.

Earlier detection and more effective treatment of diseases

To form a functioning, healthy body, our cells follow developmental paths during which they acquire specific roles in tissues and organs. But when they deviate from their healthy course, they accumulate changes leading to disease which remain undetected until symptoms appear. At this point, medical treatment is often invasive, expensive and inefficient. However, now we have the technologies to capture the molecular makeup of individual cells and to detect the emergence of disease or therapy resistance much earlier.

Using breakthrough single-cell and imaging technologies in combination with artificial intelligence and personalized disease models will allow us to not only predict disease onset earlier, but also to select the most effective therapies for individual patients. Targeting disease-causing cells to intercept disorders before irreparable damage occurs will substantially improve the outlook for many patients and has the potential of saving billions of Euros of disease-related costs in Europe.

A detailed roadmap for implementing LifeTime

The perspective article "The LifeTime initiative and the future of cell-based interceptive medicine in Europe" and the LifeTime Strategic Research Agenda (SRA) explain how these technologies should be rapidly co-developed, transitioned into clinical settings and applied to the five major disease areas. Close interactions between European infrastructures, research institutions, hospitals and industry will be essential to generate, share and analyze LifeTime's big medical data across European borders. The initiative's vision advocates ethically responsible research to benefit citizens all across Europe.

According to Professor Nikolaus Rajewsky, scientific director of the Berlin Institute for Medical System Biology at the Max Delbrueck Center for Molecular Medicine and coordinator of the LifeTime Initiative, the LifeTime approach is the way into the future: "LifeTime has brought together scientists across fields - from biologists, to clinicians, data scientists, engineers, mathematicians, and physicists ¬- to enable a much improved understanding of molecular mechanisms driving health and disease. Cell-based medicine will allow doctors to diagnose diseases earlier and intercept disorders before irreparable damage has occurred. LifeTime has a unique value proposition that promises to improve the European patient's health."

Dr. Geneviève Almouzni, director of research at CNRS, honorary director of the research center from Institut Curie in Paris and co-coordinator of the LifeTime Initiative believes that the future with LifeTime offers major social and economic impact: "By implementing interceptive, cell-based medicine we will be able to considerably improve treatment across many diseases. Patients all over the world will be able to lead longer, healthier lives. The economic impact could be tremendous with billions of Euros saved from productivity gains simply for cancer, and significantly shortened ICU stays for Covid-19. We hope EU leaders will realize we have to invest in the necessary research now."

Credit: 
Max Delbrück Center for Molecular Medicine in the Helmholtz Association