Earth

Origin of childhood cancer malignant rhabdoid tumour discovered

The first proof of the origin of malignant rhabdoid tumour (MRT), a rare childhood cancer, has been discovered by researchers at the Wellcome Sanger Institute, the Princess Máxima Center for Pediatric Oncology in the Netherlands, and their collaborators.

The study, published today (3 March 2021) in Nature Communications, found that MRT arises from developmental cells in the neural crest* whose maturation is blocked by a genetic defect. The team also identified two drugs that could be used to overcome this block and resume normal development, bringing hope of new treatments for the disease.

Malignant rhabdoid tumour (MRT) is a rare soft tissue cancer that predominantly affects infants. Although these tumours may arise in any part of the body, they usually form in the kidney and the brain. MRT is one of the childhood cancers with the poorest outcomes.

The rarity of MRT, with only 4-5 cases per year in the UK, combined with its aggressiveness, make clinical trials extremely difficult. Until now, the origin of MRT has not been known and no reliably effective treatment currently exists. This new study sought to discover the root of MRT in the hope of identifying new treatments for the disease.

For this study, two cases of MRT were whole genome sequenced at the Wellcome Sanger Institute, alongside corresponding normal tissues. The researchers then conducted phylogenetic analyses of the somatic mutations in the diseased and healthy tissue, in order to 'reconstruct' the timeline of normal and abnormal development.

The analyses confirmed that MRT develops from progenitor cells on their way to becoming Schwann cells, a cell type found in the neural crest, due to a mutation in the SMARCB1 gene. This mutation blocks the normal development of these cells, which can then go on to form MRT.

Researchers at the Princess Máxima Center for Pediatric Oncology then inserted the intact SMARCB1 gene into patient-derived MRT organoids**, artificial tumours grown in the laboratory from the patients' original tumours, to successfully overcome the maturation block that had prevented normal development and led to cancer. Based on single-cell mRNA analyses and predictions made from these experiments, the researchers then identified two existing medicines that overcome the maturation block and could thus be used to treat children with MRT.

Dr Jarno Drost, co-lead author of the study from the Princess Máxima Center for Pediatric Oncology, said: "To be able to identify where malignant rhabdoid tumour (MRT) comes from for the first time is an important step in being able to treat this disease, but to confirm that it is possible to overcome the genetic flaw that can cause these tumours is incredibly exciting. The fact that two drugs already exist that we think can be used to treat the disease gives us hope that we can improve outcomes for children diagnosed with MRT."

Professor Richard Grundy, Chair of the Children's Cancer and Leukaemia Group, said: "It is fantastic to see this collaborative research bearing highly translatable outcomes in a childhood cancer with a currently poor prognosis. It emphasises the significant benefit of a National Tumour banking system, that allows collection of rarer tumours and in turn, the best use of such precious tissue through agreement of the CCLG Biological Studies committee that oversees this resource. For this to result in such a meaningful outcome gives new hope to children with malignant rhabdoid tumour (MRT)."

MRT remains a cancer with very poor outcomes for which few new treatments are on the horizon. This study harnesses cutting edge quantitative and experimental methods, to identify a potential new treatment for MRT.

Dr Sam Behjati, co-lead author of the study from the Wellcome Sanger Institute, said: "We began our enquiry into the origins of malignant rhabdoid tumours in late 2019, so we have gone from hypothesis to discovery of origin to possible treatments for the disease in just over a year. This was possible due to all the leading-edge tools available to us, from organoid technology to single-cell mRNA sequencing to drug screen databases. I hope this study will serve as the blueprint for discovering the origin of other childhood cancers and, ultimately, lead to better outcomes for children affected by these awful diseases."

Credit: 
Wellcome Trust Sanger Institute

Researchers introduce a new generation of tiny, agile drones

image: Insects' remarkable acrobatic traits help them navigate the aerial world, with all of its wind gusts, obstacles, and general uncertainty.

Image: 
Courtesy of Kevin Yufeng Chen

If you've ever swatted a mosquito away from your face, only to have it return again (and again and again), you know that insects can be remarkably acrobatic and resilient in flight. Those traits help them navigate the aerial world, with all of its wind gusts, obstacles, and general uncertainty. Such traits are also hard to build into flying robots, but MIT Assistant Professor Kevin Yufeng Chen has built a system that approaches insects' agility.

Chen, a member of the Department of Electrical Engineering and Computer Science and the Research Laboratory of Electronics, has developed insect-sized drones with unprecedented dexterity and resilience. The aerial robots are powered by a new class of soft actuator, which allows them to withstand the physical travails of real-world flight. Chen hopes the robots could one day aid humans by pollinating crops or performing machinery inspections in cramped spaces.

Chen's work appears this month in the journal IEEE Transactions on Robotics. His co-authors include MIT PhD student Zhijian Ren, Harvard University PhD student Siyi Xu, and City University of Hong Kong roboticist Pakpong Chirarattananon.

Typically, drones require wide open spaces because they're neither nimble enough to navigate confined spaces nor robust enough to withstand collisions in a crowd. "If we look at most drones today, they're usually quite big," says Chen. "Most of their applications involve flying outdoors. The question is: Can you create insect-scale robots that can move around in very complex, cluttered spaces?"

According to Chen, "The challenge of building small aerial robots is immense." Pint-sized drones require a fundamentally different construction from larger ones. Large drones are usually powered by motors, but motors lose efficiency as you shrink them. So, Chen says, for insect-like robots "you need to look for alternatives."

The principal alternative until now has been employing a small, rigid actuator built from piezoelectric ceramic materials. While piezoelectric ceramics allowed the first generation of tiny robots to take flight, they're quite fragile. And that's a problem when you're building a robot to mimic an insect -- foraging bumblebees endure a collision about once every second.

Chen designed a more resilient tiny drone using soft actuators instead of hard, fragile ones. The soft actuators are made of thin rubber cylinders coated in carbon nanotubes. When voltage is applied to the carbon nanotubes, they produce an electrostatic force that squeezes and elongates the rubber cylinder. Repeated elongation and contraction causes the drone's wings to beat -- fast.

Chen's actuators can flap nearly 500 times per second, giving the drone insect-like resilience. "You can hit it when it's flying, and it can recover," says Chen. "It can also do aggressive maneuvers like somersaults in the air." And it weighs in at just 0.6 grams, approximately the mass of a large bumble bee. The drone looks a bit like a tiny cassette tape with wings, though Chen is working on a new prototype shaped like a dragonfly.

Building insect-like robots can provide a window into the biology and physics of insect flight, a longstanding avenue of inquiry for researchers. Chen's work addresses these questions through a kind of reverse engineering. "If you want to learn how insects fly, it is very instructive to build a scale robot model," he says. "You can perturb a few things and see how it affects the kinematics or how the fluid forces change. That will help you understand how those things fly." But Chen aims to do more than add to entomology textbooks. His drones can also be useful in industry and agriculture.

Chen says his mini-aerialists could navigate complex machinery to ensure safety and functionality. "Think about the inspection of a turbine engine. You'd want a drone to move around [an enclosed space] with a small camera to check for cracks on the turbine plates."

Other potential applications include artificial pollination of crops or completing search-and-rescue missions following a disaster. "All those things can be very challenging for existing large-scale robots," says Chen. Sometimes, bigger isn't better.

Credit: 
Massachusetts Institute of Technology

Researchers investigate imaginary part in quantum resource theory

image: Experimental results for local state discrimination

Image: 
WU Kangda et al.

Recently, research team led by academician GUO Guangcan from CAS Key Laboratory of Quantum Information of the University of Science and Technology of China (USTC) of CAS, has made an important progress in quantum information theory. Prof. LI Chuanfeng and Prof. XIANG Guoyong from the team, cooperated with Dr. Strelstov from University of Warsaw, investigated the imaginary part of quantum theory as a resource, and several important results have been obtained. Relevant results are now jointly published as Editors' Suggestion in Physical Review Letters and Physical Review A.

Complex number is a mathematical tool, and it is widely used in mechanics, electrodynamics, optics and other related fields of physics to provide an elegant formulation of the corresponding theory. The birth of quantum mechanics gives a unified picture of wave and particle, and further strengthens the prominent role of complex number in physics. However, the question of whether complex structures are necessary for quantum mechanics has long been debated by physicists

Researchers regarded complex number as a kind of quantum resource, and reveal its irreplaceable role in the local discrimination of bipartite quantum states. Furthermore, in the framework of quantum resource theory, they studied the measurement method of this resource and the transformation problem under various free operations. They have solved the problem of robustness measurement of complex size, transformation of single bit quantum state under free operation and mutual transformation probability of any pure state under free operation.

Using the two-photon entangled state prepared by parametric down conversion, researchers further measured and compared the success probability of locally distinguishing quantum state when only using the real measurement basis and using general measurement basis. They successfully observed the increase of the success probability when using the complex measurement basis, which verified the important role of the complex in quantum mechanics.

This work proves that the imaginary part is indispensable in the theory of quantum mechanics. The reviewer highly recommended it as "I find that the quantum imaginarity can be considered as a stronger form of quantum coherence, ......., I also think that the results in the manuscript will stimulate the research on the quantum foundation and the quantum resource theories with a richer structure".

Credit: 
University of Science and Technology of China

Researchers discover SARS-CoV-2 inhibitors

image: with one of the newly developed inhibitors in the active centre. The individual domains of the protein are shown in different colours, the inhibitor in pink.

Image: 
© V. Namasivayam/Pharmazeutisches Institut/Uni Bonn

A research team of pharmacists at the University of Bonn has discovered two families of active substances that can block the replication of the SARS-CoV-2 coronavirus. The drug candidates are able to switch off the the key enzyme of the virus, the so-called main protease. The study is based on laboratory experiments. Extensive clinical trials are still required for their further development as therapeutic drugs. The results have now been published in the journal Angewandte Chemie.

In order for the SARS-CoV-2 coronavirus to replicate, it relies on the main protease as a key enzyme. The virus first has its genome translated from RNA into a large protein strand. The viral main protease then cuts this protein chain into smaller pieces, from which the new virus particles are formed. "The main protease is an extremely promising starting point for coronavirus drug research," says Prof. Dr. Christa E. Müller of the Pharmaceutical Institute at the University of Bonn. "If this enzyme is blocked, viral replication in the body's cells is stopped." The researcher is a member of the Transdisciplinary Research Area "Life and Health" at the University of Bonn.

The pharmaceutical chemists designed a large number of potential inhibitors based on the structure of the main protease and the mechanism by which the important virus-replicating enzyme works. "A suitable inhibitor must bind sufficiently tightly to the main protease to be able to block its active site," says Prof. Dr. Michael Gütschow, who heads an independent research group on such inhibitors at the Pharmaceutical Institute of the University of Bonn.

Fluorescent test system

Then the experimental phase began. The researchers developed a new test system for high-throughput screening. They offered the main protease a substrate to which a reporter molecule was coupled. When the protease catalytically cleaved this coupling, the fluorescence of the product was measurable. However, if a simultaneously administered inhibitor successfully blocked the activity of the protease, there was no fluorescence. "For most of the test compounds, we observed no enzyme inhibition. But on rare occasions in our comprehensive tests, fluorescence was suppressed: These were the hits we had hoped for in our search for inhibitors of the viral protease," reports Gütschow.

Like chewing gum at the catalytic center

The researchers' high-throughput screening showed two classes of drugs that appeared to be particularly promising. Customized compounds of both classes were then newly synthesized. They stick to the main protease like chewing gum and block the crucial catalytic center, which prevents the main protease from preparing the virus replication. "Some of the compounds even have another effect," Müller reports. "They also inhibit a human enzyme that helps the virus enter body cells."

The participants contributed very different expertise to the study. "Only through great collaboration have we been able to design, synthesize and biochemically characterize suitable drug candidates," says Gütschow. "The best compounds represent promising lead structures for drug development," according to Müller. However, extensive clinical trials have yet to prove whether these candidates also inhibit SARS coronavirus-2 replication in humans, Gütschow adds.

Credit: 
University of Bonn

USTC detects a sharp rise in detection rate of broad absorption line variations

Gas around black holes and interstellar medium distribution are key factors in understanding the growth of supermassive black holes and the evolution of their host galaxies. However, as a crucial parameter, gas density is hard to be determined reliably, because the general method is not applicable to all quasars.

Researchers from the University of Science and Technology of China (USTC) of the Chinese Academy of Sciences (CAS) for the first time detected a "sharp rise" signature in the detection rate of broad absorption line (BAL) variations, which in turn deduced ionized gas density. The work was published in The Astrophysical Journal Letters on January 11, 2021.

The ionization state of a gaseous outflow requires a period of time (the recombination timescale, trec) to respond to changes in the ionizing continuum for the ionized outflows. trec is inversely proportional to the gas density.

According to that, a previous study reported by the group of Prof. WANG Tinggui and Prof. LIU Guilin from USTC of CAS proposed that the gas density can be determined by measuring trec.

They assumed the probability of detecting the variability of a BAL with trec at observational time interval (ΔT) is a step function. In other words, the BAL variability can be detected when the trec is shorter than the ΔT.

Following the same method, "sharp rise" phenomena are detected in detection rate of several different BALs in a quasar SDSS J141955.26+522741.1 from the Sloan Digital Sky Survey Data Release 16 (SDSS DR16), which indicates that this measuring method is reliable.

Researchers first found that the detection rate curve could be used to distinguish gaseous components with different density but the same velocity and location, optimizing the group's work on the new method measuring gas density by trec.

Credit: 
University of Science and Technology of China

Chemists boost boron's utility

CAMBRIDGE, MA -- Boron, a metalloid element that sits next to carbon in the periodic table, has many traits that make it potentially useful as a drug component. Nonetheless, only five FDA-approved drugs contain boron, largely because molecules that contain boron are unstable in the presence of molecular oxygen.

MIT chemists have now designed a boron-containing chemical group that is 10,000 times more stable than its predecessors. This could make it possible to incorporate boron into drugs and potentially improve the drugs' ability to bind their targets, the researchers say.

"It's an entity that medicinal chemists can add to compounds they're interested in, to provide desirable attributes that no other molecule will have," says Ron Raines, the Firmenich Professor of Chemistry at MIT and the senior author of the new study.

To demonstrate the potential of this approach, Raines and his colleagues showed that they could improve the protein-binding strength of a drug that is used to treat diseases caused by the misfolding of a protein called transthyretin.

MIT graduate student Brian Graham and former graduate student Ian Windsor are the lead author of the study, which appears this week in the Proceedings of the National Academy of Sciences. Former MIT postdoc Brian Gold is also an author of the paper.

Hungry for electrons

Boron is most commonly found in the Earth's crust in the form of minerals such as borax. It contains one fewer electron than carbon and is hungry for additional electrons. When boron is incorporated into a potential drug compound, that hunger for electrons often leads it to interact with an oxygen molecule (O2) or another reactive form of oxygen, which can destroy the compound.

The boron-containing drug bortezomib, which prevents cells from being able to break down used proteins, is an effective cancer chemotherapy agent. However, the drug is unstable and is destroyed readily by oxygen.

Previous research has shown that the stability of boron-containing compounds can be increased by appending benzene, a six-carbon ring. In 2018, Raines and his colleagues used this approach to create a modified version of a drug called darunavir, a protease inhibitor used to treat HIV/AIDS. They found that this molecule bound to HIV protease much more tightly than the original version of darunavir. However, later studies revealed that the molecule still did not survive for long under physiological conditions.

In the new paper, the researchers decided to use a chemical group called a carboxylate to further anchor boron within a molecule. An oxygen atom in the carboxylate forms a strong covalent bond -- a type of bond that involves sharing pairs of electrons between atoms -- with boron.

"That covalent bond pacifies the boron," Raines says. "The boron can no longer react with an oxygen molecule in the way that boron in other contexts can, and it still retains its desirable properties."

One of those desirable properties is the ability to form reversible covalent bonds with the target of the drug. This reversibility could prevent drugs from permanently locking onto incorrect targets, Raines says. Another useful feature is that the boron-containing group -- also known as benzoxaboralone -- forms many weaker bonds called hydrogen bonds with other molecules, which helps to ensure a tight fit once the right target is located.

Greater stability

Once they showed that benzoxaboralone was significantly more stable than boron in other contexts, the researchers used it to create a molecule that can bind to transthyretin. This protein, which carries hormones through the bloodstream, can cause amyloid diseases when it misfolds and clumps. Drugs that bind to transthyretin can stabilize it and prevent it from clumping. The research team showed that adding benzoxaboralone to an existing drug helped it to bind strongly with transthyretin.

Benzoxaboralone may offer medicinal chemists a useful tool that they can explore in many different types of drugs that bind to proteins or sugar molecules, Raines says. His lab is now working on a new version of darunavir that incorporates benzoxaboralone. They recently developed a way to synthesize this compound and are now in the process of measuring how strongly it binds to HIV protease.

"We are working hard on this because we think that this scaffold will provide much greater stability and utility than any other presentation of boron in a biological context," Raines says.

Credit: 
Massachusetts Institute of Technology

Ecosystems across the globe 'breathe' differently in response to rising temperatures

Land stores vast amounts of carbon, but a new study led by Cranfield University's Dr Alice Johnston suggests that how much of this carbon enters the atmosphere as temperatures rise depends on how far that land sits from the equator.

Ecosystems on land are made up of plants, soils, animals, and microbes - all growing, reproducing, dying, and breathing in a common currency; carbon. And how much of that carbon is breathed out (also known as ecosystem respiration) compared to how much is stored (through primary production) has impacts for climate change.

A key concern is that if more carbon is respired than stored, the rate of climate change could accelerate even further. Yet, some big assumptions are made in the models used to predict climate changes - that ecosystem respiration rises with temperature at the same rate (doubles for a temperature rise of 10 °C) irrespective of the ecosystem itself. A new study "Temperature thresholds to ecosystem respiration at a global scale" published in Nature Ecology & Evolution from an international group of scientists led by Dr Alice Johnston at Cranfield University and Professor Chris Venditti at University of Reading, however, suggests there are two major 'thresholds' to this relationship.

The study, funded by the Leverhulme Trust, shows that ecosystem respiration doesn't rise as strongly with temperature in warmer (Mediterranean and tropical) climates compared to mild (temperate) climates but shows an extreme rise with temperature in cold (boreal and tundra) climates. This finding contradicts several studies showing a static temperature-respiration relationship globally but agrees with observations made within different ecosystems.

First author Dr Alice Johnston, Lecturer in Environmental Data Science at Cranfield University, said: "Ecosystems are extremely complex, and there is massive variation in how many and what type of plants, animals and microbes are present in one field compared to the next let alone across global ecosystems. Given those shifting patterns in biodiversity we would expect changes in how ecosystem respiration responds to temperature because different species exhibit different temperature sensitivities. Our study is very simple and doesn't capture all of that variation, but it does capture three distinct differences in the ecosystem respiration pattern across 210 globally distributed sites.

"Fundamentally, our results show that temperature has a weak effect on ecosystem respiration in Mediterranean and tropical ecosystems, a well understood effect in temperate ecosystems, and an outsized effect in boreal and tundra ecosystems. On the one hand, that's a concern because huge stores of carbon in cold climates could be released with rising temperatures. On the other hand, if CO2 fertilisation in tropical regions promote primary production and rising temperatures inhibit ecosystem respiration, warmer climates could become an even more important carbon sink. Regardless, this study really indicates that we need to better understand the causes of these thresholds and the vital role that biodiversity loss could be playing. That would not only improve climate change forecasts, but further incentivise conservation efforts."

Professor Chris Venditti, Professor of Evolutionary Biology at University of Reading, added: "The impact of plant diversity on the terrestrial carbon cycle is far better known than animal diversity. In the future, we need to focus our attention on identifying general but realistic ways to integrate whole community complexity into climate models. That way, we can determine biodiversity loss or gain tipping points beyond which biosphere carbon sinks are enhanced or diminished."

Credit: 
Cranfield University

'Best case' goals for climate warming which could still result in massive wildfire risk

image: In a new study, scientists have found that by projecting two different types of fire weather conditions, an additional half-degree of warming could drastically increase the likelihood and significance of blazes worldwide

Image: 
Pexels

Under the 2015 Paris Agreement, the United Nations Framework Convention on Climate Change agreed to pursue efforts to limit the temperature increase to 2.0°C and, ideally, to 1.5°C, over preindustrial levels. However, even before that treaty was signed, scientists had already warned that those "best case" targets were unlikely to be achievable. Consequently, many fire weather studies are built with models that simulate much higher levels of climate warming.

Recently, researchers from South Korea, Japan, and the United States have found that by projecting the fire weather conditions under two mildly varying warming levels -- one in which the global climate warms by 1.5°C and the other by 2°C -- even just a half-degree of warming could significantly increase the likelihood and significance of wildfires!

"When it comes to the conditions that make wildfires more likely, a little bit of warming goes a long way," explained lead author Rackhun Son, Ph.D. candidate at Gwangju Institute of Science and Technology (GIST), Korea, "but, of course, this is troubling, because it is quite unlikely that we will only be experiencing a little bit of warming."

"Although it is reasonable to look at fire weather under more extreme circumstances, there is little sense in making goals without a good understanding of what might happen if you were to reach those goals," said co-author Seung-Hee Kim of Chapman University, "so, we asked 'what would happen if we did reach these goals? Would the fire weather conditions not become as severe?'"

That answer is complex, but this study's key finding is that just a half a degree of additional warming would likely create a notably greater danger of fire on the most widely inhabited continents, with dangers particularly concentrated in the Amazon rainforest and African savanna, and around the Mediterranean. "We also provided evidence that places like Australia and Indonesia are likely to reach peak levels of fire susceptibility even before we reach that lower threshold," said co-author Simon Wang of Utah State University.

The study does provide a silver lining of hope to this cloud of danger. Commenting on the implications of their findings, Dr. Wang comments, "If we were somehow able to suppress this extra half a degree of warming, we could reduce climate-driven extreme fire activities in many places, potentially saving many lives and billions of dollars."

Credit: 
GIST (Gwangju Institute of Science and Technology)

How to track the variants of the pandemic faster

image: Specialists in viral and genetic analysis, led by Swiss scientists Dr. Emma Hodcroft at the University of Bern and Prof. Christophe Dessimoz at University of Lausanne, alongside Dr. Nick Goldman at EMBL-EBI in the UK, lay out the 'bioinformatics bottlenecks' that are hindering response to the SARS-CoV-2 pandemic, and propose ways to 'clear the road' for better tools and approaches.

Image: 
© Oliver Hochstrasser

"What scientists have achieved in a year since the discovery of a brand-new virus is truly remarkable," says Emma Hodcroft from the Institute of Social and Preventive Medicine (ISPM) of the University of Bern, first author on the piece, "but the tools scientists are using to study how SARS-CoV-2 is transmitting and changing were never designed for the unique pressures - or volumes of data - of this pandemic."

SARS-CoV-2 is now one of the most sequenced pathogens of all time, with over 600,000 full-genome sequences having been generated since the pandemic began, and over 5,000 new sequences coming in from around the world every day. However, the analysis and visualization tools used today (including Nextstrain, co-developed by Prof. Richard Neher's group at the SIB and the University of Basel) were never designed to handle the volume and speed of sequences being generated today, or the scale of the involvement with public health response. "Across the world, genomic surveillance rests on the initiative of academic researchers to find essential answers. Public health decision making would benefit from a more sustainalble collaboration framework," says Christophe Dessimoz at SIB and University of Lausanne.

What an improved sequencing would enable

The genetic sequences from SARS-CoV-2 hold valuable information to implementing effective pandemic policies and staying ahead of the virus. Comparing how many mutations different samples share, for example, allows scientists to track the transmission of the virus - helping to identify super-spreading events and international spread. But at the moment it can be hard to combine this genetic information with other key variables - like who attended an event, and when symptoms appeared - which could help make these methods even more informative.

The 'R-number' has gone from a scientific concept to a household word in the last year - it measures the average number of people an infected person will transmit to. Here, sequences can help too, by helping to pick apart imported cases from local transmission. This allows for a more accurate estimate of Re, but needs high levels of sequencing and complex analyses, which are currently not widely implemented.

Finally, sequencing is the only way to identify and track the many mutations that arise in SARS-CoV-2. While mutations are a normal part of virus life, scientists need to know which are harmless variations and which could change the virus' transmissibility or clinical outcome. Combining sequences, lab work, and computational predictions could allow for a better understanding of mutational impacts, but there's little framework to help these different specialities work together. "The viral data - sequences and associated metadata- must be determined, gathered and harmonized thanks to stable infrastructures compatible with the principles of Open Data to facilitate peer-review by the community and their reuse", says Christophe Dessimoz at SIB and University of Lausanne, last author of the comment.

Benefits for Switzerland

"In Switzerland, the population could benefit from more systematic and representative sequencing, for example through better contract tracing, targeted isolation and quarantine of smaller regions, and guiding the closing and opening of schools based on the presence of certain variants", explains Emma Hodcroft. Harmonisation of health data practices is also a critical topic. Switzerland is already putting a lots of efforts at the national level through the Swiss Personalized Health Network (SPHN).

The researchers are convinced that Switzerland's potential in terms of expertise and infrastructure is just waiting to be tapped, to the benefit of public health. "The tools to enable research are there, and researchers have self-organized and taken the first step: to scale up and sustain these efforts to bring research and public health closer together, we rely on a sustainable public funding", says Christophe Dessimoz.

Credit: 
University of Bern

Learning about health from trusted sources may help teens battle depression

UNIVERSITY PARK, Pa. -- Depression can be a common problem for teens and adolescents, and while many treatments exist, they don't always work for everyone. A new study found that feeling more informed about their health may help teens take better care of themselves, leading to less depressive symptoms.

The researchers also found that trust played a factor in whether receiving health information improved depression. The more that adolescents trusted their parents or teacher as a credible source of health information, the more likely they were to experience less depression.

Additionally, even though adolescents reported that they trusted traditional media -- like TV, radio and newspapers -- more than online content, only content from social media or websites resulted in actual changes in behavior.

Bu Zhong, associate professor of communications at Penn State, said the findings -- recently published in the journal Child: Care, Health and Development -- suggest that while adolescents are probably taught to be skeptical of online content, websites and social media have the potential to powerfully affect adolescent health.

"The kids weren't purposefully being misleading when they said they didn't trust information online, even though that information was ultimately linked with lower depression," Zhong said. "They were probably told by their parents and teachers to be wary of information found online or on social media. But our research found that online content has a strong impact on their health behavior and depression mitigation strategies, which are not found in the traditional media content."

According to the researchers, depression is one of the most common mental disorders among adolescents in China. In the United States, the National Institute of Mental Health reported that 13.3% of the U.S. population aged 12 to 17 in 2017 had at least one major depressive episode.

Additionally, previous work has shown that depression increases the risk of adolescents experiencing poorer school performance and social withdrawal, along with an increased risk of self-harm and suicide ideation.

"This study was actually inspired by my students, after several of them came to me really stressed out," Zhong said. "I know firsthand how widespread depression can be among students, so I was interested in what kind of health information people shared with the young people and if it can help them cope with depression."

The researchers recruited 310 adolescents from elementary, middle and high schools in North China for the study. Participants answered information about health information -- such as seminars, classes, pamphlets and other media -- they recently consumed, including its quality and whether the source and information were credible.

They also answered questions about their health, including their symptoms of depression and whether consuming health information led to changes in their behavior, such as whether they felt it helped them prevent disease and if it increased their likelihood to discuss and share health information with friends.

The researchers found that the older participants were, the more likely they were to be depressed. Additionally, participants with higher GPAs were also more likely to be depressed. Zhong said this could be because the longer students were in school, and the better their grades were, the more likely they were more likely to feel more pressure to succeed.

However, the more frequently participants used social media, the more likely they were to change their health behaviors, which led to less depression.

Lastly, adolescents felt more depressed when their mothers had a higher level of education, but less depressed when their fathers had a higher level of education. Zhong described this finding as the "tiger mom effect."

"These findings don't mean that a mom's education causes their kids' depression, but one interpretation could be that it may not be a good idea for moms to dominate their children's school life and push them too hard," Zhong said. "Kids may do much better at school and more importantly are less likely to experience stress or other depressive symptoms. Parents may learn from each other in educating their teen children."

Overall, the researchers said the results suggest that health information has the potential to be strategically used to help mitigate depression in teens and adolescents.

"Our research is interested in providing long-term health outcomes, not just temporary relief," Zhong said. "So we're looking for anything in addition to drugs, in addition to therapy, that can help people with their depression, and this offers another possibility. It may not be able to remove all the stressors causing teen depression, but it's possible we could equip adolescents with better health information gathering skills to help battle depression."

Credit: 
Penn State

High end of climate sensitivity in new climate models seen as less plausible

image: The researchers found that models with lower climate sensitivity are more consistent with observed temperature differences, particularly between the northern and southern hemispheres. The graph shows changes in the annual global-mean surface temperature (a) and the temperature difference between the northern and southern hemispheres (b) from 1850 to 2000. The red line represent high climate-sensitivity models, while the blue line represents models with low climate sensitivity. The black line shows observed temperature fluctuations collected by NASA's Goddard Institute for Space Studies Surface Temperature Analysis project, which more closely follow the blue line when it comes to interhemispheric temperature. The gray backgrounds indicate years when the difference between the high and low climate-sensitivity models are significant.

Image: 
Image by Chenggong Wang, Program in Atmospheric and Oceanic Sciences, Princeton University

A recent analysis of the latest generation of climate models -- known as a CMIP6 -- provides a cautionary tale on interpreting climate simulations as scientists develop more sensitive and sophisticated projections of how the Earth will respond to increasing levels of carbon dioxide in the atmosphere.

Researchers at Princeton University and the University of Miami reported that newer models with a high "climate sensitivity" -- meaning they predict much greater global warming from the same levels of atmospheric carbon dioxide as other models -- do not provide a plausible scenario of Earth's future climate.

Those models overstate the global cooling effect that arises from interactions between clouds and aerosols and project that clouds will moderate greenhouse gas-induced warming -- particularly in the northern hemisphere -- much more than climate records show actually happens, the researchers reported in the journal Geophysical Research Letters.

Instead, the researchers found that models with lower climate sensitivity are more consistent with observed differences in temperature between the northern and southern hemispheres, and, thus, are more accurate depictions of projected climate change than the newer models. The study was supported by the Carbon Mitigation Initiative (CMI) based in Princeton's High Meadows Environmental Institute (HMEI).

These findings are potentially significant when it comes to climate-change policy, explained co-author Gabriel Vecchi, a Princeton professor of geosciences and the High Meadows Environmental Institute and principal investigator in CMI. Because models with higher climate sensitivity forecast greater warming from greenhouse gas emissions, they also project more dire -- and imminent -- consequences such as more extreme sea-level rise and heat waves.

The high climate-sensitivity models forecast an increase in global average temperature from 2 to 6 degrees Celsius under current carbon dioxide levels. The current scientific consensus is that the increase must be kept under 2 degrees to avoid catastrophic effects. The 2016 Paris Agreement sets the threshold to 1.5 degrees Celsius.

"A higher climate sensitivity would obviously necessitate much more aggressive carbon mitigation," Vecchi said. "Society would need to reduce carbon emissions much more rapidly to meet the goals of the Paris Agreement and keep global warming below 2 degrees Celsius. Reducing the uncertainty in climate sensitivity helps us make a more reliable and accurate strategy to deal with climate change."

The researchers found that both the high and low climate-sensitivity models match global temperatures observed during the 20th century. The higher-sensitivity models, however, include a stronger cooling effect from aerosol-cloud interaction that offsets the greater warming due to greenhouse gases. Moreover, the models have aerosol emissions occurring primarily in the northern hemisphere, which is not consistent with observations.

"Our results remind us that we should be cautious about a model result, even if the models accurately represent past global warming," said first author Chenggong Wang, a Ph.D. candidate in Princeton's Program in Atmospheric and Oceanic Sciences. "We show that the global average hides important details about the patterns of temperature change."

In addition to the main findings, the study helps shed light on how clouds can moderate warming both in models and the real world at large and small scales.

"Clouds can amplify global warming and may cause warming to accelerate rapidly during the next century," said co-author Wenchang Yang, an associate research scholar in geosciences at Princeton. "In short, improving our understanding and ability to correctly simulate clouds is really the key to more reliable predictions of the future."

Scientists at Princeton and other institutions have recently turned their focus to the effect that clouds have on climate change. Related research includes two papers by Amilcare Porporato, Princeton's Thomas J. Wu '94 Professor of Civil and Environmental Engineering and the High Meadows Environmental Institute and a member of the CMI leadership team, that reported on the future effect of heat-induced clouds on solar power and how climate models underestimate the cooling effect of the daily cloud cycle.

"Understanding how clouds modulate climate change is at the forefront of climate research," said co-author Brian Soden, a professor of atmospheric sciences at the University of Miami. "It is encouraging that, as this study shows, there are still many treasures we can exploit from historical climate observations that help refine the interpretations we get from global mean-temperature change."

Credit: 
Princeton University

Drug target could fight Parkinson's and Alzheimer's disease

image: An artist's impression of the brain, made from images of the SARM1 protein.

Image: 
The University of Queensland

Neurodegenerative disorders such as Parkinson's and Alzheimer's disease are in the firing line after researchers identified an attractive therapeutic drug target.

An international collaboration, co-led by University of Queensland researchers, has isolated and analysed the structure and function of a protein found in the brain's nerve fibres called SARM1.

Dr Jeff Nanson said the protein was activated when nerve fibres were damaged by injury, disease, or as a side effect of certain drugs.

"After a damaging incident occurs, this protein often induces a form of nerve fibre degeneration - known as axon degeneration - a 'self-destruct' mechanism of sorts," Dr Nanson said.

"This is a key pathological feature of many terrible neurodegenerative diseases, such as Parkinson's and Alzheimer's disease, and also amyotrophic lateral sclerosis (ALS), traumatic brain injury, and glaucoma.

"There are currently no treatments to prevent this nerve fibre degeneration, but now we know that SARM1 is triggering a cascade of degeneration we can develop future drugs to precisely target this protein.

"This work will hopefully help design new inhibiting drugs that could stop this process in its tracks."

Professor Bostjan Kobe said the researchers analysed the structure of the protein and defined its three-dimensional shape using X-ray crystallography and cryo-electron microscopy.

"With X-ray crystallography, we make proteins grow into crystals, and then shoot X-rays at the crystals to get diffraction," Professor Kobe said.

"And with cryo-electron microscopy, we freeze small layers of solution and then visualise protein particles by a beam of electrons.

"The resulting 3D images of SARM1's ring-like structure were simply beautiful, and truly allowed us to investigate its purpose and function.

"This visualisation was a highly collaborative effort, working closely with our partners at Griffith University and our industry partners."

The researchers hope that the discovery is the start of a revolution in treatments for neurodegenerative disorders.

"It's time we had effective treatments for these devastating disorders," Dr Nanson said.

"We know that these types of diseases are strongly related to age, so in the context of an ageing population here in Australia and globally, these diseases are likely to increase.

"It's incredibly important that we understand how they work and develop effective treatments."

Credit: 
University of Queensland

Creamy or gritty?

There's more to taste than flavor. Let ice cream melt, and the next time you take it out of the freezer you'll find its texture icy instead of the smooth, creamy confection you're used to. Though its flavor hasn't changed, most people would agree the dessert is less appetizing.

UC Santa Barbara Professor Craig Montell and postdoctoral fellow Qiaoran Li have published a study in Current Biology providing the first description of how certain animals sense the texture of their food based on grittiness versus smoothness. They found that, in fruit flies, a mechanosensory channel relays this information about a food's texture.

The channel, called TMEM63, is part of a class of receptors that appear in organisms from plants to humans. As a result, the new findings could help shed light on some of the nuances of our own sense of taste.

"We all appreciate that food texture impacts food appeal," said Montell, Duggan and Distinguished Professor in the Department of Molecular, Cellular, and Developmental Biology. "But this is something that we don't understand very well."

Li and Montell focused on fruit flies to investigate the molecular and cellular mechanisms behind the effect of grittiness on food palatability. "We found that they, like us, have food preferences that are influenced by this textural feature," Montell said. They devised a taste test in which they added small particles of varying size to sugary food, finding that the flies preferred particles of a specific size.

In prior work, Montell and his group have elucidated the nuances at play in the sense of taste. In 2016, they found a channel that enables flies to sense the hardness and viscosity of their food by the movement of the small bristles on their tongue, or labellum. More recently they discovered the mechanisms by which cool temperatures reduce palatability.

Now the researchers sought to identify a receptor that's required for sensing grittiness. They figured that it would be a mechanically-activated channel triggered when particles lightly bent a fly's taste hairs. However, the inactivation of known receptors had no impact on preference for food based on smoothness and grittiness.

The authors then considered the protein TMEM63. "Fly TMEM63 is part of a class of mechano-sensors conserved from plants to humans," Montell said, "but its roles in animals were unknown."

With only the suspicion that it might relay information about grittiness, Li and Montell inactivated the gene that codes for TMEM63 and compared the behavior of the mutant flies with wild-type animals.

After withholding food from the animals for a couple hours, the researchers measured their interest in various sugar solutions mixed with particles of different sizes. They used the amount that the fly extended its proboscis to measure the animal's interest in the food it was presented. Li and Montell discovered that without TMEM63, the flies couldn't distinguish between a solution of pure sugar water and one containing small silica spheres around 9 micrometers in diameter, which is the flies' preferred level of grittiness.

When they added chemicals to make the sugar solution less pleasant -- a mild acid, caffeine or moderate amounts of salt -- the microspheres reversed the flies' aversion. But not in flies lacking TMEM63. Upon restoring the gene that codes for this channel in the mutant flies' labella, the animals regained their ability to detect grittiness.

"It wasn't known before this study that flies could even discriminate foods on the basis of grittiness," Montell noted. "Now that we found that the mechanosensitive channel is TMEM63, we have uncovered one role for this protein in an animal."

The TMEM63 channel functions in a single multi-dendritic neuron (md-L) in each of two labella at the end of the fly's proboscis. The neuron senses movements of most of the taste bristles on the labella. When the bristles move slightly upon interacting with food particle, it activates the TMEM63 channel, which stimulates the neuron that relays the sensation to the brain. Because one neuron connects to many bristles, it likely can't convey positional data of individual particles, only a gestalt sense of grittiness.

By applying light force to these bristles -- mimicking the action of small particles in a gritty solution -- Montell and Li could activate the md-L neuron. However, the same procedure showed no effect in flies with TMEM63 knocked out. Interestingly, both groups of animals could detect larger forces on their bristles, such as might be caused by hard or viscous foods.

Montell's team had previously shown that another channel called TMC, which is also expressed in md-L neurons, is important for detecting these larger forces. Both TMEM63 and TMC relay textural information about food, and even activate the same neuron. However, Li and Montell's results revealed that the two channels have distinct roles.

Texture can provide a lot of information about food. It can indicate freshness or spoilage. For instance, fruit often become mealy when they start to spoil. "Animals use all the sensory information that they can to evaluate the palatability of food," Montell said. "This includes not only its chemical makeup, but color, smell, temperature and a variety of textural features.

"The 9 micrometer particle size that flies most enjoy corresponds in size to some of their preferred foods, like budding yeast and the particles in their favorite fruit," he explained.

Montell suggested TMEM63 almost certainly has many other roles in animals. "This protein is conserved in humans," Montell said. "We don't know if it has a role in texture sensation in humans, but some kind of mechanically-sensitive channel probably does."

Credit: 
University of California - Santa Barbara

New cell line could lead to more reliable vaccine development to fight costly pig virus

image: Jianqiang Zhang

Image: 
Iowa State University News Service

AMES, Iowa - Vaccines are an important tool in fighting porcine reproductive and respiratory syndrome (PRRS), but the fast-mutating virus that causes the disease sometimes requires the production of autogenous vaccines tailored to particular variants.

The production of autogenous vaccines depends on the ability of scientists to isolate the virus, but sometimes that's a tricky process. A new study from an Iowa State University researcher shows that a new cell line may offer a better alternative to the cell line most commonly used to isolate the PRRS virus. That could lead to more reliable processes for creating autogenous vaccines, but most autogenous vaccine producers would have to make dramatic changes to their processes in order to adopt the new cell line, said Jianqiang Zhang, associate professor of veterinary diagnostic and production animal medicine and lead author of the study.

The article was published this month in the peer-reviewed Journal of Clinical Microbiology.

PRRS is an infectious disease in pigs that costs pork producers hundreds of millions of dollars every year to contain. Currently available commercial PRRS vaccines don't always provide effective protection to pigs due to high genetic and antigenic diversity among PRRS strains, Zhang said. So swine veterinarians often request diagnostic laboratories to isolate the virus from clinical samples to produce farm-specific autogenous vaccines.

Scientists have to grow the virus in a cell culture to isolate it successfully, and the most commonly used cell line for PRRS isolation is referred to as MARC-145, a cell line that originates in the kidneys of monkeys.

In the new study, Zhang analyzed a different cell line, referred to as ZMAC. The ZMAC cell line, derived from alveolar macrophages in pigs' lungs, was developed by Dr. Federico Zuckerman at the University of Illinois and is a patented cell line. The Iowa State University Veterinary Diagnostic Laboratory obtained the ZMAC cell line from Aptimmune Biologics, Inc., a swine disease vaccine company that licensed the ZMAC cell line from University of Illinois. The study looked at the ability of the ZMAC line to culture the virus and then compared the results to the MARC-145 line. The study demonstrated that the ZMAC cell line can significantly improve the success rate for isolating the PRRS virus.

Using the ZMAC cell line resulted in successful isolation of viruses 57.6% of the time, while the MARC-145 cell line was successful 26.3% of the time, according to the findings.

"That's potentially good for producers and veterinarians to produce farm-specific vaccines, but there's a challenge because not all companies have adopted this ZMAC cell line," Zhang said.

If scientists can't isolate a particular virus variant, then they can't produce autogenous vaccines to target that variant. So a cell line that successfully isolates the virus more often could be a valuable tool in producing more tailor-made vaccines.

"However, it is noteworthy that, when PRRS virus isolates obtained in ZMAC cell line were adapted to grow in MARC-145 cell line, only 57.3% of them grew and 42.7% did not grow", Zhang said. "Considering that the vast majority of autogenous vaccine companies still rely on the MARC-145 cell line in their vaccine production systems, it may happen that some of them cannot produce autogenous vaccines even if a PRRS virus isolate is obtained in ZMAC cell line. It remains to be seen how readily they'll adopt the use of the new ZMAC cell line."

In the meantime, the research team notes that further study of these cell lines could fill in gaps in their understanding of why the ZMAC cells isolated the virus at a higher rate.

"Overall, a better PRRS virus isolation outcome can be achieved in ZMAC cells than in MARC-145 cells," the study says. "The details of the mechanisms remain to be elucidated. It is suspected that the mechanisms are related to virus genetic diversity and the interaction between viral proteins and host cell receptors." 

Credit: 
Iowa State University

Unusual earthquakes highlight central Utah volcanoes

image: Volcanic basalt rocks in the Black Rock Desert, Utah.

Image: 
Paul Gabrielsen


Volcanic basalt rocks in the Black Rock Desert, Utah.

If you drive south through central Utah on Interstate 15 and look west somewhere around Fillmore, you'll see smooth hills and fields of black rock. The area is, aptly, named the Black Rock Desert. It may not look like much, but you're looking at some of Utah's volcanoes.

A pair of earthquake sequences, in September 2018 and April 2019, focused scientists' attention on the Black Rock Desert. The sequences, which included the main quakes and their aftershocks, were very different from the Magna earthquake that shook the Wasatch Front in 2020 and other Utah earthquakes. The Black Rock sequences were captured by the Utah Regional Seismic Network and by nearby temporary seismic equipment deployment that was monitoring a geothermal well. Earthquakes in the Black Rock Desert are rare and capturing the seismic recordings from these earthquakes provides a glimpse into the volcanic system of the Black Rock Desert that, while not showing any signs of erupting, is still active. A study of the earthquake sequences is published in Geophysical Research Letters.

"The results showed us that we should give more attention to the Black Rock area," says Maria Mesimeri, a postdoctoral research associate with the University of Utah Seismograph Stations. "We need to improve seismic and volcanic monitoring in this area, so that we are aware of small changes that may occur."

Not your typical earthquakes


Location of the Black Rock desert in Utah. Orange triangles show the location of University of Utah Seismograph Stations. Black dots show the locations of Utah earthquakes.

The earthquake sequences, with main shocks of magnitude 4.0 and 4.1 respectively, were picked up by both the Utah Regional Seismic Network and a dense temporary network of seismometers deployed as part of Utah FORGE, an experimental geothermal project funded by the U.S. Department of Energy and operated by the University of Utah, located about 19 miles south of the Black Rock Desert near Milford, Utah. The temporary network allowed researchers to detect more aftershocks than usual. For example, the regional network detected 19 earthquakes as part of the April 2019 sequence. But the dense temporary network detected an additional 35 quakes. Each additional aftershock provided a bit more information for seismologists studying the sequence.

The Black Rock sequences showed some interesting features that set them apart from the 2020 Magna sequence and other Utah earthquake sequences. While the initial Magna quake occurred at a depth of about six miles below the surface, a typical depth for Utah earthquakes, the Black Rock quakes were much shallower--around 1.5 miles below the surface.

"Because these earthquakes were so shallow," Mesimeri says, "we could measure surface deformation [due to the quakes] using satellites, which is very unusual for earthquakes this small."

Also, Mesimeri and her colleagues found, the quakes produced much lower-frequency seismic energy than usually seen in Utah quakes. And one of the main types of seismic waves, shear waves or S-waves, wasn't detected in the Black Rock sequences.

Volcanoes? In Utah?


Volcanic rocks found in the Black Rock Desert, Utah.

All of these signs point to the Black Rock sequences having a very different origin than the Magna sequence, which was generated by movement of the Wasatch Fault. The Black Rock quakes, on the other hand, may have been generated by ongoing activity in the Black Rock volcanic field.

What are volcanoes doing in the middle of Utah? The Wasatch Mountains (and Wasatch Fault) form the eastern margin of a region called the Basin and Range province that stretches west to the Sierra Nevada. The province is being stretched apart by plate tectonics, and that stretching thins the crust, allowing more heat to rise up from the Earth's interior. In the Black Rock area, that heat resulted in eruption of basalt lava up until around 9,000 to 12,000 years ago.

So what do these earthquake sequences mean for the volcanoes of the Black Rock Desert?

"Our findings suggest that the system is still active and that the earthquakes were probably the result of fluid-related movement in the general area," Mesimeri says, referring to potentially magma or heated water.  "The earthquakes could be the result of the fluid squeezing through rock or the result of deformation from fluid movement that stressed the surface faults."

Activity in a volcanic field does not mean eruption, and Mesimeri says that there's no evidence that any eruption is imminent in the Black Rock Desert. But, she says, it's an area that geoscientists may want to monitor a little more closely.

Credit: 
University of Utah