Tech

Discovery reveals how plants make cellulose for strength and growth

New research from the University of Virginia School of Medicine reveals how plants create the load-bearing structures that let them grow - much like how building crews frame a house.

Funded by the U.S. Department of Energy, the new discovery unveils the molecular machinery that plants use to weave cellulose chains into cable-like structures called "microfibrils." These microfibrils provide crucial support to the cell walls of land plants and allow them to build up pressure inside their cells. This pressure lets plants grow towards the sky.

"Cellulose is the most abundant naturally produced polymer, and its building block, glucose, is a direct product of photosynthesis that captures carbon dioxide from the atmosphere," said researcher Jochen Zimmer, DPhil, of UVA's Department of Molecular Physiology and Biological Physics. "Understanding, on a molecular level, how cellulose is produced enables us to tailor its biosynthesis to alter the physical properties of cellulose, optimize carbon sequestration or extract the stored energy to fuel man-made processes."

Constructing Cellulose

Cellulose is tough stuff and has accompanied and shaped human evolution from its beginning. It is used to make building materials, clothes, paper, food additives and even medical tools. The polymer does not dissolve in water, and microbes have a very hard time breaking it down. These are just a few examples of cellulose's unique material properties.

Zimmer and his colleagues have shed light on how plants create this essential material. Scientists have known that cellulose is made of molecules of glucose, a simple sugar, chained together, but the new research maps out the molecular machinery plants use to do this. In essence, the scientists have created a blueprint of the factories plants use to make cellulose and to transport it to their cell surfaces. These factories are known as cellulose synthase complexes, and they sit inside the cell membrane to enable traffic across the cell boundary.

The factories, the researchers found, produce three cellulose chains with parts located inside the cell. They also transport the polymers to the cell surface through channels that traverse the cell boundary. These channels release the cellulose chains toward a single exit point to align them into thin fibrillar "protofibrils." Protofibrils emerge, like toothpaste from a tube, as a strand. They are then assembled with many others into microfibrils to perform their essential functions in the cell wall.

Cellulose proto- and microfibrils are only a few nanometers thick -- a nanometer is a billionth of a meter. But their strength is in their numbers. Plants make microfibril after microfibril to support their cells. When assembled, the resulting structure is very strong. You might think of it like how pieces of dry straw can be packed to make a durable, waterproof thatched roof.

The cellulose factories are far, far too small to be seen by a conventional light microscope. To map them out, Zimmer and his colleagues tapped the power of UVA's Titan Krios electron microscope. This is a machine so sensitive that it is buried deep underground, encased in tons of concrete, to spare it even the slightest vibrations. It allows scientists to reveal a fascinating molecular world previously concealed from human view.

In this case, it has allowed the research team to provide the first glimpse of the production and assembly of the world's most abundant biopolymer.

"We are already facing rapidly changing environmental conditions that impact agriculture and food security around the world. In the future, understanding how plants operate on a molecular level will be increasingly important for population health," Zimmer said. "It is now more important than ever to invest in plant sciences."

Credit: 
University of Virginia Health System

Care for cats? So did people along the Silk Road more than 1,000 years ago

image: Cats like we know them today accompanied pastoralists in Kazakhstan more than 1,000 years ago.

Image: 
Maike Glöckner / MLU

Common domestic cats, as we know them today, might have accompanied Kazakh pastoralists as pets more than 1,000 years ago. This has been indicated by new analyses done on an almost complete cat skeleton found during an excavation along the former Silk Road in southern Kazakhstan. An international research team led by Martin Luther University Halle-Wittenberg (MLU), Korkyt-Ata Kyzylorda State University in Kazakhstan, the University of Tübingen and the Higher School of Economics in Russia has reconstructed the cat's life, revealing astonishing insights into the relationship between humans and pets at the time. The study will appear in the journal "Scientific Reports".

The tomcat - which was examined by a team led by Dr Ashleigh Haruda from the Central Natural Science Collections at MLU - did not have an easy life. "The cat suffered several broken bones during its lifetime," says Haruda. And yet, based on a very conservative estimate, the animal had most likely made it past its first year of life. For Haruda and her colleagues, this is a clear indication that people had taken care of this cat.

During a research stay in Kazakhstan, the scientist examined the findings of an excavation in Dzhankent, an early medieval settlement in the south of the country which had been mainly populated by the Oghuz, a pastoralist Turkic tribe. There she discovered a very well-preserved skeleton of a cat. According to Haruda, this is quite rare because normally only individual bones of an animal are found during an excavation, which prevents any systematic conclusions from being drawn about the animal's life. The situation is different when it comes to humans since usually whole skeletons are found. "A human skeleton is like a biography of that person. The bones provide a great deal of information about how the person lived and what they experienced," says Haruda. In this case, however, the researchers got lucky: after its death, the tomcat was apparently buried and therefore the entire skull including its lower jaw, parts of its upper body, legs and four vertebrae had been preserved.

Haruda worked together with an international team of archaeologists and ancient DNA specialists. An examination of the tomcat's skeleton revealed astonishing details about its life. First, the team took 3D images and X-rays of its bones. "This cat suffered a number of fractures, but survived," says Haruda. Isotope analyses of bone samples also provided the team with information about the cat's diet. Compared to the dogs found during the excavation and to other cats from that time period, this tomcat's diet was very high in protein. "It must have been fed by humans since the animal had lost almost all its teeth towards the end of its life."

DNA analyses also proved that the animal was indeed likely to be a domestic cat of the Felis catus L. species and not a closely related wild steppe cat. According to Haruda, it is remarkable that cats were already being kept as pets in this region around the 8th century AD: "The Oghuz were people who only kept animals when they were essential to their lives. Dogs, for example, can watch over the herd. They had no obvious use for cats back then," explains the researcher. The fact that people at the time kept and cared for such "exotic" animals indicates a cultural change, which was thought to have occurred at a much later point in time in Central Asia. The region was thought to have been slow in making changes with respect to agriculture and animal husbandry.

The Dhzankent settlement, where the remains of the cat were found, was located along the Silk Road, an ancient network of important caravan routes that connected Central and East Asia with the Mediterranean region by land. According to Haruda, the find is also an indication of cultural exchange between the regions located along the Silk Road.

Credit: 
Martin-Luther-Universität Halle-Wittenberg

The spin state story: Observation of the quantum spin liquid state in novel material

image: A QSL state can be experimentally observed, which has advanced our knowledge of spin behavior, and its integration in next-generation "spintronic" devices.

Image: 
Tokyo University of Science

Aside from the deep understanding of the natural world that quantum physics theory offers, scientists worldwide are working tirelessly to bring forth a technological revolution by leveraging this newfound knowledge in engineering applications. Spintronics is an emerging field that aims to surpass the limits of traditional electronics by using the spin of electrons, which can be roughly seen as their angular rotation, as a means to transmit information.

But the design of devices that can operate using spin is extremely challenging and requires the use of new materials in exotic states--even some that scientists do not fully understand and have not experimentally observed yet. In a recent study published in Nature Communications, scientists from the Department of Applied Physics at Tokyo University of Science, Japan, describe a newly synthesized compound with the formula KCu6AlBiO4(SO4)5Cl that may be key in understanding the elusive "quantum spin liquid (QSL)" state. Lead scientist Dr Masayoshi Fujihala explains his motivation: "Observation of a QSL state is one of the most important goals in condensed-matter physics as well as the development of new spintronic devices. However, the QSL state in two-dimensional (2D) systems has not been clearly observed in real materials owing to the presence of disorder or deviations from ideal models."

What is the quantum spin liquid state? In antiferromagnetic materials below specific temperatures, the spins of electrons naturally align into large-scale patterns. In materials in a QSL state, however, the spins are disordered in a way similar to how molecules in liquid water are disordered in comparison to crystalline ice. This disorder arises from a structural phenomenon called frustration, in which there is no possible configuration of spins that is symmetrical and energetically favorable for all electrons. KCu6AlBiO4(SO4)5Cl is a newly synthesized compound whose copper atoms are arranged in a particular 2D pattern known as the "square kagome lattice (SKL)," an arrangement that is expected to produce a QSL state through frustration. Professor Setsuo Mitsuda, co-author of the study, states: "The lack of a model compound for the SKL system has obstructed a deeper understanding of its spin state. Motivated by this, we synthesized KCu6AlBiO4(SO4)5Cl, the first SKL antiferromagnet, and demonstrated the absence of magnetic ordering at extremely low temperatures--a QSL state."

However, the experimental results obtained could not be replicated through theoretical calculations using a standard "J1-J2-J3 SKL Heisenberg" model. This approach considers the interactions between each copper ion in the crystal network and its nearest neighbors. Co-author Dr Katsuhiro Morita explains: "To try to eliminate the discrepancy, we calculated an SKL model considering next-nearest-neighbor interactions using various sets of parameters. Still, we could not reproduce the experimental results. Therefore, to understand the experiment correctly, we need to calculate the model with further interactions."

This disagreement between experiment and calculations highlights the need for refining existing theoretical approaches, as co-author Prof Takami Tohyama concludes: "While the SKL antiferromagnet we synthesized is a first candidate to investigate SKL magnetism, we may have to consider longer-range interactions to obtain a quantum spin liquid in our models. This represents a theoretical challenge to unveil the nature of the QSL state." Let us hope physicists manage to tackle this challenge to bring us yet another step closer to the wonderful promise of spintronics.

Credit: 
Tokyo University of Science

Study of supercooled liquids contributes to better understanding of phase change processes

image: Schematic procedure for evaluation of the quantities k+ and k? taking into account the identification (label) numbers of the particles of a crystalline nucleus. Particles detached from the nucleus surface are in red, whereas the attached particles are colored green.

Image: 
Kazan Federal University

The authors propose a new quantitative approach to better measure the crystal growth rate in supercooled liquids. The approach is based on a unique statistical algorithm used in molecular dynamics simulation.

Crystallization occurs with matter in a supercooled liquid or in an amorphous state. According to the classical theory, this process occurs through the formation of crystalline phase foci, called nuclei. The crystal nucleation rate and crystal growth rate in such matter are determined by a number of kinetic factors, among which the decisive role is played by the frequency of attachment of atoms to a nucleus and the frequency of detachment of atoms from the surface of a nucleus. Obviously, the predominance of the atoms attachment frequency will contribute to a stable crystal growth. At the same time, the currently existing experimental methods do not allow the direct measurement of these kinetic factors due to the difficulties associated with the identification of various phase atoms in the volume of the system. Therefore, molecular dynamics simulation is most suitable and affordable method.

The researchers were faced with the task of making an accurate assessment of kinetic factors and constructing their dependence on the size of crystalline nuclei based on molecular dynamics calculations. For this, an algorithm was developed that tracks atomic rearrangements near the surface of each growing nucleus on the fly. Tracking takes place according to the identification numbers that are assigned to each atom. These numbers make it possible to distinguish between crystal atoms and atoms of the parent disordered phase. With this approach, the accuracy of the calculations is orders of magnitude higher compared to existing methods for estimating kinetic rate factors. This accuracy is achieved due to the fact that the calculations are carried out directly without the use of any model functions and adjustable parameters.

Based on the performed calculations, the researchers were able to estimate the crystal growth rate for the well-known Lennard-Jones model system, following the basic definition - through the difference in the atoms attachment and detachment frequencies. This made it possible to reveal the presence of a stationary regime in the dependence of the growth rate of the crystalline nucleus on its size. The results are in good agreement with the prediction of the classical theory of crystal growth.

The results of the study can be used to develop more accurate methods for estimating the rate of phase transitions in systems with various physicochemical properties, for example, in ionic liquids, molecular liquids, polymer systems, and colloidal solutions. Furthermore, these results can be used to develop practical methods for controlling crystallization and melting processes, which is important in various fields (from metallurgy and microelectronics to pharmaceuticals). The results will also be in demand in the development of a rigorous theory for describing the dependence of kinetic rate factors on the size of the nucleus and on the crystallization time. Another interesting and less studied field is the study of the crystal decay processes.

Further research will be aimed at a detailed study of the mechanisms of the crystalline nuclei decay, where the focus will be on identifying factors that affect the crystals decay rate and their stability. The authors' work will also be targeted at the development of a rigorous universal kinetic theory for describing crystallization rate factors. The results will be applied in the computer design of crystalline materials with necessary physical and mechanical properties.

Credit: 
Kazan Federal University

Daytime aardvark sightings are a sign of troubled times

image: A camera trap photo showing an aardvark leaving its burrow for feeding at night.

Image: 
Nora Weyer/Wits University

Aardvarks occur across most of sub-Saharan Africa, but very few people have seen one, because they are solitary, mostly active at night, and live in burrows. They use their spade-like claws to build these burrows and to dig up ants and termites on which they feed. However, seeing aardvarks feeding in the day is becoming more common in the drier parts of southern Africa. While catching sight of an aardvark is a delight for many a wildlife enthusiast, researchers from the Wildlife Conservation Physiology laboratory at the University of the Witwatersrand (Wits) warn that seeing aardvarks in the daytime does not bode well for this secretive animal.

New research by the team from Wits, with collaborators from the University of Cape Town and University of Pretoria, reveals what a shift from night-time to daytime activity means for the well-being of aardvarks in a warming and drying world. The researchers studied aardvarks living at Tswalu, a reserve in the Kalahari that lies at the edge of the aardvark's distribution and provides support and infrastructure for researchers through the Tswalu Foundation. The results are published in the journal Frontiers in Physiology.

Using biologgers, the researchers recorded body temperature and activity of aardvarks for three years, during which Dr Nora Weyer followed the aardvarks as part of her PhD research.

Assisted by satellite imaging that showed her how droughts affected the vegetation, Weyer was able to connect changes in aardvark behaviour and body temperature to what was happening in the aardvarks' environment.

Weyer's research confirmed earlier findings by the team that there are times when the aardvarks switched their feeding to the day, and showed, for the first time, that drought caused that switch. "We suspected that it was drought," says co-worker Dr Robyn Hetem, "but we needed a long-term, comprehensive data set to confirm that it really was drought causing this unusual behaviour."

The Kalahari is arid at the best of times, but drought killed the vegetation that fed the ants and termites. Most of the ants and termites disappeared, leaving the aardvarks starving. "It was heart-breaking to watch our aardvarks waste away as they starved," says Weyer.

By shifting their activity from the cold nights to the warm days during dry winter months, aardvarks can save some of the energy needed to keep their body temperatures up. But those energy savings were not enough to see the aardvarks through a particularly bad drought in which many aardvarks died.

"Aardvarks have coped with the Kalahari's harsh environment in the past, but it is getting hotter and drier, and the current and future changes to our climate might be too much for the aardvarks to bear," says Weyer. "Because the Kalahari is such a unique and potentially vulnerable ecosystem, we need to better understand whether its animals can cope with the increasingly dry conditions," says Professor Andrea Fuller, co-worker and project leader of the Kalahari Endangered Ecosystem Project (KEEP).

Disappearance of aardvarks from the Kalahari would be devastating for many other animals in this ecosystem. The large burrows which aardvarks build provide important shelters for many other species that cannot dig their own burrows, earning the aardvark the title of ?ecosystem engineer'.

"Unfortunately, the future looks grim for Kalahari aardvarks and the animals that use their burrows. Tackling climate change is key, but there is no quick fix", says Weyer. What conservationists do know is that any solution will require a much better understanding of what capacities animals have to cope with drought. And that means many more long-term comprehensive studies of physiology and behaviour, like the study that Dr Weyer and her colleagues carried out at Tswalu.

Credit: 
University of the Witwatersrand

No association found between exposure to mobile devices and brain volume alterations in adolescents

How does the use of mobile devices affect children's brains? A team from the Barcelona Institute for Global Health (ISGlobal), a centre supported by the "la Caixa" Foundation, has conducted the first epidemiological study to explore the relationship between brain volume in preadolescents--more than 2,500 Dutch children--and different doses of radiofrequency electromagnetic fields (RF-EMF). No association was found, although the authors did not rule out the possibility of an association between using mobile devices with a wireless Internet connection and smaller volume of the caudate nucleus.

The potential negative health consequences associated with children's use of mobile devices have been a matter of concern for some time. Exposure to RF-EMF is of particular interest, since the preadolescent brain is still developing and children will experience long periods of exposure to RF-EMF if they use mobile devices throughout their lives.

Most previous research on this subject has separately assessed the association between brain development and different RF-EMF sources, without finding clear associations. The new study, published in Environment International, aimed to investigate brain volume alterations using an integrative approach that considered multiple sources of RF-EMF. This approach allowed a more comprehensive assessment of the possible impact of RF-EMF exposure on the adolescent brain.

The study used data on more than 2,500 children aged 9-12 years from the Generation R Study, a birth cohort based in Rotterdam, the Netherlands. Parents completed a questionnaire on their children's use of mobile devices. RF-EMF doses to the brain from different sources were estimated and grouped according to three exposure patterns: telephone calls, screen activities, and other environmental factors such as mobile telephone antennas. Magnetic resonance imaging (MRI) scans were used to determine the volume of various parts of the brain.

The authors found no association between alterations in total or lobe-specific brain volume and overall RF-EMF dose. Nor was brain volume associated with the use of mobile devices for telephone calls, which are the primary contributors of RF-EMF exposure to the brain. However, a link was found between smaller volume of the caudate nucleus--a brain structure involved in memory and coordination of movements--and RF-EMF dose from the use of devices with screens (mobile phones, tablets and laptops) with a wireless Internet connection.

"The main objective of the study was to determine whether there were any associations between exposure to RF-EMF and brain volumes," commented ISGlobal researcher Alba Cabré, lead author of the study. "Our findings show that this is not the case. The possible association between the RF-EMF dose received through the use of these devices for screen activities and the volume of the caudate nucleus is a secondary finding for which we currently have no explanation. When you surf the Internet on a mobile phone, tablet or laptop using a wireless connection, the brain's exposure to RF-EMF is much lower than it is when you make phone calls, for example, because of the distance between the device and your head. In any case, this result should be interpreted with great caution, since the influence of other factors and the possibility of a chance finding cannot be ruled out."

RF-EMF Exposure or Use-Related Factors?

One possible explanation for the findings, besides the brain's exposure to RF-EMF, is the influence of social or individual factors related to certain uses of mobile devices. ISGlobal researcher Mònica Guxens, coordinator of the study, commented: "We cannot rule out the possibility that brain alterations may somehow be related to the way in which children use mobile devices." She added: "More research is needed on mobile communication devices and their possible associations with brain development, regardless of whether the relationship is due to RF-EMF exposure or other factors related to the use of these devices."

The average overall whole-brain dose of RF-EMF was estimated at 84.3 mJ/kg/day. The highest overall lobe-specific dose was estimated in the temporal lobe (307.1 mJ/kg/day). Both doses are well below the maximum values recommended by the International Commission on Non-Ionising Radiation Protection (ICNIRP).

Credit: 
Barcelona Institute for Global Health (ISGlobal)

Extreme rainfall events cause top-heavy aquatic food webs

image: Scientists used the insect larvae that live in the water trapped by bromeliad plants as a model ecosystem, discovering that food webs become top-heavy with predators when there are large day-to-day variations in rainfall.

Image: 
The Bromeliad Working Group/UBC.

An expansive, multi-site ecology study led by UBC has uncovered new insights into the effects of climate change on the delicate food webs of the neotropics.

In research recently outlined in Nature, scientists across seven different sites throughout Central and South America replicated the extreme rainfall events predicted by climate change science. Using the insect larvae that live in the water trapped by bromeliad plants as a model ecosystem, they found that food webs became top-heavy with predators when there were large day-to-day variations in rainfall.

"This has knock-on effects for all parts of the rainforest system, because the larval insects in the bromeliads are destined to become winged adults that then are part of the forest ecosystem around them," said co-author Diane Srivastava, professor of zoology in UBC's faculty of science, who established the Bromeliad Working Group, an international consortium of researchers who conducted the research.

To attain their results, scientists in sites spread across Argentina, Brazil, Columbia, Costa Rica, French Guiana and Puerto Rico performed identical experiments on bromeliads--large flowering tropical plants that trap water and provide a habitat for many aquatic insects and larvae. The bromeliads were covered with rain shelters, and researchers watered them on strict schedules to replicate 30 different rainfall patterns in each site.

"This is the first study, to my knowledge, where we have a replicated study of how precipitation patterns affect an entire food web in multiple sites," said Srivastava. "Every day we'd run around with a watering can with a list of how much water each bromeliad should get on each day. We had a customized rainfall schedule for each bromeliad in every field site."

While the researchers found that extreme rainfall patterns resulted in top-heavy food webs, the opposite was true when rainfall was delivered on an even schedule, with similar amounts of water delivered to the plants every day. Under those conditions, there were fewer predators and more prey among the larval insects.

"We were actually expecting to see the opposite pattern," said Srivastava. "We often think of predators being the most sensitive to environmental change, but we got the opposite result. One reason may be that, when the water level in the bromeliad went down during drier days, there was less aquatic habitat, so the prey was condensed into a small amount of water together with their predators. This can really benefit predators and disadvantage prey."

These findings can be extrapolated to other rainfall-dependent aquatic ecosystems, said Srivastava. "In any small pond or lake which is primarily determined by rainfall, we can expect to see a similar effect. We should be concerned about these findings, because we've shown that these extreme perturbations in rainfall really do affect the flow of energy through the food web."

Credit: 
University of British Columbia

Black individuals at higher risk for contracting COVID-19, according to new research

image: In a study, Black race was associated with increased COVID-19 infection risk.

Image: 
ATS

July 08, 2020 - Results of an analysis published in the Annals of the American Thoracic Society found that Black individuals were twice as likely as White individuals to test positive for COVID-19. The average age of all participants in the study was 46. However, those infected were on average 52 years old, compared to those who tested negative, who were 45 years old on average.

"Association of Black Race with Outcomes in COVID-19 Disease: A Retrospective Cohort Study" is not the first to examine race. However, it provides further evidence that, while anyone can get COVID-19, race is indeed a factor in the extent to which some populations are affected. Of the 4413 individuals tested, 17.8 percent tested positive. Of those who tested positive, 78.9 percent were Black while 9.6 percent were White.

Study author Ayodeji Adegunsoye, MD, MS, assistant professor of medicine at the University of Chicago,
sees logic in the results of the analysis as it relates to the infection rates along racial lines: "I think this really amplifies how pre-existing socioeconomic and health care disparities affect outcomes in the population. We already know that the common comorbidities that have been associated with COVID such as hypertension and diabetes disproportionately affect the Black community. So, it wasn't too surprising that COVID-19 seemed to more commonly affect Black individuals as well."

In addition, noted Dr. Adegunsoye, given that Black individuals are overly represented in the service
industry, and therefore more likely to be essential workers, their risk of exposure to COVID-19 is greater: "Even during precautionary lockdowns to reduce spread, these jobs were often deemed essential services, and included jobs such as bus drivers, janitors, city sanitation workers, hospital food production personnel, security guards, etc. so it wasn't too surprising that Black people were disproportionately infected and subsequently hospitalized with the virus."

The results showing that the individuals who tested positive were older than their counterparts who
tested negative is consistent with reports of infection rates in the U.S. and elsewhere. "We have
observed that for various reasons, older individuals are more likely to develop severe symptoms when
they get infected and therefore they are more likely to get tested for COVID-19," said Dr. Adegunsoye.

"It's a vicious cycle of sorts, as older people are more likely to have hypertension and other comorbid
diseases, which further increase the risk for hospitalization with COVID. Even after accounting for their
older age, Black patients were still at significantly increased risk of COVID-19 infection and
hospitalization."

In addressing the disparity in COVID-19 infection rates, Dr. Adegunsoye proposes making COVID-19 screening free and widely accessible. He hopes that there will be an increase in policy decisions that result in increased funding for community-led prevention efforts as well as "improved public enlightenment campaigns targeted at minorities to reduce the risk of developing hypertension and
diabetes."

These measures, together with renewed strategic focus on reducing health inequities, will improve the lives of all Americans.

Credit: 
American Thoracic Society

A helping hand for cancer immunotherapy

image: Ze'ev Ronai, Ph.D., professor in the Tumor Initiation and Maintenance Program at Sanford Burnham Prebys Medical Discovery Institute and senior author of the study

Image: 
Sanford Burnham Prebys Medical Discovery Institute

Scientists at Sanford Burnham Prebys Medical Discovery Institute have demonstrated the therapeutic potential of PRMT5 inhibitors to sensitize unresponsive melanoma to immune checkpoint therapy. PRMT5 inhibitors are currently in clinical trials in oncology, and this research provides a strong rationale for evaluating the drugs in tumors that are not responsive to immune checkpoint therapy. The study was published in Science Translational Medicine.

"Our study reveals that PRMT5 enables tumors to hide from the immune system by controlling two immune signaling pathways," says Ze'ev Ronai, Ph.D., professor in the Tumor Initiation and Maintenance Program at Sanford Burnham Prebys and senior author of the study. "We found that inhibiting PRMT5 enhances both antigen presentation and the activation of innate immunity, prerequisites for effective immune checkpoint therapy. We are optimistic that this research will lead to a near-term, much-needed breakthrough for people with tumors that do not respond to checkpoint inhibitor therapy."

Immunotherapy, which harnesses the power of an individual's immune system to destroy tumors, has revolutionized the treatment of certain cancers. For some people with advanced melanoma, the treatment has extended survival to years instead of months. However, immunotherapy only works for about 40% of people with advanced melanoma. Scientists are working to uncover new approaches that would make the treatment effective for more people with more cancer types.

Turning "cold" tumors "hot"

Scientists have dubbed tumors that don't respond to immunotherapy "cold" tumors, and are working to find approaches that make the cancer responsive to treatment--or "hot." Approaches that achieve this goal represent major advances for the field.

In the study, the scientists used mouse models of melanoma to show that combining a PRMT5 inhibitor with anti-PD-1 therapy--one of the more prevalently used immune checkpoint therapies--successfully turned the unresponsive "cold" tumor "hot." Mice that normally didn't respond to PD-1 therapy survived longer and had smaller tumors after also receiving the PRMT5 inhibitor--which the scientists showed was due to the enhanced ability of the immune system to attack the tumor.

"The development of immune checkpoint inhibitors has been a major step forward in the treatment of advanced melanoma, with more than half of people alive for five years or longer. Yet, there is a substantial number who have melanoma that is resistant to this therapeutic approach, and addressing that unmet need in a mechanism-based way is a therapeutic imperative," says Jedd Wolchok, M.D., Ph.D., chief of the Immuno-Oncology Services at Memorial Sloan Kettering Cancer Center. "This study's findings indicate that we should immediately begin exploring a clinical trial testing the effectiveness of PRMT5 inhibitors combined with immune checkpoint inhibitors in people who do not respond, or have stopped responding, to currently available immunotherapy."

The researchers identified two cellular signaling pathways that were suppressed by PRMT5 and allowed the tumor to escape detection by the immune system. One pathway is responsible for antigen presentation, and the second pathway controls cytokine production and innate immunity. Together, these pathways determine the degree of the tumor's ability to escape from the immune system. Thus, their inhibition makes tumors that were not seen by the immune system--and therefore were unresponsive to immune checkpoint therapy--visible and sensitive to immune attack.

The study further demonstrated that PRMT5-controlled immune signaling pathways are associated with the survival of people with melanoma--indicating that patients may be stratified for PRMT5 inhibitors based on expression of these pathway components.

"The pathways we uncovered are expected to also define how different tumor types respond, or not, to immune checkpoint therapy. Thus, these findings are likely to have clinical relevance to tumors other than melanoma that failed immune checkpoint therapy," says Ronai.

Credit: 
Sanford Burnham Prebys

Scaling up the quantum chip

MIT researchers have developed a process to manufacture and integrate "artificial atoms," created by atomic-scale defects in microscopically thin slices of diamond, with photonic circuitry, producing the largest quantum chip of its type.

The accomplishment "marks a turning point" in the field of scalable quantum processors, says Dirk Englund, an associate professor in MIT's Department of Electrical Engineering and Computer Science. Millions of quantum processors will be needed to build quantum computers, and the new research demonstrates a viable way to scale up processor production, he and his colleagues note.

Unlike classical computers, which process and store information using bits represented by either 0s and 1s, quantum computers operate using quantum bits, or qubits, which can represent 0, 1, or both at the same time. This strange property allows quantum computers to simultaneously perform multiple calculations, solving problems that would be intractable for classical computers.

The qubits in the new chip are artificial atoms made from defects in diamond, which can be prodded with visible light and microwaves to emit photons that carry quantum information. The process, which Englund and his team describe in Nature, is a hybrid approach, in which carefully selected "quantum micro chiplets" containing multiple diamond-based qubits are placed on an aluminum nitride photonic integrated circuit.

"In the past 20 years of quantum engineering, it has been the ultimate vision to manufacture such artificial qubit systems at volumes comparable to integrated electronics," Englund says. "Although there has been remarkable progress in this very active area of research, fabrication and materials complications have thus far yielded just two to three emitters per photonic system."

Using their hybrid method, Englund and colleagues were able to build a 128-qubit system -- the largest integrated artificial atom-photonics chip yet.

Other authors on the Nature paper include MIT researchers Noel H. Wan, Tsung-Ju Lu, Kevin C. Chen, Michael P. Walsh, Matthew E. Trusheim, Lorenzo De Santis, Eric A. Bersin, Isaac B. Harris, Sara L. Mouradian and Ian R. Christen; with Edward S. Bielejec at Sandia National Laboratories.

Quality control for chiplets

The artificial atoms in the chiplets consist of color centers in diamonds, defects in diamond's carbon lattice where adjacent carbon atoms are missing, with their spaces either filled by a different element or left vacant. In the MIT chiplets, the replacement elements are germanium and silicon. Each center functions as an atom-like emitter whose spin states can form a qubit. The artificial atoms emit colored particles of light, or photons, that carry the quantum information represented by the qubit.

Diamond color centers make good solid-state qubits, but "the bottleneck with this platform is actually building a system and device architecture that can scale to thousands and millions of qubits," Wan explains. "Artificial atoms are in a solid crystal, and unwanted contamination can affect important quantum properties such as coherence times. Furthermore, variations within the crystal can cause the qubits to be different from one another, and that makes it difficult to scale these systems."

Instead of trying to build a large quantum chip entirely in diamond, the researchers decided to take a modular and hybrid approach. "We use semiconductor fabrication techniques to make these small chiplets of diamond, from which we select only the highest quality qubit modules," says Wan. "Then we integrate those chiplets piece-by-piece into another chip that 'wires' the chiplets together into a larger device."

The integration takes place on a photonic integrated circuit, which is analogous to an electronic integrated circuit but uses photons rather than electrons to carry information. Photonics provides the underlying architecture to route and switch photons between modules in the circuit with low loss. The circuit platform is aluminum nitride, rather than the traditional silicon of some integrated circuits.

Using this hybrid approach of photonic circuits and diamond chiplets, the researchers were able to connect 128 qubits on one platform. The qubits are stable and long-lived, and their emissions can be tuned within the circuit to produce spectrally indistinguishable photons, according to Wan and colleagues.

A modular approach

While the platform offers a scalable process to produce artificial atom-photonics chips, the next step will be to "turn it on," so to speak, to test its processing skills.

"This is a proof of concept that solid-state qubit emitters are very scalable quantum technologies," says Wan. "In order to process quantum information, the next step would be to control these large numbers of qubits and also induce interactions between them."

The qubits in this type of chip design wouldn't necessarily have to be these particular diamond color centers. Other chip designers might choose other types of diamond color centers, atomic defects in other semiconductor crystals like silicon carbide, certain semiconductor quantum dots, or rare-earth ions in crystals. "Because the integration technique is hybrid and modular, we can choose the best material suitable for each component, rather than relying on natural properties of only one material, thus allowing us to combine the best properties of each disparate material into one system," says Lu.

Finding a way to automate the process and demonstrate further integration with optoelectronic components such as modulators and detectors will be necessary to build even bigger chips necessary for modular quantum computers and multichannel quantum repeaters that transport qubits over long distances, the researchers say.

Credit: 
Massachusetts Institute of Technology

Blood-based biomarker can detect, predict severity of traumatic brain injury

image: location of neurofilament light chain on the non-myelinated section of the axon

Image: 
Pashtun Shahim, MD, PhD, NIH Clinical Center

A study from the National Institutes of Health confirms that neurofilament light chain as a blood biomarker can detect brain injury and predict recovery in multiple groups, including professional hockey players with acute or chronic concussions and clinic-based patients with mild, moderate, or severe traumatic brain injury. The research was conducted by scientists at the NIH Clinical Center, Bethesda, Maryland, and published in the July 8, 2020, online issue of Neurology.

After a traumatic brain injury, neurofilament light chain breaks away from neurons in the brain and collects in the cerebrospinal fluid (CSF). The scientists confirmed that neurofilament light chain also collects in the blood in levels that correlate closely with the levels in the CSF. They demonstrated that neurofilament light chain in the blood can detect brain injury and predict recovery across all stages of traumatic brain injury.

"Currently, there are no validated blood-based biomarkers to provide an objective diagnosis of mild traumatic brain injury or to predict recovery," said Leighton Chan, M.D., M.P.H., chief of the Rehabilitation Medicine Department at the NIH Clinical Center. "Our study reinforces the need and a way forward for a non-invasive test of neurofilament light chain to aid in the diagnosis of patients and athletes whose brain injuries are often unrecognized, undiagnosed or underreported. "

The study examined multiple groups including professional hockey players in Sweden with sports-related concussions, hockey players without concussions, hockey players with persistent post-concussion symptoms, non-athlete controls, and clinic-based patients at the NIH Clinical Center who were healthy or with acute, subacute, and chronic mild traumatic brain injuries. The study showed that neurofilament light chain in the blood:

Correlated closely with CSF neurofilament light chain in hockey players with concussions and non-athlete healthy controls, suggesting that blood neurofilament light chain could be used instead of CSF neurofilament light chain.

Demonstrated strong diagnostic ability for sports-related concussions, where it could identify hockey players with concussions from hockey players without concussions and could identify clinic-based patients with mild, moderate, and severe traumatic brain injuries from each other and controls. This is significant as there is an unmet need for an easy and accessible blood biomarker to determine at the time of injury or in the chronic phase if a person has a concussion or signs of a traumatic brain injury.

Could distinguish with high accuracy hockey players who could return to play after 10 days from those who developed persistent post-concussion symptoms and eventually retired from the game. In the clinic-based cohort, patients with worse functional outcomes had higher blood neurofilament light chain levels. This is significant as there is an unmet need for a blood biomarker that can help clinicians to determine when athletes can safely return to play or when patients can return to work or resume daily activities.

In the clinic-based patients, the levels of blood neurofilament light chain at five years after a single mild, moderate, or severe traumatic brain injury were significantly increased compared to healthy controls. This suggests that even a single mild traumatic brain injury (without visible signs of structural damage on a standard clinical MRI) may cause long-term brain injury, and serum neurofilament light could be a sensitive biomarker to detect even that far out from initial injury.

"This study is the first to do a detailed assessment of serum neurofilament light chain and advanced brain imaging in multiple cohorts, brain injury severities, and time points after injury," said the study's lead author, Pashtun Shahim, M.D., Ph.D., NIH Clinical Center. "Our results suggest that serum neurofilament light chain may provide a valuable compliment to imaging by detecting underlying neuronal damage which may be responsible for the long-term symptoms experienced by a significant number of athletes with acute concussions, and patients with more severe brain injuries."

The study was funded by the Intramural Research Program at NIH, the Department of Defense Center for Neuroscience and Regenerative Medicine at the Uniformed Services University, and the Swedish Research Council.

Traumatic brain injury is a major leading cause of death and disability in the United States with more than 2.87 million emergency department visits, hospitalizations and deaths annually. While majority of all traumatic brain injuries are classified as mild (also known as a concussion), it remains difficult to diagnose this condition. There are a wide range of variable behavioral and observational tests to help determine a patient's injuries but most of these tests rely on the patient to self-report signs and symptoms. Also, imaging has limitations with detecting micro-structural injuries in the brain.

Credit: 
NIH/Clinical Center

NFL outperforms other blood tests to predict and diagnose traumatic brain injury

image: Traumatic brain injury protein biomarkers in neuron and astrocyte cells: neurofilament light chain, glial fibrillary acidic protein, tau, ubiquitin c-terminal hydrolase-L1

Image: 
Pashtun Shahim, MD, PhD, NIH Clinical Center

A study from the National Institutes of Health showed that neurofilament light chain (NfL) delivered superior diagnostic and prognostic performance as a blood biomarker for mild, moderate, and severe traumatic brain injury (TBI) when compared to blood proteins glial fibrillary acidic protein, tau, and ubiquitin c-terminal hydrolase-L1. The research was conducted by scientists at the NIH Clinical Center, Bethesda, Maryland, and published in the July 8, 2020, online issue of Neurology.

"This study confirms the sensitivity of serum neurofilament light chain and its value as a biomarker of choice for all stages of brain injury, even when measured months to years after a single mild, moderate or severe traumatic brain injury," said Leighton Chan, M.D., M.P.H., chief of the Rehabilitation Medicine Department at the NIH Clinical Center.

The scientists selected and studied four proteins from the brain that collect in the blood after a TBI from patients at the NIH Clinical Center who had mild, moderate, or severe injury. The proteins were compared on their ability to distinguish patients with TBI from each other and controls, determine brain injury from 30 days to five years after injury, predict functional outcomes, and compliment advanced brain imaging.

Serum NfL was better than the other proteins at identifying patients with mild, moderate, and severe TBI from each other and controls at a median of seven months after injury. Additionally, serum NfL was the only protein that showed an association to functional outcomes with higher concentrations in the blood of patients with worse or lower outcome.

Serum NfL was the only protein that distinguished TBI patients from uninjured controls with high accuracy even months to years after the injury. This result suggests that a single TBI may cause long-term neuroaxonal degeneration, which may be detected by measuring serum NfL.

Serum NfL also had the strongest association to advanced brain imaging, such as diffusion tensor MRI scans, than the other proteins. This suggests that serum NfL can provide clinicians with an easier, faster, and cost-effective diagnostic and prognostic option than advanced brain imaging.

"Currently, there is no validated biomarker that can reliably detect the subtle signs of brain injury months to years after a traumatic brain injury," said lead author, Pashtun Shahim, M.D., Ph.D. "Our study shows that the amount of serum NfL was higher even at five years after a single traumatic brain injury, while the other proteins we measured in this study, although, detectable in blood, were not high enough to distinguish patients from controls."

Credit: 
NIH/Clinical Center

Abnormal cells in early-stage embryos might not preclude IVF success

The presence of an abnormal number of chromosomes in the genetic profile of early-stage embryos may be far more common - and potentially less threatening - during normal human development than is currently appreciated, according to new research from Johns Hopkins University biologists.

The findings could have clinical implications for the in-vitro fertilization field, where debate still rages about the efficacy of implanting embryos with cells featuring too few or too many chromosomes - a state called "aneuploidy." Until recently, these embryos were destroyed during the typical IVF process. But the researchers found that eight out of every 10 potentially healthy embryos they studied contained those abnormalities.

Even embryos with "mosaic" profiles featuring both normal and aneuploid cells are often labeled as fully abnormal and discarded. But researchers have struggled to determine what percentage of embryos feature mosaic aneuploidy or its consequences for development.

"Clinicians have wrestled with the decision to transfer embryos featuring mosaic aneuploidy when no other embryos are available," said Rajiv McCoy, a biology professor and senior author of the research published today in Genome Research. "In recent years some have implanted such embryos and reported healthy births, indicating embryos may have resilience or self-correction of mosaicism."

Previous studies have reported anywhere from 4% to 90% of human embryos with mosaic chromosome counts. Such a wide range is the result of discrepant research using the most prevalent screening in IVF called "preimplantation genetic testing," or PGT. The method uses a biopsy to pluck just five cells from the outer, placenta-to-be layer of an embryo to determine whether it's normal or abnormal, and whether it's implanted or discarded.

This approach has provoked debate because it assumes that a biopsy is representative of "the embryo as a whole and predictive of its developmental outcomes," the paper states.

"What we found is that low level mosaicism is common," said Margaret R. Starostik, a graduate student in Biology and lead author of the study. "It may be a normal phenotype."

Unlike the small snapshot taken by PGT, McCoy's lab applied a novel statistical technique to probe a far more extensive, existing dataset comprising single-cell RNA sequencing of 74 embryos. The process "provides an embryo-wide census of aneuploidy across early development and quantifies parameters of chromosomal mosaicism that have proven elusive to biopsy-based studies," the paper states.

The result: 80% of the embryos studied contained at least one aneuploid cell across all cell types and developmental stages. In addition, the findings show that the aneuploidy rates are similar across different types of cells of early embryos, but that differences may emerge during later stages of development.

"Hopefully we can move on from the debate about whether mosaicism is common or not to understanding what are the features of mosaicism that are associated with good or bad outcomes in pregnancy," McCoy said. His lab is currently developing methods to distinguish these forms of aneuploidy and their mechanisms of origin using prenatal testing data.

Credit: 
Johns Hopkins University

A bioartificial system acts as 'dialysis' for failing livers in pigs

video: A video showing the air liquid bioartificial liver expanding liver progenitor-like cells. This material relates to a paper that appeared in the Jul. 8, 2020, issue of Science Translational Medicine, published by AAAS. The paper, by W.-J. Li at Jiaotong University School of Medicine in Shanghai, China; and colleagues was titled, "An extracorporeal bioartificial liver embedded with 3D-layered human liver progenitor-like cells relieves acute liver failure in pigs."

Image: 
[W.-J. Li <i>et al., Science Translational Medicine</i> (2020)]

A bioartificial system that incorporates enhanced liver cells can act as an analogous form of dialysis for the liver in pigs, effectively carrying out the organ's detoxifying roles and preventing further liver damage in animals with acute liver failure. With further development, the system could provide a new and much needed treatment for patients with acute liver failure, which can have a mortality rate of up to 80%. Acute liver failure can be caused by factors ranging from drug overdoses to liver surgery. The only effective intervention for acute liver failure is liver transplantation, but this approach is often impractical due to shortages of donor organs, the need for immunosuppressive drugs, and the fact that some patients are considered incompatible for transplantation. To address this treatment gap, Wei-Jian Li and colleagues created their air-liquid interactive bioartificial liver (Ali-BAL), a bioreactor-based extracorporeal system that cultures liver progenitor-like cells on scaffolds to provide complete liver function. When connected to the body's vasculature, the Ali-BAL takes in blood through a series of pumps and plasma filters, exposes it to the liver cells in the bioreactor, and returns detoxified blood to the body. The scientists tested their system in a pig model of acute liver failure induced by drug overdose and found that three hours of treatment was enough to detoxify compounds such as ammonia, prevent liver inflammation, and enhance liver regeneration. The treatment was also safe and greatly improved survival, with 5 out of 6 treated pigs surviving versus 1 out of 6 untreated pigs. Li et al. note that further experiments should test their system's effectiveness against other forms of acute liver failure, such as that induced by surgical procedures, before moving to clinical studies in humans.

Credit: 
American Association for the Advancement of Science (AAAS)

UBCO kindness researcher challenges the notion of mean teens

A UBC Okanagan researcher is hoping to flip the switch on the pre-convinced stereotype that teens are mean.

Associate Professor John-Tyler Binfet, a researcher in the School of Education, says teenagers often receive a negative reputation, sometimes showcased in mainstream media reports of bullying, cyber harassment or schoolyard battles.

Binfet's new research seeks to disrupt that notion by showing how adolescents demonstrate kindness.

"There's been a shift in schools in recent years to move away from anti-bullying initiatives to efforts that embrace and promote pro-social behaviour," says Binfet. "There is an emphasis on kindness throughout school curriculum, but little is known about how youth actually enact kindness."

Binfet and his research team surveyed 191 Grade 9 Okanagan Valley students to determine the extent they see themselves as kind in online and face-to-face interactions. The students were then asked to plan and complete five kind acts for one week.

In total, the students accomplished 943 acts of kindness, with 94 per cent of the participants completing three or more of their assigned acts. The kind acts ranged from helping with chores, being respectful, complimenting or encouraging others and giving away items like pencils or money for the vending machine.

"When encouraged to be kind, they surpassed expectations. It was interesting to see how adolescents support others with nuanced ways of helping that included helping generally, physically, emotionally and with household chores," says Binfet. "As educators and parents model kindness or provide examples of kindness, showcasing examples of subtle acts might make being kind easier for adolescents to accomplish."

The majority of the participants enacted kindness to people they know, most frequently to family, friends and other students. As the bulk of the kind acts took place at the school, the findings show positive effects for school climate, student-to-student relationships and student behaviour.

Following the one-week challenge, participants were interviewed once again to see how their perception of their own kindness had changed. The findings showed a significant increase in their self-ratings of face-to-face and online kindness.

"This has implications for school-based initiatives seeking to encourage kindness among students who may say, 'but I'm already kind'," says Binfet. "The findings suggest that by participating in a short kindness activity, students' perceptions of themselves as kind may be boosted."

For years, Binfet's research has focused on counterbalancing the bullying literature to elevate the discussion of kindness. Through this latest research, his goal is to challenge the negative stereotypes of teens.

"I think adolescents can be misperceived, especially in schools. By understanding how they show kindness, parents, educators and researchers can gain insight as to how they actualize pro-social behaviour," says Binfet. "We can find ways to best structure opportunities for youth to be kind to help foster their development."

Credit: 
University of British Columbia Okanagan campus