Tech

BrainHealth's SMART methodology helps patients make more informed treatment decisions

DALLAS (September 23, 2020) - Researchers at Center for BrainHealth, part of The University of Texas at Dallas, collaborated with scientists at the University of North Carolina at Chapel Hill to examine whether the Strategic Memory Advanced Reasoning Training (SMART) program affects people's abilities to make informed decisions about their medical treatment options. Patients with rheumatoid arthritis, in particular, are often reluctant to take antirheumatic drugs because of perceptions about the drugs' risks and benefits. The findings from this study point to an approach that helps these patients, and other people, make more informed decisions about their health.

The study, accepted for publication in Arthritis Care & Research (August 2020), was authored by lead researcher Susan J. Blalock, PhD, a now-retired professor from the University of North Carolina at Chapel Hill, and a research team that featured Sandra Bond Chapman, PhD, founder and chief director of the Center for BrainHealth, and Molly Keebler, former head of community programs at the Center for BrainHealth.

SMART teaches evidence-based strategies and techniques that improve strategic thinking, increase productivity and foster innovation. It was developed and tested by BrainHealth researchers over the past three decades.

The findings build on previous work that suggests the SMART program helps people understand the gist, or bottom-line meaning, of complex information. As part of the study, patients with rheumatoid arthritis were asked questions about how they value the risks and rewards of certain antirheumatic drugs. Six months after they participated in the SMART program, participants that initially had inadequate knowledge of the risks and rewards of treatment options made more informed decisions because they became more knowledgeable about the drugs.

The SMART protocol helps improve gist reasoning abilities - how well people can understand the overall gist or essence of complex information such as medication pamphlets and doctor's recommendations.
"Rather than getting overwhelmed by the immense number of warnings, possible risks and outcomes, participants learn to focus on the major facts rather than all the less relevant details, to reach an informed decision aligned with their personal values," said Chapman.

This research has broader implications, such that the SMART methodology could help others with a variety of chronic conditions strategically focus on key inputs to make more informed decisions consistent with their belief systems about their health.

"We helped empower people to ask questions and focus more effectively on the most important facts within complex information by following a SMART protocol. We got them thinking about the big picture and gave them the confidence to be informed users of medical advice - actively engaged in making decisions about their healthcare," said Chapman.

Credit: 
Center for BrainHealth

Amazonia racing toward tipping point, fueled by unregulated fires

image: Majoi Nascimento, Ph.D., from Florida Tech is seen here collecting charred material for analysis in the Brazilian Amazon.

Image: 
Florida Institute of Technology

Amazonia is closer to a catastrophic ecological tipping point than any time in the last 100,000 years, and human activity is the cause.

In a new paper published today in the Annals of the Missouri Botanical Garden, Florida Tech biology professor Mark Bush describes how the vast Amazonian rainforest could be replaced by savanna, which is a grassland with few trees, within our lifetime.

Rainforests rely on high humidity and have no adaptation to withstand fire. Bush uses fossil pollen and charcoal recovered from lake sediments dating back thousands of years to track changes in vegetation and fire frequency through time. He has found that fires were almost unknown in Amazonia before the arrival of humans.

Relatively small-scale disturbances caused by the first inhabitants of Amazonia over the last 10,000 years did not bring the system to a tipping point because it could recover from these minor events. But the modern effects of a warming climate and elevated drought risk - both the product of anthropogenic climate change - are combining with much larger-scale deforestation and burning in Amazonia to create the conditions where vast areas of rainforest could transition to savanna in a matter of decades.

"The immense biodiversity of the rainforest is at risk from fire," Bush said.

One of the key points of the paper, "New and repeating tipping points: the interplay of fire, climate change and deforestation in Neotropical ecosystems," is that while no individual government can control climate change, fire can be regulated through policy. Almost all fires in Amazonia are set deliberately by people and have become much more frequent in the last two years, because of altered policy, than over the previous decade.

Bush's data show that the tipping point is likely to be reached if temperatures rise by another 2 to 3 degrees Fahrenheit. Anthropogenic warming would bring those temperatures by the end of this century, but increased burning creates hotter, drier, less shaded landscapes that could hasten that transition.

"Warming alone could induce the tipping point by mid-century, but if the present policies that turn a blind eye to forest destruction aren't stopped, we could reach the tipping point much sooner," Bush said.

He added, "Beyond the loss of wildlife, the cascading effects of losing Amazonian rainforest would alter rainfall across the hemisphere. This is not a remote problem, but one of global importance and critical significance to food security that should concern us all."

Credit: 
Florida Institute of Technology

Berry good news -- new compound from blueberries could treat inflammatory disorders

image: A polyphenolic compound derived from blueberry shows remarkable immunosuppressive effects and can be useful in treating Inflammatory bowel disease (IBD)

Image: 
Tokyo University of Science

Various plants and their products are known to contain "bioactive" ingredients that can alleviate human diseases. These "phytocompounds" often contain restorative biological properties such as anti-cancerous, antioxidant, and anti-inflammatory effects. Thus, understanding how they interact with the body can lead to potential treatment strategies against major immune disorders.

A team of researchers at Tokyo University of Science, led by Prof Chiharu Nishiyama, has been working this direction for the past several years, to identify novel active components in functional foods and understand their effects on the body. Their efforts have now led to success: In their latest study, published in The FASEB Journal, the scientists identified a polyphenolic compound called "pterostilbene" (PSB) with strong immunosuppressive properties--making it a potential therapeutic option for chronic inflammatory diseases, including inflammatory bowel disease (IBD). This compound is very similar to another phytocompound known to have important medicinal effects, called "resveratrol" (RSV). Dr Takuya Yashiro, corresponding author of this report, explains the idea that prompted their research, "RSV, a polyphenol, was known to have pronounced immunomodulatory and anti-inflammatory effects on animal models of colitis ulcer. Therefore, we investigated the possibility of other compounds structurally similar to RSV as a new type of treatment for IBD."

In patients with IBD, the gastrointestinal tract lining contains long-lasting ulcers caused by chronic inflammation due to an elevated immune response in the body. This involves the excessive production of immune system-related molecules called "cytokines." Moreover, two types of immune cells, "dendritic cells" (DCs) and "T cells," are also involved: at the onset of an immune response, DCs produce inflammatory cytokines and activate T cells to initiate a defense response. These processes together form a complex pathway that result in a "hyper" immune response. Thus, to find an effective compound that can suppress the immune system, it was crucial to test it on this population of immune cells.

Thus, to begin with, the scientists studied the effects of a range of plant-derived compounds on DC-mediated T cell proliferation. Their initial research led them to PSB, which showed stronger immunosuppressive activity than the other candidates. When they dug deeper, they found that PSB treatment prevents T cells from differentiating into Th1 and Th17 (subtypes of T cells that elevate the immune response) while increasing their differentiation into regulatory T cells (another subtype known to inhibit inflammation). They also revealed that PSB treatment inhibits inflammatory cytokine production from DCs by attenuating the DNA-binding activity of a crucial transcription factor PU.1. When they further tested PSB in mice with IBD, they found that oral intake of PSB improved symptoms of IBD. Thus, the study confirmed that PSB is an extremely promising anti-inflammatory agent to fight IBD. Not just this--it is easily absorbed by the body, making it an ideal drug candidate!

Through these findings, the scientists have ushered in new possibilities for the treatment of not just IBD but also other inflammatory disorders. Dr Yashiro concludes, "For disease prevention, it is important to identify the beneficial components in foods and to understand the underlying mechanism by which immune responses and homeostasis are modulated in body. Our findings showed that PSB possesses a strong immunosuppressive property, paving the way for a new, natural treatment for IBD."

Credit: 
Tokyo University of Science

Three genes predict success of naltrexone in alcohol dependence treatment

image: Personalized alcohol treatment

Image: 
Medical University of South Carolina

Considering a patient's genetics could inform clinicians which medications would be most effective in controlling cravings and treating alcohol use disorder.

Twenty million Americans currently struggle with an alcohol use disorder. Of those who seek treatment, only 20% receive medications, either alone or in addition to counseling.

Medications are not used more often, according to Charleston Alcohol Research Center scientific director Raymond Anton, M.D., in part because they do not work equally well for everyone. Many patients with alcohol use disorder would benefit from a personalized medicine approach, in which a medication is
prescribed based on a patient's genetic code.

Anton and his team report in Alcoholism: Clinical and Experimental Research that doing a few relatively simple genetic tests to identify variations in just three brain genes makes it possible to predict which patients with an alcohol use disorder will benefit most from the addiction treatment medication naltrexone.

In previous studies, Anton's team showed that treating alcohol use disorder with medications that work on specific brain chemicals can reduce the relapse rate by up to a third.

"Alcohol dependence is a brain disease known to affect certain brain chemicals," said Anton, "So, it's important to use treatment methods that address not only the behavioral but also the biological/brain components of the problem."

Naltrexone, a Food and Drug Administration (FDA)-approved addiction medication, is somewhat unique in that it targets just a single protein in the brain -the mu-opioid receptor. When activated by either an internally produced or externally introduced opioid-like chemical, the mu-opioid receptor signals a positive experience. Drinking alcohol releases natural opiates in the brain that activate the mu-opioid receptor. Naltrexone blocks the mu-opioid receptor to prevent the reward and pleasure that comes from drinking alcohol and can even reduce the craving to consume it.

The gene that produces the mu-opioid receptor protein in the brain is not the same in every patient. In the current study, Anton and his team considered the influence of a small gene variation that results in a slight difference in the mu-opioid receptor protein structure.

That slight difference does not affect how people act under normal situations, but it does cause a subtle difference in how strongly the mu-opioid receptor becomes activated when alcohol is consumed, with one variation having a greater response than the other.

Anton and his team hypothesized that this subtle difference in brain chemistry might affect how well naltrexone works in any given patient.

They quickly discovered, however, that the variation in this one gene only did not fully predict how well a patient would respond to the medication.

"There is a small indication that the difference in the mu-opioid receptor gene sequence matters, but it isn't a powerful predictor," Anton explained. "People are far more complex than one individual gene variation. Naltrexone targets this specific mu-opioid receptor, so we hypothesized that the other brain chemicals that might influence the mu-opioid receptor could also influence how the drug might work."

Dopamine is another reward and pleasure signaling system in the brain that often interacts with the opioid system. Therefore, the amount of dopamine present could influence the mu-opioid receptor and thus the effectiveness of naltrexone.

Anton and his team looked at two such genes that produce proteins controlling the amount of dopamine in the brain.

Like the mu-opioid receptor, these dopamine-processing genes can have small specific variations that result in slight differences in the strength of reward or pleasure signaling after alcohol consumption.

In a clinical trial, Anton and his team genotyped 146 treatment-seeking alcohol use disorder patients for the selected variations in the mu-opioid receptor gene and the two dopamine-processing genes. A roughly equal number of patients with each gene variation were assigned randomly to receive naltrexone or an identical-looking placebo medication.

Throughout the 16-week trial funded by the National Institutes of Health, patients reported how much they drank each day. A reduction in the number of binge-drinking days, defined as five or more drinks for men or four or more drinks for women, across the study indicated a positive effect of the medication.

Anton and his team found that only patients with certain combinations of gene variations showed consistently reduced drinking when taking naltrexone.

"To benefit most from naltrexone, you have to have the gene variations that predict you'll be low in one brain chemical response -dopamine or mu-opioid -and high in the other," Anton explained.

This finding indicates that patients can be genotyped before treatment to see if they will benefit from naltrexone. If they will not benefit, other medications that might be effective are available for them.

Currently, there are no standard genetic screens to test for a patient's medication response in alcohol/addiction treatment. Anton and his team are taking the first steps to make genetic predictors a common clinical practice. They are currently working with the MUSC Foundation for Research Development, MUSC's technology transfer office, to secure a patent for the discovery that these three genes together predict naltrexone efficacy. In addition, they are discussing with others the potential of commercial genetic testing to improve the treatment of alcohol use disorder. This is the first step in what could be a wider range of genetic testing for other addictions.

Credit: 
Medical University of South Carolina

New brain cell-like nanodevices work together to identify mutations in viruses

image: An electron micrograph of the artificial neuron. The niobium dioxide layer (yellow) endows the device with neuron-like behavior.

Image: 
Dr. R. Stanley Williams

In the September issue of the journal Nature, scientists from Texas A&M University, Hewlett Packard Labs and Stanford University have described a new nanodevice that acts almost identically to a brain cell. Furthermore, they have shown that these synthetic brain cells can be joined together to form intricate networks that can then solve problems in a brain-like manner.

"This is the first study where we have been able to emulate a neuron with just a single nanoscale device, which would otherwise need hundreds of transistors," said Dr. R. Stanley Williams, senior author on the study and professor in the Department of Electrical and Computer Engineering. "We have also been able to successfully use networks of our artificial neurons to solve toy versions of a real-world problem that is computationally intense even for the most sophisticated digital technologies."

In particular, the researchers have demonstrated proof of concept that their brain-inspired system can identify possible mutations in a virus, which is highly relevant for ensuring the efficacy of vaccines and medications for strains exhibiting genetic diversity.

Over the past decades, digital technologies have become smaller and faster largely because of the advancements in transistor technology. However, these critical circuit components are fast approaching their limit of how small they can be built, initiating a global effort to find a new type of technology that can supplement, if not replace, transistors.

In addition to this "scaling-down" problem, transistor-based digital technologies have other well-known challenges. For example, they struggle at finding optimal solutions when presented with large sets of data.

"Let's take a familiar example of finding the shortest route from your office to your home. If you have to make a single stop, it's a fairly easy problem to solve. But if for some reason you need to make 15 stops in between, you have 43 billion routes to choose from," said Dr. Suhas Kumar, lead author on the study and researcher at Hewlett Packard Labs. "This is now an optimization problem, and current computers are rather inept at solving it."

Kumar added that another arduous task for digital machines is pattern recognition, such as identifying a face as the same regardless of viewpoint or recognizing a familiar voice buried within a din of sounds.

But tasks that can send digital machines into a computational tizzy are ones at which the brain excels. In fact, brains are not just quick at recognition and optimization problems, but they also consume far less energy than digital systems. Hence, by mimicking how the brain solves these types of tasks, Williams said brain-inspired or neuromorphic systems could potentially overcome some of the computational hurdles faced by current digital technologies.

To build the fundamental building block of the brain or a neuron, the researchers assembled a synthetic nanoscale device consisting of layers of different inorganic materials, each with a unique function. However, they said the real magic happens in the thin layer made of the compound niobium dioxide.

When a small voltage is applied to this region, its temperature begins to increase. But when the temperature reaches a critical value, niobium dioxide undergoes a quick change in personality, turning from an insulator to a conductor. But as it begins to conduct electric currents, its temperature drops and niobium dioxide switches back to being an insulator.

These back-and-forth transitions enable the synthetic devices to generate a pulse of electrical current that closely resembles the profile of electrical spikes, or action potentials, produced by biological neurons. Further, by changing the voltage across their synthetic neurons, the researchers reproduced a rich range of neuronal behaviors observed in the brain, such as sustained, burst and chaotic firing of electrical spikes.

"Capturing the dynamical behavior of neurons is a key goal for brain-inspired computers," said Kumar. "Altogether, we were able to recreate around 15 types of neuronal firing profiles, all using a single electrical component and at much lower energies compared to transistor-based circuits."

To evaluate if their synthetic neurons can solve real-world problems, the researchers first wired 24 such nanoscale devices together in a network inspired by the connections between the brain's cortex and thalamus, a well-known neural pathway involved in pattern recognition. Next, they used this system to solve a toy version of the viral quasispecies reconstruction problem, where mutant variations of a virus are identified without a reference genome.

By means of data inputs, the researchers introduced the network to short gene fragments. Then, by programming the strength of connections between the artificial neurons within the network, they established basic rules about joining these genetic fragments. The jigsaw puzzle-like task for the network was to list mutations in the virus' genome based on these short genetic segments.

The researchers found that within a few microseconds, their network of artificial neurons settled down in a state that was indicative of the genome for a mutant strain.

Williams and Kumar noted this result is proof of principle that their neuromorphic systems can quickly perform tasks in an energy-efficient way.

The researchers said the next steps in their research will be to expand the repertoire of the problems that their brain-like networks can solve by incorporating other firing patterns and some hallmark properties of the human brain like learning and memory. They also plan to address hardware challenges for implementing their technology on a commercial scale.

"Calculating the national debt or solving some large-scale simulation is not the type of task the human brain is good at and that's why we have digital computers. Alternatively, we can leverage our knowledge of neuronal connections for solving problems that the brain is exceptionally good at," said Williams. "We have demonstrated that depending on the type of problem, there are different and more efficient ways of doing computations other than the conventional methods using digital computers with transistors."

Credit: 
Texas A&M University

Broad beans versus soybeans as feedstuff for dual-purpose chickens

image: Cockerel and hens from the Vorwerkhuhn breed, which is a traditional German dual-purpose breed (ie the breed is suitable for both meat and egg-laying)

Image: 
Juliane Fellner, University of Göttingen

Current practices of the poultry industry have raised ethical and ecological concerns: ethical concerns include the culling of day-old male chicks of egg-laying breeds; ecological concerns include the import of large quantities of soybeans for feedstuff. Now a research team at the University of Göttingen has investigated alternatives such as using a regional protein crop like broad beans (also known as faba or fava beans), and dual-purpose chicken breeds (ie suitable for both meat and egg-laying). They found that using broad beans as feed and dual-purpose breeds were both suitable alternatives which did not impact the quality of chicken meat. Their results were published in Foods.

As part of the "Potentials of sustainable use of regional breeds and regional protein feed in poultry production" (PorReE) project; this study focuses on adult cockerels of two local dual-purpose chicken breeds (Vorwerkhuhn and Bresse Gauloise) and one high-performing laying line (White Rock) fattened on feed where the main protein sources were soybean meal and two broad bean (Vicia faba) diets with different composition. This study focused on the effect of the diet on meat quality characteristics, including sensory analysis, of these particular breeds. The results of physicochemical and sensory analyses show that broad beans can be included in poultry feed without negatively impacting the quality of the product.

Broad beans' nutritional composition makes them a suitable replacement for soybeans as a protein source in poultry feed and they have the added advantage of improving the soil quality by fixing nitrogen. First author Cynthia Escobedo del Bosque at the University of Göttingen says: "Broad beans are widely cultivated legumes that would help local agricultural industries by granting them a greater independence since they would be freed up from relying on soy imports and could control the price."

The use of dual-purpose breeds has only been the subject of research in recent years. These breeds cannot keep up with the laying and/or fattening performance of specialized breeds. "Our research shows that these breeds produce high quality eggs and chicken meat but at a smaller volume," explains coauthor Professor Daniel Mörlein from the University of Göttingen. He adds: "This means the cost will be higher, but if consumers would be willing to pay more, animal welfare and genetic diversity can be improved."

Four research groups at the Faculty of Agricultural Sciences, University of Göttingen, are currently examining the foundations for a more sustainable and socially-accepted poultry production system. Product perception as well as consumer acceptance studies are commonly conducted in the faculty's modern sensory laboratory.

Credit: 
University of Göttingen

This tiny device can scavenge wind energy from the breeze you make when you walk

video: The video shows the two plastic strips from the triboelectric nanogenerator flapping in sync for energy generation in slow motion.

Image: 
Chen, Ma, and Ren et al./Cell Reports Physical Science

Most of the wind available on land is too gentle to push commercial wind turbine blades, but now researchers in China have designed a kind of "tiny wind turbine" that can scavenge wind energy from breezes as little as those created by a brisk walk. The method, presented September 23 in the journal Cell Reports Physical Science, is a low-cost and efficient way of collecting light breezes as a micro-energy source.

The new device is not technically a turbine. It is a nanogenerator made of two plastic strips in a tube that flutter or clap together when there is airflow. Like rubbing a balloon to your hair, the two plastics become electrically charged after being separated from contact, a phenomenon called the triboelectric effect. But instead of making your hair stand up like Einstein's, the electricity generated by the two plastic strips is captured and stored.

"You can collect all the breeze in your everyday life," says senior author Ya Yang of Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences. "We once placed our nanogenerator on a person's arm, and a swinging arm's airflow was enough to generate power."

A breeze as gentle as 1.6 m/s (3.6 mph) was enough to power the triboelectric nanogenerator designed by Yang and his colleagues. The nanogenerator performs at its best when wind velocity is between 4 to 8 m/s (8.9 to 17.9 mph), a speed that allows the two plastic strips to flutter in sync. The device also has a high wind-to-energy conversion efficiency of 3.23%, a value that exceeds previously reported performances on wind energy scavenging. Currently, the research team's device can power up 100 LED lights and temperature sensors.

"Our intention isn't to replace existing wind power generation technology. Our goal is to solve the issues that the traditional wind turbines can't solve," says Yang. "Unlike wind turbines that use coils and magnets, where the costs are fixed, we can pick and choose low-cost materials for our device. Our device can also be safely applied to nature reserves or cities because it doesn't have the rotating structures."

Yang says he has two visions for the project's next steps: one small and one big. In the past, Yang and his colleagues have designed a nanogenerator as small as a coin, but he wants to make it even tinier and more compact with higher efficiency. In the future, Yang and his colleagues would like to combine the device to small electronic devices such as phones, to provide sustainable electric power.

But Yang is also looking to make the device bigger and more powerful. "I'm hoping to scale up the device to produce 1,000 watts, so it's competitive with traditional wind turbines," he says. "We can place these devices where traditional wind turbines can't reach. We can put it in the mountains or on the top of buildings for sustainable energy."

Credit: 
Cell Press

Proof-of-concept for a new ultra-low-cost hearing aid for age-related hearing loss

image: Construction and components of the LoCHAid. a. The LoCHAid is shown in its top view, with its 3D printed polyamide (Nylon 12) case tilted. The side view of the audio jack opening and holes for attaching material for neck wear are shown below. The LoCHAid in its case has a size of 6.70 mm by 5.70 mm. The audio jack can incorporate any standard 8 mm sound transducer. b. Displays various types of batteries such as AA, rechargeable AAA, Lithium Ion flat pack, as well as lithium ion coin cell that can be used to power the device. The device has a power requirement that is between 3-5.5 V. The amount of batteries denote the the number required to power the device. c. The required parts to assemble the device are shown here with group labels; specific details are given in Table 1. d. View of the custom printed circuit board (PCB) without any components. e. View of the PCB with components soldered on. f. View of the body-worn device by an anonymous 65 year old male as part of the intended audience of the device.

Image: 
Sinha et al, 2020 (PLOS ONE, CC BY)

A new ultra-affordable and accessible hearing aid made from open-source electronics could soon be available worldwide, according to a study published September 23, 2020 in the open-access journal PLOS ONE by Soham Sinha from the Georgia Institute of Technology, Georgia, US, and colleagues.

Hearing aids are a major tool for individuals with hearing loss--especially age-related hearing loss, which currently affects approximately 226 million adults over the age of 65 worldwide (and is projected to affect 900 million by 2050). However, hearing aid adoption remains relatively low among adults: fewer than 3 percent of adults in low-and-middle-income countries (LMIC) use hearing aids, versus around 20 percent of adults in non-LMIC countries. Though various reasons contribute to this poor uptake, cost is a significant factor. While the price to manufacture hearing aids has decreased over time, the retail price for a pair of hearing aids ranges from $1,000 to $8,000 USD, with the average pair costing $4,700 in the US.

In this study, Sinha and colleagues used mass-produced open source electronics to engineer a durable, affordable, self-serviceable hearing aid that meets most of the targets set by the WHO for mild-to-moderate age-related hearing loss: "LoCHAid." When mass-produced at 10,000 units including earphones, a coin-cell battery, and holder, LoCHAid costs $0.98 (this doesn't include labor costs) and is designed to be marketed over-the-counter--or even as a DIY project. LoCHAid doesn't require specialty parts, and repairs can be completed by a minimally skilled user with access to a soldering iron and solder. Though it's not currently programmable, simulations show that the LoCHAid is well fitted to a range of age-related hearing loss profiles for men and women between the ages of 60-79 years.

Potential limitations include the device lifetime (currently 1.5 years), as well as its relatively large size, which may not appeal to all consumers. The authors are currently working on a smaller prototype, but this costs more money to produce and would likely require third-party assemblers.

Despite these limitations, LoCHAid shows great potential to benefit individuals impacted by age-related hearing loss, especially those consumers challenged by the affordability and accessibility of current hearing aids available on the market.

The authors add: "In this work, we describe the development and rigorous audiological testing a minimal, 3d-printed and ultra low-cost ($1 in parts) hearing aid. The vision of the device is to make hearing aid accessible and affordable for elderly individuals with age related hearing loss in low- and middle-income countries."

Credit: 
PLOS

Scientists identify dozens of genes allowing cancer cells to evade the immune system

image: A cancer cell surrounded by immune T killer cells

Image: 
National Institutes of Health (NIH)

Toronto scientists have mapped the genes allowing cancer cells to avoid getting killed by the immune system in a finding that paves the way for the development of immunotherapies that would be effective for larger patient populations and across different tumour types.

"Over the last decade, different forms of immunotherapy have emerged as really potent cancer treatments but the reality is that they only generate durable responses in a fraction of patients and not for all tumour types," says Jason Moffat, a professor of molecular genetics in the Donnelly Centre for Cellular and Biomolecular Research at the University of Toronto who led the work.

The study also revealed the need for new therapy to take into account the genetic composition of tumours because of mutations in the cancer cells that can potentially make the disease worse in response to treatment, often referred to as cancer resistance mutations.

"It's very important to understand at the molecular level how cancer develops resistance to immunotherapies in order to make them more broadly available. Advances in systematic genetic approaches have let us key in on genes and molecular pathways that are commonly involved in resistance to therapy," says Moffat, who holds Canada Research Chair in Functional Genomics of Cancer.

In immunotherapy, a patient's own immune cells, known as T killer cells, are engineered to find and destroy cancer. But treatment resistance has precluded its use in most patients, especially those with solid tumours.

"It's an ongoing battle between the immune system and cancer, where the immune system is trying to find and kill the cancer whereas the cancer's job is to evade that killing," says Keith Lawson, a co-lead author completing a PhD in Moffat's lab as part of his medical training in the Surgeon-Scientist Program at U of T's Faculty of Medicine.

Tumour heterogeneity--genetic variation in tumour cells within and across individuals that can impact therapy response--further complicates things.

"It's important to not just find genes that can regulate immune evasion in one model of cancer, but what you really want are to find those genes that you can manipulate in cancer cells across many models because those are going to make the best therapeutic targets," says Lawson.

The team, including collaborators from Agios Pharmaceuticals in Cambridge, Massachusetts, looked for genes that regulate immune evasion across six genetically diverse tumor models derived from breast, colon, kidney and skin cancer. The cancer cells were placed in a dish alongside the T cells engineered to kill them, where the ensuing onslaught served as a baseline. The researchers next deployed the gene editing tool CRISPR to switch off one-by-one every gene in the cancer cells and measured the resulting deviations from the killing baseline.

They identified 182 "core cancer intrinsic immune evasion genes" whose deletion makes the cells either more sensitive or more resistant to T cell attack. Among the resisters were all the genes known to develop mutations in patients who stopped responding to immunotherapy, giving the researchers confidence that their approach worked.

Many of the found genes had no previous links to immune evasion.

"That was really exciting to see, because it means that our dataset was very rich in new biological information", says Lawson.

Genes involved in autophagy, a process when cells ramp up recycling their components to mitigate damage following stress, came up as key for immune evasion. This raises a possibility that cancer's susceptibility to immunotherapy could be boosted by targeting its autophagy genes.

But as the researchers delved deeper, they found that deleting certain autophagy genes in pairs rendered the cells resistant to T cell killing. It means that if a tumour already harbors a mutation in one autophagy gene, a treatment that combines immunotherapy with a drug targeting another autophagy gene could make the disease worse in that patient.

"We found this complete inversion of gene dependency", says Moffat. "We did not anticipate this at all. What it shows us is that genetic context, what mutations are present, very much dictates whether the introduction of the second mutations will cause no effect, resistance or sensitivity to therapy".

As more research explores combinatorial effects of mutations across different types of cancer cells, it should become possible to predict from a tumour's DNA what type of therapy will be most effective.

Credit: 
University of Toronto

The biomimetic hand prosthesis Hannes uniquely similar to a human hand

video: The robotic hand Hannes is developed in Italy at Istituto Italiano di Tecnologia and Centro Protesi INAIL. Hannes is able to replicate the key biological properties of the human hand and is able to restore over 90% of functionality to people with upper-limb amputations.

Image: 
IIT-Inail

Genoa/Bologna (Italy), 23 September 2020 - The biomimetic prosthetic hand Hannes is featured on Science Robotics' cover today; in the current issue researchers from Istituto Italiano di Tecnologia (IIT- Italian Institute of Technology) and Centro Protesi INAIL (the prosthetic unit of the National Institute for Insurance against Accidents at Work) in Italy reported about its ability to replicate the key biological properties of the human hand: natural synergistic and adaptable movement; biomimetic levels of force and speed; high anthropomorphism and grasp robustness. Developed involving researchers, orthopaedists, industrial designers together with patients, Hannes is able to restore over 90% of functionality to people with upper-limb amputations. It owns CE marking and it is ready to enter the international medical market, but its future commercialization will be possible only when researchers will identify investors and industrial partners.

The robotic system Hannes was born out of the IIT's Rehab Technologies Lab, that is the joint-lab between Istituto Italiano di Tecnologia in Genova and Centro Protesi INAIL in Budrio (Bologna), coordinated by Lorenzo De Michieli; the collaboration started at the end of 2013 with the aim of creating innovative high-tech cost-effective solutions for patients with physical impairment. Among the solutions developed so far, Hannes is the most recent result, whose name is a tribute to Professor Johannes "Hannes" Schmidl, technical director of the Centro Protesi Inail in the 1960s and pioneer in upper limb prosthetics. On September 9th 2020, the hand Hannes was awarded the international industrial design prize Compasso d'Oro, due to its original and highly innovative concept.

Hannes is an anthropomorphic, poly-articulated upper limb prosthetic system including hand and wrist, whose main characteristics are the softness and the ability to dynamically adapt themselves to the shape of objects to grasp. It is uniquely similar to a human hand and, being developed directly with patients, it is of practical use. To evaluate the effectiveness and usability of Hannes, pilot trials on amputees were performed at Centro Protesi Inail and researchers found that, after a training period of less than one week, patients could autonomously use Hannes domestically to perform activities of daily living.

The prosthesis is a myoelectric system that can be worn all day long and adjustable to different upper limb impairments. Therefore, an array of surface electromyographic sensors, placed within a custom socket, detects the activity of the residual limb muscles - in the lower or higher part of the arm, which are actively contracted by the user to perform multiple movements. Moreover, through a specially developed software and a bluetooth connection, it is possible to customize the operating parameters of the hand, such as the precision and speed of movements, to ensure the most optimized experience for each user.

Hannes hand has been tested for durability and robustness in a setting that simulated more than 1 year of usage of a so called "pro-user" (almost 500000 life cycles).

The true intelligence of Hannes lies in the mechanical design, which is completely unique in its market sector, and it gives to the prosthesis the versatility and the movement of a natural hand. The underlying mechanism of the hand is a mechanical differential system that allows Hannes to adapt to the object being grasped by using just a single motor. This also permits to dramatically enhance performance due to its efficiency, at the same time, to be coherent with the 50th percentile human hand size. Hannes is provided in two different sizes, 7 ¾ and 8 ¼, for right and left handed and suitable to female and male subjects. Its weight is 450 grams.

Fingers can flex and be positioned in a natural manner, even at rest. In particular, the thumb can be oriented in 3 different positions to replicate a wide variety of grips, including a fine grip that allows to pick up small objects, a lateral grip, which allows to grasp thin objects, and finally a power grip capable of grasping and moving even heavy loads. The overall grasp is efficient, robust against external conditions and natural. The system also permits to pronate and supinate the wrist ('key turning movement'), allowing grasps in different orientation without relying on harmful patient compensation.

Hannes can perform a full closed grasp in less than 1 second and, at the same time, it can exert a maximum grasp force of 150N, that is well beyond other commercial and research poly articulated hands, and it has an autonomy of a whole day of standard use (battery life of 1 day: 12V power supply for a battery capacity of 1300 mAh).

Researchers conducted experiments to validate Hannes's performance and the human-likeness of its grasping behaviour and they demonstrated an improved performance compared with existing research or commercial devices.

The fundamental principles and design of Hannes are covered by IIT-INAIL patent applications. Moreover, the prosthetic hand obtained CE marking, which is fundamental for future commercialization in the European market and a precondition for international sale. Researchers are looking for investors and companies to industrialize and produce Hannes on a large scale, benefiting patients with physical impairment.

Credit: 
Istituto Italiano di Tecnologia - IIT

Combined droughts and heatwaves are occurring more frequently in several regions across the US

The frequency of combined droughts and heatwaves - which are more devastating when they occur in unison - has substantially increased across the western U.S. and in parts of the Northeast and Southeast over the past 50 years, according to a new study. The findings also suggest areas that experience compound dry-hot extremes are growing less scattered and more connected, resulting in larger impacted regions that place enormous strain on regional and national relief efforts. "Episodes of extreme dryness and heat are the recipe for large forest fires," said Mojtaba Sadegh, the senior author of the study. "These extremes are intensifying and extending at unprecedented spatial scales, allowing current wildfires to burn across the entire U.S. west coast." Climate risk analyses have typically focused on shifts in one climate parameter at a time, such as changes in heatwave magnitude or trends in aridity. But while multiple extreme events rarely occurred at the same time in the past, they have begun to coincide more often as climate change progresses. To better understand how the frequency of concurrent droughts and heatwaves has changed over time in the contiguous U.S., Mohammad Reza Alizadeh and colleagues analyzed combined dry-hot extremes using 122 years of climate data based on ground observations. While most prior analyses of these concurrent events rely on post-1950s data, the researchers extended their analysis to cover the years 1896-2017, incorporating the 1930s megadrought that, combined with inappropriate farming practices, led to the Dust Bowl phenomenon. The findings indicate that the climate factors driving concurrent droughts and heatwaves has shifted from lack of precipitation in the 1930s to excess heat in recent decades. The authors suggest their findings may be used to bolster risk assessment frameworks and inform climate adaptation and mitigation efforts.

Credit: 
American Association for the Advancement of Science (AAAS)

Nanostructures with a unique property

image: Skyrmions are nanoscale vortices in the magnetic alignment of atoms. For the first time, PSI researchers have now created antiferromagnetic skyrmions in which critical spins are arranged in opposing directions. This state is shown in the artist's impression above.

Image: 
Paul Scherrer Institute/Diego Rosales

Nanoscale vortices known as skyrmions can be created in many magnetic materials. For the first time, researchers at PSI have managed to create and identify antiferromagnetic skyrmions with a unique property: critical elements inside them are arranged in opposing directions. Scientists have succeeded in visualising this phenomenon using neutron scattering. Their discovery is a major step towards developing potential new applications, such as more efficient computers. The results of the research are published today in the journal Nature.

Whether a material is magnetic depends on the spins of its atoms. The best way to think of spins is as minute bar magnets. In a crystal structure where the atoms have fixed positions in a lattice, these spins can be arranged in criss-cross fashion or aligned all in parallel like the spears of a Roman legion, depending on the individual material and its state.

Under certain conditions it is possible to generate tiny vortices within the corps of spins. These are known as skyrmions. Scientists are particularly interested in skyrmions as a key component in future technologies, such as more efficient data storage and transfer. For example, they could be used as memory bits: a skyrmion could represent the digital one, and its absence a digital zero. As skyrmions are significantly smaller than the bits used in conventional storage media, data density is much higher and potentially also more energy efficient, while read and write operations would be faster as well. Skyrmions could therefore be useful both in classical data processing and in cutting-edge quantum computing.

Another interesting aspect for the application is that skyrmions can be created and controlled in many materials by applying an electrical current. "With existing skyrmions, however, it is tricky to move them systematically from A to B, as they tend to deviate from a straight path due to their inherent properties," explains Oksana Zaharko, research group leader at PSI.

Working with researchers from other institutions, Dr Zaharko and her team have now created a new type of skyrmion and demonstrated a unique characteristic: in their interior, critical spins are arranged in opposite directions to one another. The researchers therefore describe their skyrmions as antiferromagnetic.

In a straight line from A to B

"One of the key advantages of antiferromagnetic skyrmions is that they are much simpler to control: if an electrical current is applied, they move in a simple straight line," Zaharko comments. This is a major advantage: for skyrmions to be suitable for practical applications, it must be possible to selectively manipulate and position them.

The scientists created their new type of skyrmion by fabricating them in a customised antiferromagnetic crystal. Zaharko explains: "Antiferromagnetic means that adjacent spins are in an antiparallel arrangement, in other words one pointing upwards and the next pointing downwards. So what was initially observed as a property of the material we subsequently identified within the individual skyrmions as well."

Several steps are still needed before antiferromagnetic skyrmions are mature enough for a technological application: PSI researchers had to cool the crystal down to around minus 272 degrees Celsius and apply an extremely strong magnetic field of three tesla - roughly 100,000 times the strength of the Earth's magnetic field.

Neutron scattering to visualise the skyrmions

And the researchers have yet to create individual antiferromagnetic skyrmions. To verify the tiny vortices, the scientists are using the Swiss Spallation Neutron Source SINQ at PSI. "Here we can visualise skyrmions using neutron scattering if we have a lot of them in a regular pattern in a particular material", Zaharko explains.

But the scientist is optimistic: "In my experience, if we manage to create skyrmions in a regular alignment, someone will soon manage to create such skyrmions individually."

The general consensus in the research community is that once individual antiferromagnetic skyrmions can be created at room temperature, a practical application will not be far off.

Credit: 
Paul Scherrer Institute

Meditation for mind-control

image: Imaging shows the difference in alpha power between the meditation and control groups.

Image: 
Carnegie Mellon University College of Engineering

A BCI is an apparatus that allows an individual to control a machine or computer directly from their brain. Non-invasive means of control like electroencephalogram (EEG) readings taken through the skull are safe and convenient compared to more risky invasive methods using a brain implant, but they take longer to learn and users ultimately vary in proficiency.

Bin He, professor and head of the Department of Biomedical Engineering, and collaborators conducted a large-scale human study enrolling subjects in a weekly eight-week course in simple, widely-practiced meditation techniques, to test their effect as a potential training tool for BCI control. The work was published in Cerebral Cortex. A total of 76 people participated in this study, each being randomly assigned to the meditation group or the control group, which had no preparation during these 8 weeks. Up to 10 sessions of BCI study were conducted with each subject. He's work shows that humans with just eight lessons in mindfulness-based attention and training (MBAT) demonstrated significant advantages compared to those with no prior meditation training, both in their initial ability to control BCI's and in the time it took for them to achieve full proficiency.

After subjects in the MBAT group completed their training course they, along with a control group, were charged with learning to control a simple BCI system by navigating a cursor across a computer screen using their thought. This required them to concentrate their focus and visualize the movement of the cursor within their head. During the course of the process, He's team monitored their performance and brain activity via EEG.

As stated prior, the team found that those with training in MBAT were more successful in controlling the BCI, both initially and over time. Interestingly, the researchers found that differences in brain activity between the two sample groups corresponded directly with their success. The meditation group showed significantly enhanced capability of modulating their alpha rhythm, the activity pattern monitored by the BCI system to mentally control the movement of a computer cursor.

His findings are very important for the process of BCI training and the overall feasibility of non-invasive BCI control via EEG. While prior work from his group has shown that long-term meditators were better able to overcome the difficulty of learning non-invasive mind control, this work shows that just a short period of MBAT training can significantly improve a subject's skill with a BCI. This suggests that education in MBAT could provide a significant addition to BCI training. "Meditation has been widely practiced for well-being and improving health," said He. Our work demonstrates that it can also enhance a person's mental power for mind control, and may facilitate broad use of noninvasive brain-computer interface technology."

It could also inform neuroscientists and clinicians working in BCI design and maintenance. A thorough understanding of the brain is crucial for creating the machine learning algorithms BCI's use to interpret brain signals. This knowledge is especially important in BCI recalibration, which can be time-consuming and frequently necessary for non-invasive BCI's.

The work of He and his team presents a new application for a well-known and widely practiced form of meditation, and may even offer insights into the neurological effects of meditation and how it may be adapted for better BCI training. This study offers novel information for researchers of BCI's and presents a new tool for both understanding the brain and preparing subjects to use a BCI.

Credit: 
College of Engineering, Carnegie Mellon University

Wobbling shadow of the M87 black hole

video: An animation representing one year of M87* image evolution according to numerical simulations. Measured position angle of the bright side of the crescent is shown, along with a 42 microarcsecond ring. For a part of the animation, image blurred to the EHT resolution is shown.

Image: 
G. Wong, B. Prather, C. Gammie, M. Wielgus & the EHT Collaboration

In 2019, the Event Horizon Telescope (EHT) Collaboration delivered the first image of a black hole, revealing M87*--the supermassive object in the center of the M87 galaxy. The EHT team has now used the lessons learned last year to analyze the archival data sets from 2009-2013, some of them not published before. The analysis reveals the behavior of the black hole image across multiple years, indicating persistence of the crescent-like shadow feature, but also variation of its orientation--the crescent appears to be wobbling. The full results appeared today in The Astrophysical Journal.

The EHT is a global array of telescopes, performing synchronized observations using the technique of Very Long Baseline Interferometry (VLBI). Together they form a virtual Earth-sized radio dish, providing a uniquely high image resolution. "With the incredible angular resolution of the EHT we could observe a billiard game being played on the Moon and not lose track of the score!" said Maciek Wielgus, an astronomer at Center for Astrophysics | Harvard & Smithsonian, Black Hole Initiative Fellow, and lead author of the paper. In 2009-2013 M87* was observed by early-EHT prototype arrays, with telescopes located at three geographical sites in 2009-2012, and four sites in 2013. In 2017 the EHT reached maturity with telescopes located at five distinct geographical sites across the globe.

"Last year we saw an image of the shadow of a black hole, consisting of a bright crescent formed by hot plasma swirling around M87*, and a dark central part, where we expect the event horizon of the black hole to be," said Wielgus. "But those results were based only on observations performed throughout a one-week window in April 2017, which is far too short to see a lot of changes. Based on last year's results we asked the following questions: is this crescent-like morphology consistent with the archival data? Would the archival data indicate a similar size and orientation of the crescent?"

The 2009-2013 observations consist of far less data than the ones performed in 2017, making it impossible to create an image. Instead, the EHT team used statistical modeling to look at changes in the appearance of M87* over time. While no assumptions about the source morphology are made in the imaging approach, in the modeling approach the data are compared to a family of geometric templates, in this case rings of non-uniform brightness. A statistical framework is then employed to determine if the data are consistent with such models and to find the best-fitting model parameters.

Expanding the analysis to the 2009-2017 observations, scientists have shown that M87* adheres to theoretical expectations. The black hole's shadow diameter has remained consistent with the prediction of Einstein's theory of general relativity for a black hole of 6.5 billion solar masses. "In this study, we show that the general morphology, or presence of an asymmetric ring, most likely persists on timescales of several years," said Kazu Akiyama, a Jansky Fellow of the National Radio Astronomy Observatory (NRAO) at MIT Haystack Observatory, and a contributor to the project. "The consistency throughout multiple observational epochs gives us more confidence than ever about the nature of M87* and the origin of the shadow."

But while the crescent diameter remained consistent, the EHT team found that the data were hiding a surprise: the ring wobbles, and that means big news for scientists. For the first time, they can get a glimpse of the dynamical structure of the accretion flow so close to the black hole's event horizon, in extreme gravity conditions. Studying this region holds the key to understanding phenomena such as relativistic jet launching, and will allow scientists to formulate new tests of the theory of General Relativity.

The gas falling onto a black hole heats up to billions of degrees, ionizes and becomes turbulent in the presence of magnetic fields. "Because the flow of matter is turbulent, the crescent appears to wobble with time," said Wielgus. "Actually, we see quite a lot of variation there, and not all theoretical models of accretion allow for so much wobbling. What it means is that we can start ruling out some of the models based on the observed source dynamics."

"These early-EHT experiments provide us with a treasure trove of long-term observations that the current EHT, even with its remarkable imaging capability, cannot match," said Shep Doeleman, Founding Director, EHT. "When we first measured the size of M87* in 2009, we couldn't have foreseen that it would give us the first glimpse of black hole dynamics. If you want to see a black hole evolve over a decade, there is no substitute for having a decade of data."

EHT Project Scientist Geoffrey Bower, Research Scientist of the Academia Sinica, Institute of Astronomy and Astrophysics (ASIAA), added, "Monitoring M87* with an expanded EHT array will provide new images and much richer data sets to study the turbulent dynamics. We are already working on analyzing the data from 2018 observations, obtained with an additional telescope located in Greenland. In 2021 we are planning observations with two more sites, providing extraordinary imaging quality. This is a really exciting time to study black holes!"

Credit: 
U.S. National Science Foundation

First evidence that air pollution particles and metals are reaching the placenta

image: Fig. 1. Light microscopy images: (A) macrophage-enriched placental cell isolates from different participants, showing black inclusions (red arrows) compatible with phagocytosed inhaled particulate matter; (B) phagocytosed particulate matter in an airway macrophage obtained by sputum induction from a healthy child in London (Liuetal.,2018). Cell nuclei indicated with white arrows. Brightness and contrast of images were adjusted for optimal visualisation of particulate matter.

Image: 
QMUL

Pollution particles, including metals, have been found in the placentas of fifteen women in London, according to research led by Queen Mary University of London.

The study, funded by Barts Charity and published in the journal Science of The Total Environment, demonstrate that inhaled particulate matter from air pollution can move from the lungs to distant organs, and that it is taken up by certain cells in the human placenta, and potentially the foetus.

The researchers say that further research is needed to fully define the direct effect that pollution particles may have on the developing foetus.

Lead author Professor Jonathan Grigg from Queen Mary University of London said: "Our study for the first time shows that inhaled carbon particulate matter in air pollution, travels in the blood stream, and is taken up by important cells in the placenta. We hope that this information will encourage policy makers to reduce road traffic emissions in this post lock down period."

Dr Norrice Liu from Queen Mary University of London added: "Pollution levels in London often exceed annual limits and we know that there is a link between maternal exposure to high pollution levels and problems with the foetus, including risk of low birthweight. However, until now we had limited insight into how that might occur in the body."

Placentas from 15 consenting healthy women were donated to the study following the birth of their children at The Royal London Hospital. Pollution exposure was determined in 13 of the women, all of whom had exposure above the annual mean WHO limit for particulate matter. The cells in the placentas were analysed using a range of techniques including light and electron microscopy, x-rays and magnetic analyses.

Black particles that closely resembled particulate matter from pollution were found in placental cells from all fifteen women and these appeared in an average of 1 per cent of the cells which were analysed.

The majority of particles found in the placental cells were carbon-based, but researchers also found trace amounts of metals including silica, phosphorus, calcium, iron and chromium, and more rarely, titanium, cobalt, zinc and cerium.

Analysis of these nanoparticles strongly suggests that they predominantly originated from traffic-related sources. Many of these metals are associated with fossil fuel combustion, arising from fuel and oil additives, and vehicle brake-wear.

Dr Lisa Miyashita from Queen Mary University of London said: "We have thought for a while that maternal inhalation could potentially result in pollution particles travelling to the placenta once inhaled. However, there are many defence mechanisms in the lung that prevent foreign particles from travelling elsewhere, so it was surprising to identify these particles in the placental cells from all 15 of our participants."

Fiona Miller Smith, Chief Executive of Barts Charity said: "This is an incredibly important study and immensely relevant to mums-to-be in our local community, indeed in any urban community anywhere in the world.

"In the current climate it can be hard to see beyond COVID and so we are particularly proud to have funded this vital work and truly hope that it will lead to greater awareness of the risks of pollution to the unborn child."

The study involved researchers from University of Lancaster, Barts Health NHS Trust, University of Manchester, Central Manchester University Hospital NHS Foundation Trust, King's College London, University of Birmingham, University of Oxford and University of Leeds.

Credit: 
Queen Mary University of London