Tech

Research pinpoints major drivers of tobacco epidemic among teens in South Asia

Advertising on TV and online, being offered free tobacco products and exposure to smoking in public places are the biggest drivers of tobacco use among teens in South Asia, a new study suggests.

The research, led by the University of York, looked at data from Global Youth Tobacco survey on the tobacco use of just under 24,000 adolescents in Bangladesh, India, Pakistan and Sri-Lanka.

The researchers found that while bans on the sale of cigarettes to teens were linked to reduced tobacco use, anti-tobacco mass media messages were an ineffective method. Teaching adolescents about the harmful effects of tobacco at school was effective at reducing their use of smokeless tobacco, but not smoking.

The findings will make an important contribution towards informing tobacco control policies in South Asia, the authors of the study say.

Lead author of the study, Dr. Masuma Mishu from the Global Health research team in the Department of Health Sciences at the University of York, said: "Our study provides important insights on several wider environmental factors that are associated with tobacco use among adolescents in South Asia, which is backed up by robust analysis.

"The study provides a vital message for policy makers that the current form of anti-tobacco media campaigns are unlikely to work on young people in South Asia and suggests evidence on the effectiveness of being taught at school about the harmful effects of tobacco is also inconsistent."

Both smoking and smokeless tobacco use is common among adolescents in South Asia where there are weak tobacco control policies. Only a very limited number of studies have previously investigated the wider environmental factors behind tobacco use among adolescents.

Of the young people who participated in the survey, 2% were smokers, 6.5% used smokeless tobacco and 1.1% were both smokers and smokeless tobacco users.

This study was conducted under "Addressing Smokeless Tobacco and building Research capacity in South Asia" (ASTRA) project, funded by the NIHR. ASTRA has created a platform for an interdisciplinary group of researchers working on the agenda on smokeless tobacco control and research capacity building in South Asia.

The Principal Investigator of ASTRA and co-author of the paper Professor Kamran Siddiqi, from the Department of Health Sciences at the University of York, added: "the findings of this study reveal that there is a need to strengthen and enforce bans on smoking in public places, tobacco advertising (including in electronic media), sponsorship and promotions and the sale of tobacco to and by minors, in order to eliminate pro-tobacco influences on youth and curb the tobacco epidemic in these countries."

Credit: 
University of York

Software spots and fixes hang bugs in seconds, rather than weeks

Hang bugs - when software gets stuck, but doesn't crash - can frustrate both users and programmers, taking weeks for companies to identify and fix. Now researchers from North Carolina State University have developed software that can spot and fix the problems in seconds.

"Many of us have experience with hang bugs - think of a time when you were on website and the wheel just kept spinning and spinning," says Helen Gu, co-author of a paper on the work and a professor of computer science at NC State. "Because these bugs don't crash the program, they're hard to detect. But they can frustrate or drive away customers and hurt a company's bottom line."

With that in mind, Gu and her collaborators developed an automated program, called HangFix, that can detect hang bugs, diagnose the relevant problem, and apply a patch that corrects the root cause of the error. Video of Gu discussing the program can be found here.

The researchers tested a prototype of HangFix against 42 real-world hang bugs in 10 commonly used cloud server applications. The bugs were drawn from a database of hang bugs that programmers discovered affecting various websites. HangFix fixed 40 of the bugs in seconds.

"The remaining two bugs were identified and partially fixed, but required additional input from programmers who had relevant domain knowledge of the application," Gu says.

For comparison, it took weeks or months to detect, diagnose and fix those hang bugs when they were first discovered.

"We're optimistic that this tool will make hang bugs less common - and websites less frustrating for many users," Gu says. "We are working to integrate Hangfix into InsightFinder." InsightFinder is the AI-based IT operations and analytics startup founded by Gu.

The paper, "HangFix: Automatically Fixing Software Hang Bugs for Production Cloud Systems," is being presented at the ACM Symposium on Cloud Computing (SoCC'20), being held online Oct. 19-21. The paper was co-authored by Jingzhu He, a Ph.D. student at NC State who is nearing graduation; Ting Dai, a Ph.D. graduate of NC State who is now at IBM Research; and Guoliang Jin, an assistant professor of computer science at NC State.

The work was done with support from the National Science Foundation under grants 1513942 and 1149445.

HangFix is the latest in a long line of tools Gu's team has developed to address cloud computing challenges. Her 2011 paper, "CloudScale: Elastic Resource Scaling for Multi-tenant Cloud Systems," was selected as the winner of the 2020 SoCC 10-Year Award at this year's conference.

Credit: 
North Carolina State University

EPFL scientist gains fresh insight into the origins of earthquakes

Sometimes barely noticeable, and at other times devasting, earthquakes are a major geological phenomenon which provide a stark reminder that our planet is constantly evolving. Scientists have made significant progress in understanding these events over the past 50 years thanks to sensors set up around the world. And while we know that earthquakes are caused by shifts in tectonic plates, a lot remains to be learned about how and why they occur.

Passelègue, a scientist at ENAC's Laboratory of Experimental Rock Mechanics (LEMR), has been studying the dynamics of faults - or the areas between tectonic plates, where most earthquakes occur - for the past ten years. He recently made a breakthrough in understanding the rupture mechanisms that eventually lead to seismic shifts along fault lines. His findings were published in the prestigious Nature Communications on 12 October 2020.

"We know that rupture speeds can vary from a few millimeters per second to a few kilometers per second once nucleation occurs [the process by which a slip expands exponentially]. But we don't know why some ruptures propagate very slowly and others move quickly," says Passelègue. "However, that's important to know because the faster the propagation, the quicker the energy that accumulates along the fault is released."

An earthquake will generally release the same amount of energy whether it moves slowly or quickly. The difference is that if it moves slowly, its seismic waves can be absorbed by the surrounding earth. These types of slow earthquakes are just as frequent as regular ones; it's just that we can't feel them. In extremely fast earthquakes - which occur much less often - the energy is released in just a few seconds through potentially devasting high-frequency waves. That's what sometimes occurs in Italy, for example. The country is located in a friction zone between two tectonic plates. While most of its earthquakes aren't (or are barely) noticeable, some of them can be deadly - like the one on 2 August 2016 that left 298 people dead.

In his study, Passelègue developed an experimental fault with the same temperature and pressure conditions as an actual fault running 8 km deep. He installed sensors along the fault to identify the factors causing slow vs. fast rupture propagation. "There are lots of hypotheses out there - most scientists think it's related to the kind of rock. They believe that limestone and clay tend to result in slow propagation, whereas harder rocks like granite are conducive to fast propagation," he says. Passelègue's model uses a complex rock similar to granite. He was able to replicate various types of slip on his test device, and found that "the difference isn't necessarily due to the properties of the surrounding rock. A single fault can demonstrate all kinds of seismic mechanisms."

Passelègue's experiments showed that the amount of energy released during a slip, and the length of time over which it's released, depend on the initial strain exerted along the fault; that is, the force applied on the fault line, generally from shifting tectonic plates. By applying forces of different magnitudes to his model, he found that higher strains triggered faster ruptures and lower strains triggered slower ruptures. "We believe that what we observed in the lab would apply under real-world conditions too," he says.

Using the results of his model, Passelègue developed equations that factor in the initial strain on a fault and not just the amount of energy accumulated immediately before a slip, which was the approach used in other equations until now. "François is one of the first scientists to measure rupture speeds in rocks under the same temperature and pressure conditions that you find out in nature. He developed a way to model the mechanisms physically - something that had never been done before. And he showed that all earthquakes follow the same laws of physics," says Marie Violay, head of LEMR.

Passelègue warns that his model cannot be used to determine when or where an earthquake will occur. Since faults run too deep, scientists still aren't able to continually measure the strain on rock right along a fault. "We can identify how much strain there needs to be to cause a rupture, but since we don't know how much a fault is 'loaded up' with energy deep underground, we can't predict the rupture speed."

One implication of Passelègue's research is that earthquakes may not be as random as we thought. "Most people think that faults that have been stable for a long time will never cause a serious earthquake. But we found that any kind of fault can trigger many different types of seismic events. That means a seemingly benign fault could suddenly rupture, resulting in a fast and dangerous wave propagation."

Credit: 
Ecole Polytechnique Fédérale de Lausanne

New therapy improves treatment for multiple sclerosis

Multiple sclerosis, an autoimmune disease of the central nervous system that affects millions worldwide, can cause debilitating symptoms for those who suffer from it.

Though treatments exist, researchers are still searching for therapies that could more effectively treat the disease, or even prevent it altogether.

Researchers at the Pritzker School of Molecular Engineering (PME) at the University of Chicago have designed a new therapy for multiple sclerosis (MS) by fusing a cytokine to a blood protein. In mice, this combination prevented destructive immune cells from infiltrating the central nervous system and decreased the number of cells that play a role in MS development, leading to fewer symptoms and even disease prevention.

Their results, published October 12 in the journal Nature Biomedical Engineering, could eventually lead to a new therapy for the disease.

"The exciting result is that we can suppress MS symptoms in a way that is more effective than current treatments," said Jeffrey Hubbell, Eugene Bell Professor in Tissue Engineering and co-author of the paper.

Binding therapy to a blood protein

While most immune cells help protect the body from disease, in patients with MS, autoreactive immune cells infiltrate the central nervous system and cause damage. Recent studies have shown that Th17 cells, immune cells that are activated in the body's secondary lymphoid organs, migrate to the brain and play a role in the severity of the disease. Several drugs to treat MS work by sequestering these cells in the lymph nodes and preventing them from targeting tissue, but these drugs can have adverse side effects.

Interleukin-4 (IL-4), an anti-inflammatory cytokine, is known to suppress the genes that cause MS and has been found to suppress the reactivation of Th17 cells. To use it as a potential therapy, researchers needed to find a way to keep the IL-4 in the secondary lymphoid organs to ensure that Th17 cells were suppressed and did not migrate.

To do this, they bound IL-4 to a blood protein and injected it into mice that had experimental autoimmune encephalomyelitis (the mouse model of MS) and found that it caused the IL-4 to stay within the secondary lymphoid organs. The result was reduced infiltration of Th17 cells into the spinal cord. That suppressed the disease and resulted in fewer symptoms.

A potential new way to prevent MS

Researchers also found that the therapy even prevented MS from developing in the majority of mice they treated with it.

"This is the first time anyone has shown how the fusion of this protein to immunosuppressive cytokines can treat and prevent multiple sclerosis," said Jun Ishihara, a former postdoctoral researcher in Hubbell's group and co-corresponding author of the paper.

Though the therapy showed few negative side effects, the researchers will next formally study the toxicity of the therapy in hopes of eventually moving it to human clinical trials.

"This treatment could potentially be self-administered by MS patients at home with an injector pen," Hubbell said. "We think this is imminently translatable and could lead to better quality of life, with fewer symptoms, for those with the disease."

Credit: 
University of Chicago

Earphone tracks facial expressions, even with a face mask

ITHACA, N.Y. - Cornell University researchers have invented an earphone that can continuously track full facial expressions by observing the contour of the cheeks - and can then translate expressions into emojis or silent speech commands.

With the ear-mounted device, called C-Face, users could express emotions to online collaborators without holding cameras in front of their faces - an especially useful communication tool as much of the world engages in remote work or learning.

With C-Face, avatars in virtual reality environments could express how their users are actually feeling, and instructors could get valuable information about student engagement during online lessons. It could also be used to direct a computer system, such as a music player, using only facial cues.

"This device is simpler, less obtrusive and more capable than any existing ear-mounted wearable technologies for tracking facial expressions," said Cheng Zhang, assistant professor of information science and senior author of "C-Face: Continuously Reconstructing Facial Expressions by Deep Learning Contours of the Face With Ear-Mounted Miniature Cameras."

The paper will be presented at the Association for Computing Machinery Symposium on User Interface Software and Technology, to be held virtually Oct. 20-23.

"In previous wearable technology aiming to recognize facial expressions, most solutions needed to attach sensors on the face," said Zhang, director of Cornell's SciFi Lab, "and even with so much instrumentation, they could only recognize a limited set of discrete facial expressions."

Because it works by detecting muscle movement, C-Face can capture facial expressions even when users are wearing masks, Zhang said.

The device consists of two miniature RGB cameras - digital cameras that capture red, green and bands of light - positioned below each ear with headphones or earphones. The cameras record changes in facial contours caused when facial muscles move.

Once the images are captured, they're reconstructed using computer vision and a deep learning model. Since the raw data is in 2D, a convolutional neural network - a kind of artificial intelligence model that is good at classifying, detecting and retrieving images - helps reconstruct the contours into expressions.

The model translates the images of cheeks to 42 facial feature points, or landmarks, representing the shapes and positions of the mouth, eyes and eyebrows, since those features are the most affected by changes in expression.

These reconstructed facial expressions represented by 42 feature points can also be translated to eight emojis, including "natural," "angry" and "kissy-face," as well as eight silent speech commands designed to control a music device, such as "play," "next song" and "volume up."

The ability to direct devices using facial expressions could be useful for working in libraries or other shared workspaces, for example, where people might not want to disturb others by speaking out loud. Translating expressions into emojis could help those in virtual reality collaborations communicate more seamlessly, said Francois Guimbretière, professor of information science and a co-author of the C-Face paper.

One limitation to C-Face is the earphones' limited battery capacity, Zhang said. As its next step, the team plans to work on a sensing technology that uses less power.

Credit: 
Cornell University

One-two punch

Drought is endemic to the American West along with heatwaves and intense wildfires. But scientists are only beginning to understand how the effects of multiple droughts can compound to affect forests differently than a single drought alone.

UC Santa Barbara forest ecologist Anna Trugman -- along with her colleagues at the University of Utah, Stanford University and the U.S. Forest Service -- investigated the effects of repeated, extreme droughts on various types of forests across the globe. They found that a variety of factors can increase and decrease a forest's resilience to subsequent droughts. However, the study, published in Nature Climate Change, concluded that successive droughts are generally increasingly detrimental to forests, even when each drought was no more extreme than the initial one.

Droughts usually leave individual trees more vulnerable to subsequent droughts. "Compounding extreme events can be really stressful on forests and trees," said Trugman, an assistant professor in the Department of Geography. She compares the experience to a person battling an illness: You'll be harder hit if you get sick again while you're still recovering.

That said, the case is not quite so clear cut. "Theoretically, responses to subsequent droughts could be quite varied depending on a wide range of tree-level and ecosystem-level factors," said lead author William Anderegg, an assistant professor at the University of Utah. So, while a drought may place a tree under considerable stress, it could also kill off some of its neighbors, leaving the survivors with less competition for water should arid conditions return.

Trugman and her colleagues used a variety of data sources to investigate this effect on a broad scale. Tree ring data spanning over 100 years enabled them to see how trees that survived an initial drought grew afterward. Data from the U.S. Forest Inventory and Analysis gave them access to metrics on tree mortality for more than 100,000 forest plots from 2000 through 2018. They combined these sources with satellite measurements of the water content in forest canopies.

Two clear tends emerged. "We found that generally trees seem to become more vulnerable to stress after multiple droughts, especially conifers," Anderegg said.

The second finding, the researchers believe, comes down to basic physiology. Conifers and their kin have different vascular systems than broadleaf trees, or "angiosperms." As a result, they may sustain more damage in an initial drought and be at a disadvantage compared to angiosperms during subsequent periods of drought stress. The tree ring data bears this out, showing that conifers that survived a drought grew much more slowly, especially if another drought settled in.

"By contrast, angiosperms have much more flexible anatomy and physiology, and this seems to help them recover faster and more fully after initial droughts," Anderegg said.

Anderegg was particularly surprised by the impact repeated drought had on the Amazon Rainforest. "We tend to think of these forests as not very impacted by drought and, due to their high tree diversity, able to recover quickly," he said. "But our results indicate the Amazon has been hit hard by three very severe droughts in the past 15 years."

Forests are complex systems, and a variety of factors ultimately dictate how they respond to extreme events. "In terms of damage you need to not only think about it at the individual level, but at the forest level as well," said Trugman. So, although they will need time to recover from an extreme drought, surviving trees will face less competition for water resources than they had before. This could leave them in a better situation if drought returns to the area.

What's more, natural selection will drive the forest as a whole to transition toward more resilient individuals, or even to more drought tolerant species overall. Repeated droughts affect forest pests and pathogens as well, and their response to these conditions will also influence how forests behave.

Scientists are still working to untangle the conditions under which each of these factors rises to the top. "This [study] provides a lot of motivation," said Trugman, "but I think the next pressing step is to get at the underlying mechanisms at a physiological level and ecological level."

Researchers can use these insights to improve computer models and make more accurate forecasts about the future of forests in a changing climate. "Climate change is going to bring more frequent droughts," Anderegg said, "so we have to understand and be able to forecast how forests will respond to multiple droughts.

"These results are especially crucial in the western U.S.," he added, "where we've had a number of major droughts in the past 20 years.

Credit: 
University of California - Santa Barbara

Fuels, not fire weather, control carbon emissions in boreal forest

image: Rockets represent carbon stored in wood, trees, and soil in four main boreal forest regions. Though fire weather helps "ignite" the rockets, the amount of emissions each forest can produce is determined by fuel load (soil layers) and flammability (soil moisture).

Image: 
Victor Leshyk, Center for Ecosystem Science and Society

As climate warming stokes longer fire seasons and more severe fires in the North American boreal forest, being able to calculate how much carbon each fire burns grows more urgent. New research led by Northern Arizona University and published this week in Nature Climate Change suggests that how much carbon burns depends more on available fuels than on fire weather such as drought conditions, temperature, or rain. In a large retrospective study that stretched across Canada and Alaska, the international team of researchers found that the carbon stored belowground in soil organic matter was the most important predictor of how much carbon a fire will release.

The team surveyed the vast Western Boreal's diverse forest conditions by analyzing field data collected from 417 burn sites in six ecoregions in Canada and Alaska between 2004-2015. They found that the amount of carbon stored in soils was the biggest predictor of how much carbon would combust, and that soil moisture was also significant in predicting carbon release.

"In these northern forests, soil, not trees, can account for up to 90 percent of carbon emissions, so we expected that these organic soils would be a significant driver," said lead author Xanthe Walker of the Center for Ecosystem Science and Society at Northern Arizona University. "But we were surprised that fire weather and the time of year a fire starts proved to be poor indicators of carbon combustion. It's really about the fuels that are there when a fire starts."

That's a pivotal finding, since fire weather, as measured by a Fire Weather Index, is one of the main tools scientists and fire managers currently use to model carbon emissions in these boreal forests. This study suggests fuels should be a bigger component of those models. "When we think of climate change and wildfires, we often instinctively think of extreme weather conditions," said Marc-André Parisien, a research scientist with the Canadian Forest Service and co-author of the study. "But our study shows that vegetation also matters--a lot! Predicting future vegetation is a tough nut to crack, but this study emphasizes the need to keep chipping away at it."

The vegetation patterns they uncovered were complex--soil moisture, tree species composition, and stand age at the time of fire all interacted to predict combustion amounts. For instance, highly flammable black spruce was generally a predictor of carbon combustion, and the presence of this species increased with site moisture and stand age at the time of fire. But such interactions are likely to change with the climate. For example, as the climate warms and fire intervals shorten, black spruce stands are being replaced by deciduous trees and jack pine, which grow in shallower soils that release less carbon during fires. The site-level resolution of the study allowed the researchers to capture such dynamism in carbon combustion patterns, and offers clues about the way they may shift in the future.

"We really need to move beyond the misconception of the boreal forest as a monotonous stretch of forest," said Sander Veraverbeke, assistant professor at Vrije Universiteit Amsterdam and co-author of the study. "While only a few tree species occur in the boreal forest, its diversity in ecosystem structure, forest age, topography, peatland occurrence and permafrost conditions is enormous, and our paper shows that these features dictate the carbon emissions from boreal fires. The good news is that we can map aspects of this fine-scale ecosystem variation with current tools from NASA and other space agencies. Now we need to do this at the continental scale."

The level of detail this study captured offers modelers a framework for asking more questions about carbon, said Michelle Mack, senior author on the study and professor of biology at Northern Arizona University. "In the past, fire models have focused on fire behavior, not carbon emissions," Mack said. "It's only been in the last decade or so that we've seen a global effort to quantify how much carbon these fires are releasing. We hope that our observations about fuels will inform the models as we work to better understand the boreal forest's emission trajectory."

Parisien agreed. "We are figuring out that fire-vegetation feedbacks a lot stronger that we thought they were just a few years ago," he said. "Of course, we'll never be able to manage all of vast boreal biome--nor should we want to--but this helps us know what targeted actions, such as fire management or modifying forest vegetation, we can take to limit carbon loss."

Credit: 
Northern Arizona University

Carnivores living near people feast on human food, threatening ecosystems

image: Researchers collected bone and fur samples from almost 700 carnivores across four Great Lakes states (top) to compare their diets to the extent of human development, which varied from minimal to urban sprawl (bottom).

Image: 
Phil Manlick. Photo credits (Left to Right): Flickr/Tambako the Jaguar/Renee Grayson and Wikimedia Commons/United States National Park Service/United States Fish and Wildlife Service.

MADISON - Ecologists at the University of Wisconsin-Madison have found that carnivores living near people can get more than half of their diets from human food sources, a major lifestyle disruption that could put North America's carnivore-dominated ecosystems at risk.
The researchers studied the diets of seven predator species across the Great Lakes region of the U.S. They gathered bone and fur samples for chemical analysis from areas as remote as national parks to major metropolitan regions like Albany, New York. They found that the closer carnivores lived to cities and farms, the more human food they ate.

While evolution has shaped these species to compete for different resources, their newfound reliance on a common food source could put them in conflict with one another. That conflict could be reordering the relationship between different carnivores and between predators and prey, with an unknown but likely detrimental impact on ecosystems that evolved under significant influence of strong predators.

Jon Pauli, a UW-Madison professor of forest and wildlife ecology, and his former graduate student Phil Manlick, published their findings this week in the Proceedings of the National Academy of Sciences. The study is the most comprehensive look yet at how most of the region's major carnivores -- like gray wolves, coyotes, and bobcats -- have changed their diets in response to people.

How much human food they ate varied considerably by location. On average, more than 25 percent of the carnivores' diets came from human sources in the most human-altered habitats.

It also varied by species. For instance, committed carnivores like bobcats ate a relatively small amount of human food. "But what you see is that the sort of generalist species that you might expect -- coyotes, foxes, fishers, martens -- in human-dominated landscapes, they're getting upwards of 50 percent of their diet from human foods," says Manlick, the lead author of the study who is now a postdoctoral researcher at the University of New Mexico. "That's a relatively shocking number, I think."

Pauli and Manlick found that relying on human food sources increased how much carnivores overlapped one another in their competition for food. Compared to when these predators vie for distinct prey, this increased competition could lead to more conflicts between animals. Their reliance on human food could also make the carnivores vulnerable to human attacks near towns, or even change how and when they hunt traditional prey, with potentially harmful ecological consequences.

The researchers studied the diets of almost 700 carnivores, including red and gray foxes, fishers, and American martens. They gathered bone and fur samples from Minnesota, Wisconsin, New York and the Upper Peninsula of Michigan with the help of state and federal researchers and citizen-science trappers. The researchers compared the carnivores' diets to the extent of human development in the region, which varied from essentially pristine wilderness to urban sprawl.

Thanks to quirks in how plants incorporate carbon as they grow, a sample of bone or fur is enough to get a snapshot of an animal's diet. Different weights, or isotopes, of carbon are common in different plants -- and in the animals who ultimately eat them.

"Isotopes are relatively intuitive: You are what you eat," says Manlick. "If you look at humans, we look like corn."

Human foods, heavy in corn and sugar, lend them distinctive carbon signatures. In contrast, the diets of prey species in the wild confer their own carbon signatures. The ratio of these two isotope fingerprints in a predator's bone can tell scientists what proportion of their diet came from human sources, either directly or from their prey that ate human food first.

The geographic extent of the study and the large number of species the ecologists examined demonstrate that the trend of human food subsidies in carnivore diets is not limited to a single location or species. The ultimate outcome of such widespread disruptions remains unclear.

"When you change the landscape so dramatically in terms of one of the most important attributes of a species -- their food -- that has unknown consequences for the overall community structure," says Pauli. "And so I think the onus is now on us as ecologists and conservation biologists to begin to understand these novel ecosystems and begin to predict who are the winners and who are the losers."

Credit: 
University of Wisconsin-Madison

Mosquitoes' taste for blood traced to four types of neurons

image: A female mosquito has finely tuned senses that help her find the blood meal she needs in order to reproduce. New research reveals how the insects experience the taste of blood.

Image: 
©Alex Wild, used by permission

It's one of the world's deadliest animals, and it has a taste for human blood: the mosquito.

Mosquitoes spread diseases like malaria, dengue, and yellow fever that kill at least a half a million people each year. Now researchers are learning what humans taste like to mosquitoes, down to the individual neurons that sense blood's distinctive, delectable flavor.

Female mosquitoes have a sense of taste that is specially tuned to detect a combination of at least four different substances in blood, Howard Hughes Medical Institute Investigator Leslie Vosshall's team at The Rockefeller University and colleagues report October 12, 2020, in the journal Neuron. The team genetically modified mosquitoes so that researchers could see which neurons fire when a mosquito tastes blood.

"This is definitely a technical tour de force," says neuroscientist Chris Potter of the Johns Hopkins University School of Medicine, who studies mosquito repellents. Identifying the specific taste neurons associated with blood might be something "we could use against the mosquito," he says.

Vosshall and her team already knew a great deal about the insect's other finely tuned senses. In previous work, for instance, they've found that mosquitoes can detect the repellent DEET with their legs and have identified an odorant receptor that mosquitoes use to distinguish between humans and non-humans. But little is known about mosquitoes' sense of taste, despite being key to spreading illness. "If mosquitoes weren't able to detect the taste of blood, in theory they couldn't transmit disease," says Veronica Jové, an HHMI Gilliam Fellow at Rockefeller who led the work in Vosshall's laboratory.

Only female mosquitoes feed on blood, which they need for their eggs to develop. That puts females in a unique position. They need to distinguish between the sweet nectar they eat for most of their meals and the blood they gorge on before laying eggs.

Jové suspected that female Aedes aegypti mosquitoes, unlike males, would be able to distinguish between the two substances by taste. Indeed, in behavioral experiments she found that female mosquitoes have two feeding modes that use different mouthparts and detect different flavors. A nectar-feeding mode detects sugars and a blood-feeding mode uses a syringe-like "stylet" to pierce the skin and taste blood. Jové tricked the mosquitoes into the blood-feeding mode by offering them a mix of four compounds: glucose (a sugar), sodium chloride (salt), sodium bicarbonate (found in both blood and baking soda), and adenosine triphosphate, or ATP, a compound that provides energy to cells.

Vosshall was curious, so she asked Jové to whip up an ATP solution in the lab and then took a sip. "It doesn't have a taste at all," she says. "ATP is this special mystery stuff that tastes like nothing to humans. But it's got to be incredibly exciting and rewarding for the mosquito."

Just as a human has taste buds that differentiate between salty, sweet, bitter, sour, and umami flavors, a mosquito's stylet has neurons specialized to respond to particular flavors. To see these taste neurons in action, the researchers genetically modified mosquitoes with a fluorescent tag that glowed when a nerve cell was activated. Then they watched which cells in the stylet lit up in response to different meals. Only a subset were activated by blood, including both real blood and the researchers' artificial mix.

So just what does human blood taste like to a mosquito? Perhaps the closest we can say is that it's a little salty and a little sweet. It's a bit like trying to describe the way a honeybee sees a flower in ultraviolet hues invisible to the human eye, or how a bat eavesdrops on sonar waves we can't hear, Vosshall says. Likewise, a female mosquito can taste things we can't. "There is nothing like this in the human experience," she says.

The findings shed light on just how specially adapted the female mosquito is to find blood. Jové and Vosshall say they hope that a better understanding of mosquitoes' senses will ultimately lead to new ways to stop them from biting us and spreading disease.

One possibility might sound like science fiction, Vosshall says, but there is precedent. "I just gave my dogs their monthly flea and tick medication, which is oral," she says. Perhaps something similar could eventually be done for mosquitoes - a drug that humans could take before going to a mosquito-infested area that would interfere with mosquito's taste for blood.

That idea, which boils down to making humans less delicious, raises one last question. Are some people really "tastier" to mosquitoes than others? "We're all tasty enough for a mosquito," Jové says. Once they detect blood, she says, "we don't have a sense they're very picky."

Credit: 
Howard Hughes Medical Institute

New bioengineering approach to fix fetal membranes

New research led by Queen Mary University of London and UCL has shown that small bioengineered molecules can be used to repair defects in the fetal membranes that surround and protect babies developing in the womb.

The study, published in the journal Prenatal Diagnosis, found that these molecules, known as peptide amphiphiles (PAs) self-assemble to form a 'plug' that seals holes within the fetal membranes, and could potentially help repair any damage.

The integrity of the fetal membranes during pregnancy is vital for normal development. The premature rupture of fetal membranes, known as preterm prelabour rupture of the membranes (PPROM), is a major cause of preterm birth accounting for around 40 per cent of early infant death.

Currently there are no clinical approaches available to repair or improve healing in the fetal membranes.

For the study, the researchers established a fetal membrane defect model, which mimics the creation of a defect site within the fetal membrane as a result of keyhole surgery. They assembled donated human fetal membrane tissue onto cell culture dishes and injected human amniotic fluid, the protective fluid which surrounds the developing fetus, underneath. The injection creates a small hole within the membrane to which the PAs were then added.

The research team found that PAs spontaneously assembled together seconds after coming into contact with the human amniotic fluid. Within two minutes, a dense mesh of fibres formed a plug, which sealed the hole created at the injection site after 24 hours.

When they combined this novel use of PAs with protein sequences known to promote adhesion and healing, the researchers observed an enhanced sealing and regenerative effect.

The findings follow on from previous work from the research team, which showed that reducing the levels of a protein called connexin 43 (Cx43) encouraged rebuilding of the fetal membranes, and enhances the processes of tissue healing and repair.

Whilst PPROM can occur spontaneously, it can also result from fetal surgery and prenatal diagnosis procedures such as amniocentesis that require doctors to make a hole in the fetal membrane sac.

Fetal medicine specialists are increasingly offering surgery to babies in the womb before birth, to treat abnormalities of the spine, diaphragm or placenta. PPROM complicates around one third of these cases, reducing the clinical effectiveness of fetal surgery.

Dr Tina Chowdhury, Senior Lecturer in Regenerative Medicine at Queen Mary, said: "The next step in this research is to understand whether these molecular 'plugs' are able to withstand the mechanical forces such as tension or pressure induced by the developing fetus and the amniotic fluid. We also need to explore the wound healing mechanisms in more detail, and the safety of using the peptides for expectant mothers and babies during pregnancy."

Professor Anna David, Professor of Obstetrics and Fetal Medicine at UCL Elizabeth Garrett Anderson Institute for Women's Health and a co-author of the study, said: "Finding a method to heal the amniotic membranes and prevent preterm birth after PPROM is a vital step to improving the outcomes of babies where the membranes rupture. The sealing and regeneration that we saw is very encouraging for our bioengineering approach."

Credit: 
Queen Mary University of London

Penn Medicine scientists engineer bacteria-killing molecules from wasp venom

PHILADELPHIA--A team led by scientists in the Perelman School of Medicine at the University of Pennsylvania has engineered powerful new antimicrobial molecules from toxic proteins found in wasp venom. The team hopes to develop the molecules into new bacteria-killing drugs, an important advancement considering increasing numbers of antibiotic-resistant bacteria which can cause illness such as sepsis and tuberculosis.

In the study, published today in the Proceedings of the National Academy of Sciences, the researchers altered a highly toxic small protein from a common Asian wasp species, Vespula lewisii, the Korean yellow-jacket wasp. The alterations enhanced the molecule's ability to kill bacterial cells while greatly reducing its ability to harm human cells. In animal models, the scientists showed that this family of new antimicrobial molecules made with these alterations could protect mice from otherwise lethal bacterial infections.

There is an urgent need for new drug treatments for bacterial infections, as many circulating bacterial species have developed a resistance to older drugs. The U.S. Centers for Disease Control & Prevention has estimated that each year nearly three million Americans are infected with antibiotic-resistant microbes and more than 35,000 die of them. Globally the problem is even worse: Sepsis, an often-fatal inflammatory syndrome triggered by extensive bacterial infection, is thought to have accounted for about one in five deaths around the world as recently as 2017.

"New antibiotics are urgently needed to treat the ever-increasing number of drug-resistant infections, and venoms are an untapped source of novel potential drugs. We think that venom-derived molecules such as the ones we engineered in this study are going to be a valuable source of new antibiotics," said study senior author César de la Fuente, PhD, a Presidential Assistant Professor in Psychiatry, Microbiology, and Bioengineering at Penn.

De la Fuente and his team started with a small protein, or "peptide," called mastoparan-L, a key ingredient in the venom of Vespula lewisii wasps. Mastoparan-L-containing venom is usually not dangerous to humans in the small doses delivered by wasp stings, but it is quite toxic. It destroys red blood cells, and triggers a type of allergic/inflammatory reaction that in susceptible individuals can lead to a fatal syndrome called anaphylaxis--in which blood pressure drops and breathing becomes difficult or impossible.

Mastoparan-L (mast-L) also is known for its moderate toxicity to bacterial species, making it a potential starting point for engineering new antibiotics. But there are still some unknowns, including how to enhance its anti-bacterial properties, and how to make it safe for humans.

The team searched a database of hundreds of known antimicrobial peptides and found a small region, the so-called pentapeptide motif, that was associated with strong activity against bacteria. The researchers then used this motif to replace a section at one end of mast-L that is thought to be the chief source of toxicity to human cells.

In a key set of experiments, the researchers treated mice with mast-MO several hours after infecting them with otherwise lethal, sepsis-inducing strains of the bacteria E. coli or Staphylococcus aureus. In each test the antimicrobial peptide kept 80 percent of treated mice alive. By contrast, mice treated with mast-L were less likely to survive, and showed severe toxic side-effects when treated with higher doses--doses at which mast-MO caused no evident toxicity.

The potency of mast-MO in these tests also appeared to be comparable to existing antibiotics such as gentamicin and imipenem--for which alternatives are needed due to the spread of resistant bacterial strains.

De la Fuente and his colleagues found evidence in the study that mast-MO kills bacterial cells by making their outer membranes more porous--which can also improve the ability of co-administered antibiotics to penetrate the cells--and by summoning antimicrobial white blood cells. At the same time, mast-MO appears to damp down the kind of harmful immune-overreaction that can lead to severe disease in some bacterial infections.

The researchers created dozens of variants of mast-MO and found several that appeared to have significantly enhanced antimicrobial potency with no toxicity to human cells. They hope to develop one or more of these molecules into new antibiotics--and they expect to take a similar approach in future to turn other venom toxins into promising antibiotic candidates.

"The principles and approaches we used in this study can be applied more broadly to better understand the antimicrobial and immune-modulating properties of peptide molecules, and to harness that understanding to make valuable new treatments," de la Fuente said.

Credit: 
University of Pennsylvania School of Medicine

Using robotic assistance to make colonoscopy kinder and easier

image: The robotic arm houses a magnet that interacts with magnets on a small capsule inside the patient and is able to navigate the capsule to the correct spot inside the colon.

Image: 
University of Leeds

Scientists have made a breakthrough in their work to develop semi-autonomous colonoscopy, using a robot to guide a medical device into the body.

The milestone brings closer the prospect of an intelligent robotic system being able to guide instruments to precise locations in the body to take biopsies or allow internal tissues to be examined.

A doctor or nurse would still be on hand to make clinical decisions but the demanding task of manipulating the device is offloaded to a robotic system.

The latest findings - 'Enabling the future of colonoscopy with intelligent and autonomous magnetic manipulation' - is the culmination of 12 years of research by an international team of scientists led by the University of Leeds.

The research is published today (Monday, 12 October) in the scientific journal Nature Machine Intelligence. Once the embargo lifts, the paper can be downloadable from https://www.nature.com/articles/s42256-020-00231-9

Patient trials using the system could begin next year or in early 2022.

Pietro Valdastri, Professor of Robotics and Autonomous Systems at Leeds, is supervising the research. He said: "Colonoscopy gives doctors a window into the world hidden deep inside the human body and it provides a vital role in the screening of diseases such as colorectal cancer. But the technology has remained relatively unchanged for decades.

"What we have developed is a system that is easier for doctors or nurses to operate and is less painful for patients. It marks an important a step in the move to make colonoscopy much more widely available - essential if colorectal cancer is to be identified early."

Because the system is easier to use, the scientists hope this can increase the number of providers who can perform the procedure and allow for greater patient access to colonoscopy.

A colonoscopy is a procedure to examine the rectum and colon. Conventional colonoscopy is carried out using a semi-flexible tube which is inserted into the anus, a process some patients find so painful they require an anaesthetic.

Magnetic flexible colonoscope

The research team has developed a smaller, capsule-shaped device which is tethered to a narrow cable and is inserted into the anus and then guided into place - not by the doctor or nurse pushing the colonoscope but by a magnet on a robotic arm positioned over the patient.

The robotic arm moves around the patient as it manoeuvres the capsule. The system is based on the principle that magnetic forces attract and repel.

The magnet on the outside of the patient interacts with tiny magnets in the capsule inside the body, navigating it through the colon. The researchers say it will be less painful than having a conventional colonoscopy.

Guiding the robotic arm can be done manually but it is a technique that is difficult to master. In response, the researchers have developed different levels of robotic assistance. This latest research evaluated how effective the different levels of robotic assistance were in aiding non-specialist staff to carry out the procedure.

Levels of robotic assistance

Direct robot control. This is where the operator has direct control of the robot via a joystick. In this case, there is no assistance.

Intelligent endoscope teleoperation. The operator focuses on where they want the capsule to be located in the colon, leaving the robotic system to calculate the movements of the robotic arm necessary to get the capsule into place.

Semi-autonomous navigation. The robotic system autonomously navigates the capsule through the colon, using computer vision - although this can be overridden by the operator.

During a laboratory simulation, 10 non-expert staff were asked to get the capsule to a point within the colon within 20 minutes. They did that five times, using the three different levels of assistance.

Using direct robot control, the participants had a 58% success rate. That increased to 96% using intelligent endoscope teleoperation - and 100% using semi-autonomous navigation.

In the next stage of the experiment, two participants were asked to navigate a conventional colonoscope into the colon of two anaesthetised pigs - and then to repeat the task with the magnet-controlled robotic system using the different levels of assistance. A vet was in attendance to ensure the animals were not harmed.

The participants were scored on the NASA Task Load Index, a measure of how taxing a task was, both physically and mentally.

The NASA Task Load Index revealed that they found it easier to operate the colonoscope with robotic assistance. A sense of frustration was a major factor in operating the conventional colonoscope and where participants had direct control of the robot.

James Martin, a PhD researcher from the University of Leeds who co-led the study, said: "Operating the robotic arm is challenging. It is not very intuitive and that has put a brake on the development of magnetic flexible colonoscopes.

"But we have demonstrated for the first time that it is possible to offload that function to the robotic system, leaving the operator to think about the clinical task they are undertaking - and it is making a measurable difference in human performance."

The techniques developed to conduct colonoscopy examinations could be applied to other endoscopic devices, such as those used to inspect the upper digestive tract or lungs.

Dr Bruno Scaglioni, a Postdoctoral Research Fellow at Leeds and co-leader of the study, added: "Robot-assisted colonoscopy has the potential to revolutionize the way the procedure is carried out. It means people conducting the examination do not need to be experts in manipulating the device.

"That will hopefully make the technique more widely available, where it could be offered in clinics and health centres rather than hospitals."

Credit: 
University of Leeds

New virtual reality software allows scientists to 'walk' inside cells

image: DBScan analysis being performed a mature neuron in a typical vLUME workspace.

Image: 
Alexandre Kitching

Virtual reality software which allows researchers to 'walk' inside and analyse individual cells could be used to understand fundamental problems in biology and develop new treatments for disease.

The software, called vLUME, was created by scientists at the University of Cambridge and 3D image analysis software company Lume VR Ltd. It allows super-resolution microscopy data to be visualised and analysed in virtual reality, and can be used to study everything from individual proteins to entire cells. Details are published in the journal Nature Methods.

Super-resolution microscopy, which was awarded the Nobel Prize for Chemistry in 2014, makes it possible to obtain images at the nanoscale by using clever tricks of physics to get around the limits imposed by light diffraction. This has allowed researchers to observe molecular processes as they happen. However, a problem has been the lack of ways to visualise and analyse this data in three dimensions.

"Biology occurs in 3D, but up until now it has been difficult to interact with the data on a 2D computer screen in an intuitive and immersive way," said Dr Steven F. Lee from Cambridge's Department of Chemistry, who led the research. "It wasn't until we started seeing our data in virtual reality that everything clicked into place."

The vLUME project started when Lee and his group met with the Lume VR founders at a public engagement event at the Science Museum in London. While Lee's group had expertise in super-resolution microscopy, the team from Lume specialised in spatial computing and data analysis, and together they were able to develop vLUME into a powerful new tool for exploring complex datasets in virtual reality.

"vLUME is revolutionary imaging software that brings humans into the nanoscale," said Alexandre Kitching, CEO of Lume. "It allows scientists to visualise, question and interact with 3D biological data, in real time all within a virtual reality environment, to find answers to biological questions faster. It's a new tool for new discoveries."

Viewing data in this way can stimulate new initiatives and ideas. For example, Anoushka Handa - a PhD student from Lee's group - used the software to image an immune cell taken from her own blood, and then stood inside her own cell in virtual reality. "It's incredible - it gives you an entirely different perspective on your work," she said.

The software allows multiple datasets with millions of data points to be loaded in and finds patterns in the complex data using in-built clustering algorithms. These findings can then be shared with collaborators worldwide using image and video features in the software.

"Data generated from super-resolution microscopy is extremely complex," said Kitching. "For scientists, running analysis on this data can be very time consuming. With vLUME, we have managed to vastly reduce that wait time allowing for more rapid testing and analysis."

The team are mostly using vLUME with biological datasets, such as neurons, immune cells or cancer cells. For example, Lee's group has been studying how antigen cells trigger an immune response in the body. "Through segmenting and viewing the data in vLUME, we've quickly been able to rule out certain hypotheses and propose new ones," said Lee. This software allows researchers to explore, analyse, segment and share their data in new ways. All you need is a VR headset."

Credit: 
University of Cambridge

Total deaths recorded during the pandemic far exceed those attributed to COVID-19

image: This graph shows the number of weekly excess deaths for the 10 states with the largest number of excess deaths during March-July 2020. The dates on the graph indicate when broad COVID-19 restrictions were lifted in each state using data from reports by The New York Times.

Image: 
Courtesy of JAMA Network

RICHMOND, Va. (Oct. 12, 2020) -- For every two deaths attributed to COVID-19 in the U.S., a third American dies as a result of the pandemic, according to new data publishing Oct. 12 in the Journal of the American Medical Association.

The study, led by researchers at Virginia Commonwealth University, shows that deaths between March 1 and Aug. 1 increased 20% compared to previous years -- maybe not surprising in a pandemic. But deaths attributed to COVID-19 only accounted for 67% of those deaths.

"Contrary to skeptics who claim that COVID-19 deaths are fake or that the numbers are much smaller than we hear on the news, our research and many other studies on the same subject show quite the opposite," said lead author Steven Woolf, M.D., director emeritus of VCU's Center on Society and Health.

The study also contains suggestive evidence that state policies on reopening early in April and May may have fueled the surges experienced in June and July.

"The high death counts in Sun Belt states show us the grave consequences of how some states responded to the pandemic and sound the alarm not to repeat this mistake going forward," said Woolf, a professor in the Department of Family Medicine and Population Health at the VCU School of Medicine.

Total death counts in the U.S. are remarkably consistent from year to year, as the study notes. The study authors pulled data from the Centers for Disease Control and Prevention for 2014 to 2020, using regression models to predict expected deaths for 2020.

The gap between reported COVID-19 deaths and all unexpected deaths can be partially explained by delays in reporting COVID-19 deaths, miscoding or other data limitations, Woolf said. But the pandemic's other ripple effects could explain more.

"Some people who never had the virus may have died because of disruptions caused by the pandemic," said Woolf, VCU's C. Kenneth and Dianne Wright Distinguished Chair in Population Health and Health Equity. "These include people with acute emergencies, chronic diseases like diabetes that were not properly care for, or emotional crises that led to overdoses or suicides."

For example, the study specifically showed that the entire nation experienced significant increases in deaths from dementia and heart disease. Woolf said deaths from Alzheimer's disease and dementia increased not only in March and April, when the pandemic began, but again in June and July when the COVID-19 surge in the Sun Belt occurred.

This study, with data from March to Aug. 1, builds on a previously published JAMA article by the same authors from VCU and Yale University that focused on data from March to May 1. And it brings in new data about the timing of when states lifted restrictions on social distancing.

States like New York and New Jersey, which were hit hard early, were able to bend the curve and bring death rates down in less than 10 weeks. Meanwhile, states such as Texas, Florida and Arizona that escaped the pandemic at first but reopened early showed a protracted summer surge that lasted 16-17 weeks -- and was still underway when the study ended.

"We can't prove causally that the early reopening of those states led to the summer surges. But it seems quite likely," said Woolf. "And most models predict our country will have more excess deaths if states don't take more assertive approaches in dealing with community spread. The enforcement of mask mandates and social distancing is really important if we are to avoid these surges and major loss of life."

Woolf paints a grim picture, warning that long-term data may show a broader impact of the pandemic on mortality rates. Cancer patients who have had their chemotherapy disrupted, women who have had their mammograms delayed -- preventable, early deaths may increase in the coming years, he said.

"And death is only one measure of health," Woolf said. "Many people who survive this pandemic will live with lifelong chronic disease complications. Imagine someone who developed the warning signs of a stroke but was scared to call 9-1-1 for fear of getting the virus. That person may end up with a stroke that leaves them with permanent neurological deficits for the rest of their life."

Diabetes complications that aren't being managed properly could lead to kidney failure and dialysis. And behavioral health issues, like emotional trauma, are going untreated. Woolf worries most about the lasting effects on children -- long-term, generational outcomes.

"This isn't a pandemic involving a single virus," said Peter Buckley, M.D., dean of the VCU School of Medicine. "This is a public health crisis with broad and lasting ripple effects. VCU researchers have been diligent in their investigations into both treatment of COVID-19 and in understanding the long-term repercussions of the pandemic, so that fellow doctors, policymakers and community members can fight these battles on multiple fronts."

Co-authors on Woolf's paper include: Derek Chapman, Ph.D., Latoya Hill, DaShaunda Taylor and Roy Sabo, Ph.D., of VCU; and Daniel Weinberger, Ph.D., of Yale University.

The study complements another VCU researcher's recent data showing an alarming surge in opioid overdoses at VCU Medical Center during the pandemic. Taylor Ochalek, Ph.D., a postdoctoral research fellow at the Wright Center, found a 123% increase in nonfatal overdoses between March and June this year, as compared to last, in a study also published in JAMA.

Woolf notes that the CDC has released provisional overdose deaths under a broad label called "external causes," which also includes car crashes and homicides, making research like Ochalek's all the more important.

"Car crashes decreased because fewer people were driving during the lockdowns," Woolf said. "We worry that the broad umbrella category of 'external causes' may hide an increase in deaths from overdoses, because the opioid epidemic didn't go away."

The CDC, Woolf added, has rushed out provisional mortality data this year because of the pandemic. More reliable, granular detail will come out later and allow researchers to unpack the detailed contributors to excess deaths and secondary health impacts of the pandemic.

Researchers across multiple disciplines at VCU are studying the secondary health impacts of the pandemic -- from substance use disorders and intimate partner violence to diminished access to regular medical care -- all of which could contribute to loss of life, according to Woolf's study.

Credit: 
Virginia Commonwealth University

Enzyme SSH1 impairs disposal of accumulating cellular garbage, leading to brain cell death

image: David Kang, PhD, professor of molecular medicine at the University of South Florida Health (USF Health) Byrd Alzheimer's Center, led the research team that discovered a defect early in the dynamic cellular waste clearance process known as autophagy.

Image: 
© USF Health

TAMPA, Fla (Oct. 12, 2020) -- In a healthy brain, the multistep waste clearance process known as autophagy routinely removes and degrades damaged cell components - including malformed proteins like tau and toxic mitochondria. This cellular debris would otherwise pile up like uncollected trash to drive the death of brain cells (neurons), ultimately destroying cognitive abilities like thinking, remembering and reasoning in patients with Alzheimer's and certain other neurodegenerative diseases.

The protein p62, a selective autophagy cargo receptor, plays a major role in clearing misfolded tau proteins and dysfunctional mitochondria, the energy powerhouse in all cells including neurons. Through autophagy (meaning "self-eating" in Greek) old or broken cellular material is ultimately digested and recycled in lysosomes, membrane-bound structures that work like mini-waste management plants.

Now, neuroscientists at the University of South Florida Health (USF Health) Byrd Alzheimer's Center report for the first time that the protein phosphatase Slingshot-1, or SSH1 for short, disrupts p62's ability to function as an efficient "garbage collector" and thereby impairs the disposal of both damaged tau and mitochondria leaking toxins. In a preclinical study, the researchers showed that SSH1's influence in halting p62-mediated protective clearance of tau was separate from SSH1's role in activating cofilin, an enzyme that plays an essential part in worsening tau pathology.

Their findings were published Oct. 12 in Autophagy.

"Slingshot-1 is an important player in regulating the levels of tau and neurotoxic mitochondria, so it's important to understand exactly what's going wrong when they accumulate in the brain," said the paper's senior author David Kang, PhD, professor of molecular medicine at the USF Health Morsani College of Medicine, who holds the Fleming Endowed Chair in Alzheimer's Disease and serves as the director of basic research at the Byrd Alzheimer's Center. "This study provides more insight into a defect stemming from the p62 pathway, which will help us develop SSH1 inhibitors (drugs) to stop or slow Alzheimer's disease and related neurodegenerative disorders."

At the start of their study, Dr. Kang's team, including first author and doctoral student Cenxiao (Catherine) Fang, MD, already knew that, in the case of clearing bad mitochondria (known as mitophagy), the enzyme TBK1 transiently adds phosphate to p62. Phosphate is specifically added at the site of amino acid 403 (SER403), which activates p62. However, no scientist had yet discovered what enzyme removes phosphate from p62, known as dephosphorylation.

Tightly controlled phosphorylation is needed to strike a balance in p62 activation, an early step key in priming the cargo receptor's ability to recognize and collect chunks of cellular waste labelled as "garbage" by a ubiquitin tag. Put simply, when autophagy works well, ubiquitinated tau and ubiquitinated mitochondria are selectively targeted for collection and then delivered for destruction and recycling by autophagosomes (the garbage trucks in this dynamic process). But, garbage collector p62 doesn't touch the cell's healthy (untagged) proteins and organelles.

In a series of gene deactivation and overexpression experiments using human cell lines, primary neurons, and a mouse model of tauopathy, Dr. Kang's team discovered SSH1, acting specifically on SER403, as the first enzyme to remove this key phosphate off p62, causing p62 deactivation.

"When something shifts out of balance, like overactivation of Slingshot-1 by Alzheimer's-related protein Aβ for example, then SSH1 starts to remove the phosphate off the garbage collector p62, essentially relaying the message 'stop, don't do your job.' That leads to bad consequences like accumulation of damaged tau proteins and toxic mitochondria," Dr. Kang said.

"If we can bring phosphorylation regulation back into balance through inhibitors that dampen overactive Slingshot-1, we can increase p62's normal activity in removing the toxic garbage."

This latest study builds upon previous USF Health research showing that Aβ-activated cofilin, which occurs through SSH1, essentially kicks tau from the microtubules providing structural support to neurons, thereby boosting the build-up of tau tangles inside dying nerve cells. In the displacement process, cofilin gets transported to mitochondria and damage to the energy-producing mitochondria ensues. Following up on that collateral cofilin-triggered damage, Dr. Kang's team expected to find widespread mitophagy to remove the sick mitochondria.

"We got exactly the opposite result, which meant there was another mechanism affecting how Slingshot-1 regulated mitochondria," Dr. Kang said, "and it turned out to encompass the key autophagy machinery of p62."

The researchers also showed that two major and entirely separate signaling pathways implicated in tau pathology - one for p62 and another for cofilin - are regulated by the same enzyme, SSH1.

"In addition to the SSH1-cofilin activation pathway in promoting tau displacement from microtubules, this study highlights the divergent SSH1-p62 inhibitory pathway in impairing autophagic clearance of misfolded tau," the study authors report.

Credit: 
University of South Florida (USF Health)