Culture

Using a cappella to explain speech and music specialization

image: Figure shows original song (bottom left) and its spectrogram (above it, in blue). This spectrogram can be decomposed according to the amount of energy contained in spectral and temporal modulation rates (central panel). Auditory cortex on the right and left sides of the brain (right side of figure) decode melody and speech, respectively, because the melody depends more on spectral modulations and the speech depends more on temporal modulations.

Image: 
Robert Zatorre

Speech and music are two fundamentally human activities that are decoded in different brain hemispheres. A new study used a unique approach to reveal why this specialization exists.

Researchers at The Neuro (Montreal Neurological Institute-Hospital) of McGill University created 100 a capella recordings, each of a soprano singing a sentence. They then distorted the recordings along two fundamental auditory dimensions: spectral and temporal dynamics, and had 49 participants distinguish the words or the melodies of each song. The experiment was conducted in two groups of English and French speakers to enhance reproducibility and generalizability. The experiment is demonstrated here: https://www.zlab.mcgill.ca/spectro_temporal_modulations/

They found that for both languages, when the temporal information was distorted, participants had trouble distinguishing the speech content, but not the melody. Conversely, when spectral information was distorted, they had trouble distinguishing the melody, but not the speech. This shows that speech and melody depend on different acoustical features.

To test how the brain responds to these different sound features, the participants were then scanned with functional magnetic resonance imaging (fMRI) while they distinguished the sounds. The researchers found that speech processing occurred in the left auditory cortex, while melodic processing occurred in the right auditory cortex.

Music and speech exploit different ends of the spectro-temporal continuum

Next, they set out to test how degradation in each acoustic dimension would affect brain activity. They found that degradation of the spectral dimension only affected activity in the right auditory cortex, and only during melody perception, while degradation of the temporal dimension affected only the left auditory cortex, and only during speech perception. This shows that the differential response in each hemisphere depends on the type of acoustical information in the stimulus.

Previous studies in animals have found that neurons in the auditory cortex respond to particular combinations of spectral and temporal energy, and are highly tuned to sounds that are relevant to the animal in its natural environment, such as communication sounds. For humans, both speech and music are important means of communication. This study shows that music and speech exploit different ends of the spectro-temporal continuum, and that hemispheric specialization may be the nervous system's way of optimizing the processing of these two communication methods.

Solving the mystery of hemispheric specialization

"It has been known for decades that the two hemispheres respond to speech and music differently, but the physiological basis for this difference remained a mystery," says Philippe Albouy, the study's first author. "Here we show that this hemispheric specialization is linked to basic acoustical features that are relevant for speech and music, thus tying the finding to basic knowledge of neural organization."

Credit: 
McGill University

Study: The opioid crisis may be far worse than we thought

image: New research in the journal Addiction shows that the deaths from opioid-related overdoses may be 28% higher than previously reported. This discrepancy is more pronounced in several states, including Alabama, Mississippi, Pennsylvania, Louisiana, and Indiana, where the estimated number of deaths more than doubles.

Image: 
University of Rochester Medical Center

New research appearing in the journal Addiction shows that the number of deaths attributed to opioid-related overdoses could be 28 percent higher than reported due to incomplete death records. This discrepancy is more pronounced in several states, including Alabama, Mississippi, Pennsylvania, Louisiana, and Indiana, where the estimated number of deaths more than doubles - obscuring the scope of the opioid crisis and potentially affecting programs and funding intended to confront the epidemic.

"A substantial share of fatal drug overdoses is missing information on specific drug involvement, leading to underreporting of opioid-related death rates and a misrepresentation of the extent of the opioid crisis," said Elaine Hill, Ph.D., an economist and assistant professor in the University of Rochester Medical Center (URMC) Department of Public Health Sciences and senior author of the study. "The corrected estimates of opioid-related deaths in this study are not trivial and show that the human toll has been substantially higher than reported, by several thousand lives taken each year."

Hill and her team - including co-authors Andrew Boslett, Ph.D., and Alina Denham, M.S., with URMC - found that almost 72 percent of unclassified drug overdoses that occurred between 1999-2016 involved prescription opioids, heroin, or fentanyl - translating into an estimated 99,160 additional opioid-related deaths.

Gaps in Death Records

Hill and Boslett first stumbled upon the discrepancy while studying the economic, environmental, and health impacts of natural resources extraction. Many regions of the country hit the hardest by the opioid crisis overlap with areas associated with shale gas development and coal mining. As a part of her research, Hill was attempting to determine whether the shale boom improved or exacerbated the opioid crisis. However, as they started collecting data, they discovered that close to 22 percent of all drug-related overdoses where unclassified, meaning the drugs involved in the cause of death were not indicated.

A medical examiner or coroner becomes involved during any sudden and unexpected death of an otherwise healthy person and anyone suspected to have died from an unnatural cause. Under ideal circumstances, the cause of death is identified through a combination of evidence collected at the scene, a toxicological analysis of blood or tissue, and an autopsy. If the cause is determined to be drug-related, either accidental or a suicide, then the specific drugs identified in the person's system are recorded on the death certificate.

However, in practice, this process is expensive and time-consuming, dependent upon the resources and staffing available to the specific medical examiner's office, and potentially influenced by family members due to the stigma associated with opioid use. Additionally, the requirements to serve as a medical examiner or coroner varies nationally. In some states, the office is an elected position with no prerequisite for professional experience or training in forensic pathology.

Underreporting Concentrated in Several States

In the study, Hill and her colleagues obtained death records of individuals identified as having died from drug overdoses from the National Center for Health Statistics, part of the Centers for Disease Control and Prevention. In addition to the cause, the records also include any additional medical issues that might have contributed to the death. Employing a statistical analysis, the researchers were able to correlate the information in the death records of unclassified overdose deaths with contributing causes associated with known opioid-related deaths, such as previous opioid use and chronic pain conditions.

While the overall percentage of unclassified deaths declined over time, a phenomenon that the researchers speculate is due to a more focused effort by federal, state, and local officials to understand the scope of the crisis, in several states, the number remained high. The new estimates of actual opioid-related deaths show a pronounced increase in states like Alabama, Mississippi, Pennsylvania, Louisiana, and Indiana. In fact, in each of these states, the number of opioid-related deaths increased by more than 100 percent.

In Pennsylvania, for example, the number of reported opioid-related deaths was 12,374. The study estimates the actual number of deaths was 26,586. Consequently, the state's total number of deaths in places it behind only California and Florida, states with significantly higher populations, and moves Pennsylvania from fifteenth to sixth in terms of highest per capita death rates in 2016.

"The underreporting of opioid-related deaths is very dependent upon location and this new data alters our perception of the intensity of the problem," said Hill. "Understanding the true extent and geography of the opioid crisis is a critical factor in the national response to the epidemic and the allocation of federal and state resources to prevent overdoses, treat opioid use disorders, regulate the prescription of opioid medications, and curb the illegal trafficking of drugs."

Credit: 
University of Rochester Medical Center

Mom's gut microbes affect newborn's metabolism, mouse models suggest

Using mouse models, scientists have discovered a mother's gut microbiota may shape the metabolism of her offspring, by providing environmental cues during pregnancy that fine tune energy homeostasis in the newborn's microbiome. These findings suggest targeting the maternal microbiota - for example, by recommending dietary changes - could offer a preemptive strategy to protect offspring from future metabolic disease. A balanced microbiome has been linked to good health, while disruptions or changes to the microbiome have been linked to several diseases and disorders, including obesity, heart diseases and diabetes. Though the effect of maternal microbiota on an infant's health has been well-documented, much less is known about how the mother's gut microbes impact the offspring's microbiota at the embryonic stage. Ikuo Kimura and colleagues explored this question in a mouse model, specifically focusing on short chain fatty acids (SCFAs), microbiota-derived metabolites that fuel cells and signal communication between gut microbes and organs. They found that SCFAs produced by the pregnant mother's gut microbes dictated the differentiation of the offspring's neural, intestinal and pancreatic cells through cell signaling of GPR41 and GPR43, protein receptors on the surface of fat cells. This developmental process helped offspring maintain balanced energy levels, whereas offspring from mothers that lacked microbes altogether were highly susceptible to metabolic syndromes like obesity and glucose intolerance. One particular SCFA - propionate - played a vital role in preventing the development of metabolic disorder in the offspring, the researchers found. Supplementation by propionate may thus be a possible route for therapy, but the safety and efficacy of this application during pregnancy remains to be determined, says Jane Ferguson in a related Perspective that further details Kimura et al.'s study.

Credit: 
American Association for the Advancement of Science (AAAS)

Newly discovered driver of plant cell growth contradicts current theories

The shape and growth of plant cells may not rely on increased fluidic pressure, or turgor, inside the cell as previously believed. Rather, a new study shows the swelling of tiny pectin filaments within the cell wall propels these morphological changes. If true, this discovery could overturn the current textbook model for plant cell expansion, and it suggests similar biochemical processes could underlie cell growth in other organisms as well, including animals. The authors also hope their observations inspire the development of new smart materials mimicking the unique expansion of plant cell walls. Composed of a network of puzzle-like pieces, called pavement cells, the outermost layer of plants protects the structure and integrity of the specialized cells within. The walls of pavement cells are composed of polysaccharides, proteins and pectins and can shift between different states in response to chemical cues to support cell shape, size and division. But just how cell wall components contribute to the shaping and expansion of the puzzle-like cells is yet unclear. Kalina Haas and colleagues studied the morphogenesis of pavement cells in Arabidopsis cotyledon (the first leaves to emerge from a germinating seed). They employed data sonification methods to perceptualize the wide variety of pavement cell shapes with sound. Using super-resolution imaging techniques to home in on homogalacturonan (HG) polysaccharides, a kind of pectin in the cell wall, the researchers found that these polysaccharides assemble into discrete nanofilaments rather than a cross-linked network bound to structural proteins. Though the microscopy methods could not provide a closer look at these structures, Haas et al. postulated that HG are multi-subunit structures that, when demethylated, shift from their crystalline state into a swelling state that leads to wall expansion and growth of "lobes" on the pavement cells. They validated their hypothesis in models whereby they simulated lobe development in cotyledon and induced demethylation of pectin components in the cell wall. This altered the plant cell shape despite the absence of hydration and turgor pressure.

Credit: 
American Association for the Advancement of Science (AAAS)

Discovery of expanding pectin nanofilaments that manipulate plant cell shapes

Scientists have discovered new filamentous structures within plant cell walls that influence cell growth and help build complex three-dimensional cell shapes.

Combining two types of high-performance microscopes, the researchers identified pectin nanofilaments aligned in columns along the edge of the cell walls of plants. The filaments, which are 1,000 times thinner than a human hair, had only ever been synthesised in a lab, but never observed in nature until now.

These revelations about the cell wall structure are crucial for understanding how plants form their complex shapes and will help increase understanding of plant immunity and adaptation to changing environments, and possibly inspire future development of biofuels, agriculture, and even building smart, self-expanding materials.

It might look like a uniform surface of green, but place a typical leaf under a microscope and an intricate patchwork of irregular-shaped cells fitting together perfectly like a jigsaw puzzle is revealed. Each of these cells on the surface of a leaf, called pavement cells, has its own unique shape and continues to expand and change shape as the leaf grows.

The current "textbook" thinking about how these unusual wavy-shaped cells are formed is that the internal pressure within the cell (turgor) pushes against the rigid cell wall that surrounds each cell to define its final shape. Weaker parts of the wall expand further, like air pressure forcing weaker areas of a balloon to expand more.

Published today in the journal Science, researchers from the French National Research Institute for Agriculture, Food and Environment (INRAE) together with scientists from the University of Cambridge and Caltech/Howard Hughes Medical Institute are the first to show the presence of pectin nanofilament structures. Not only did they discover these new structures, they also demonstrated that they actively drive cell shape - and even cell growth - independent of pressure within the cell.

Before the team's discovery, pectin was considered a disorganised gel-like filling material sitting between the long cellulose fibres in the cell wall. Dr Kalina T. Haas, first author of the paper, who was working at the University of Cambridge at the time and is now an INRAE researcher, explains: "Biochemistry is typically used to study the components of the cell wall, but biochemical analysis disintegrates the cell wall to extract molecules for further study and so we do not get a chance to examine the original structure. Conventional fluorescence microscopes with a resolution of 200 nm aren't any help either as the cell wall is only 50-100 nm in width and too small for this type of microscope to see its detailed structure. To overcome this, we used two types of cutting-edge microscopy, dSTORM and cryoSEM, which allowed us to keep the cell wall intact. Together, these microscopes revealed that pectins do not form a 'jelly', but create a well organised nanoscaled colonnade (sequence of columns) along the edge of the cell wall."

The cryo (very low temperature) Scanning Electron Microscope (cryoSEM) developed at the Sainsbury Laboratory at the University of Cambridge captured the very first images of these pectin filaments. Dr Raymond Wightman, Imaging Core Facility Manager at the Sainsbury Laboratory, said: "It was in a lab 40 years ago that chemists first demonstrated that pectin might form filaments, but these had never been observed in nature. The cryoSEM provided us with the very first images of pectin as filamentous structures and the super-high resolution light microscope called dSTORM confirmed that what we were seeing was actually pectin structures. No single microscope by itself could have confirmed these results."

Dr Haas and Dr Alexis Peaucelle at INRAE adapted the MRC/LMB's dSTORM microscope to analyse the leaf cells of Arabidopsis thaliana (thale cress) at a high resolution of 20-40 nm. They found that a single type of chemical change (methyl group removal) in the pectin nanofilament triggers the filaments to swell and expand radially by around 40%. This swelling causes buckling of the cell wall, which then initiates the growth and formation of the unusual wavy-shaped cells.

Dr Peaucelle explained: "This is related to a change in the packaging of pectin polymers inside the nanofilament from a compact to a loose lattice. Such self-expansion of the cell wall components, coupled with their local orientation, can drive the emergence of complex shapes. A computer model found the small change in size that accompanies a modified nanofilament is enough to make the jigsaw puzzle cell shape. Furthermore, these shape changes did not need the force of turgor within the modelled cells."

Further research will be required to determine what contribution turgor pressure and the cellulose in the cell wall play in determining cell shape. The team think it likely that turgor pressure and cellulose fibres work alongside pectin nanofilaments to help to maintain shape.

The team also wanted to see how random or ordered the plant cell wavy shapes were. Instead of only analysing the cell shapes visually in a graph, they used data sonification to "hear the shape of cells".

Dr Peaucelle explained: "We found that the waved edge of the puzzle-shape of plant epidermal cells is very ordered. When we sonified the images, we observed that their shapes are organised in waves similar to that produced by a musical instrument. As an example, we used different cells to create notes from a chromatic scale and then play 'The Blue Danube' by J. Strauss with them. It is extraordinary that through increasing our understanding of how the epidermal cells form their wavy pattern, we also confirmed that pectin is involved in the growth process. This highlights how little we know about something so vital for sustaining our society as plant growth. I envisage further discoveries in plant and human health will come as more attention is given to the extracellular matrix surrounding cells, thanks to the new generation of high-resolution microscopes. Although animal cells are not surrounded by cell walls, they are surrounded by an extracellular matrix of proteins and sugars, which may similarly guide cell shape." The authors conclude, by saying that related functions for filament self-expansion may be present in different kingdoms. Other extracellular matrix polysaccharides such as carrageenan of red algae, alginate of brown algae, or even hyaluronic acid in animals may play a similar role.

Credit: 
University of Cambridge

Illinois study shows universally positive effect of cover crops on soil microbiome

image: Cover crops such as ryegrass can boost soil microbiome by 27%, according to a University of Illinois meta-analysis.

Image: 
Maria Villamil, University of Illinois

URBANA, Ill. - Only a fraction of conventional row crop farmers grow cover crops after harvest, but a new global analysis from the University of Illinois shows the practice can boost soil microbial abundance by 27%.

The result adds to cover crops' reputation for nitrogen loss reduction, weed suppression, erosion control, and more. Although soil microbial abundance is less easily observed, it is a hugely important metric in estimating soil health.

"A lot of ecological services are done by the soil microbiome, including nutrient cycling. It's really important to understand how it functions and how agriculture can form a healthier soil microbiome," says Nakian Kim, doctoral student in the Department of Crop Sciences at the University of Illinois and lead author on a new paper in Soil Biology and Biochemistry.

Other studies have shown benefits of cover cropping on the soil microbial community, but most of them have been one-offs influenced by specific site conditions, unique seasonal effects, idiosyncratic management regimes, and the researchers' chosen analysis methods. Kim's work is different in that he looked for universal patterns among dozens of these one-off studies.

"Our analysis shows that across 60 field studies, there was a consistent 27% increase in microbial abundance in fields with cover crops versus no cover crops. It's across all these studies from around the world," says Maria Villamil, associate professor in crop sciences and co-author on the paper.

The research team performed a search of the existing studies on cover crops, and wound up with some 985 scientific articles. Of these, they only kept studies that directly compared cover crops and bare fallow soils, and omitted studies conducted in greenhouses or that treated crop residues as cover crops. They also ensured that the studies were statistically sound, with reasonably large sample sizes. In the end, they mined and reanalyzed data from 60 studies reporting on 13 soil microbial parameters.

"That's why the criteria of selection had to be so strict. We wanted to compare studies that were solid, and with enough replications that we could make valid claims about global patterns," Villamil says.

The research team divided the 13 microbial parameters into three categories: microbial abundance, activity, and diversity. Microbial abundance wasn't the only category to show a significant increase with cover cropping compared to bare fallow soils. Microbial activity was also up 22%, and diversity increased 2.5%.

"All the categories are important, but especially diversity, because a diverse microbiome is more resilient. Considering the close linkage between microbial diversity and the provision of ecosystem services, small impacts could go a long way to increase sustainability. In that sense, I think the cover crops are really helping," Kim says.

The researchers were also able to tease out several factors that layered on top of the main effect of cover crops. For example, how did climate, cover crop termination method, or tillage regime affect the ability of the cover crops to benefit the soil microbial community?

Kim says the use of burndown herbicides as a cover crop termination method had a strong moderating effect on the microbial community. "The results were very interesting. With chemical termination, the effect sizes were consistently smaller compared to mechanical termination. In other words, the benefits from the cover crops are diminished somehow from the herbicides. I think that's one big takeaway."

Tillage also made a difference, according to Kim. He expected conventional tillage to reduce the effect of cover crops on the soil microbes, but instead, conservation tillage did that. "My guess is that because conservation tillage included not tilling at all, that allowed weeds to grow on the land. The weeds could have mimicked what the cover crops do. So the difference between the control treatment and the cover crop may decrease because of the weeds."

Because their effects were indirect, these secondary factors need more research before real claims can be made. Villamil's research team already has studies in the works to get more definitive answers. But in the meantime, she's heartened by the results of the analysis as a whole.

"For me, it was surprising to see the consistent, positive effect of cover crops - surprising but good. Finally! I've been researching cover crops in our typical corn-soybean rotations in Illinois since 2001, yet in these high-fertility environments, it has proven difficult to show any effects beyond cereal and annual rye capturing nitrogen (weather permitting). Changes in chemical and physical properties related to cover crop use are difficult to see," Villamil says. "But the microbiome, that's where it's at. That's how everything is related. Thanks to this work, I have something to look forward to when I put in cover crops, and have generated many more questions in need of research."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Getting off of the blood sugar roller coaster

For the 250,000 Canadians living with type 1 diabetes, the days of desperately trying to keep their blood sugar stable are coming to an end. A team of researchers at McGill University's Faculty of Medicine is working to optimize an artificial pancreas with the ability to minimize the glucose highs and lows that diminish quality of life and contribute to long-term health complications.

Creating smart insulin pumps

Insulin pumps have been around for more than 30 years. Using these portable devices, people living with diabetes manually select the amount of insulin released into their bloodstream. While the majority still prick their finger to test their blood glucose level in order to determine the right amount of insulin, a growing number use an automatic glucose sensor. Even using the automatic sensor, however, the average person hits their glucose target less than 50 percent of the time. As a result, they spend most of their time in a state of hyperglycemia, which produces headaches and weakness, or hypoglycemia, which causes dizziness, confusion, and difficulty speaking.

Dr. Ahmad Haidar began his PhD studies at McGill just as the automatic glucose sensor became commercially available. "It was the best coincidence of my life," he claims, "because the automatic sensor made it possible to create an artificial pancreas system." Drawing upon his background in control engineering, Dr. Haidar devised an algorithm that tells the insulin pump how much insulin to release based on the sensor reading entered by the user. He then teamed up with three clinicians in the McGill Faculty of Medicine--Drs. Laurent Legault, Michael Tsoukas, and Jean-Francois Yale--to form the McGill Artificial Pancreas Lab. Their team of 12 full-time and 45-part time researchers has become the only group in Canada to develop artificial pancreas systems.

Advances through the artificial pancreas systems can improve quality of life

A study recently published in Diabetes Care by the McGill Artificial Pancreas Lab represents a breakthrough in the understanding of what makes an artificial pancreas system effective. With funding from the Juvenile Diabetes Research Foundation, the group ran an experiment to deliver a second hormone, pramlintide, in addition to insulin in hopes that the combination would be superior to insulin alone. In the end, the study found that the combination of drugs significantly improved the percentage of time that a person's blood glucose level stayed within a target range. By slowing down meal absorption, pramlintide gave the insulin more time to work.

"I was surprised at the results," Dr. Haidar admits. "I didn't expect the experiment to be this successful." The patients who received both insulin and pramlintide during Dr. Haidar's study reported a high level of satisfaction with the new treatment regime. "By improving their glucose control, we can greatly improve their quality of life," Dr. Haidar explains.

For the McGill Artificial Pancreas Lab, the next frontier is creating a fully automated artificial pancreas that eliminates the burden of having to manually enter carbohydrate numbers and activate the insulin pump at mealtimes. "There has been an enormous amount of interest in the patient community as we develop this second-generation technology," Dr. Haidar shares. As he works to improve the artificial pancreas system, Dr. Haidar thinks of his colleagues living with type 1 diabetes and the patients he encounters every day. "I'm feeling optimistic about what's ahead for them," he reflects. "We're working to make an impact that goes far beyond our lab."

Credit: 
McGill University

Sugar-poor diets wreak havoc on bumblebee queens' health

image: B. impatiens foraging

Image: 
Jim Whitfield

Without enough sugar in their diets, bumblebee queens can experience difficulty reproducing and shorter lifespans.

Hollis Woodard, assistant professor of entomology at UCR, has conducted multiple studies showing how loss of plant availability negatively affects the prolific pollinators. Previous research indicates a queen's diet can impact how quickly her brood develops, or whether she's able to live through hibernation.

In a study published today, Woodard and her team demonstrate that without adequate sugar, the queen's fat body, which functions like a human liver, does not correctly produce enzymes required for healthy metabolism and detoxification from pesticides.

"The fitness consequences of not producing enough of these enzymes, P450s, could ultimately be lethal, preventing reproduction or shortening lifespans," Woodard said. "If you don't outright kill a queen, but she can't lay eggs, from a population standpoint you might as well have killed her."

Woodard's team wanted to focus its research on bumblebee queens because they undergo an event called diapause, which is like hibernation. Queens gather an abundance of nutrients just before diapause, as it is likely one of the most vulnerable times in their lives.

A queen's ability to survive this period is not guaranteed without adequate nutrition. However, the team also wanted to know about the sublethal effects.

"We wanted to ask if other signs of distress might be detectable in advance, and whether there are molecular signs queens might not make it through diapause," Woodard said.

The research team collected queen bumblebees of varying ages and fed them one of four diets ranging from no sugar to extremely concentrated sugar.

They found queens in the oldest age group, roughly 12 days old, who were fed the most sugar, began to exhibit gene expression patterns consistent with the beginning of diapause. This suggests queens may have an internal mechanism to detect the amount of stored sugar in their bodies, Woodard said. By extension, it means there may be a way to detect in advance when queens do not have enough sugar to survive diapause, or for their fat bodies to produce enough enzymes to protect them from damaging pesticides.

This research underscores the importance of bumblebee food resources, which have been dwindling in recent years due to habitat loss.

"There are studies showing gaps in floral availability late in the season before diapause, when queens most need food," Woodard said.

In addition to habitat loss reducing bumblebees' preferred plants, climate change may also be taking a toll. As winters become warmer, queens may emerge earlier from diapause in winter when fewer flowers are available to them. Their physiology becomes uncoupled from the seasons, making it less likely that they survive, or that their nests will be successful.

Many bumblebee species are declining, which could eventually have consequences for the human diet. Bumblebees, unlike honeybees, are native to the U.S. and perform a type of pollination essential for some of humans' favorite foods, such as tomatoes, blueberries, and potatoes.

Therefore, Woodard believes every effort must be made to ensure bumblebees, especially queens, have enough flowers to feed on.

"It's important for everyone from land managers to people with gardens to plant things bumblebees can feed on," she said.

Credit: 
University of California - Riverside

Revving habits up and down, new insight into how the brain forms habits

image: An illustration of the dorsolateral striatum shown as a cross-section. The blue shading shows the area of the brain that had the optogenetic receptors, and the black dots show where the placements were for light delivery.

Image: 
Image provided by study co-authors

Each day, humans and animals rely on habits to complete routine tasks such as eating and sleeping. As new habits are formed, this enables us to do things automatically without thinking. As the brain starts to develop a new habit, in as little as a half a second, one region of the brain, the dorsolateral striatum, experiences a short burst in activity. This activity burst increases as the habit becomes stronger. A Dartmouth study demonstrates how habits can be controlled depending on how active the dorsolateral striatum is. The results are published in the Journal of Neuroscience.

In prior research at MIT, the senior author found that this burst in brain activity in the dorsolateral striatum correlated with how habitual a running maze task was for rats. The activity was found to be accentuated at the beginning and end of the maze run.

For this study, the researchers sought to manipulate this burst in brain activity in rats using a method called optogenetics. With this method, the neurons (brain cells) in the dorsolateral striatum, which have been found to be associated with forming habits, can be excited or inhibited using light. Optogenetics enables the brain cells to express a receptor that is sensitive to light, and is painless. A flashing blue light excites the brain cells while a flashing yellow light inhibits the cells and shuts them down.

Using maze running tasks, rats were trained to run in a cross-shaped maze. (There was only one rat in a maze at a time). The rats began in one of two starting arms and ran from one end of the cross and ran to the center decision point. They were trained to turn either left of right and run to the end, where a sugar pellet reward was waiting; only one arm of the cross was baited with the reward. As soon as the animals started the maze run and turned in the correct direction of where the reward was located, they received a sugar pellet reward.

After the rats had learned the maze training runs, the optogenetics component of using the flashing color lights to manipulate the dorsolateral striatum activity, was incorporated. When the cells in the dorsolateral striatum were excited for just a half a second as the rats initiated their runs, the rats would run more vigorously and habitually on the entire maze. The habit had been formed, once the rats ran to the center of the cross-shaped maze and turned immediately towards the direction of where the reward was located. The animals would no longer stop at the center to look around, once they knew where to go.

In contrast, when the cells where inhibited, the rats were slow and appeared to lose their habit altogether. Once they reached the center of the cross-shaped maze, they paused and would turn around a lot as though deliberating, before ultimately making their choice. Even more striking, the researchers also tested how habitual the animals were by changing the tasty reward to something not tasty. In this case, the excitation made the rats continue running by habit for the now unpleasant outcome, while the inhibition made the rats essentially refuse to run when there was no reward to gain from it.

When the researchers applied the light manipulations during the middle of the runs on another day, there was little effect. Once the rats had already set in motion the full sequence of behavior - run, turn and stop sequence -this habit appeared to dictate their actions, as if they were on auto-pilot.

"Our findings illustrate how habits can be controlled in a tiny time window when they are first set in motion. The strength of the brain activity in this window determines whether the full behavior becomes a habit or not," explained senior author, Kyle S. Smith, an associate professor and director of graduate studies in the department of psychological and brain sciences at Dartmouth, whose lab focuses on the neuroscience of reward and action. "The results demonstrate how activity in the dorsolateral striatum when habits are formed really does control how habitual animals are, providing evidence of a causal relationship," he added.

Gaining a better understanding of the specific role that the dorsolateral striatum plays in habit memory and other behaviors is critical. Damage to this brain area has been found to be associated with Parkinson's disease, a neurodegenerative disorder that often affects body movement. In the study, the researchers explain how targeting the time window as to when habits are formed could be leveraged in "designing intervention strategies for humans with otherwise treatment-resistant compulsive behaviors."

Kyle S. Smith is available for comment at: kyle.s.smith@dartmouth.edu. The study was conducted by Adam C.G. Crego, a graduate student in the Smith lab at the time of the study. In addition to Smith and Crego, Fabián Štoček, Alec G. Marchuk, James E. Carmichael, and Matthijs A.A. van der Meer at Dartmouth, also served as co-authors of the study.

Credit: 
Dartmouth College

Thinning, prescribed burns protected forests during the massive Carlton Complex wildfire

image: A view near Loup Loup Pass showing the impacts of the Carlton Complex wildfire in 2014.

Image: 
Susan Prichard/University of Washington

The 2014 Carlton Complex wildfire in north central Washington was the largest contiguous fire in state history. In just a single day, flames spread over 160,000 acres of forest and rangeland and ultimately burned more than 250,000 acres in the midst of a particularly hot, dry summer.

The wildfire, driven by strong winds and explosive growth, was unprecedented in how it burned the landscape, destroying more than 300 homes in Washington's Methow Valley. But "megafires" like the Carlton Complex are becoming more common in western U.S. forests as the climate warms and forests are crowded with trees after years of fire exclusion.

In the first major study following the devastating Carlton Complex fire, researchers from the University of Washington and U.S. Forest Service found that previous tree thinning and prescribed burns helped forests survive the fire. The study, published Feb. 22 in the journal Ecological Applications, shows that even in extreme wildfires, reducing built-up fuels such as small trees and shrubs pays off.

"Our study suggests that the fuel treatments were worth the investment, yielding a more desirable post-fire outcome than if they hadn't been implemented," said lead author Susan Prichard, a research scientist at the UW School of Environmental and Forest Sciences. "There are a lot of benefits to creating more resilient landscapes, and this study suggests that even in the worst-case scenario wildfires, it can be worth it."

On July 17, 2014, sustained winds of 35 mph raced through the Methow Valley, blasting oxygen into several fires that had started earlier that week. Before long, the fires joined and spread. A column of smoke stretched 30,000 feet high, dropping embers and igniting dry grasses, trees and other fuels on the ground over thousands of acres.

Propelled by the fast-moving column of smoke and wind, the flames traveled all the way to the Columbia River.

"It was the kind of fire that even experts who have studied them for decades had not seen before," said Prichard, who lives in the Methow Valley. "It was a devastating fire for our community. I truly thought people would be killed by that fire, and it still feels miraculous that everyone survived."

In the aftermath, Prichard and collaborators wondered if actions such as controlled burns and thinning were helpful in the case of a megafire like the Carlton Complex. State and federal agencies sink ample resources into these efforts to increase the resiliency of forests to wildfires by removing small- and medium-sized trees, leaving larger, mature trees that are more likely to survive future drought and wildfires.

After a forest is thinned, prescribed burning is used to reduce leftover woody debris on logging sites or across landscapes that have built up downed trees, pine needles, grasses and shrubs.

The Carlton Complex fire burned across hundreds of sites that were previously thinned or burned, offering a testbed for the researchers to analyze whether the work helped reduce fire impacts in those areas during the megafire. The researchers used satellite images of burn severity to examine how past fuel treatments performed in the context of this extreme wildfire event.

They found that even during the first explosive days of the Carlton Complex, areas that were thinned and prescribed burned had more trees survive than areas that didn't receive those fuel treatments.

"Some of the treatments measurably reduced fire impacts even under very hot, dry and windy conditions," said co-author David W. Peterson, a research scientist at U.S. Forest Service Wenatchee Forestry Sciences Lab. "Our results suggest that as we increase our 'restoration footprint' -- the proportion of forest area treated to reduce fuels -- forests may become increasingly resilient to wildfires under a broad range of conditions."

They also found that thinning and burning on slopes that were protected from prevailing winds was more effective in reducing the wildfire's impacts on the landscape than similar treatments in areas directly exposed to the wind. Because winds often move through the Methow Valley in a predictable pattern, fire managers could thin and burn strategically in areas where these activities would be most effective.

Those efforts helped mature, naturally fire-resistant ponderosa pines survive. These mature trees, in turn, will provide seeds that produce younger trees across the landscape.

While this study focused on the Carlton Complex fire, its results have broader implications for forests around the world that are prone to wildfires. This work adds to the growing library of studies showing how thinning trees and controlled burns can reduce the severity of the next fire on the landscape.

"I'm hopeful our study will encourage policymakers as well as managers to invest in restoration," Prichard said. "It's our best hope for protecting our streams, rivers and landscapes from catastrophic fires."

Credit: 
University of Washington

How does the brain put decisions in context? Study finds unexpected brain region at work

video: Calcium responses of neurons in ALM (anterolateral motor cortex) while mice perform the olfactory delayed match to sample task.

Image: 
Zheng (Herbert) Wu/Axel and Shadlen labs/Columbia University's Zuckerman Institute

NEW YORK -- When crossing the street, which way do you first turn your head to check for oncoming traffic? This decision depends on the context of where you are. A pedestrian in the United States looks to the left for cars, but one in the United Kingdom looks right. A group of scientists at Columbia's Zuckerman Institute has been studying how animals use context when making decisions. And now, their latest research findings have tied this ability to an unexpected brain region in mice: an area called the anterior lateral motor cortex, or ALM, previously thought to primarily guide and plan movement.

This discovery, published today in Neuron, lends new insight into the brain's remarkable ability to make decisions. Flexible decision making is a critical tool for making sense of our surroundings; it allows us to have different reactions to the same information by taking context into account.

"Context-dependent decision-making is a building block of higher cognitive function in humans," said neuroscientist Michael Shadlen, MD, PhD, the paper's co-senior author with Richard Axel, MD. "Observing this process in a motor area of the mouse brain, as we did with today's study, puts us a step closer to understanding cognitive function at the level of brain cells and circuits."

"If someone is standing uncomfortably close to me on a deserted street, I may try to run away, but if the same event occurred on a crowded subway car, I would feel no such danger," said neuroscientist and first author Zheng (Herbert) Wu, PhD. "My decision to move or not move is dependent on the context of where I am; thus giving a reason behind the choices I make."

To investigate how the brain achieves this context-dependent flexibility, the team surveyed several brain areas dedicated to processing and integrating sensory information, but found the critical area to be a part of the motor cortex called the ALM. Previous experiments suggested that the ALM has a relatively simple job: It guides movements of a mouse's tongue and facial muscles.

Building on this understanding, the researchers designed a new experiment that required mice to make flexible decisions using their tongues and their olfactory system, which guides their sense of smell. In the experiment, a mouse first encountered a single odor. The mouse had to remember this odor, because after a brief pause, the researchers then puffed a second odor over the nostrils of the mouse. If both odors were the same, the mouse had to lick a tube to the left to get water. If the two odors were different, it had to lick a tube to the right.

Previous work on this type of "delayed match to sample" test would lead one to expect that the mouse would use brain areas dedicated to odor perception to make the decision about which way to lick. Recordings of brain activity from these areas seemed to confirm this mechanism.

"Based on these recordings, one could imagine that these brain areas have the answer when the mouse receives the second odor," said Dr. Shadlen. "All that's left to do is pass that answer to the brain's motor system to produce the appropriate lick response to the left or right."

If this were so, then the motor area should not a play a role until the second odor is provided, and the mouse decides whether the two odors are the same or different. Dr. Wu devised a clever way to test this prediction. He switched off the animals' ALM until just before the second odor was given, turning ALM back on in time for the mice to receive the answer.

"According to the standard view, the mice should have been unfazed by this manipulation, as their olfactory system remained intact," said Dr. Shadlen. "Instead they were impaired on the task."

"Our results suggest that the ALM was required to solve the question of whether the two odors were a match and then to decide where to lick, prompting us to significantly rethink what the brain was doing to make these decisions," said Dr. Wu.

ALM was not known to be involved in odor perception. Dr. Wu therefore took a closer look at the brain cells in ALM. He discovered a new type of neuron in ALM very near the surface of the brain that responds to the first odor. It keeps that information handy until the second odor is received.

To explore this unexpected result, the research team turned to theoretical neuroscientist Ashok Litwin-Kumar, PhD, to investigate a variety of potential mechanisms that could account for ALM's role.

"Conventional wisdom held that the animals' olfactory brain region should handle scent processing on its own, and then feed information to the ALM, which would then guide the tongue," said Dr. Litwin-Kumar. "But the data told us a different story; the first odor acts as a contextual clue, priming the ALM to then indicate that relationship by deciding which way to lick in response to the second odor."

Today's findings, while focused on the ALM, are important for how they can inform scientists' larger understanding of brain function as a whole.

"Ultimately, we want to elucidate fundamental principles that explain simple behaviors, but that writ large provide insight into higher cognitive function in humans," said Dr. Shadlen. "An essential step toward that goal is to knit together knowledge about neurons, circuits and behavior using the languages of biology and mathematics. This collaborative project highlights the promise of this strategy."

Credit: 
The Zuckerman Institute at Columbia University

Study reveals how drug meant for Ebola may also work against coronaviruses

A group of University of Alberta researchers who have discovered why the drug remdesivir is effective in treating the coronaviruses that cause Middle East respiratory syndrome (MERS) and severe acute respiratory syndrome (SARS) expect it might also be effective for treating patients infected with the new COVID-19 strain.

"Even if you know a drug works, it can be a red flag if you don't know how it works," said virologist Matthias Götte. "It is reassuring if you know exactly how it works against the target.

"We know the drug works against different coronaviruses, like MERS and SARS, and we know the novel coronavirus is very similar to SARS. So I would say I'm cautiously optimistic that the results our team found with remdesivir and MERS will be similar with COVID-19."

The study, published in the Journal of Biological Chemistry this week, is among the first in Canada to discuss the COVID-19 strain.

Until now, there has not been a published explanation of why remdesivir may work against coronaviruses, said Götte, who added his study is an important step in answering that question.

Developed by Gilead Sciences as a response to the 2014 West African Ebola virus epidemic, remdesivir was first used on a patient with the novel coronavirus earlier this year in the United States.

As reported in the New England Journal of Medicine, the patient was given the drug on the seventh day of illness, and showed marked improvement the following day, with symptoms eventually disappearing altogether. And at a recent press conference in Beijing, the assistant director-general of the World Health Organization, Bruce Alyward, said remdesivir is the only drug available that may have real efficacy against COVID-19.

"What our study showed was that remdesivir essentially mimics one of the natural building blocks for RNA synthesis necessary for genome replication of the virus. Enzymes within the virus are synthesizing the viral RNA genome with these building blocks, but they mix up the bits they need with the drug. Once the drug is incorporated into the growing RNA chain, the virus can no longer replicate,"explained Götte.

He said the next step is to wait for results from ongoing clinical trials with remdesivir, which are expected by the end of April. Even then, that won't be the end of the story, he cautioned.

"It's likely we'll need more than one drug to properly fight emerging diseases like COVID-19, as we have with HIV and hepatitis C virus infections," Götte said.

"Ideally, we will have a couple of drugs because certain strains could be resistant to certain treatments."

Credit: 
University of Alberta Faculty of Medicine & Dentistry

Study sheds light on how a drug being tested in COVID-19 patients works

A team of academic and industry researchers is reporting new findings about how exactly an investigational antiviral drug stops coronaviruses. Their paper was published the same day that the National Institutes of Health announced that the drug in question, remdesivir, is being used in the nation's first clinical trial of an experimental treatment for COVID-19, the illness caused by the SARS-CoV-2 virus.

Previous research in cell cultures and animal models has shown that remdesivir can block replication of a variety of coronaviruses, but until now it hasn't been clear how it does so. The research team, which studied the drug's effects on the coronavirus that causes Middle East Respiratory Syndrome, reports that remdesivir blocks a particular enzyme that is required for viral replication. Their work was published in the Journal of Biological Chemistry.

All viruses have molecular machinery that copy their genetic material so that they can replicate. Coronaviruses replicate by copying their genetic material using an enzyme known as the RNA-dependent RNA polymerase. Until now, it has been difficult to get the polymerase complex that contains multiple proteins to work in a test tube.

"It hasn't been easy to work with these viral polymerases," Matthias Götte, a virologist and professor at the University of Alberta, Edmonton, who led the JBC study. That has slowed research into new drugs' function.

Using polymerase enzymes from the coronavirus that causes MERS, scientists in Götte's lab, including graduate student Calvin Gordon, found that the enzymes can incorporate remdesivir, which resembles an RNA building block, into new RNA strands. Shortly after adding remdesivir, the enzyme stops being able to add more RNA subunits. This puts a stop to genome replication.

The scientists hypothesize that this might happen because RNA containing remdesivir takes on a strange shape that doesn't fit into the enzyme. To find out for certain, they would need to collect structural data on the enzyme and newly synthesized RNA. Such data could also help researchers design future drugs to have even greater activity against the polymerase.

Götte's lab previously showed that remdesivir can stop the polymerase of other viruses with RNA genomes, such as Ebola. But, said Götte, the molecules that remdesivir and related drugs mimic are used for many different functions in the cell. This paper supports the viral RNA polymerase of coronaviruses as a target.

Credit: 
American Society for Biochemistry and Molecular Biology

Naked mole rats migrate above ground with no help from the moon

A full moon conjures an image of a person transforming into a werewolf -- a mythical story of moonlight explaining the unexplainable. While werewolves may only exist in the movies, unusual animal and human behaviors noticed under a full moon are real. Could moonlight be responsible?

A new study published in the African Journal of Ecology considers the role of the moon in driving a particularly rare occurrence: the solo journey of a naked mole rat from one underground colony to start a new one.

Stan Braude, professor of the practice of biology at Washington University in St. Louis, has been studying naked mole rats in the wild in eastern Africa for more than 30 years. Braude previously discovered how naked mole rats disperse -- or leave their underground colony to mate with an outsider and form a new colony -- by migrating above ground.

The factors driving this rare behavior are not well understood. In the new study, Braude and his collaborators asked for the first time whether naked mole rats are using moon phases to time their dispersal.

A number of publications on other rodent species suggested that if the animals depend on vision to avoid predators, they might not be active at new moons when the sky is the darkest. But if animals, such as the naked mole rats, instead depend on hearing and smell, they would be more likely to be active at new moons, when they can avoid detection.

"We were so sure that naked mole rats would only be above ground on nights without moonlight, but the data proved us wrong. They don't appear to entrain to moonlight at all. Our next hypothesis to test is that cues from the social environment are triggering this behavior."

Capturing naked mole rats

Naked mole rats are almost blind. They spend the majority of their lives underground in tunnels that they dig with their teeth. Naked mole rats live most of their lives in their natal colony as workers, helping out their mothers.

And when their time comes to leave the colony to breed, they switch from being active any time during a 24-hour cycle to being nocturnal. They put on fat and become less active in preparation for the journey they will undergo to start a new colony. A journey that requires going above ground.

The safest time for them to go above ground would be during a new moon when the night is darkest -- or, at least, that was the expectation.

To find out, Braude and his team captured dispersing naked mole rats in Meru National Park, Kenya.

They used a collection approach called a drift fence to help guide the movement of animals walking in the open field into a specific location. In this case, the drift fences led to pitfall traps containing buckets.

The idea came to Braude from work that Owen Sexton did 30 years ago to study the migration of salamanders at Tyson Research Center, the environmental field station for Washington University in St. Louis. Sexton's group put the drift fences in the woods to study when the salamanders were migrating to the pond.

"Salamanders cannot jump. Neither can naked mole rats. If you put a small barrier in their way they have to go around it," Braude said.

And they did. Braude and his team successfully collected nine dispersing naked mole rats using this method -- an astonishing accomplishment, given the rarity of these walkabouts.

Migrating with or without the moon

Capturing naked mole rats in the wild allowed Braude and his team to definitively conclude that naked mole rats do in fact migrate away from their natal colonies by walking above ground.

They also knew the exact dates when the nine naked mole rats were above ground. Now they could ask whether there is a correlation between moon phase and their presence above the ground.

The short answer was no.

"There truly is no association in either direction with the amount of light or lack of light," Braude said.

Dispersing naked mole rats did not avoid a full moon and did not prefer a new moon. They do not disperse according to the phases of the moon, the co-authors wrote.

Births also not tied to moon phase

Braude and his collaborators also used data from captive naked mole rats at the Leibniz Institute for Zoo and Wildlife Research in Berlin to consider the association between moon phase and litters born. In this facility, the animals live in a room with windows that allow in sun and moonlight.

"They had dozens of successful births and had a wealth of birthdays," Braude continued.

Some 173 litters in 23 colonies were born across every day of the lunar cycle during an 11-year period. They found no association.

"These results, although negative, gives us better insight into the dispersal. And they suggest perhaps the dispersal has more to do with the social environment that they are leaving than the physical environment that they are entering," Braude said.

Credit: 
Washington University in St. Louis

Telecommuting found to have little impact on corporate careers

TROY, N.Y. -- Working from home is known to be good for a strong work-life balance, advantageous for employee productivity, and is even touted as being beneficial for the environment. However, telecommuting has also carried a stigma -- despite a lack of data to back it up -- that employees who work remotely have difficulties rising in their career.

New research from the Lally School of Management at Rensselaer Polytechnic Institute finds that the reality is more positive than previously feared. In a study recently published in the Journal of Vocational Behavior, Timothy D. Golden, a professor and area head of enterprise management and organization in Lally, found that rather than suffering career consequences, telecommuters and non-telecommuters receive an equal number of promotions.

"Although telecommuting has experienced rapid growth, some workers are reluctant to try telecommuting for fear that it will hurt their career," Golden said. "This research helps answer that critical question: Does it hurt your career if you telecommute? My study shows that it depends heavily on the employee's work context."

Golden found that a key determinant in the success of telecommuters receiving promotions was the prevalence of telecommuting in their workplace. Telecommuters were promoted more when they worked in offices where working from home was widely accepted, yet in offices where few people telecommuted, those employees received fewer promotions.

While telecommuters may rise in the ranks at the same rate as their office-bound counterparts, Golden observed that employees working from home don't earn the same bump in pay. However, if telecommuters signaled a "devotion" to the workplace by working additional hours outside of normal working hours, his analysis indicated that they benefited in terms of both promotions and salary growth.

Golden also determined that it was not simply the fact that an employee telecommuted that mattered. The amount of telecommuting per week is also a key component of an employee's advancement. Moreover, he found that face time matters. Even when an employee telecommuted a large percentage of their work week, telecommuters who had more in-person contact with supervisors received higher pay increases.

Golden used a sample of more than 400 employees matched with corporate data on promotion and salary growth.

"In this study, I wanted to use objective data -- actual promotions and salary increases -- rather than simply rely on survey responses, as had been done in previous research," Golden said. "In this way, we can begin to uncover the true impact of telecommuting on fundamental career outcomes, such as promotions and salary growth over time."

Golden is an expert in the field of telework and telecommuting, studying this field for more than 20 years.

"Previous research has tended to treat all telecommuters as one homogeneous group, and my research suggests that telecommuting is not a one-size-fits-all work arrangement," Golden said. "Telecommuting arrangements are often unique, and differences in these arrangements must be understood and taken into account when determining how best to be successful. This study suggests contextual factors are especially important to consider when determining telecommuting's effect on promotions and salary growth."

Credit: 
Rensselaer Polytechnic Institute