Culture

By imaging the brain, scientists can predict a person's aptitude for training

image: U. of I. psychology professor Aron Barbey and his colleagues found that the relative size of specific brain regions predicted how much a person would benefit from interventions designed to boost fluid intelligence.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- People with specific brain attributes are more likely than others to benefit from targeted cognitive interventions designed to enhance fluid intelligence, scientists report in a new study. Fluid intelligence is a measure of one's ability to adapt to new situations and solve never-before-seen problems.

Reported in the journal Trends in Neuroscience and Education, the study is the first to link the size of specific brain structures to a person's response to interventions, the researchers said. The work also identifies brain regions that appear to play a more central role in supporting fluid intelligence than was previously understood.

The study included 424 people, all of whom scored within the normal range on tests of intelligence. Participants were randomly assigned to one of three intervention groups or an active control group. One intervention involved aerobic exercise; another combined exercise and cognitive training; and a third consisted of mindfulness training, exercise and cognitive training. People in the active control group engaged in visual search tasks over the course of the 16-week study.

At the beginning and end of the study, the scientists tested participants' fluid intelligence, the ability to solve unfamiliar tests of logic and spatial reasoning. A subset of randomly selected participants also underwent MRI brain imaging.

"We wanted to know whether structural brain attributes predicted an individual's response to intervention, as measured by improvement on tests of fluid intelligence," said University of Illinois psychology professor Aron Barbey, who led the research with Wayne State University psychology professor Ana Daugherty.

The researchers used scientifically validated methods to determine the relative volumes of several brain structures that previous studies have implicated in fluid intelligence.

Their analyses revealed that some individuals in each intervention group did better than others on tests of fluid intelligence at the beginning and end of the intervention and benefited from the training more than others.

This group of 71 individuals also shared specific brain attributes that distinguished them from the other participants.

"We had enough people who showed this pattern that the statistical analysis actually identified them as a group, regardless of the intervention group they were in," Daugherty said.

The size of several brain structures - some of which are known to be closely associated with fluid intelligence - was larger in these participants than for everyone else in the study, with the exception of two regions that were smaller: the middle frontal insula and parahippocampal cortex.

"We noticed that, independent of the activity they were engaged in, some people were improving and some were not improving on measures of fluid intelligence," Daugherty said. "And then there were some people who were actually getting worse over the course of the intervention."

The findings address an ongoing problem in cognitive intervention studies, Barbey said.

"Historically, research in the psychological and brain sciences has sought to develop interventions to enhance cognitive performance and to promote brain health," Barbey said. "But these efforts have largely been unsuccessful."

This failure might be the result of scientists' one-size-fits-all approach, he said.

"If we assume that we all solve problems in a similar way, then it's reasonable to expect that everyone will benefit from the same types of instruction or cognitive training," Barbey said. "Research suggests otherwise, however. We each have unique characteristics that shape our cognitive abilities and influence whether we will benefit from specific forms of training."

The researchers focused on brain regions that are known to be important for fluid intelligence. But they were surprised that two structures in the brain - the parahippocampal cortex and the caudate nucleus - were so strongly linked to improvement on tests of fluid intelligence.

"We found evidence that these regions, in particular, distinguished people who had the highest response following our intervention from the other participants," Daugherty said. "These regions are responsible for visual-spatial abilities and the ability to use memory in the process of reasoning."

The findings suggest these abilities may be more closely related to fluid intelligence than previously thought, she said.

The researchers began the study with a central question, Barbey said.

"Can we measure differences in the brain to predict who is most likely to benefit from an intervention? And the answer is: Yes, we can," he said. "The question now becomes whether we can use this knowledge to design more effective interventions that are tailored to the needs of the individual across the lifespan."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

New study provides insight into chronic kidney disease

(Boston) - Researchers have further analyzed a known signaling pathway they believe brings them one step closer to understanding the complex physiology of patients with chronic kidney disease (CKD), which might provide a path to new treatment options.

CKD is a complex and unique condition that involves impaired removal of many different toxins from the blood. These toxins, which often are referred to as uremic solutes, then can cause damage to multiple different organ systems, not limited to the kidneys. Patients with CKD are at a heightened risk for complications to the heart and brain, which increases morbidity and mortality.

To understand how uremic solutes causes damage in the kidney and other organs, researchers from Boston University School of Medicine (BUSM) tested a pathway called Aryl Hydrocarbon Receptor (AHR). In previous studies, these same researchers showed that molecules that activate AHR have a toxic effect on blood vessels.

The researchers created an experimental model where they monitored blood levels of different uremic solutes and the levels of AHR in different organ systems. They discovered that, as the levels of uremic solutes increased in the blood, so did the levels of AHR, suggesting that AHR may be responsible for the damage to blood vessels that is seen in CKD, at least in part. The researchers noted that AHR also increased in the heart, liver and brain. This is the first study that has been able to directly demonstrate the activation of AHR in different organs and within specific tissues of that organ.

The researchers believe that these uremic solutes (called indoxyl sulfate and kyurenine), could potentially be translated into clinical blood tests that may give a more complete picture about the severity and progression of a patient's kidney disease. Moreover, AHR may be a target for new medication development in the treatment of CKD. "From a clinical standpoint," explained corresponding author, Vipul Chitalia, MD, PhD, associate professor of medicine at BUSM, "it further adds evidence to the use of AHR inhibitors as effective therapeutics to combat complications associated with CKD and uremic solutes."

These findings appear online in the in journal Kidney International.

Credit: 
Boston University School of Medicine

Online therapy helped cardiovascular disease patients with depression

Researchers at Linköping University have developed a treatment for depression among people with cardiovascular disease. The results, recently published in JMIR Mental Health, show that cardiovascular disease patients who underwent internet-based therapy for their depression became less depressed and gained a better quality of life.

It is common that cardiovascular disease (CVD) patients suffer from depression. This can lead to a vicious circle where the CVD can be affected negatively, and for this reason it's important that the depression is treated.

"Our study shows that internet-based therapy can reduce depression and improve quality of life among CVD patients. Because of insufficient resources, all CVD patients don't get the required care against depression, and so internet-based therapy can play an important role. Also, the patients can undergo therapy at home, at a time that suits them", says Peter Johansson, professor at the Department of Social and Welfare Studies at Linköping University.

A number of previous studies have shown that internet-based cognitive behavioural therapy is effective against depression, but this study is the first to be designed specifically for CVD patients with depression. It was a randomised controlled trial, where the participants were randomly put into different groups, to enable comparison with each other. The participants included 144 CVD patients with depression. Of these, 72 underwent nine weeks of internet-based cognitive behavioural therapy, and during this time had access to a nurse. The remaining 72 discussed health with each other in an internet forum, for the same period of time.

The results show that after nine weeks of online therapy, one of five patients had a significant clinical improvement in their depression, compared to the group in the internet forum. Also, after the completed treatment, the online-therapy patients reported an increase in quality of life.

"The strength of our study is that the patients had access to nurses via the web - a contact that was crucial to the good result", says Peter Johansson.

The patients who underwent online CBT had an average of 15 minutes of feedback time every week with one of the nurses; this was not available to the patients in the online forum. The feedback time enabled the patients to get answers to their questions, but was also aimed at supporting and encouraging them.

Credit: 
Linköping University

Controlling attention with brain waves

Having trouble paying attention? MIT neuroscientists may have a solution for you: Turn down your alpha brain waves. In a new study, the researchers found that people can enhance their attention by controlling their own alpha brain waves based on neurofeedback they receive as they perform a particular task.

The study found that when subjects learned to suppress alpha waves in one hemisphere of their parietal cortex, they were able to pay better attention to objects that appeared on the opposite side of their visual field. This is the first time that this cause-and-effect relationship has been seen, and it suggests that it may be possible for people to learn to improve their attention through neurofeedback.

"There's a lot of interest in using neurofeedback to try to help people with various brain disorders and behavioral problems," says Robert Desimone, director of MIT's McGovern Institute for Brain Research. "It's a completely noninvasive way of controlling and testing the role of different types of brain activity."

It's unknown how long these effects might last and whether this kind of control could be achieved with other types of brain waves, such as beta waves, which are linked to Parkinson's disease. The researchers are now planning additional studies of whether this type of neurofeedback training might help people suffering from attentional or other neurological disorders.

Desimone is the senior author of the paper, which appears in Neuron on Dec. 4. McGovern Institute postdoc Yasaman Bagherzadeh is the lead author of the study. Daniel Baldauf, a former McGovern Institute research scientist, and Dimitrios Pantazis, a McGovern Institute principal research scientist, are also authors of the paper.

Alpha and attention

There are millions of neurons in the brain, and their combined electrical signals generate oscillations known as brain waves. Alpha waves, which oscillate in the frequency of 8 to 12 hertz, are believed to play a role in filtering out distracting sensory information.

Previous studies have shown a strong correlation between attention and alpha brain waves, particularly in the parietal cortex. In humans and in animal studies, a decrease in alpha waves has been linked to enhanced attention. However, it was unclear if alpha waves control attention or are just a byproduct of some other process that governs attention, Desimone says.

To test whether alpha waves actually regulate attention, the researchers designed an experiment in which people were given real-time feedback on their alpha waves as they performed a task. Subjects were asked to look at a grating pattern in the center of a screen, and told to use mental effort to increase the contrast of the pattern as they looked at it, making it more visible.

During the task, subjects were scanned using magnetoencephalography (MEG), which reveals brain activity with millisecond precision. The researchers measured alpha levels in both the left and right hemispheres of the parietal cortex and calculated the degree of asymmetry between the two levels. As the asymmetry between the two hemispheres grew, the grating pattern became more visible, offering the participants real-time feedback.

Although subjects were not told anything about what was happening, after about 20 trials (which took about 10 minutes), they were able to increase the contrast of the pattern. The MEG results indicated they had done so by controlling the asymmetry of their alpha waves.

"After the experiment, the subjects said they knew that they were controlling the contrast, but they didn't know how they did it," Bagherzadeh says. "We think the basis is conditional learning -- whenever you do a behavior and you receive a reward, you're reinforcing that behavior. People usually don't have any feedback on their brain activity, but when we provide it to them and reward them, they learn by practicing."

Although the subjects were not consciously aware of how they were manipulating their brain waves, they were able to do it, and this success translated into enhanced attention on the opposite side of the visual field. As the subjects looked at the pattern in the center of the screen, the researchers flashed dots of light on either side of the screen. The participants had been told to ignore these flashes, but the researchers measured how their visual cortex responded to them.

One group of participants was trained to suppress alpha waves in the left side of the brain, while the other was trained to suppress the right side. In those who had reduced alpha on the left side, their visual cortex showed a larger response to flashes of light on the right side of the screen, while those with reduced alpha on the right side responded more to flashes seen on the left side.

"Alpha manipulation really was controlling people's attention, even though they didn't have any clear understanding of how they were doing it," Desimone says.

Persistent effect

After the neurofeedback training session ended, the researchers asked subjects to perform two additional tasks that involve attention, and found that the enhanced attention persisted. In one experiment, subjects were asked to watch for a grating pattern, similar to what they had seen during the neurofeedback task, to appear. In some of the trials, they were told in advance to pay attention to one side of the visual field, but in others, they were not given any direction.

When the subjects were told to pay attention to one side, that instruction was the dominant factor in where they looked. But if they were not given any cue in advance, they tended to pay more attention to the side that had been favored during their neurofeedback training.

In another task, participants were asked to look at an image such as a natural outdoor scene, urban scene, or computer-generated fractal shape. By tracking subjects' eye movements, the researchers found that people spent more time looking at the side that their alpha waves had trained them to pay attention to.

"It is promising that the effects did seem to persist afterwards," says Desimone, though more study is needed to determine how long these effects might last.

Credit: 
Massachusetts Institute of Technology

Machine learning helps scientists measure important inflammation process

image: A scanning electron microscope image of NETs engulfing fungal cells (Candida albicans) in an infected mouse lung.

Image: 
Urban, et al., Max Planck Institute

CHAPEL HILL, NC - Inflammation is a hallmark of many health conditions, but quantifying how the underlying biology of inflammation contributes to specific diseases has been difficult. For the first time, UNC School of Medicine researchers and colleagues now report the development of a new technology to identify white blood cells called neutrophils that are primed to eject inflammatory DNA into the circulation via a process called NETosis.

The findings, published in Scientific Reports, mark the first time scientists have used machine learning tools for rapid quantitative and qualitative cell analysis in basic science.

"This new test will allow investigators to measure NETosis in different diseases and to test drugs that may inhibit or promote the process," said senior author Leslie Parise, PhD, professor and chair of the UNC Department of Biochemistry and Biophysics.

When foreign invaders such as viruses or bacteria enter our bodies, white blood cells rush in to fight the invaders in various ways. One type of white cell, the neutrophil, expels its DNA into the bloodstream to trap bacteria and viruses and aid in their killing to prevent infections. This neutrophil DNA has a net-like appearance and is called Neutrophil Extracellular Traps, or NETs. The process by which this DNA is thrust into the extracellular space is called NETosis. These so-called DNA NETs are reported to contribute to inflammation in numerous diseases such as autoimmune disease, sepsis, arthritis, cancers, sickle cell disease, and thrombosis.

NETosis can be activated by different chemical stimuli but to the eye, the final NETotic neutrophil looks the same. To help distinguish neutrophils activated by different stimuli and in a very rapid manner, the Parise laboratory leaned on machine learning, a branch of artificial intelligence built on the idea that computers can acquire knowledge through data and observations without explicit programming. The computers can then learn to generalize from examples and make predictions.

Machine learning has been used in analysis of genomics, drug discovery, modeling protein structures, and disease diagnosis. Few studies have used automated imaging technologies with machine learning in non-diagnostic and exploratory research-focused efforts, such as cell quantification in animal models.

"The machine learning revolution has come about only in the past few years due to rapid developments in parallel computing and mathematical optimization theory," said co-author Joshua Cooper, PhD, Professor in the Department of Mathematics at the University of South Carolina. "Most scientific work using these extraordinarily powerful tools thus far has focused on building predictive models, and usually on very expensive equipment. We have shown it's also possible to deploy these tools on commodity hardware to advance fundamental science by automating analyses previously requiring enormous amounts of human labor, and by transforming qualitative biological morphology into measurably quantitative features."

Cell classification is laborious, relying on heavily supervised image analysis tools with the need for continuous user interaction. Although the reduced error rates and exceptional learning speeds of machine learning have the potential to transform the field of cellular imaging, they have not been widely adopted in the biological sciences, in part because of a lack of testing and validation of such methods due to shortages of large datasets for training.

Convolutional neural networks (CNNs) are deep learning algorithms commonly used in image recognition and classification. Their structure was inspired by the structure of the mammalian visual cortex, whose function is pattern recognition and computing complex object attributes.

In the Scientific Reports paper, first author Laila Elsherif, PhD, and colleagues including co-authors Noah Sciaky at UNC and Joshua Cooper, PhD, at the University of South Carolina, demonstrated for the first time the feasibility of designing different CNNs to address key questions related to neutrophil NETosis.

Elsherif, who is now a faculty member at University of Tennessee, applied this new technology to sickle cell disease (SCD) because chronic inflammation and hypercoagulability are well known complications in SCD patients, and NETs are thought to be important for both inflammation and blood clotting. Also, previous studies found that plasma from SCD patients caused NET production in neutrophils from healthy individuals, leading to the conclusion that NETosis is associated with SCD pathophysiology.

However, both studies used surrogate indicators for NETosis and did not measure NETosis directly in neutrophils isolated from SCD patients.

"Our technology allows us to quantitatively assess NETosis in neutrophils of patients with SCD where patients are not experiencing pain and other symptoms associated with crisis," Elsherif said.

The investigators applied their new technology to find that one pathway of NETosis appears to be absent in the patients tested.

"The decreased potential for NETosis in neutrophils from patients with SCD could explain their increased susceptibility to certain invasive bacterial infections, as their neutrophils are unable to NETose and trap and kill bacteria efficiently," Elsherif said. "Our results could also reflect the response of neutrophils from patients with SCD to hydroxyurea or other pharmacological interventions. Additional studies with a larger patient cohort are certainly warranted and could elucidate important aspects in SCD disease mechanisms."

Credit: 
University of North Carolina Health Care

Paying taxes less 'taxing' when we recognize how those dollars help others -- study

image: There's nothing certain in life except death and taxes. But taxpayers' support for the latter could potentially be improved, according to a new study led by SFU psychology researchers Emily Thornton and Lara Aknin.

Image: 
Jan Vasek; Jeshoots; CC0

There's nothing certain in life except death and taxes. But taxpayers' support for the latter could potentially be improved, according to a new study led by SFU psychology researchers Emily Thornton and Lara Aknin.

Their work, conducted alongside University of Kansas psychologist Nyla Branscombe and University of British Columbia economist John Helliwell, reveals that when taxpayers recognize their tax dollars are used to help others, they are more supportive of taxation and more willing to pay their taxes.

The researchers undertook a series of four studies that questioned participants about their views on taxation. The studies spanned a representative national sample of Americans, Canadian undergraduate students, and more than 470,000 adults from over 100 countries.

According to the Canada Revenue Agency (CRA), Canadian taxpayers who overstate deductions and understate income to minimize their taxes are contributing to an estimated $8.7-billion tax deficit.

"Many people express deep discontent about paying their taxes," says Thornton, a fourth-year psychology student and lead author of the paper.

"Some people will even go to extreme lengths to avoid paying taxes. Our findings raise an intriguing possibility-- would Canadians be more willing to pay their taxes if the CRA better publicized how their tax dollars help others?"

The research, published in PLOS ONE, says the relationship between seeing how taxes help others and supporting taxation still exists when other important factors are taken into account. For instance, Thornton and colleagues found the results still hold when controlling for demographic variables, participants' general pro-social orientation, and the perception that tax dollars are being put to good use.

"These findings align with a growing body of research underscoring the pro-social nature of human behaviour," says Aknin, SFU psychology professor. "And offer a more optimistic perspective on taxation."

Credit: 
Simon Fraser University

Dangerous skin tumor now has treatment guidelines

Yellow or shiny bump that doesn't go away

New treatment guidelines will prevent tumor spread and may save lives

Rare tumors often get neglected

CHICAGO --- It starts as a yellow or shiny bump on the eyelids, face or body that grows and doesn't go away, unlike a pimple.

The bump is sebaceous carcinoma, a cancer of the oil glands diagnosed in thousands of patients in the U.S. every year. If not removed and treated promptly, it can spread to other organs and cause grave harm to patients, including death. But up until now there was no commonly agreed method to treat it.

A new study from Northwestern Medicine reports the first guidelines for treating this rare tumor, which required expertise from both dermatologists and ophthalmologists, because it can be on the skin as well as in the eye.

"Now there is a clear direction for treating these tumors, which may result in fewer tumors spreading to other organs and fewer deaths," said Dr. Murad Alam, vice chair and professor of dermatology at Northwestern University Feinberg School of Medicine and a Northwestern Medicine dermatologist.

The study was published Dec. 2 in The Lancet Oncology journal.

"With limited resources to develop guidelines, rare tumors often get neglected," Alam said. "It's also more challenging to develop guidelines for rare tumors because there is less data in the medical literature about these."

The guidelines tell physicians how to best perform biopsies to find (or exclude) sebaceous carcinoma. They also explain physicians need to treat eyelid and non-eyelid sebaceous carcinoma differently. The surgical approaches described in the guidelines enable removal of sebaceous carcinoma with a high rate of cure without risking injury to important organs, like the eye.

Having a sebaceous carcinoma can be a helpful clue that the same patient has undetected Muir-Torre Syndrome, which increases the risk for colon cancer. When a sebaceous carcinoma is found in a patient, the guidelines explain whether and how the patient should also be checked for Muir-Torre Syndrome. If they have Muir-Torre, then they can be screened for colon cancer.

Northwestern Medicine has a particular expertise in the treatment of less common skin cancers. It is the lead coordinating site for several international collaborations in this area, notably the Committee on Invasive Skin Tumor Evidence-Based Recommendations, an international working group of specialists at major academic medical centers who develop tools (like guidelines) to better treat rare and uncommon skin cancers.

Credit: 
Northwestern University

Study prompts call for disaster-specific pharmacy legislation

image: QUT's Dr Kaitlyn Watson study calls for disaster-specific pharmacy legislation to cope with Australian bushfires.

Image: 
QUT media

Pharmacists caught up in the Australian bushfire crisis are being hampered from providing timely and effective treatment to displaced people due to outdated laws, according to QUT researchers.

The legal barriers pharmacists face across Australian jurisdictions include restrictive emergency medication that only covers three days as well as vaccination and relocation limitations.

Dr Kaitlyn Watson, from QUT's Faculty of Health School of Clinical Sciences, said the medical impacts from disasters last much longer than three days.

"People panic and often forget to pack their vital medications," she said.

"To ensure they have supplies longer than what a pharmacist can administer they find they either go to hospitals' emergency departments to get a new script or find a GP.

"Both these options are difficult in disaster-hit areas."

Dr Watson said the current disaster medicine health care model is focused primarily on high-acuity patients however the demographic of those adversely affected in the longer term during a disaster is shifting.

"Disasters are increasing in frequency and severity displacing people from their homes and life-saving medicines disrupting the continuity of care particularly to the elderly, very young, people with reduced mobility and other vulnerable groups," she said.

Dr Watson has investigated pharmacy legislation specific to disasters for five countries including Australia, Canada, United Kingdom, United States and New Zealand.

The research has been published in Australian Health Review and co-authored by researcher Dr Judith Singleton and Professors Vivienne Tippett and Lisa Nissen.

The study found legislative barriers prevented the level of assistance pharmacists can provide during times of crisis which in turn added to the burden on healthcare teams during disasters.

Dr Watson said pharmacists' responsibilities and capabilities should be widened legally, particularly to treat displaced people in disaster-affected regions.

"They are the community landmark and communication hub for these patients and where they can go to get healthcare," she said.

"The three-day supply rule should be extended to at least 30 days, similar to parts of the United States which changed their laws to help patients in the aftermath of Hurricane Katrina."

Dr Watson also said pharmacists, in addition to dispensing medications, often provided basic necessities like toiletries, hydration, sanitary products often without charge.

"There are no current mechanisms in place for the reimbursement for pharmacies from local, state or federal governments when they supply these essential services," she said.

"They have a duty of care to their patients but often they're struggling as well in rural communities."

Dr Watson said pharmacists could undertake numerous clinical roles in a disaster and be involved in disaster health management planning and response.

"Pharmacies have been identified as one of the fastest community healthcare services to re-establish operations post disaster," she said.

Credit: 
Queensland University of Technology

Mobile devices blur work and personal privacy raising cyber risks, says QUT researcher

image: This is QUT's Dr Kenan Degirmenci.

Image: 
QUT Media

Organisations aren't moving quickly enough on cyber security threats linked to the drive toward using personal mobile devices in the workplace, warns a QUT privacy researcher.

Dr Kenan Degirmenci from QUT's Science and Engineering Faculty's School of Information Systems said workers worldwide expected to take their work with them whenever and wherever.

But he warned Bring Your Own Device (BYOD) had opened up a can of worms for employers and employees alike.

Dr Degirmenci's published research, Future of Flexible Work in the Digital Age: Bring Your Own Device Challenges of Privacy Protection, will be presented at the International Conference on Information Systems in Munich in December.

"The breakneck speed of digital transformation brought with it opportunities as well as threats," he said.

"Organisations don't appear to be keeping up with the pace of change, deliberately putting the brakes on digital transformation because it comes with security challenges."

Dr Degirmenci said, nonetheless, by 2021 the BYOD and enterprise mobility market which incorporates segments such as software, security, data management and network security is estimated to grow to US$73 billion globally.

Data breaches including stealing of personal information are also on the rise for all kinds of businesses and workplaces.

Often employees use their personal devices, but many don't know if their employer has a policy in place to protect their data and usage.

"Some organisations wary of malware or theft of data can track employees' locations during work and non-work hours, wipe data, as well as access private emails and photos," Dr Degirmenci said.

The research involved a case study of two multinational companies and a survey of almost 550 employees from the United States, Germany and South Korea about bringing their own devices (BYOD) to work.

Dr Degirmenci said the multinational companies from the survey used mobile device management (MDM) to monitor, manage and secure devices of employees.

"American employees placed greater emphasis on BYOD risks compared to Germany and South Korea," he said.

He said Australia ranked similarly to the United States in terms of its "individualist-type culture" and while workers wanted increased flexibility there were drawbacks to using their own devices.

New technologies and digital capabilities are also omniscient across the education sector with schools enacting BYOD.

"We've recommended BYOD security management be improved, particularly for countries like America and Australia."

Credit: 
Queensland University of Technology

A common insulin signaling pathway across cancer and diabetes

An oncology researcher has made an unexpected contribution to the understanding of type 2 diabetes. In results published in Science Advances, Patrick Hu, M.D., associate professor of medicine at Vanderbilt University Medical Center, found a protein that modulates a signaling pathway often targeted by cancer therapies is also required for insulin biogenesis.

Hu and colleagues showed a protein controlling the PI3K/Akt pathway, a pathway targeted by more than 40 anti-tumor drugs, is absolutely required for the synthesis, processing and secretion of insulin. Hu initiated the discovery using the nematode worm Caenorhabditis elegans, a model system commonly used in research on genetics and development.

"We approached our work on the PI3K/Akt pathway from a cancer perspective, but this is a primordial pathway." Hu said. "It is equally relevant to diabetes and cancer. Understanding how this pathway is regulated could lead to new strategies to treat both diseases."

Insulin signaling is certainly involved in diabetes, but a related insulin-like growth factor signaling network is also implicated in cancer, Hu explained. In C. elegans, a primordial pathway exists that likely gave rise to both human pathways, providing a convenient research model.

"Both networks involve PI3K/Akt in humans, and we were looking for new components of this pathway," Hu said.

The researchers used a forward genetic approach to screen C. elegans worms for components of the pathway that are altered during abnormal insulin signaling. They landed on TRAP-alpha, a highly conserved protein across worms, flies and mammalian systems, including humans.

TRAP-alpha sits on a structure inside cells called the endoplasmic reticulum (ER), where it helps make proteins that will eventually be secreted. Deleting the worm equivalent of TRAP-alpha activated ER stress responses, the researchers found.

Given some people with type 2 diabetes have common genetic variants in the TRAP-alpha gene, Hu moved the experiments to pancreatic beta cells.

Hu collaborated with Ming Liu, M.D. and Peter Arvan, M.D. of the University of Michigan to delete TRAP-alpha from rat beta cells. The deletion caused a 90 percent reduction in total insulin content inside the cells. Instead of being shuttled through the ER for conversion to insulin and secretion, most parent molecules of insulin were degraded, and those that escaped degradation accumulated inside beta cells. They were never processed to insulin, or secreted. The finding shows how without TRAP-alpha, insulin biogenesis is drastically impaired.

Said Hu, "TRAP-alpha is the first situation where we've identified a mutant in the worm and then were able to move it into mammalian cell culture to show it affects a disease phenotype."

In both models, the researchers found deleting TRAP-alpha triggered the ER unfolded protein response. The cells detected unfolded proteins accumulating inside them, and decreased corresponding gene expression to combat it. The cells also increased expression of chaperone proteins that help fold proteins properly.

"We're moving toward the role of TRAP-alpha in maintaining protein homeostasis," Hu said. "Maintaining proper protein folding in the ER is certainly important for cellular health, and it likely contributes to human health in general."

Beyond diabetes, many other diseases are associated with abnormal protein folding responses and protein imbalances. These include neurodegenerative diseases such as amyotrophic lateral sclerosis and Alzheimer's disease.

"It's likely other secreted molecules besides insulin might be affected by TRAP-alpha deletion," Hu said. "If we can understand the broader role that TRAP-alpha is playing in maintaining protein homeostasis, we might develop new ways to approach other diseases, too."

Credit: 
Vanderbilt University Medical Center

U-M researchers discover stress in early life extends lifespan

ANN ARBOR--Some stress at a young age could actually lead to a longer life, new research shows.

University of Michigan researchers have discovered that oxidative stress experienced early in life increases subsequent stress resistance later in life.

Oxidative stress happens when cells produce more oxidants and free radicals than they can deal with. It's part of the aging process, but can also arise from stressful conditions such as exercise and calorie restriction.

Examining a type of roundworm called C. elegans, U-M scientists Ursula Jakob and Daphne Bazopoulou found that worms that produced more oxidants during development lived longer than worms that produced fewer oxidants. Their results are published in the journal Nature.

Researchers have long wondered what determines variability in lifespan, says Jakob, a professor of molecular, cellular and developmental biology. One part of that is genetics: If your parents are long-lived, you have a good chance for living longer as well. Environment is another part.

That other stochastic--or random--factors might be involved becomes clear in the case of C. elegans. These short-lived organisms are a popular model system among aging researchers in part because every hermaphroditic mother produces hundreds of genetically identical offspring. However, even if kept in the same environment, the lifespan of these offspring varies to a surprising extent, Jakob says.

"If lifespan was determined solely by genes and environment, we would expect that genetically identical worms grown on the same petri dish would all drop dead at about the same time, but this is not at all what happens. Some worms live only three days while others are still happily moving around after 20 days," Jakob said. "The question then is, what is it, apart from genetics and environment, that is causing this big difference in lifespan?"

Jakob and Bazopoulou, a postdoctoral researcher and lead author of the paper, found one part of the answer when they discovered that during development, C. elegans worms varied substantially in the amount of reactive oxygen species they produce.

Reactive oxygen species, or ROS, are oxidants that every air-breathing organism produces. ROS are closely associated with aging: the oxidative damage they elicit are what many anti-aging creams claim to combat. Bazopoulou and Jakob discovered that instead of having a shorter lifespan, worms that produced more ROS during development actually lived longer.

"Experiencing stress at this early point in life may make you better able to fight stress you might encounter later in life," Bazopoulou said.

When the researchers exposed the whole population of juvenile worms to external ROS during development, the average lifespan of the entire population increased. Though the researchers don't know yet what triggers the oxidative stress event during development, they were able to determine what processes enhanced the lifespan of these worms.

To do this, Bazopoulou sorted thousands of C. elegans larvae according to the oxidative stress levels they have during development. By separating worms that produced large amounts of ROS from those that produced little amounts of ROS, she showed that the main difference between the two groups was a histone modifier, whose activity is sensitive to oxidative stress conditions.

The researchers found that the temporary production of ROS during development caused changes in the histone modifier early in the worm's life. How these changes persist throughout life and how they ultimately affect and extend lifespan is still unknown. What is known, however, is that this specific histone modifier is also sensitive to oxidative stress sensitive in mammalian cells. Additionally, early-life interventions have been shown to extend lifespans in mammalian model systems such as mice.

"The general idea that early life events have such profound, positive effects later in life is truly fascinating. Given the strong connection between stress, aging and age-related diseases, it is possible that early events in life might also affect the predisposition for age-associated diseases, such as dementia and Alzheimer's disease," Jakob said.

Next, the researchers want to figure out what key changes are triggered by these early-life events. Understanding this might allow scientists to develop lifespan-extending interventions that work at later stages in life.

Credit: 
University of Michigan

Untangling the branches in the mammal tree of life

New Haven, Conn. -- The mammal tree of life is a real leaner. Some branches are weighed down with thousands of species -- we're looking at you, rodents and bats -- while others hold just a few species.

Now we may have a better idea why.

In a new study published in the journal PLOS Biology, researchers at Yale University unveil a complete overhaul of the way species data is brought together and analyzed to construct an evolutionary tree of life for mammals. It's aimed at giving scientists, conservation managers, policymakers, and environmentalists more accurate, comprehensive information about species diversity and relationships, past and present.

"The fossil and genomic data we use are often fragmentary and messy, but the reality is we are reconstructing events that occurred millions of years ago in long-extinct mammals," said Nathan Upham, a Yale postdoctoral associate in ecology and evolutionary biology and first author of the study.

Of the roughly 6,000 species of living mammals, most of them are rodents (42%) or bats (24%), while common mammals such as cows, pigs, sheep, cats, raccoons, and monkeys consist of relatively few species. Yet up to this point, attempts to formulate a tree of life for mammals have been unable to explain this unevenness of species diversity.

Upham and senior author Walter Jetz, professor of ecology and evolutionary biology at Yale, took a new approach. They reconstructed the evolutionary relationships of species by creating "patches" of smaller, more accurate evolutionary trees that were then linked to a carefully developed "backbone" representing the deep divergences in the tree. This resulted in 10,000 big trees -- designed so they can be studied individually or together -- that also point out the remaining gaps in data for the overall mammal tree of life.

"We're calling it a 'backbone-and-patch' approach," Jetz said. "For the first time, we're able to characterize the genetic relationships of essentially all living mammals while transparently relaying the parts that remain uncertain. It should enable advances in a variety of fields, including comparative biology, ecology, and conservation."

The completeness and accuracy of this information is important, Jetz added, as evolutionary distinctiveness is increasingly used to determine conservation priorities. Therefore, it can be useful for researchers and policymakers in the U.S. to know that the closest genetic relatives of the pronghorn antelope in the U.S. are not nearby mammal species, but giraffes and okapi in Africa.

The researchers also developed an interactive tool for exploring the mammal tree of life. The interface, which is downloadable, lets users examine information both at the species level and also more broadly.

Upham said further research will use the new information to look at how the uneven distribution of species in the mammal tree of life is related to geographic isolation among mammal populations, which can lead to higher rates of speciation -- the evolutionary process of forming new species -- and extinction.

Credit: 
Yale University

Immigrants who naturalize outearn their peers

The moment when an immigrant becomes a citizen of his adopted country looks remarkably similar in ceremonies around the world: a hand raised, an oath taken, a flag waved, and a celebration with family and friends. But the road leading to that moment differs widely by country. Some are long and steep and others more walkable, depending on the country's policies.

Behind this divergence is a kind of chicken-and-egg problem. Is citizenship a prize, something to be won only after considerable striving? Then it should be surrounded by hurdles, like requirements that you've mastered the language, lived in the country a long time, and achieved a certain level of economic success. Or is citizenship an invitation to build a future in the country, something that helps immigrants succeed? Then it should be easier to get.

Which side has the better of the argument? A new study from the Immigration Policy Lab at ETH Zurich and Stanford University (IPL) sheds light on the importance of citizenship in immigrants' trajectories. Looking at more than thirty years of data on thousands of immigrants in Switzerland, IPL researchers found that those who had naturalized earned more money each year than those who hadn't--and the boost in income was largest for people facing the greatest disadvantages in the labor market.

A Puzzle for Researchers

Considering the benefits usually reserved for citizens, it's easy to imagine how naturalizing early on could equip immigrants to prosper: access to advantageous jobs, eligibility for scholarships to get education and training, and the assurance that they can stay in the country indefinitely and invest in the future.

But it's hard to prove that citizenship actually delivers on this promise, because those who get citizenship and those who don't aren't similar enough to allow for meaningful comparison. People who jump the hurdles to apply for citizenship differ in many ways from those who hold back, and successful applicants differ from unsuccessful ones. If naturalized immigrants do better in the long run, this could be due to any number of factors--factors that, like work ethic or resources, also account for their ability to successfully navigate the citizenship application process.

"To accurately assess the benefits of citizenship it is essential to compare naturalized and non-naturalized immigrants that are similar in all characteristics but for their passport", said Dalston Ward, a postdoctoral researcher at ETH Zurich.

This is where Switzerland is a boon to social scientists. Between 1970 and 2003, some Swiss towns put citizenship applications to a popular vote. To become a Swiss citizen, an immigrant would have to receive more "yes" than "no" votes. For applicants who won or lost by only a handful of votes, the decision may as well have been pure chance, enabling an apples-to apples comparison. Combine that with decades of records from the Swiss pension system showing annual earnings, and you have a trustworthy way to determine whether or not citizenship actually improves immigrants' fortunes.

Long-Term Benefits

After identifying those who narrowly won or lost their bid for citizenship, the researchers looked back at the five years leading up to the vote that would divide them. There, they had similar incomes. But after the vote, the new citizens went on to earn more money than those who remained in permanent residency status, and the earnings gap increased as time went on. At first, they earned an average of about 3,000 Swiss francs more (roughly the same in U.S. dollars), and that increased to almost 8,000 a decade later. In any given year after the vote awarded them citizenship, these immigrants earned an average of 5,637 more than their peers.

"In sum, these findings provide causal evidence that citizenship is an important catalyst for economic integration, which benefits both immigrants and host communities", said Jens Hainmueller, a professor of political science at Stanford University.

If citizenship was the wedge between the two groups, how exactly did it lift one above the other? The most likely explanation, the researchers thought, was that it counteracted the discrimination that colors immigrants' lives in the job market. When immigrants apply for jobs in Switzerland, their citizenship status is almost as visible as hair color or height, and individual employers can use it to filter candidates. Immigrants who haven't become citizens may be seen as less skilled or less likely to remain in the country. On the other hand, because it is relatively difficult to gain citizenship in Switzerland, it may act as a kind of credential.

A closer look at the data bears this out. Citizenship made the greatest difference for immigrants facing obstacles--those likely to be discriminated against for their religion or country of origin, or those in low-wage occupations. When the researchers focused on immigrants from Turkey and the former Yugoslavia, who were often refugees and potentially targets of anti-Muslim sentiment, they found an average yearly earnings gain of 10,721--roughly double that of the new citizens as a whole.

According to Dominik Hangartner, a professor of public policy at ETH Zurich, "the finding that the benefits are disproportionally larger for poorer and more marginalized immigrants speaks to the important role that citizenship policies can play in facilitating more equal access to employment opportunities for immigrants."

While income is only one element of an immigrant's life, the persistence of the earnings gap revealed in this study raises an important question about the public purpose of citizenship. We tend to think of citizenship as a private issue, personally meaningful to the immigrant but not necessarily something society or state should invest in.

But if citizenship can counter discrimination, boost social mobility, and act as a stepping stone toward deeper integration, then its benefits reach beyond immigrants themselves. That means that we all have a stake in the debate over whether to obstruct or ease access to citizenship. At a time when cities, states, and countries around the world are reconsidering their welcome to immigrants, it's all the more important to have solid evidence about the contributions newcomers can make--and the policies that best encourage them.

Credit: 
Stanford University - Immigration Policy Lab

Single dose of ketamine plus talk therapy may reduce alcohol use

NEW YORK, NY (Dec. 4, 2019)--A single infusion of ketamine, combined with outpatient behavioral therapy, helped alcohol-dependent individuals abstain from drinking for a few weeks after the treatment, researchers at Columbia University Vagelos College of Physicians and Surgeons and New York State Psychiatric Institute have found. The findings could lead to a new approach to the treatment of alcohol use disorder.

Results of the trial were published online on December 2 in the American Journal of Psychiatry.

"Our findings add to a growing body of evidence that a single dose of medications with powerful psychoactive effects, such as MDMA, psilocybin, and ketamine, may have immediate and long-lasting effects on behavior, especially when integrated with psychotherapy," says lead author Elias Dakwar, MD, associate professor of clinical psychiatry at Columbia University Vagelos College of Physicians and Surgeons.

In 2017, alcohol use disorder affected an estimated 14.1 million adults and 443,000 adolescents in the U.S., resulting in 88,000 deaths, according to the National Institute of Alcohol Abuse and Alcoholism. Those with alcohol use disorder are unable to stop or control alcohol use despite adverse social, occupational, or health consequences.

In a previous study, Dakwar and his colleagues showed that one dose of the anesthetic ketamine combined with behavioral modification therapy promoted abstinence and reduced cravings in people dependent on cocaine. "Based on those findings, we thought it would be valuable to test this combination of therapies in people with alcohol use disorder, which is much more pervasive," says Dakwar.

In the new study, 40 people with alcohol use disorder who were seeking treatment were randomized to receive either a single sub-anesthetic dose of ketamine or midazolam (a drug used to treat alcohol withdrawal). All participants also received motivational enhancement treatment, a type of psychotherapy that has shown only modest success in helping people with alcohol use disorder. The researchers hypothesized that motivational therapy may work better when combined with a single ketamine infusion at the beginning of therapy.

Ketamine plus therapy reduced alcohol use

The ketamine group had a higher likelihood of abstinence; 82% remained abstinent at the end of the study, 3 weeks after getting the infusion, compared to 65% of the midazolam group. The ketamine group also took longer to relapse and had fewer heavy drinking days than the midazolam group.

In addition, those who stopped drinking after the ketamine infusion were more likely to resume abstinence after relapse than the midazolam group.

Ketamine treatment was well tolerated, without any adverse effects or misuse of the study drugs, the researchers reported.

Ketamine may improve motivation

Researchers are not certain how ketamine helps people abstain from drinking. "One possibility is that ketamine addresses addiction-related vulnerabilities, like low motivation and low resilience, that contribute to problematic use. This may create a window where they can benefit more from behavioral treatment and lay the groundwork to meet their goals," says Dakwar.

"In our participants, ketamine appears to have increased resilience and reduced demoralization after a lapse," says Dakwar. "Participants may have been better able to bounce back after slipping, and they may have been more motivated to resume the work of recovery. In the midazolam group, on the other hand, there was a higher likelihood of escalating use after slipping and either relapsing or dropping out."

The researchers are currently studying whether multiple doses of ketamine can further improve abstinence in people with alcohol use disorder.

"It's gratifying and hopeful to see innovative research making progress in developing new treatments for substance use disorders," says Jeffrey A. Lieberman, chair of the Department of Psychiatry at Columbia University Vagelos College of Physicians and Surgeons.

Credit: 
Columbia University Irving Medical Center

First experimental genetic evidence of the human self-domestication hypothesis

image: Research study from the University of Barcelona identified a genetic network involved in the unique evolutionary trajectory of the modern human face and prosociality not found in Neanderthals.

Image: 
Thomas Ôrourke/UB

The study, published in Science Advances, results from the collaboration between a UB team led by Cedric Boeckx, ICREA professor from the Section of General Linguistics at the Department of Catalan Philology and General Linguistics, and member of the Institute of Complex Systems of the UB (UBICS), and researchers from the team led by Giuseppe Testa, lecturer at the University of Milan and the European Institute of Oncology.

An evolutionary process similar to animal domestication

The idea of human self-domestication dates back to the 19th century. It is the claim that anatomical and cognitive-behavioral hallmarks of modern humans, such as docility or a gracile physiognomy, could result from an evolutionary process bearing significant similarities to the domestication of animals.

The key role of neural crest cells

Earlier research by the team of Cedric Boeckx had found genetic similarities between humans and domesticated animals in genes. The aim of the present study was to take a step further and deliver empirical evidence focusing on neural crest cells. This is a population of migratory and pluripotent cells - able to form all the cell types in a body - that form during the development of vertebrates with great importance in development. "A mild deficit of neural crest cells has already been hypothesized to be the factor underlying animal domestication. Could it be that humans got a more prosocial cognition and a retracted face relative to other extinct humans in the course of our evolution as a result of changes affecting neural crest cells?" asks Alejandro Andirkó, PhD students at the Department of Catalan Philology and General Linguistics of the UB, who took part in the study.

To test this relationship, researchers focused on Williams Syndrome disorder, a specific human neurodevelopmental disorder characterized by both craniofacial and cognitive-behavioral traits relevant to domestication. The syndrome is a neurocristopathy: a deficit of a specific cell type during embryogenesis. In this case, neural crest cells.

In this study, researchers from the team led by Giuseppe Testa used in vitro models of Williams syndrome with stem cells derived from the skin. Results showed that the BAZ1B gene -which lies in the region of the genome causing Williams Syndrome- controls neural crest cell behavior: lower levels of BAZ1B resulted in reduced neural-crest migration, and higher levels produced greater neural-crest migration.

Comparing modern human and Neanderthal genomes

Researchers examined this gene in archaic and modern human genomes. "We wanted to understand if neural crest cell genetic networks were affected in human evolution compared to the Neanderthal genomes", Cedric Boeckx said.

Results showed that that BAZ1B affects a significant number of genes accumulating mutations in high frequency in all living human populations that are not found in archaic genomes currently available. "We take this to mean that BAZ1B genetic network is an important reason our face is so different when compared with our extinct relatives, the Neanderthals," Boeckx said. "In the big picture, it provides for the first time experimental validation of the neural crest-based self-domestication hypothesis," continues.

An empirical way to test evolutionary claims

These results open the road to studies tackling the role of neural crest cells in prosociality and other cognitive domains but is also one of the first examples of a potential subfield to test evolutionary claims. "This research constitutes one of the first studies that uses cutting-edge empirical technologies in a clinical setting to understand how humans have evolved since the split with Neanderthals, and establishes Williams Syndrome in particular as a unique atypical neurodevelopmental window onto the evolution of our species," Boeckx concludes.

Credit: 
University of Barcelona