Culture

Using virtual reality to help individuals with autism spectrum disorder

image: The premier peer-reviewed journal for authoritative research on understanding the social, behavioral, and psychological impact of today's social networking practices, including Twitter, Facebook, and internet gaming and commerce.

Image: 
Mary Ann Liebert, Inc.,publishers

New Rochelle, NY, January 28, 2020--Novel interventions using virtual reality to aid individuals with autism spectrum disorder (ASD) handle common scenarios may include helping youngsters navigate air travel. This example and more are included in a Special Issue on Virtual Reality Interventions for Autism Spectrum Disorder published in Cyberpsychology, Behavior, and Social Networking, a peer-reviewed journal from Mary Ann Liebert, Inc., publishers. Click here to read the full special issue free on the Cyberpsychology, Behavior, and Social Networking website through February 28, 2020.

The article "Virtual Reality Air Travel Training with Children on the Autism Spectrum: A Preliminary Report" was coauthored by Ian Miller, Brenda Wiederhold, PhD, Catherine Miller, and Mark Wiederhold, MD, PhD, Interactive Media Institute (San Diego, CA), Virtual Reality Medical Center (La Jolla, CA), and Speech Tree Therapy Center (Chula Vista, CA). The researchers used an iPhone and Google Cardboard device to deliver the virtual reality intervention once a week for 3 weeks to each child with ASD. At week 4, the children went to the airport for a real-world air travel experience. Both the parents and researchers observed improvement in the children's air travel skills after the intervention and all of the children were able to navigate the real-world airport under their own power.

Gregory Kuper, Kate Ksobiech, PhD, Jonathan Wickert, PhD, Frederick Leighton, and Edward Frederick, PhD coauthored the article entitled "An Exploratory Analysis of Increasing Self-Efficacy of Adults with Autism Spectrum Disorder Through the Use of Multimedia Training Stimuli." The researchers found significant increases in the participating adults' self-efficacy when a multimedia training approach was used - in this case, video and virtual reality - to teach them a vocational skill.

The article entitled "An Auti-Sim Intervention: The Role of Perspective Taking in Combating Public Stigma with Virtual Simulations" describes a study in which a virtual reality tool is intended to reduce the stigma associated with ASD and sensory overload symptoms. Coauthors Melanie Sarge, PhD, Hark-Shin Kim, PhD, and John Velez, PhD, from Indiana University (Bloomington) and Utah Valley University (Orem) exposed a group of college students to a virtual simulation called Auti-Sim, which is intended to simulate the sensory overload of ASD. Compared to students not exposed to Auti-Sim, those that engaged with the virtual simulation showed greater perspective taking, resulting in greater emotional concern, helping intentions, and willingness to volunteer.

"From education and training to treatment and assimilation, VR technologies are assisting across a range of experiences to help us better serve those with ASD. With the increasing focus on integrating a neurodiverse population into today's workforce, these new tools can assist us in achieving our inclusionary goals," says Editor-in-Chief Brenda K. Wiederhold, PhD, MBA, BCB, BCN, Interactive Media Institute, San Diego, California and Virtual Reality Medical Institute, Brussels, Belgium.

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Concordian examines the link between cognition and hearing or vision loss

image: This is Professor of Psychology Natalie Phillips.

Image: 
Concordia University

There is a long-established and widely recognized link between declines in sensory acuity — particularly hearing and vision — and cognition. Data from the Canadian Longitudinal Study on Aging (CLSA), involving tens of thousands of participants across the country aged 45 to 85, backs this up.

A recent study led by Montreal researchers asks why this relationship exists. Concordia’s Natalie Phillips and her colleagues found that poor hearing especially was linked to declines in memory and executive function in otherwise relatively healthy, autonomous, community-dwelling older adults.

Their paper, published in the journal Scientific Reports, asks if social factors — loneliness, depression and so on — also play a role in cognitive decline.

“We really want to look at individuals who have more restricted social networks and less social support,” says Phillips, professor of psychology in the Faculty of Arts and Science and the paper’s co-author. “Are the ones who are getting less brain stimulation and less social enrichment experiencing poorer cognition?”

Phillips says they really did not find strong evidence of that.

“All we can say at this point is that individuals who have poorer sensory abilities have poorer cognitive abilities, and we can’t explain it by more restricted social networks or social functioning.”

The study’s research team consisted of Phillips, Walter Wittich at the Université de Montréal, Kathleen Pichora-Fuller at the University of Toronto and Paul Mick at the University of Saskatchewan. The study’s lead author, Anni Hämäläinen, was a postdoctoral fellow working with the team.

Looking for the why

The authors examined data the CLSA collected between 2012 and 2015. None of the respondents lived in institutions or suffered from any kind of cognitive impairment. The researchers also controlled for other factors, such as age, gender, education and health status. They found the relationship between sensory acuity and cognitive function was strong in all circumstances.

“We wanted to know why this relationship between our sensory and cognitive abilities exists,” Phillips explains.

She lists four hypotheses. First is a common cause of deterioration: as individuals get older, their cognition simply deteriorates along with their hearing and vision.

A second hypothesis posits that sensory difficulty leads to poor-quality information entering the brain. Over the long term, poor-quality information leads to poorer cognitive functioning.

A third idea relies on resource expenditure. If your brain is expending a lot of energy trying to understand what is being said or what is being presented visually, there will be relatively fewer resources available to process that information cognitively.

The researchers were unable to test these first three hypotheses because they need longitudinal follow-up baseline data to compare with these baseline results. However, they were able to test the fourth hypothesis, which looks at the established link between sensory decline and negative social outcomes such as the potential for increased depression and social withdrawal and isolation.

“It becomes hard to navigate your world,” explains Phillips. “Going out to social activities or maintaining conversations becomes complicated. So we wanted to test whether this relationship between sensory function and cognition was mediated by limitations in someone’s social network.” In the end, the researchers did not find this to be the case.

Asking why not

Phillips and her colleagues note that the study of CLSA data is in its early stages. After all, this is the first wave of data made available from a study that will span 20 years. There is a lot left to learn.

For instance, they identified that poor hearing predicted poor executive function even though the majority of the tests were visual in nature. “We see this relationship between poor hearing and cognition regardless of the modality that you get the cognitive information from,” she says.

Phillips adds that her team will continue to study CLSA data going forward.

“We want to get access to the genetic data that’s available to see if there is a certain genetic profile that is more important for this sensory-cognitive relationship.”

Credit: 
Concordia University

Bad to the bone: Specific gut bacterium impairs normal skeletal growth and maturation

image: Tartrate-resistant acid phosphatase (TRAP) stained tibiae in germ-free (GF) and segmented filamentous bacteria (SFB)-monoassociated mice. Mice colonized with SFB displayed an increase in osteoclasts (stained in red), which resorb bone.

Image: 
Dr. Chad M. Novince from the Medical University of South Carolina

Microbes are often seen as pathogens that cause disease, but the picture is actually more complex than that limited view. The gut microbiome, which is the collection of microorganisms that colonize the healthy gut, is considered a supporting organ that can regulate host biological functions, including skeletal health. Researchers at the Medical University of South Carolina (MUSC) who study osteoimmunology, the interface of the skeletal and immune systems, have examined the impact of a specific microorganism, segmented filamentous bacteria (SFB), on post-pubertal skeletal development. Their results, published online on Jan. 10 in the Journal of Bone and Mineral Research Plus, showed that SFB elevated the response of specific immune cells in the gut and liver. This response led to increased osteoclast activity and decreased osteoblast activity, cumulatively impairing bone mass accrual.

"This is the first known report to show that within the complex gut microbiome, specific microbes have the capacity to effect normal skeletal growth and maturation," said Chad M. Novince, D.D.S., Ph.D., an assistant professor in the colleges of Medicine and Dental Medicine, who studies the impact of the microbiome on osteoimmunology and skeletal development.

The Novince lab focuses on the post-pubertal phase of skeletal development, the critical window of plasticity that supports the accrual of approximately 40% of a person's peak bone mass. Recent work from the lab showed that the gut microbiome heightens immune responses in the liver and bone environment, which impairs skeletal bone mass. SFB has been shown by other groups to activate the immune response of TH17 cells in an interleukin-17A (IL-17A) dependent manner. The research aimed to link these aspects of SFB-mediated immunity determine if specific gut microbes have the ability to affect skeletal health.

To study the effects of SFB and the gut microbiome on skeletal health, the Novince lab utilized a mouse model with a defined microbiota. This research was facilitated by MUSC's Gnotobiotic Animal Core, which is directed by Caroline Westwater, Ph.D., co-author and professor in the College of Dental Medicine. Available through the unique animal core, germ-free mice (no microbiota) and gnotobiotic mice (defined microbiota) provide an excellent model for studying the contributions of the microbiome to skeletal development.

"Whenever we think of bone, it's always the balance of osteoblasts and osteoclasts; the osteoblasts form the bone, and the osteoclasts resorb the bone," said Novince. "SFB colonization caused a shift in both sides of the axis: the osteoclast activity went up, and the osteoblast activity went down, which is detrimental to the skeleton."

To begin its studies, the Novince lab compared germ-free mice to SFB-monoassociated mice. Mice that had SFB exhibited a 20% reduction in trabecular bone volume - the type of bone that undergoes high rates of bone metabolism. Further examination of these mice showed that SFB colonization led to increased IL-17A levels in the gut and circulation, and enhanced osteoclast potential.

Next, the Novince lab examined whether the presence of SFB within a complex gut microbiota could influence normal skeletal development. They compared Taconic specific-pathogen-free mice that had SFB as part of their microbiome to Taconic specific-pathogen-free mice that lacked SFB.

The presence of SFB within a complex microbiota led to reduced trabecular bone volume, which was attributed to increased osteoclast activity and decreased osteoblast activity. SFB colonization led to a proinflammatory immune state in the gut, with increased numbers of myeloid-derived suppressor cells (MDSCs) and M1-macrophages in the lymph nodes associated with the gut. Moreover, SFB colonization increased IL-17A levels in the gut and circulation.

Interestingly, the SFB presence in a complex gut microbiota also profoundly stimulated hepatic immunity. SFB colonization upregulated proinflammatory immune factors in the liver and increased TH17 cells in the liver draining lymph nodes. Furthermore, the presence of SFB led to higher levels of acute phase reactants - immune factors that are made in the liver and secreted into the circulation. Lipocalin-2 (LCN2) was particularly interesting because it is an antimicrobial peptide that influences bone metabolism.

SFB colonization resulted in increased circulating levels of IL-17A and LCN2, which are both factors that support osteoclast activity and suppress osteoblast activity. Taken together, these data show that SFB plays a critical role in regulating the immune response in both the gut and liver which has significant effects on the skeleton and provides strong support that gut microbiota actions on the skeleton are mediated in part through a Gut-Liver-Bone Axis.

Additionally, the contribution of SFB to skeletal health may have significant clinical implications.

"If we can prevent the colonization or deplete specific microbes such as SFB from the microbiome, there is a clinical potential to optimize bone mass accrual during post-pubertal skeletal development," said Novince.

It is known that diet, probiotics and antibiotics have significant effects on the make-up of the microbiome, including SFB colonization. A majority of a person's bone mass accrues during adolescence, or postpubescently. As people age, they slowly begin to lose bone mass, which puts them at risk for fractures and osteoporosis. Modulation of SFB, through non-invasive interventions such as diets or probiotics, could allow for the build-up or optimization of peak bone mass accrual during adolescence. This would limit the risk for aging associated low bone mass and related fractures.

In summary, Novince's group has shown that within the complex microbiome, a specific commensal microbe, SFB, has the capacity to critically change the way the microbiome interacts with the host immune system and skeleton. This immune response, mediated through a Gut-Liver-Bone Axis, induces a pro-osteoclast and anti-osteoblast environment in the bone, which impairs normal skeletal growth and maturation. The strong contribution of hepatic immunity warrants further investigation into how the acute phase products may have a feedback effect on the microbiome or additional effects on the host.

Credit: 
Medical University of South Carolina

Harrington Seed Destructor kills nearly 100 percent of US agronomic weed seeds in lab study

image: A new University of Illinois study tested the weed-seed-killing power of the Harrington Seed Destructor, an impact mill used to reduce the weed seed bank at harvest.

Image: 
Lauren Quinn, University of Illinois

URBANA, Ill. - In the battle against herbicide-resistant weeds, farmers are increasingly eager to add non-chemical control methods to their management toolbox. Impact mills, which destroy weed seeds picked up by a combine, have been shown to kill 70-99% of weed seeds in soybeans, wheat, and other small-statured cropping systems. And a recent Weed Science study from the University of Illinois shows even seeds that appear unscathed after impact milling don't germinate the following spring.

"Harvest weed seed control is really becoming an accepted part of integrated weed management," says Adam Davis, study co-author and head of the Department of Crop Sciences at U of I. "Producers are excited about it."

In the current study, Davis and his collaborators wanted to see how the Harrington Seed Destructor (HSD), an impact mill developed and widely used in Australia, handled common U.S. agronomic weeds without the complications of real field conditions.

The researchers collected seeds from 10 common weed species in soybean fields in the U.S. Midwest and Mid-Atlantic regions. They fed the seeds through a stationary HSD, and then tried germinating them in a greenhouse and in the field following a typical Illinois winter.

Davis says 0 to 15% of the seeds appeared to be undamaged immediately after milling, regardless of species and seed size. But when the undamaged seeds were buried in the field and left through the winter, fewer than 10% survived. "Basically, almost zero survived overall."

Based on his previous research, Davis thinks microscopic abrasions from the impact mill damage the seed coat enough for microbes to enter and destroy the embryonic weed inside.

Can producers expect nearly zero weed seed survival when using the HSD or other impact mills in the field? Probably not. Davis and his collaborators have been conducting U.S. field trials with the HSD for five years, and typically see a reduction in weed seed rain by 70 to 80%.

"The difference between its efficacy as a stationary device and its efficacy in the field is largely due to shattering of the weeds," Davis explains. "As the combine is going through, it's shaking everything and causing a lot of seed dispersal. By looking at the HSD as a stationary device, we're able to quantify the theoretical max."

Whether impact mills kill 70 or 99% of weed seeds, non-chemical control strategies are important in slowing the evolution of herbicide resistance. However, over-reliance on any one strategy could select for additional problematic traits in weeds.

"If producers start using this device on a large scale, they will ultimately select for earlier shattering. It's already been shown in Australia," Davis says. "That's just the nature of weed and pest management in general. Really what you're doing is managing evolution. In order for any tactic to be successful, you've got to change it up. You need to confuse them; add diversity in the time of year and life stages you're targeting. We're just proposing this as a new tactic that's effective - not the only tactic."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Researchers generalize Fourier's heat equation, explaining hydrodynamic heat propagatio

image: The new equations explain why and under which conditions heat propagation can become fluid-like, rather than diffusive.

Image: 
@Michele Simoncelli, EPFL

Fourier's well-known heat equation, introduced in 1822, describes how temperature changes in space and time when heat flows through a material. In general, this formulation works well to describe heat conduction in objects that are macroscopic (typically, a millimeter or larger), and at high temperatures. It fails, however, in describing so-called "hydrodynamic heat phenomena".

One such phenomenon is Poiseuille heat flow, where the heat flux becomes similar to the flow of a fluid in a pipe: it has a maximum in the center and minima at the boundaries, suggesting that heat propagates as a viscous-fluid flow. Another, called "second sound" takes place when heat propagation in a crystal is akin to that of sound in air: given portions of the crystal oscillate quickly between being hot and cold, instead of following the gentle temperature variation observed in the usual (diffusive) propagation.

Neither of these phenomena is described by Fourier's equation. Until now, researchers have only been able to analyze these phenomena using microscopic models, whose complexity and high computational cost have hindered both understanding and application to anything but the simplest geometries. In contrast, in developing the novel viscous heat equations, MARVEL researchers have condensed all the relevant physics underlying heat conduction into accurate and easily-solvable equations. This introduces a novel basic research tool for the design of electronic devices, especially those integrating diamond, graphene or other low-dimensional or layered materials where hydrodynamic phenomena are now understood to be prevalent.

The work is particularly timely. While these heat hydrodynamic phenomena have been observed since the 1960s, they were only seen at cryogenic temperatures (around -260 degrees C) and therefore thought to be irrelevant for everyday applications. These beliefs suddenly changed last March with the publication in Science of pioneering experiments that found second-sound (or wavelike) heat propagation in graphite employed in several engineering devices and a promising material for next-generation electronics at the record temperature of -170 degrees C.

The novel formulation presented in the paper Generalization of Fourier's law into viscous heat equations yields results for graphite that are in striking agreement with these experiments and also predicts that this hydrodynamic heat propagation can be observed in diamond even at room temperature. This prediction is awaiting experimental confirmation, which would establish a new record for the maximum temperature at which hydrodynamic heat transfer is observed.

The work is very relevant for applications, since such hydrodynamic heat propagation can emerge in materials for next-generation electronic devices, where overheating is the main limiting factor for miniaturization and efficiency. Knowing how to handle the heat generated in these devices is critical to understanding how to maximize their efficiency, or even predict if they will work or just melt because of overheating. The paper provides new and original insights into transport theories, and also paves the way towards the understanding of shape and size effects in, e.g., new-generation electronic devices and so-called "phononic" devices that control cooling and heating. Finally, this novel formulation can be adapted to describe viscous phenomena involving electricity, discovered by Philip Moll in 2017, now professor at the Institute of Materials at EPFL.

For the mathematically inclined

In this work, MARVEL researchers have coarse-grained the microscopic integro-differential phonon Boltzmann transport equation into mesoscopic (simpler) differential equations, which they have called "viscous heat equations". These viscous heat equations capture the regime where the atomic vibrations in a solid ("phonons") assume a collective ("drift") velocity akin to that of a fluid. They have shown how thermal conductivity and viscosity can be determined exactly and in a closed form as a sum over the eigenvectors of the scattering matrix (the "relaxons", a concept introduced in 2016 by Cepellotti, for which he was awarded the IBM Research Prize and the Metropolis Prize of the American Physical Society). Relaxons have well-defined parities, with even relaxons determining the thermal viscosity and odd relaxons determining the thermal conductivity, and thermal conductivity and viscosity govern the evolution of the temperature and drift-velocity fields in these two coupled viscous heat equations.

In the paper, the scientists also introduced a Fourier deviation number (FDN), a dimensionless parameter that quantifies the deviation from Fourier's law due to hydrodynamic effects. The FDN is a scalar descriptor that captures the deviations from Fourier's law due to viscous effects, playing a role analogous to the Reynolds number for fluids, which is a parameter that engineers use to distinguish the different possible behaviors of the solutions to the Navier-Stokes equations.

Credit: 
National Centre of Competence in Research (NCCR) MARVEL

Antianxiety and antidepressant effects from a single dose of psychedelic drug persist years later in cancer patients

Following up on their landmark 2016 study, researchers at NYU Grossman School of Medicine found that a one-time, single-dose treatment of psilocybin, a compound found in psychedelic mushrooms, combined with psychotherapy appears to be associated with significant improvements in emotional and existential distress in cancer patients. These effects persisted nearly five years after the drug was administered.

In the original study, published in the Journal of Psychopharmacology, psilocybin produced immediate, substantial, and sustained improvements in anxiety and depression and led to decreases in cancer-related demoralization and hopelessness, improved spiritual well-being, and increased quality of life. At the final 6.5-month follow-up assessment, psilocybin was associated with enduring antianxiety and antidepressant effects. Approximately 60 percent to 80 percent of participants continued with clinically significant reductions in depression or anxiety, sustained benefits in existential distress and quality of life, as well as improved attitudes toward death.

The present study, publishing online Jan. 28 in the same journal, is a long-term follow-up (with assessments at about 3 years and 4.5 years following single-dose psilocybin administration) of a subset of participants from the original trial. The study reports on sustained reductions in anxiety, depression, hopelessness, demoralization, and death anxiety at both follow-up points.

Approximately 60 percent to 80 percent of participants met criteria for clinically significant antidepressant or anxiolytic responses at the 4.5 year follow-up. Participants overwhelmingly (71 to 100 percent) attributed positive life changes to the psilocybin-assisted therapy experience and rated it among the most personally meaningful and spiritually significant experiences of their lives.

"Adding to evidence dating back as early as the 1950s, our findings strongly suggest that psilocybin therapy is a promising means of improving the emotional, psychological, and spiritual well-being of patients with life-threatening cancer," says the 2016 parent study's lead investigator, Stephen Ross, MD, an associate professor of psychiatry in the Department of Psychiatry at NYU Langone Health. "This approach has the potential to produce a paradigm shift in the psychological and existential care of patients with cancer, especially those with terminal illness."

An alternative means of treating cancer-related anxiety and depression is urgently needed, says Ross. According to statistics from several sources, close to 40 percent of the global population will be diagnosed with cancer in their lifetime, with a third of those individuals developing anxiety, depression, and other forms of distress as a result. These conditions, experts say, are associated with poorer quality of life, increased rates of suicide, and lowered survival rate. Unfortunately, conventional pharmacologic treatment methods like antidepressants work for less than half of cancer patients and tend to not work any better than placebos. In addition, they have no effect whatsoever on existential distress and death anxiety, which commonly accompany a cancer diagnosis and are linked to a hastened desire for death and increased suicidality, says Ross.

The researchers say psilocybin may provide a useful tool for enhancing the effectiveness of psychotherapy and ultimately relieving these symptoms. Although the precise mechanisms are not fully understood, experts believe that the drug can make the brain more flexible and receptive to new ideas and thought patterns. In addition, previous research indicates that the drug targets a network of the brain, the default mode network, which becomes activated when we engage in self-reflection and mind wandering, and which helps to create our sense of self and sense of coherent narrative identity. In patients with anxiety and depression, this network becomes hyperactive and is associated with rumination, worry, and rigid thinking. Psilocybin appears to acutely shift activity in this network and helps people to take a more broadened perspective on their behaviors and lives.

How the Original Research and Follow-up Were Conducted

For the original study, the NYU Langone team provided 29 cancer patients with nine psychotherapy sessions, as well a single dose of either psilocybin or an active placebo, niacin, which can produce a physical flush sensation that mimics a psychedelic drug experience. After seven weeks, all participants swapped treatments and were monitored with clinical outcome measures for anxiety, depression, and existential distress, among other factors.

Although researchers found that the treatment's antianxiety and antidepressant qualities persisted 6.5 months after the intervention, little was known of the drug's effectiveness in the long term. The new follow-up study is the longest-spanning exploration of psilocybin's effects on cancer-related psychiatric distress to date, the study authors say.

"These results may shed light on how the positive effects of a single dose of psilocybin persist for so long," says Gabby Agin-Liebes, PhD candidate, lead investigator and lead author of the long-term follow-up study, and co-author of the 2016 parent study. "The drug seems to facilitate a deep, meaningful experience that stays with a person and can fundamentally change his or her mindset and outlook," she says.

Agin-Liebes, who is pursuing her PhD in clinical psychology at Palo Alto University in California, cautions that psilocybin does not inherently lead to positive therapeutic effects when used in isolation, and in uncontrolled, recreational settings, and "should be taken in a controlled and psychologically safe setting, preferably in conjunction with counseling from trained mental health practitioners or facilitators," she adds.

Next, the researchers plan to expand this research with larger trials in patients from diverse socioeconomic and ethnic groups who have advanced cancer-related psychiatric and existential distress.

"This could profoundly transform the psycho-oncologic care of patients with cancer, and importantly could be used in hospice settings to help terminally ill cancer patients approach death with improved emotional and spiritual well-being," says Ross.

Credit: 
New York University

Synthesis considers how being smart helps you at school and school helps you become smarter

Academic achievement plays an important role in children's development because academic skills, especially in reading and math, affect many outcomes, including educational attainment, performance and income at work, health, and longevity. A new synthesis looked at the relation between academic achievement (reading, math) and cognitive abilities (working memory, reasoning, executive function), and offered suggestions on how to improve educational and cognitive outcomes.

The synthesis was carried out by researchers at the University of Texas at Austin and the Medical Research Council (MRC) Cognition and Brain Sciences Unit at the University of Cambridge. It is published in Child Development Perspectives, a journal of the Society for Research in Child Development.

"It's widely thought that being smart helps you do better in school, but does doing better in school make you smarter?" asks Peng Peng, assistant professor of special education at the University of Texas at Austin, the lead author of the synthesis. "Research has long supported the idea that cognitive abilities are foundational, that is, that being smart leads to better academic achievement. For example, students who learn how to solve math problems at school can develop the reasoning abilities to solve problems in real life .We found that sustained and high-quality education directly fosters children's academic and cognitive development, and that it may indirectly affect academic and cognitive development by triggering a sort of bidirectional action that amplifies both."

This bidirectional action is especially important for children with disadvantages, who often lack the resources or foundational skills to trigger and benefit from it. The authors note that short-term cognitive training may be insufficient to improve academic performance. This is because beneficial relations between academic skills (reading and math) and cognitive abilities (working memory, reasoning, and executive function) are modest and take time to develop. However, over time, such modest effects can have large and lasting impacts.

"This emerging field suggests that it's better to think of school-based skills such as reading and math, as well as cognitive abilities such as memory and reasoning, as part a system that has positive interactions among each other and that together, support development," concludes Rogier A. Kievit, group leader at the MRC Cognition and Brain Sciences Unit at the University of Cambridge, who coauthored the synthesis. "The ultimate hope is to support both cognitive abilities and academic skills by better understanding these processes."

Credit: 
Society for Research in Child Development

New study identifies bumble bees' favorite flowers to aid bee conservation

image: A Vosnesensky bumble bee (Bombus vosnesenskii) and a Nevada bumble bee (Bombus nevadensis) feed on Rydberg's penstamon (Penstamon rydbergii) flowers in the Plumas National Forest.

Image: 
Travis DuBridge, The Institute for Bird Populations

Annapolis, MD; January 28, 2020--Many species of North American bumble bees have seen significant declines in recent decades. Bumble bees are essential pollinators for both native and agricultural plants, and their ability to fly in colder temperatures make them especially important pollinators at high elevation. Bumble bee declines have been attributed to a handful of factors, including lack of flowers. Not all flowers are used equally by bumble bees, and determining which flowers bumble bees use can aid bumble bee conservation by identifying the specific plants they need to thrive.

New research published in the journal Environmental Entomology examines which flowers bumble bees select in the Sierra Nevada region of California. Researchers from The Institute for Bird Populations, the University of Connecticut, and the USDA Forest Service compared which species of flowers the bees used relative to the availability of each flower species across the landscape. They found that each bumble bee species in the study selected a different assortment of flowers, even though the bees were foraging across the same landscape. This information can be useful to land managers who are restoring or managing meadows and other riparian habitat for native bumble bees.

This study of bumble bee floral use was unusual because it factored in the relative availability of each plant species to the bees. "It's important to consider the availability of plants when determining what's selected for by bees," says Jerry Cole, a biologist with The Institute for Bird Populations and lead author of the study. "Often studies will use the proportion of captures on a plant species alone to determine which plants are most important to bees. Without comparison to how available those plants are, you might think a plant is preferentially selected by bees, when it is simply very abundant."

The researchers captured, identified, and released bumble bees at over 400 sample plots in the Plumas National Forest. They recorded which flower each bee was captured on and estimated the availability of flowers in the plot. Though they captured bumble bees on over 100 different species of flower, only 14 of these plant species were preferentially selected by any of the bumble bee species. Thirteen species of bumble bees were captured; among the five most common species, each selected a different assortment of flowers, and each selected at least one flower species that was not selected by the other bumble bee species.

In addition, the study found some previously undocumented bumble bee-plant associations; for instance, Bombus bifarius (sometimes known as the two-form bumble bee) preferentially selected thickstem aster flowers Eurybia integrifolia, while the black tail bumble bee (Bombus melanopygus) preferentially selected Rydberg's penstemon flowers (Penstemon rydbergii). They also found that bees use different species of flowers at different times in the summer, as early blooming flowers are replaced by late-blooming flowers.

"We discovered plants that were big winners for all bumble bee species but, just as importantly, plant species that were very important for only a single bumble bee species," says Helen Loffland, a meadow species specialist with The Institute for Bird Populations. "This study allowed us to provide a concise, scientifically based list of important plant species to use in habitat restoration that will meet the needs of multiple bumble bee species and provide blooms across the entire annual lifecycle."

The USDA Forest service has already put the study's results to use. "Restoration planning on the Plumas National Forest is already using these results to identify areas where restoration efforts may increase availability or improve quality of bumble bee habitat," says co-author Matthew Johnson, who is the Wildlife, Fish, Rare Plants and Invasive Species Program Manager at the Plumas National Forest. In addition, Forest Service personnel have collected seed from plants favored by bumble bees and are experimenting with the best way to use in them in seed mixes. Johnson says he also hopes to work with students at Greenville High School in Greenville, California, and Feather River College in Quincy, California, to propagate bumble bee-friendly plants for use in restoration.

A better understanding of bumble bees' flower preferences will aid in conservation of declining bee species. "This sort of knowledge can really increase the effectiveness of restoration for bumble bees," says Loffland, "and in a way that is relatively easy and cost-effective to implement."

Credit: 
Entomological Society of America

Mayo medical student jump-starts curriculum to identify human trafficking

PHOENIX -- Human trafficking is a growing international public health concern. An estimated 400,000 people in the U.S. are affected, with as many as 88% of victims having seen a health care professional while they were being trafficked.

As human trafficking evolves as a health concern, medical schools are starting to include the topic in education. However, it's still in the early stages, says a Mayo Clinic study in the American Journal of Preventive Medicine. The research was led by third-year medical student at Mayo Clinic Alix School of Medicine, Jennifer Talbott, who suggested that human trafficking training be included in the curriculum at the school.

Working with the medical school faculty, Talbott helped develop coursework to train fellow students to identify and provide resources to potential victims of human trafficking. Talbott's adviser, Juliana Kling, M.D., a Mayo Clinic women's health internist, says training in identifying and providing resources to human trafficking victims is essential for medical school students.

"If we aren't trained to identify that they are victims, then they will continue being trafficked," says Dr. Kling, who is co-medical director of the Student Community Clinic at Mayo Clinic in Arizona. This clinic teaches the social determinants of health in a clinical setting to second-year medical students.

Many organizations have called for medical schools to train students to recognize the signs of trafficking and care for these patients. However, few standardized training resources are available, according to Talbott's study. So far, only four medical schools have published about their curricula specific to training on human trafficking.

The study points out that a robust educational curriculum "has the potential to close remaining educational gaps, allowing improved identification and treatment of those suffering from sex trafficking."

The study showcased the lack of information and training available to medical students by returning a low number of results when searching for published educational resources. Only 11 articles were identified in the study.

"This highlights an opportunity for improvement, since sex trafficking has become a priority on the public health agenda," says Talbott. "M.D. and D.O. (medical) schools could look to the current published curricula or consider sharing resources to identify an educational curriculum on sex trafficking that can be integrated into their existing programs."

The study also found that among the limited published resources available for medical school students, there were discrepancies in how the material addressed legal and security issues for victims.

Talbott and the Mayo Clinic Arizona American Medical Women's Association chapter also provide free training to classmates and other health care professionals around Arizona in coordination with Physicians Against the Trafficking of Humans of the American Medical Women's Association. This training includes simulated patient experiences and lectures that are now integrated into Mayo Clinic Alix School of Medicine curriculum. Though in its early stages, the training has received positive feedback from other medical students.

"Medical students go into medicine to help patients, and it is our duty to teach them how to identify and provide guidance to victims of trafficking," says Dr. Kling. "The curriculum we are developing will hopefully close this important gap."

Credit: 
Mayo Clinic

Novel insight into chromosome 21 and its effect on Down syndrome

A UCL-led research team has, for the first time, identified specific regions of chromosome 21, which cause memory and decision-making problems in mice with Down syndrome, a finding that provides valuable new insight into the condition in humans.

Most people have 46 chromosomes in each cell, divided into 23 pairs: people with Down syndrome (DS) have an extra copy of chromosome 21, which carries over 200 genes.

In this study, published in Cell Reports, researchers at UCL, supported by Cardiff University and the Francis Crick Institute, used mouse models to try and find out how having these extra genes causes learning disability.

Chromosome 21 and its genes are also found in mice, although the genes have dispersed onto three smaller regions on three different mouse chromosomes. These are mouse chromosomes 16, 10 and 17 containing 148 genes, 62 genes and 19 genes respectively.

The researchers looked at the effect of the genes in each of these three different mouse regions (chromosomes) on learning and memory. To do this three different mouse strains (groups of mice), were genetically modified to carry an extra copy of one of the gene groups on mouse chromosomes 16, 10 or 17.

During navigation tests, where mice needed to negotiate a simple 'left-right' T-maze, each group was measured for both memory and decision-making ability.

During these tests, the electrical activity of brain regions important for memory and decision making was also monitored, using an electroencephalogram (EEG).

The researchers found that one of the mouse strains ('Dp10Yey' mice) had worse memory, and had irregular brain circuity (signals) in a part of the brain called hippocampus - which is known to be very important for memory.

They also found another strain ('Dp1Tyb' mice) had worse decision-making ability and had poor brain signalling between the hippocampus and the pre-frontal cortex - needed for planning and decision-making. And the third strain ('Dp17Yey' mice) had no unusual electrical activity in the brain.

Co-author, Professor Matthew Walker (UCL Queen Square Institute of Neurology), said: "These findings are a complete surprise - we did not expect the three different gene groups would act completely differently.

"Scientists have traditionally worked on the hypothesis that a single gene, or single genes, was the likely cause of intellectual disabilities associated with Down syndrome.

"We have shown - for the first time - that different and multiple genes are contributing to the various cognitive problems associated with Down syndrome."

Researchers will now look to discover specifically which gene or genes, within the smaller gene groups, are responsible for impaired memory and decision-making.

Corresponding author Professor Elizabeth Fisher (UCL Queen Square Institute of Neurology) said: "Our study provides critical insights into the mechanisms underlying neuro-disability in Down syndrome and indicates that intellectual disability in Down syndrome may result from different underlying genetic, functional and regional brain abnormalities.

"This implies that therapies for people with Down syndrome should perhaps target multiple processes, and we have made the initial steps in identifying what some of these processes are."

Credit: 
University College London

New study shows why women have to be likeable, and men don't

A new study in The Economic Journal finds that likeability is an influencing factor in interactions between women, as well as interactions between men and women, but not in all-male interactions.

The researchers conducted experiments where participants rated the likeability of other participants, based on photographs. The participants were divided in to pairs, shown the photograph of their partner beforehand, and learned how their partner rated them. The pairs then played games with each other where rewards depended on the degree of cooperation.

In one version, participants chose to contribute any integer value out of an initial endowment of 6 euros to a joint project. Overall, men contributed on average 4.05 euros, and women contributed 3.92 euros. Researchers found that in same-sex pairings, men in low as well as high mutual likeability teams contributed similar amounts, suggesting likeability was not a factor in determining contribution. However if mutual likeability in all-female teams was low, women contributed 30% less on average.

In mixed-sex pairings for the cooperation game, female participants contributed on average 4.70 euros in high mutual likeability teams, and about 37% less in low mutual likeability teams. In contrast to same sex teams, the likeability effect for men factored in mixed sex teams. If mutual likeability was low, men's contribution was 50% lower than if mutual likeability was high.

In the ten round coordination game, researchers found that women in same-sex pairings chose significantly lower numbers in low mutual likeability teams than in high mutual likeability teams in each round of the game. Male participants in same-sex pairings chose high numbers from the start, regardless of the level mutual likeability. In mixed sex teams, mutual likeability was on average positively associated with the number chosen for both women and men.

"Our results hint at the existence of a likeability factor that offers a novel perspective on gender differences in labour market outcomes," said Leonie Gerhards, the paper's lead author. "While likeability matters for women in every one of their interactions, it matters for men only if they interact with the opposite sex."

Researchers concluded that for women, likeability is an asset in all interactions. For men, likeability matters only in interactions with the opposite sex. Results suggest that the likeability factor leads to considerable advantages in terms of average performance and economic outcomes for men.

Credit: 
Oxford University Press USA

Scientists short-circuit maturity in insects, opening new paths to disease prevention

image: Image depicting the effect of the steroid hormone ecdysone on brain development in a fruit fly. The green color shows the cell layer that forms the blood-brain barrier, which physically separates the fruit fly brain from the circulation.

Image: 
Naoki Yamanaka / UCR

New research from UC Riverside shows scientists may soon be able to prevent disease-spreading mosquitoes from maturing. Using the same gene-altering techniques, they may also be able help boost reproduction in beneficial bumblebees.

The research shows that, contrary to previous scientific belief, a hormone required for sexual maturity in insects cannot travel across a mass of cells separating the blood from the brain -- unless it is aided by a transporter protein molecule.

"Before this finding, there had been a longstanding assumption that steroid hormones pass freely through the blood-brain barrier," said Naoki Yamanaka, an assistant professor of entomology at UCR, who led the research. "We have shown that's not so."

The study, published this month in the journal Current Biology, details the effects on sexual maturity in fruit flies when the transporter protein is blocked.

Blocking the transporter not only prevented the steroid from entering the brain, it also permanently altered the flies' behavior. When flies are in their infancy or "maggot" stage, they usually stay on or in a source of food.

Later, as they prepare to enter a more adult phase of life, they exhibit "wandering behavior," in which they come out of their food to find a place to shed their outer body layer and transform into an adult fly.

When the transporter gene was blocked, Yamanaka said the flies entered a median stage between infancy and adulthood, but never wandered out of their food, and died slowly afterward without ever reaching adulthood or reproducing.

"Our biggest motivation for this study was to challenge the prevailing assumption about free movement of steroids past the blood-brain barrier, by using fruit flies as a model species," Yamanaka said. "In the long run, we're interested in controlling the function of steroid hormone transporters to manipulate insect and potentially human behaviors."

Currently, Yamanaka is examining whether altering genes in mosquitoes could have a similar effect. Since mosquitoes are vectors for numerous diseases, including Zika, West Nile Virus, malaria and Dengue fever, there is great potential for the findings to improve human health.

Conversely, there may be a way to alter the genes to manipulate reproduction in beneficial insects as well, in order to help them. Bumblebees, whose populations have been declining in recent years, pollinate many favorite human food crops.

Also, there is the potential for this work to more directly impact humans. Steroid hormones affect a variety of behaviors and reactions in the human body. For example, the human body under stress makes a steroid hormone called cortisol. It enters the brain so humans can cope with the stressful situation.

However, when chronic stress is experienced, cortisol can build up in the brain and cause multiple issues. "If the same machinery exists for cortisol in humans, we may be able to block the transporter in the blood-brain barrier to protect our brain from chronic stress," Yamanaka said.

"It's an exciting finding," said study author Yamanaka. "It was just in flies, but more than 70% of human disease-related genes have equivalents in flies, so there is a good chance this holds true for humans too."

Credit: 
University of California - Riverside

More than a knee injury: ACL tears cause harmful changes in our brain structure

ANN ARBOR--It's known that some joint function is often permanently lost after anterior cruciate ligament reconstruction, and re-injury is common even with intensive physical therapy, but it's unclear why.

New research from the University of Michigan School of Kinesiology shows structural changes in the brains of patients who underwent ACL reconstruction. These changes hinder recovery and may contribute to performance deficits and re-injury, says study co-author Lindsey Lepley, U-M assistant professor of athletic training.

Lindsey Lepley and colleague Adam Lepley, clinical assistant professor of athletic training, took MRI brain scans of 10 ACL-reconstructed patients. The scans showed that part of the corticospinal tract--the pathway that scuttles messages from brain to muscles--had atrophied in the patients.

The corticospinal tract runs from front to back through both hemispheres of the brain. The side of the tract that controls the ACL-reconstructed knee was about 15% smaller than on the uninjured side, the researchers say.

Think of the altered corticospinal tract as a traffic tunnel that narrows, letting fewer cars pass through, they say. In the ACL reconstructed patients, less information gets from the brain to the muscle because less information can travel along the smaller tract.

"In essence, the brain not only alters the way it communicates with the rest of the body, joints, muscles, etc., but the structural makeup of the basic building blocks of the brain are also changed after ACL injury," Adam Lepley said. "We think that this is a protective mechanism, in which our body is trying to limit unwanted movement around a joint injury ... and can be applied to not just ACL injuries, but other musculoskeletal injuries as well."

Another recent study shows that downstream neural activity in the quadriceps is impaired during sport-like movements after ACL surgery, which suggests that poor brain structure and communication can lead to reduced functioning, the researchers say.

The bottom line for patients and clinicians is that a knee injury is not just about knees--other areas, like the brain structure, are negatively impacted, too.

"It means that during treatment, a systemic approach should be taken not just to improve range of motion or swelling at the injured joint, but also consider other impairments like poor movement patterns and muscle activation in order to get better outcomes," Lindsey Lepley said. "There is evidence of using visual retraining, different motor learning modalities like external focus of attention and biofeedback, which can help 'rewire' the brain to help the body adapt to a new normal."

Credit: 
University of Michigan

Study examines genetic testing in diverse young breast cancer patients over a decade

image: Tarsha Jones, Ph.D., lead author of the study and an assistant professor in FAU's Christine E. Lynn College of Nursing.

Image: 
Florida Atlantic University

Breast cancer patients diagnosed under age 50 represent 18 percent of new invasive breast cancer cases in the United States. Compared to postmenopausal women, younger women are more likely to develop aggressive subtypes of breast cancer, have a worse prognosis with increased risk of recurrence, and have higher overall mortality. Young breast cancer patients also are more likely to be diagnosed with triple-negative breast cancer, which is associated with a higher frequency of BRCA1 genetic mutations.

Although breast cancer mortality has steadily declined over the past decade, disparities in both incidence and mortality persist across racial/ethnic and socioeconomic groups. Among women younger than 40 years old, non-Hispanic black women have a higher incidence of breast cancer compared to non-Hispanic white women. Although Hispanic women are at lower risk for breast cancer than non-Hispanic White women, they have the second highest prevalence of BRCA1/2 gene mutations after Ashkenazi Jewish women and breast cancer is the leading cause of cancer death among Hispanic women. The frequency of variants of uncertain significance remains high among Asian women as they are underrepresented in genomic research.

There has been increasing awareness of multigene panel testing, including BRCA1 and BRCA2 (BRCA1/2) genetic testing for breast and ovarian cancers in recent years. Despite this increase in public awareness, few studies in the U.S. have addressed germline genetic testing in young racially/ethnically diverse breast cancer patients in an era of multigene panel testing.

Tarsha Jones, Ph.D., a researcher at Florida Atlantic University's Christine E. Lynn College of Nursing, collaborated with researchers from Columbia University Irving Medical Center to examine racial and ethnic differences in genetic testing frequency and results (pathogenic/likely pathogenic, variants of uncertain significance, negative) among diverse breast cancer patients diagnosed at age 50 or younger from January 2007 to December 2017.

Results of the study, published in the Journal of Cancer Education, showed that among 1,503 diverse young breast cancer patients, less than half (46.2 percent) completed hereditary breast and ovarian cancer genetic testing. However, the percentage of women who completed genetic testing increased over time from 15.3 percent in 2007 to a peak of 72.8 percent in 2015. The study also found racial/ethnic differences in genetic testing results with non-Hispanic blacks and whites having the highest frequency of pathogenic/likely pathogenic variants (18.2 percent and 16.3 percent, respectively), whereas Asians and Hispanics had the highest frequency of variants of uncertain significance (21.9 percent and 19 percent, respectively).

Since the burden of breast cancer is particularly high among young black women, with a mortality rate that is two times greater among young women of European ancestry, there is a need to engage more young black breast cancer patients in genetic counseling education and to highlight the importance of performing hereditary breast and ovarian cancer genetic testing for minority women.

"Although multigene panel testing offers more comprehensive cancer risk assessment, there is greater uncertainty in clinical decision-making due to increased likelihood of variants of uncertain significance, particularly among racial/ethnic minorities who have been found to have more frequent variants of uncertain significance results compared to non-Hispanic whites," said Jones, lead author and an assistant professor in FAU's Christine E. Lynn College of Nursing.

Participants who completed genetic testing were more likely to be younger, be married, have a family history of breast cancer, have stage 1 breast cancer, and be diagnosed after 2013. There were no significant differences in the completion of genetic testing based upon race/ethnicity or primary health insurance status. Compared to patients diagnosed before 2008, women diagnosed after 2013 were more than 10 times more likely to have genetic testing.

"Our study provides insights into pathogenic/likely pathogenic variants and variants of uncertain significance in breast and ovarian cancer susceptibility genes among young breast cancer patients and highlights the need to increase genetic testing completion among racially/ethnically diverse populations," said Jones.

In addition, the study found that women with metastatic breast cancer were more than 60 percent less likely to undergo genetic testing compared to women with stage 1 disease. Women who were older at breast cancer diagnosis also were less likely to have genetic testing. The odds of a young woman completing genetic testing increased nearly three-fold if she had a family history of breast cancer. Compared to patients with stage 1 breast cancer, those with stage 0 or stage 4 disease at diagnosis were less likely to complete genetic testing.

According to Jones, next steps include conducting a future study that examines the impact of receiving pathogenic/likely pathogenic genetic testing results and promoting family risk communication since pathogenic variants can be inherited.

"This study underscores the importance of increasing awareness of the importance of genetic testing among young minority women with breast cancer," said Safiya George, Ph.D., dean of FAU's Christine E. Lynn College of Nursing. "It also signals that these patients' health care providers and future health care providers of this population need to be reminded and/or made aware of the critical need for them to receive this important multigene panel testing."

Among 1,503 evaluable patients, the mean age was 42.7 years, 42.4 percent were non-Hispanic white, 13.3 percent were non-Hispanic black, 25.5 percent were Hispanic, 9.9 percent were Asian, and 8.9 percent were other or unknown race/ethnicity. The majority (60.5 percent) of patients had private insurance, 22.8 percent had Medicaid, 7.2 percent had Medicare (either due to disability or existing co-morbidities), and 9.4 percent had other insurance or were uninsured.

Credit: 
Florida Atlantic University

Increasing tropical land use is disrupting the carbon cycle

An international study led by researchers at Lund University in Sweden shows that the rapid increase in land use in the world's tropical areas is affecting the global carbon cycle more than was previously known. By studying data from a new satellite imaging system, the researchers also found that the biomass in tropical forests is decreasing.

Vegetation fills a very important function in the carbon cycle, by absorbing 30 percent of human carbon dioxide emissions and therefore mitigating the effects of climate change. But due to deforestation, the amount of carbon dioxide in the atmosphere is increasing. A new study, published in the scientific journal Nature Ecology & Evolution, shows that the intensifying land use in the world's tropical areas is causing forests in, for example, the Amazon and Southeast Asia, to contribute much less to carbon dioxide uptake than was previously known.

"Climate change is affecting us all, and with this study we have increased our understanding of the impact of land use on the global carbon cycle", says Torbern Tagesson, physical geography researcher at Lund University who led the study.

By combining a new satellite-based dataset with dynamic vegetation models, the researchers have obtained detailed information on how much carbon dioxide is absorbed by different ecosystems around the world. The study, which covers the period between 1992 and 2015, focuses on tropical and boreal forests (coniferous forests in the northern hemisphere).

"As we have verified our estimates with other satellite data, we can now say with certainty that boreal forests contribute more to carbon dioxide uptake and tropical forests contribute less. Previous studies have not shown the same decline for tropical forests", says Torbern Tagesson.

The results provide a deeper insight into the impact of land use on the global carbon cycle, but also an increased understanding of the processes that affect carbon dioxide uptake from vegetation.

"This knowledge is essential for us to be able to predict the effects of present and future climate change and therefore also highly relevant for climate change policy", concludes Torbern Tagesson.

Credit: 
Lund University