Brain

ASU study shows positive lab environment critical for undergraduate success in research

image: An Arizona State University study conducted by 14 undergraduate students and their research mentors, found that more than 50% of life sciences students who participated in the study considered leaving their undergraduate research experience. Ultimately, more than 50% decided to leave. They also found the most important factors that influence whether a student decides to continue working in a research lab included a positive lab environment, enjoying their everyday research tasks, a flexible lab schedule and inclusiveness.

Image: 
Samantha Lloyd/ASU VisLab

Getting involved in research as an undergraduate can have significant benefits, such as enhancing a student's ability to think critically, increasing their understanding of how to conduct a research project and improving the odds that they'll complete a degree program in science, technology, engineering and math (STEM).

And, for students who participate in research over several years, the benefits are even greater. They often develop greater confidence in their research skills, an ability to solve problems independently and are more likely to pursue a career in STEM.

But many undergraduates drop out of their research experience before graduation or even during their first year working in a biology lab. Until now, there has been no research as to why.

In a study published today in PLOS ONE, a group of 14 undergraduate Arizona State University co-authors addressed this question as part of a class project. Led by School of Life Sciences Associate Professor Sara Brownell, graduate student Logan Gin, and University of Central Florida Assistant Professor Katelyn Cooper, students with the LEAP Scholars program surveyed more than 750 life sciences undergraduates doing research in 25 public institutions across the U.S. They found that 50% of students who participated in the study had considered leaving their undergraduate research experience, and ultimately, more than 50% of those students decided to leave.

They also found that the most important factors that influence whether a student decides to continue working in research included a positive lab environment and enjoying their everyday research tasks, as well as flexible schedules, positive social interactions and feeling included. Students also persisted with their research when they felt they were learning important skills and perceived the work was important to their career goals.

"We often assume that all undergraduate research experiences are positive for students, but this study shows that this is not the case. If 50% of students consider leaving their undergraduate research experience, then that means that we have a structural problem with how we are integrating students in undergraduate research," said senior author Brownell. "We can empower students with more knowledge about undergraduate research to help them choose a suitable lab, but we also need to find ways to make our research labs more positive environments for all students."

Other factors, such as race, gender, GPA and college generation status, also play a role in what factors influence students to persist in their research experiences. Men were more likely than women to stay in research because it's important for their future careers. Men were also more likely to leave their research experience because they didn't enjoy their specific lab tasks, while women were more likely to consider leaving because of a lack of flexibility in the lab.

Underrepresented minority students were more likely to leave their research work because they felt they were not learning important skills, while white students were more likely to stay in research because they enjoyed their everyday lab tasks. And, students with lower GPAs were more likely to stay in research because they were unsure about future research opportunities, while those with higher GPAs were more likely to leave research because they did not enjoy the everyday lab tasks.

"We were excited to identify factors that disproportionately affected underrepresented and marginalized students' decisions to leave research. It will be challenging to identify solutions, but identifying these issues is a critical step in developing a more diverse and inclusive scientific community," said Gin.

For faculty members who invest time and resources to train undergraduates to work in their labs, this study provides important insight that can be used to shape their student lab experiences, develop support policies and improve mentor and mentee relationships.

"What was most surprising to us was the importance of the lab environment and the interactions among people in the lab," said lead author Katelyn Cooper. "When we hire faculty members to run research labs, we often are looking for the smartest people with the best research ideas. However, this study highlights that if we want to maximize the success of undergraduates in research, we need to be selecting for supportive faculty who can create positive working environments."

Brownell and her co-instructors lead ASU's LEAP Scholars program, a four-semester scholarship program funded by the National Science Foundation to help community college transfer students get involved in undergraduate science research. Because many transfer students need to work a job while attending college, the LEAP program provides scholarships and mentors so they can work in a research lab instead and focus full-time on their coursework.

Credit: 
Arizona State University

Microbes have adapted to live on food that is hundreds of years old

Microbial communities living in deep aquatic sediments have adapted to survive on degraded organic matter, according to a study published in Applied and Environmental Microbiology and coauthored by professors at the University of Tennessee, Knoxville.

"There are microbes living in deep ocean sediments eating carbon, like proteins and carbohydrates, that is hundreds of years old," said Andrew Steen, lead author of the study and assistant professor of environmental geology at UT. "However, we don't know much about how those microbes eat that old, poor-quality food."

Understanding how these microorganisms function on low-quality foods at a very slow pace could have future uses in biomedical applications such as a technology that could slow down cell metabolism in human organs so they can survive longer during a transplant process.

"It could also aid in preserving underground microbes that play a role in carbon sequestration, a key process in the fight against climate change," said Steen.

To better understand how these microorganisms access this food, researchers tested different types of peptidases--digestive enzymes that work to degrade proteins--in sediment cores from the White Oak River estuary in North Carolina.

"These microbes live incredibly slow lives, with cells multiplying somewhere between every 10 years and every 10,000 years, but we aren't sure how," said Steen. "Our work shows that those microbes are living the same way any other microbe does, just way more slowly and with some improved ability to eat the low-quality food in their environment."

The data collected by the researchers represented about 275 years of sediment deposition from the White Oak River estuary. Using DNA analysis of the microbes in these sediments, and by measuring peptidases, researchers evaluated how these microorganisms metabolize with little access to fresh organic matter.

Organic carbon buried in aquatic sediments is a long-term sink for atmospheric carbon dioxide, and about 40 percent of organic carbon burial occurs in estuaries and deltaic systems. Steen's study gives insight into how these subsurface microbial communities begin the process of degrading organic carbon in such environments.

"Our study shows that, in some sense, subsurface microbes are happy to be where they are--or at least they're well adapted to a terrible environment," said Steen.

Credit: 
University of Tennessee at Knoxville

New study links high-fat diet and gut bacteria to insulin resistance

(TORONTO, Canada - August 13, 2019) Researchers have discovered how our choice of diet can weaken our gut immune system and lead to the development of diabetes.

A growing body of research supports that during obesity, our immune system is often responding to components of bacteria that "leak" through the intestinal tissue and results in inflammation. In turn, inflammation can drive insulin resistance, which predisposes people to diabetes.

In new research published in Nature Communications this week, Dr. Dan Winer, Scientist, Toronto General Hospital Research Institute and the Department of Pathology at University Health Network (UHN), and his team, including graduate students Helen Luck and Saad Khan, and co-lead author, Dr. Shawn Winer at St. Michael's Hospital, highlight how a high fat diet influences one component of the gut immune system called B cells, specifically those that produce a protein called IgA.

"We discovered that during obesity, there are lower levels of a type of B cell in the gut that make an antibody called IgA," says lead author Helen Luck.

"IgA is naturally produced by our bodies and is crucial to regulating the bacteria that live in our gut. It acts as a defense mechanism that helps neutralize potentially dangerous bacteria that take advantage of changes to the environment, such as when we consume an imbalanced or fatty diet."

In their experiments, they also observed that IgA deficient pre-clinical models, which lack protective IgA, had worsened blood sugar levels when fed a high fat diet. As well, transplantation of gut bacteria from these IgA deficient models into models that had no gut bacteria was able to transfer the disease, demonstrating that IgA can regulate the amounts of harmful bacteria in the gut during diet-related obesity.

In collaboration with a bariatric surgery research team at UHN led by Dr. Johane Allard and Dr. Herbert Gaisano, the team saw increased levels of IgA within the stool of patients soon after bariatric surgery, supporting the importance of IgA and the gut immune system in humans with obesity.

Overall, the research highlights a robust connection between high fat diets, obesity and the lack of gut IgA in promoting inflammation and insulin resistance. The knowledge that this class of antibodies regulate pathogenic bacteria, and protects against a "leaky gut," and additional complications of obesity, is a powerful tool in the fight against diabetes.

"If we can boost these IgA B cells or their products, then we may be able to control the type of bacteria in the gut," says Dr. Dan Winer. "Especially the ones that are more likely to be linked to inflammation and ultimately insulin resistance. Going forward, this work could form the basis for new gut immune biomarkers or therapies for obesity and its complications, like insulin resistance and type 2 diabetes."

Credit: 
University Health Network

Pinpointing how cells regulate long-lasting memories

image: CPEB3 (green) localizing to a neuron's dendrites after stimulation.

Image: 
Lenzie Ford and Luana Fioriti/Kandel lab/Columbia's Zuckerman Institute

NEW YORK -- The brain has a knack for safekeeping our most treasured memories, from a first kiss to a child's birth. In a new study in mouse cells, Columbia neuroscientists have mapped some of the molecular machinery that helps the brain maintain these kinds of long-term memories. By observing the activity of nerve cells of the brain, called neurons, that were extracted from the brain's memory center, the researchers outlined how the protein CPEB3 primes neurons to store memories that stand the test of time.

These findings, published today in the Proceedings of the National Academy of Sciences, provide a never-before-seen view into one of the brain's most universal and basic cellular functions. They also suggest new targets against neurodegenerative diseases characterized by memory loss, most notably Alzheimer's disease.

"Memory is what makes us who we are. It permeates our lives and is fundamental to our very existence," said Eric Kandel, MD, the study's co-senior author who is codirector of Columbia's Mortimer B. Zuckerman Mind Brain Behavior Institute, a University Professor and a Kavli Professor of Brain Science at Columbia. "But at its core, memory is a biological process, not unlike a heartbeat. With today's study, we've shed new light on the molecular underpinnings behind our brain's ability to make, keep and recall memories over the course of our lives."

All memories, even fleeting ones, are made when tiny branches, called axons, which extend out from neurons, connect to each other. These connection points, called synapses, are like handshakes: They can be strong or weak. When they weaken, memories vanish. But when they strengthen, memories can stand the test of time. Strengthening a synapse, researchers recently reported, causes an observable change to neurons' anatomy.

In 2015, Dr. Kandel and his team identified a protein in mice, CPEB3, that plays a critical role in this anatomical change. They found that CPEB3 is present at the brain's synapses when memories are formed and recalled. When the researchers prevented mice from making CPEB3, the animals could form a new memory but could not keep it intact.

"Without CPEB3, the synaptic connections collapsed and the memory faded," said Luana Fioriti, PhD, Laboratory Head at Mario Negri Institute for Pharmacological Research, Assistant Telethon Scientist at Dulbecco Telethon Institute in Milan, Italy and adjunct associate research scientist on sabbatical in the Kandel lab. Dr. Fioriti is the paper's co-senior author. "Discovering CPEB3's precise function inside the neurons was the impetus for today's study."

Within the hippocampus, the brain's memory center, CPEB3 is produced at regular intervals inside the centers of neurons. In today's study, the Columbia team found that once CPEB3 is produced it is transferred to P bodies, isolation chambers that keep CPEB3 dormant and ready for use.

"P bodies do not have a physical barrier, like a membrane, to contain CPEB3," said Lenzie Ford, PhD, a postdoctoral research scientist in the Kandel lab and the paper's co-first author. "Instead, P bodies are denser than their surroundings. This difference in densities holds P bodies together, creating a kind of biophysical forcefield that keeps CPEB3 contained inside and away from the other parts of the cell."

Once laden with dormant CPEB3, the researchers found, P bodies leave a neuron's center and travel down its branches toward the synapses. When an animal has an experience and begins to form a memory, the P bodies dissolve. CPEB3 is released into synapses to help create that memory. Over time, as more CPEB3 is released, these synapses strengthen. This alters the neurons' anatomy and, as a result, stabilizes that memory.

"Our results underscore the central role that protein synthesis plays in the maintenance of memory," said Dr. Kandel, a Howard Hughes Medical Institute Investigator whose pioneering work on the molecular basis of memory earned him the 2000 Nobel Prize in Physiology or Medicine. "And while there are likely additional processes involved that we have yet to discover, this study, which incorporated state-of-the-art biochemical, genetic and microscopy tools, reveals an elegant biological mechanism of memory in unmatched detail."

Beyond what this research reveals about memory, it also provides insight into neurodegenerative diseases characterized by memory loss. Because of CPEB3's demonstrated importance in memory storage, and because a version of CPEB3 is also present in the human brain, this protein represents a promising area of focus.

"The science of how synapses form and are strengthened over time is important for deciphering any disorder in which synapses -- and the memories associated with them -- degrade and die, such as Alzheimer's disease," said Dr. Fioriti. "By continuing to build this understanding, we could one day develop useful methods to boost CPEB3 in a way that prevents synaptic degradation, thus slowing memory loss."

Another area of focus, according to the Columbia team, relates to the protein SUMO, which the team also found played a central role in this process.

"One of our most intriguing findings is that CPEB3 does not move into P bodies on its own; another protein called SUMO guides it there," said Dr. Ford. "This process, called SUMOylation, represents another promising avenue for the further study of memory -- both in health and disease."

Credit: 
The Zuckerman Institute at Columbia University

Structurally complex forests better at carbon sequestration

Forests in the eastern United States that are structurally complex - meaning the arrangement of vegetation is highly varied - sequester more carbon, according to a new study led by researchers at Virginia Commonwealth University.

The study demonstrates for the first time that a forest's structural complexity is a better predictor of carbon sequestration potential than tree species diversity. The discovery may hold implications for the mitigation of climate change.

"Carbon dioxide, a potent greenhouse gas, is taken up by trees through the process of photosynthesis and some of that 'fixed' carbon is allocated to wood," said Chris Gough, Ph.D., corresponding author on the study and an associate professor in the Department of Biology in the College of Humanities and Sciences. "Our study shows that more complex forests are better at taking up and sequestering carbon in wood and, in doing so, they leave less carbon dioxide in the air."

The study, "High Rates of Primary Production in Structurally Complex Forests," will be published in a forthcoming issue of Ecology, a journal of the Ecological Society of America.

Carbon sequestration is the process by which atmospheric carbon dioxide is taken up by trees, grasses and other plants through photosynthesis and stored as carbon in soil and plant biomass, such as tree trunks, branches, foliage and roots. Carbon sequestration in forests and wood helps offset sources of carbon dioxide to the atmosphere, such as deforestation, forest fires and fossil-fuel emissions, according to the Forest Service of the U.S. Department of Agriculture.

Why are structurally complex forests better at carbon sequestration? Gough suggests that multiple layers of leaves may optimize how efficiently light is used to power carbon sequestration in wood.

"In other words, forests that are structurally variable and contain multiple layers of leaves outperform structurally simple forests with a single concentrated band of vegetation," he said.

To conduct the study, the researchers used a combination of their own data, as well as data from the National Ecological Observatory Network, or NEON, which is funded by the National Science Foundation. NEON is generating long-term, publicly available data for different ecosystems in the U.S., with the aim of understanding decadeslong ecological processes.

VCU biology post-doctoral scholar Jeff Atkins, Ph.D., led field data collection with researchers from the University of Connecticut and Purdue University serving as collaborators and co-authors.

Understanding how forest structure drives carbon sequestration is important for ecologists, climate modelers and forest managers.

"Many of the ecological indicators of forest growth and carbon sequestration fail to explicitly account for complexity," Gough said. "We wanted to test whether more novel indicators of structural complexity are superior predictors of carbon sequestration in wood. We also wanted to know whether these predictors extend to a number of different forest types residing in various parts of the eastern half of U.S., from Florida to New Hampshire to Wisconsin."

The study builds on previous research supported by the National Science Foundation that demonstrated how laser-based technology called lidar can map the distribution of leaves within a forest canopy at very high resolution.

The new study suggests that using lidar to map forest structure could predict the potential of forests to sequester carbon in biomass better than conventional approaches characterizing biodiversity and leaf quantity.

"This could be a major advance because we can likely use aircraft and, just in the last year, satellite data to collect the data needed to predict carbon sequestration from structural complexity," Gough said. "If we can estimate structural complexity from satellites in the future, then it may be possible to greatly improve our capacity to estimate and predict global forest carbon sequestration."

The study's results show what ecologists can do when they embrace new technologies and apply them to fundamental questions such as: What affects forest growth and carbon sequestration?

"These results, we hope, push the science forward by showing that how a forest is put together matters for carbon sequestration," Gough said. "And this relationship extends broadly to a number of different forests, from evergreen to deciduous and mid-Atlantic to Midwest."

While the researchers found that structural complexity outperformed species diversity measures as predictors of carbon sequestration, they noted that diversity is also important as one of many components that determine how structurally complex a forest is.

"We think structural complexity measures are powerful because they integrate multiple features of a forest that are critical to carbon sequestration," Gough said. "It takes tree diversity to produce a variety of leaf and plant shapes and, additionally, a critical quantity of leaves to supply the building blocks required to assemble a structurally complex forest capable of sequestering lots of carbon."

In addition to Gough, the paper was authored by Atkins, Robert T. Fahey, Ph.D., an assistant professor of forest ecology and management at the University of Connecticut, and Brady S. Hardiman, Ph.D., an assistant professor of urban ecology at Purdue University.

Credit: 
Virginia Commonwealth University

Launch of standardised tool to assess cognitive and language development in two year olds

A new paper published in The Lancet Child & Adolescent Health co-authored by a University of Warwick researcher provides standardised scores for The Parent Report of Children's Abilities Revised (PARCA-R) questionnaire. The PARCA-R is recommended for routine use in the UK to screen for cognitive and language developmental delay in children born preterm and can be completed by parents in 10 to 15 minutes. A new website has also been launched (http://www.parca-r.info) with an online version of the questionnaire and a pre-programmed calculator for deriving the standardised scores.

Researchers at the University of Warwick collaborated with colleagues at Leicester, Oxford, Birmingham and University College London to standardise the PARCA-R as part of research funded by Action Medical Research. Anonymised data from over 6,000 PARCA-R questionnaires completed by parents of two-year old children in three previous studies were used as a standardisation sample. This sample was representative of the UK population in terms of sex, gestational age, multiple birth, ethnicity and socio-economic status. Anonymised data from three further studies were used to assess external validity of the standardised scores.

Standardisation of the PARCA-R will enable it to be used to quantify a child's developmental level relative to the UK population of two-year olds, and identify advanced or delayed development. It is freely available for parents to use and has been translated into 14 languages, but standardised scores have, so far, only been developed for the original English version in the UK population. Developmental assessment can be costly to administer and the PARCA-R provides a reliable, cost-effective alternative that could eventually be extended for use in low- and middle-income countries.

Professor Dieter Wolke, from the University of Warwick Department of Psychology, said: "Over the last 15 years my colleagues and I adapted a research tool and tested its suitability for easy screening of cognitive and language developmental delay of two year olds in the community. We now publish the norms - how well a particular child does in relation to all children in the UK on cognition or language at a particular age - for the PARCA-R and make it freely available to clinicians and parents."

Credit: 
University of Warwick

Arctic sea-ice loss has 'minimal influence' on severe cold winter weather, research shows

The dramatic loss of Arctic sea ice through climate change has only a "minimal influence" on severe cold winter weather across Asia and North America, new research has shown.

The possible connection between Arctic sea-ice loss and extreme cold weather - such as the deep freezes that can grip the USA in the winter months - has long been studied by scientists.

Observations show that when the regional sea-ice cover is reduced, swathes of Asia and North America often experience unusually cold and hazardous winter conditions.

However, previous climate modelling studies have suggested that reduced sea ice cannot fully explain the cold winters.

Now, a new study by experts from the University of Exeter, the Royal Netherlands Meteorological Institute and the Energy and Sustainability Research Institute in Groningen, has shed new light on the link between sea-ice loss and cold winters.

For the research, the international team combined observations over the past 40 years with results from sophisticated climate modelling experiments. They found that the observations and models agreed that reduced regional sea ice and cold winters often coincide which each other.

They found that the correlation between reduced sea ice and extreme winters across the mid-latitude occurs because both are simultaneously driven by the same, large-scale atmospheric circulation patterns.

Crucially, it shows that reduced sea ice only has a minimal influence on whether a harsh and severe winter will occur.

The study is published in leading science journal, Nature Climate Change.

Dr Russell Blackport, a Mathematics Research Fellow at the University of Exeter and lead author of the paper said: "The correlation between reduced sea ice and cold winters does not mean one is causing the other. We show that the real cause is changes in atmospheric circulation which moves warm air into the Arctic and cold air into the mid-latitudes."

Over recent decades, the Arctic region has experienced warming temperatures through climate change, which has led to a large decline in sea-ice cover.

This reduction in sea-ice cover means that areas of open water increase, which in turn allows the ocean to lose more heat to the atmosphere in winter - this can potentially alter the weather and climate, even well outside the Arctic.

Recent studies have suggested that the reduced sea ice or Arctic warming has contributed to recent cold winters experienced in the mid-latitude region - and that as the sea-ice reduces further through climate change, cold winters will become more frequent and severe.

Now, this new study suggests that reduced sea ice is not the main cause of the cold winters. Instead, the cold winters are likely caused by random fluctuations in the atmospheric circulation.

Professor James Screen, an Associate Professor in Climate Science at the University of Exeter said: "The are many reasons to be concerned about the dramatic loss of Arctic sea ice, but an increased risk of severe winters in North America and Asia is not one of them."

Dr John Fyfe, a Research Scientist at the Canadian Centre for Climate Modelling and Analysis, who was not involved in the research, writes in Nature Climate Change: "Blackport and colleagues put to rest the notion that Arctic sea-ice loss caused the cold mid-latitude winters, showing instead that atmospheric circulation changes preceded, and then simultaneously drove sea-ice loss and mid-latitude cooling".

Credit: 
University of Exeter

More than just jaundice: Mouse study shows bilirubin may protect the brain

image: Collage of neuron cells used in this study. Neurons here are expressing a protein that fluoresces upon binding bilirubin.

Image: 
Chirag Vasavda

In studies in mice, Johns Hopkins Medicine researchers report they have found that bilirubin, a bile pigment most commonly known for yellowing the skin of people with jaundice, may play an unexpected role in protecting brain cells from damage from oxidative stress.

Bilirubin is commonly measured in lab tests as a marker for liver or blood health, and high levels may indicate disease. However, whether it has a role in healthy people has remained unclear.

The Johns Hopkins Medicine team says its interest in the compound's function in the brain arose from testing which tissues in the mouse body produced bilirubin. Surprisingly, the researchers found "exceptional levels" of the stuff in mouse brains -- five to 10 times higher production than in rodents' livers.

"Bilirubin is normally considered a waste product, but this level of production takes a lot of metabolic energy, and it seemed bizarre for bilirubin to not have a function," says Bindu Paul, Ph.D., faculty research instructor at the Johns Hopkins University School of Medicine's Solomon H. Snyder Department of Neuroscience, and a member of the research team.

The new study, described in a report published July 25 in Cell Chemical Biology, set out to find the purpose for harboring so much bilirubin in the brain. The team noted that past studies proposed that bilirubin might be an important antioxidant. Since the brain is so metabolically active and vulnerable to oxidative damage, the research group considered the possibility that bilirubin might be particularly important to protecting the brain against oxidative stress.

For their experiments, the team used mouse neurons grown in the laboratory that were genetically engineered to not produce bilirubin. As the cells grew, the researchers exposed them to various sources of oxidative stress by introducing reactive molecules to their environment.

When compared with normal mouse brain cells, the researchers found that the genetically modified mouse neurons were far more vulnerable to these stressors -- particularly at the hand of a harmful form of oxygen called superoxide.

Chirag Vasavda, an M.D./Ph.D. student in Solomon Snyder's laboratory and first author on the study, notes that superoxide is an important chemical cell messenger linked to learning, memory and development in the brain.

However, excessive brain cell activity can lead to uncontrolled superoxide levels, which can trigger oxidative stress and initiate a series of harmful reactions that cause damage to the brain. "Our initial experiments hinted to us that bilirubin might play an important role in controlling the levels of superoxide in the brain," says Vasavda.

The research team suspected that bilirubin's ability to regulate superoxide originated in its chemical structure, which allows it to grab on to and neutralize the harmful molecule in a way that other antioxidants, such as glutathione and cysteine, cannot.

To test this, the researchers stimulated excessive brain cell activity in normal brains and brains engineered to lack bilirubin. They found that brains lacking the bilirubin-production gene accumulated excessive superoxide. Then they stimulated brain activity in normal mice and mice lacking bilirubin to test whether removing bilirubin worsens brain damage or cell death.

The researchers found that mice that lacked bilirubin had about two to three times more brain damage as their normal counterparts, suggesting that bilirubin protected normal brains against harmful superoxide reactions.

This discovery, the investigators say, advances scientific understanding of bilirubin's role in the brain and elsewhere and could lead to novel treatments for neurodegenerative diseases such as Huntington's and Parkinson's that are marked by excessive superoxide levels and oxidative stress.

Credit: 
Johns Hopkins Medicine

When naproxen breaks down, toads croak

image: Adult southern toads in amplexus (breeding) at a study site.

Image: 
Allison Welch

A new study in Environmental Toxicology and Chemistry takes a harder look at the effects a common anti-inflammatory medication and its degradation products have on amphibians. There have been many studies that review the toxicity of naproxen, a common over-the-counter pain reliever, but none until now that have reviewed the effects it or its degradation products might have on amphibians.

Almost 95% of naproxen ingested by an individual is ultimately excreted. Wastewater treatment plants are not equipped to remove the drug, so it, along with other pharmaceuticals, are often discharged into waterways untouched. Naproxen (NAP) itself degrades when exposed to sunlight, but not before casting off two primary phototransformation (PT) products, 1-(6-methoxy-2-naphthyl) ethanol (NAP-PT1) and 2-acetyl-6-methoxynaphthalene (NAP-PT2). Researchers evaluated the amount of time it took these products to degrade and measured their toxicity individually and in mixtures to southern toad larvae. Toads were selected as the model organism to test since amphibian larvae can be prevalent in habitats with risk of exposure to pharmaceuticals like naproxen.

Lead authors Allison Welch and Wendy Cory, both of the College of Charleston, and their colleagues found that these PT products not only persisted in the environment longer than naproxen, but NAP-PT1 was six times more toxic to the southern toad larvae, and NAP-PT2 was up to 15 times more toxic to the amphibians. Welch noted that, "our results demonstrate that a relatively safe pharmaceutical can be transformed into compounds and mixtures that are many times more toxic and underscore the need to consider the consequences of pharmaceutical transformation when evaluating the risks posed by pharmaceuticals in the environment." The study highlights the need to evaluate the full life cycle of medications to properly understand their effect on the environment.

Credit: 
Society of Environmental Toxicology and Chemistry

Laser and sensor research to be advanced by new inquiries into plasmonic-photonic crystals

image: Schemes of PPC with equal effective refractive index and structure period. a 1D PPC and b 3D opal-like PPC

Image: 
Kazan Federal University

A group of researchers led by Professor Myakzyum Salakhov has been working on the problem of optical states in plasmonic-photonic crystals (PPCs). The group mostly consists of young scientists, some of whom started their participation in the project during their student years.

First Category Engineer Artyom Koryukin comments that the research was dedicated to modelling light transmission throughout photonic crystals with a continuous gold layer on their surface. Photonic crystals don't pass a certain wavelength (color) of light. This is called the photonic bandgap - the range of light wavelength where propagation through a crystal is difficult. PPCs, on the contrary, allow the passing of light of a certain wavelength in this photonic bandgap. The problem of three-dimensional opal-like PPCs (OLPPCs), however, is that they don't admit light of certain wavelengths.

In this work, conditions are defined for the passing of a beam of light with the wavelength of the photonic bandgap and certain polarization through an OLPPC. To achieve this goal, different versions of PPCs were modelled. The main conditions to pass such a beam are both the continuity of the gold layer with thickness of about 40 nm and the use of light with polarization. Transmittance of light across a PPC is accompanied by excitations of the optical Tamm states. One- dimensional PPC has a light transmission pass bands inside the photonic bandgap in both polarizations. Three-dimensional PPCs do not have light transmission pass bands inside the photonic bandgap because of a non-continuous gold layer (shaped like separate nano-caps or nano-crescents on the surface of a PPC). So the used OLPPCs have this unique feature - they have a light transmission pass band inside the photonic bandgap with certain polarization due to the excitation of the hybrid mode of the optical states.

OLPPCs with the hybrid mode of the optical states can be used in high-polarization-sensitive sensors. "We assume that the hybrid mode can be useful for improving the control of light in PPCs. New types of resonators based on OLPPCs can be used for the strong interaction of light and matter," adds Mr. Koryukin.

The group is planning to create a theoretical description of the model of such processes. Additionally, they want to find effective applications for OLPPCs, such as strong light-matter interactions with a single photon source.

Credit: 
Kazan Federal University

Why stress and anxiety aren't always bad

CHICAGO -- People generally think of stress and anxiety as negative concepts, but while both stress and anxiety can reach unhealthy levels, psychologists have long known that both are unavoidable -- and that they often play a helpful, not harmful, role in our daily lives, according to a presentation at the annual convention of the American Psychological Association.

"Many Americans now feel stressed about being stressed and anxious about being anxious. Unfortunately, by the time someone reaches out to a professional for help, stress and anxiety have already built to unhealthy levels," said Lisa Damour, PhD, a private-practice psychologist who presented at the meeting. Damour also writes a regular column for The New York Times and is author of the book "Under Pressure: Confronting the Epidemic of Stress and Anxiety in Girls."

Stress usually occurs when people operate at the edge of their abilities - when they push themselves or are forced by circumstances to stretch beyond their familiar limits, according to Damour. It's also important to understand that stress can result from both bad and good events. For instance, being fired is stressful but so is bringing a baby home for the first time.

"It's important for psychologists to share our knowledge about stress with broad audiences: that stress is a given in daily life, that working at the edge of our abilities often builds those capacities and that moderate levels of stress can have an inoculating function, which leads to higher than average resilience when we are faced with new difficulties," she said.

Anxiety, too, gets an unnecessarily bad rap, according to Damour.

"As all psychologists know, anxiety is an internal alarm system, likely handed down by evolution, that alerts us to threats both external - such as a driver swerving in a nearby lane - and internal - such as when we've procrastinated too long and it's time to get started on our work," said Damour.

Viewing anxiety as sometimes helpful and protective allows people to make good use of it. For example, Damour said she often tells the teenagers she works with in her practice to pay attention if they start to feel anxious at a party because their nerves may be alerting them to a problem.

"Similarly, if a client shares that she's worried about an upcoming test for which she has yet to study, I am quick to reassure her that she is having the right reaction and that she'll feel better as soon as she hits the books, " she said.

That doesn't mean that stress and anxiety can't be harmful, said Damour. Stress can become unhealthy if it is chronic (allowing for no possibility of recovery) or if it is traumatic (psychologically catastrophic).

"In other words, stress causes harm when it exceeds any level that a person can reasonably absorb or use to build psychological strength," she said. "Likewise, anxiety becomes unhealthy when its alarm makes no sense. Sometimes, people feel routinely anxious for no reason at all. At other times, the alarm is totally out of proportion to the threat, such as when a student has a panic attack over a minor quiz."

Untreated stress and anxiety can cause persistent misery but can also contribute to a host of additional psychological and medical symptoms, such as depression or an increased risk of cardiovascular disease, according to Damour.

"Anyone feeling overwhelmed by stress should, if possible, take measures to reduce his or her stress and/or seek help from a trained professional to learn stress management strategies. For the management of anxiety, some people find relief through workbooks that help them to evaluate and challenge their own irrational thoughts. If that approach isn't successful, or preferred, a trained professional should be consulted," said Damour. "In recent years, mindfulness techniques have also emerged as an effective approach to addressing both stress and anxiety."

Damour also urged psychologists to take an active role in providing counter-messaging to what she called "the happiness industry," or those wellness companies that are selling the idea that people should feel calm and relaxed most of the time.

"Psychologists are good at taking a more measured approach to thinking about the human experience. We want to support well-being, but don't set the bar at being happy nearly all of the time. That is a dangerous idea because it is unnecessary and unachievable," she said. "If you are under the impression that you should always be joyful, your day-to-day experience may ultimately turn out to be pretty miserable."

Credit: 
American Psychological Association

There are no water molecules between the ions in the selectivity filter of potassium

image: Potassium transport through the selectivity filter of a potassium-selective ion channel. The channel -- presented in orange -- is only permeable for potassium ions (large green spheres). Water molecules (small blue spheres) and other ions, such as sodium (not shown) cannot pass through the channel.

Image: 
Barth van Rossum / FMP

Do only potassium ions pass through the selectivity filter of a potassium channel, or are there water molecules between the ions? This question has been a source of controversy for years. Researchers led by Prof. Adam Lange from the Leibniz-Forschungsinstitut für Molekulare Pharmakologie (FMP) in Berlin have now been able to show that water molecules do not co-migrate through the potassium channel. Since the experiments were carried out on cell membranes under natural conditions for the first time, the researchers have strong proof in their hands. Their work has just been published in the journal Science Advances.

Our cells need potassium ions, for example to transmit nerve impulses or to control the heart rate. This is why virtually every human cell - or to be more precise, the membrane of a cell - is equipped with potassium channels. Because potassium channels are of fundamental importance for biological processes, and even the most minute changes can result in serious diseases, the tiny protein molecules are the focus of research efforts around the world. Indeed, in 2003, a US researcher was awarded the Nobel Prize in Chemistry for his elucidation of the structure of potassium channels.

Controversial debate on two different mechanisms

However, the question of how potassium actually passes through the channel in order to cross the cell membrane, still remained unclear. For a long time, it was assumed that each potassium ion was followed by a water molecule and that the elements then lined up, like links in a chain, and passed through the narrowest part of the potassium channel, the so-called selectivity filter, one after the other. This was based on the fact that potassium ions are positively charged and would repel each other without the intermediate water molecules. However, this mechanism was questioned in 2014 by Göttingen researchers led by Prof. Bert de Groot: Computer simulations showed that there are no water molecules in the selectivity filter of potassium channels. However, this did not end the debate. Subsequently, additional studies were published that seemed to support the older mechanism and apparently disproved the new one.

Now researchers from the FMP in Berlin have brought clarity into the controversial debate: Dr. Carl Öster and Kitty Hendriks of Prof. Adam Lange's research group and other colleagues at the FMP were able to show for the first time by means of solid-state nuclear magnetic resonance (NMR) spectroscopy that potassium ions actually do migrate through the potassium channels without water molecules in between. Their findings show that the potassium ions are positioned directly behind each other and push each other through the potassium channel, from bottom to top.

Under natural conditions the selectivity filter of potassium channels is free of water

"The technology we used enables us to look at membrane proteins in real cell membranes under natural conditions, for example at room temperature or physiological salt concentrations," explains Kitty Hendriks. "Thus we have been able to show that under these conditions there is definitely no water between the potassium ions in the selectivity filter."

The first indications of this came from computer simulations and there is also X-ray crystallographic data suggesting the absence of water molecules in the selectivity filter of potassium channels. "However, these investigations were conducted under artificial conditions," emphasizes Dr. Carl Öster. "With our supplementary data obtained through NMR spectroscopy, we now have a heavyweight argument in hand that the newer mechanism is the right one."

The FMP researchers and their colleagues from the Max Planck Institute for Biophysical Chemistry led by Prof. Bert de Groot, whose computer-aided molecular dynamics simulations were also incorporated into the study, have demonstrated that there are no water molecules between the potassium ions.

Progress for research

An important factor in elucidating the mechanism was the fact that the FMP is one of the world's leading centers for NMR spectroscopic investigations and is involved in ongoing further development of this complex technology. "Five years ago, we certainly would not have been able to demonstrate this in such a manner, but now we have reached a point where we are able to effectively answer this important question," said Prof. Adam Lange, head of the research group that focuses on the investigation of membrane proteins, such as ion channels. He adds: "Since the processes in potassium channels are fundamental to our health, our results have great significance that also extends beyond basic research."

The work was financially supported by the European Research Council as part of an ERC grant to Prof. Lange and by the German Research Foundation (DFG; Research Group 2518).

Credit: 
Forschungsverbund Berlin

Take a break! Brain stimulation improves motor learning

image: If the scientists stimulated the brain in the breaks between the short exercise sequences, the trained sequence could be better recalled by participants.

Image: 
MPI CBS/Jost-Julian Rumpf

The ability to learn new motor skills is a lifelong prerequisite for mastering everyday tasks independently and flexibly. There are many skills that we do automatically every day without thinking, such as operating a smartphone, typing on a keyboard, or riding a bicycle. But these had to initially be acquired, through repeated practice. The learning of new motor skills takes place both during the active practice of new processes and during breaks between learning sessions, even though one is not practicing. The pauses after practice are particularly important for motor learning. What has been learned solidifies in the brain so that it can be better recalled and executed later. In a joint study, Jost-Julian Rumpf from the Department of Neurology at the University of Leipzig and Gesa Hartwigsen from MPI CBS suggest the process probably already begins during short interruptions of practice. Further, the solidification process can be improved with brain stimulation.

When exactly does the brain "remember" a newly learned motor sequence? Previously, it was assumed that the stabilization of learned motor processes does not begin until the exercise is complete and then runs for several hours. However, recent work shows that already short breaks within practice sessions shape later recall. "We wanted to understand how relevant the so-called consolidation or perpetuation in these short pauses during practice is for later recall, after several hours, and whether we can influence these processes with brain stimulation," said first author Jost-Julian Rumpf.

The neurologist developed the study with healthy participants together with MPI researcher Gesa Hartwigsen. Participants had to type a simple sequence of numbers on a keyboard as quickly and accurately as possible. During practice, short pauses were taken after a certain number of trials and participants received magnetic stimulation of the brain during these breaks. The scientists wondered what was going on in the brain during the pauses - could it be that it was already learning at this point?

"The idea was to use magnetic stimulation to specifically influence the motor cortex during the short breaks between individual exercise units," reports Gesa Hartwigsen. It turned out that the brain stimulation during the short breaks improved recall of the learned number sequence when participants were tested six hours later. Although the participants had stopped practicing, the brain processed the practiced sequences more effectively if the brain had been stimulated during the short pauses within the training, presumably creating a stronger mental trace.

The researchers were also able to observe a so-called "transfer effect", from the trained hand to the other hand. "If we stimulated the brain in the breaks between the short exercise sequences, the trained sequence could be better recalled, not only with the trained hand, but also with the other hand," says the scientist.

Next, the researchers want to investigate the effects of their stimulation approach in older people who, in comparison with younger people, often have limitations in consolidation after motor learning and could, in the long term, benefit greatly from stimulation.

Credit: 
Max Planck Institute for Human Cognitive and Brain Sciences

New perovskite material shows early promise as an alternative to silicon

image: To minimize the loss of electrons from CsPbI3 (red, central layer) into adjacent layers, it is important that the energy levels (eV, on the graph) of all layers are similar.

Image: 
OIST

Silicon dominates solar energy products -- it is stable, cheap, and efficient at turning sunlight into electricity. Any new material taking on silicon must compete, and win, on those grounds. As a result of an international research collaboration, Shanghai Jiao Tong University, the Ecole Polytechnique Fédérale de Lausanne (EPFL), and the Okinawa Institute of Science and Technology Graduate University (OIST) have found a stable material that efficiently creates electricity -- which could challenge silicon hegemony.

Writing in Science, the collaborating teams show how the material CsPbI3 has been stabilized in a new configuration capable of reaching high conversion efficiencies. CsPbI3 is an inorganic perovskite, a group of materials gaining popularity in the solar world due to their high efficiency and low cost. This configuration is noteworthy as stabilizing these materials has historically been a challenge.

"We are pleased with results suggesting that CsPbI3 can compete with industry-leading materials," says Professor Yabing Qi, head of OIST's Energy Materials and Surface Sciences Unit, who led on the surface science aspect of the study.

"From this preliminary result we will now work on boosting the material's stability -- and commercial prospects."

Energy level alignment

CsPbI3 is often studied in its alpha phase, a well-known configuration of the crystal structure appropriately known as the dark phase because of its black color. This phase is particularly good at absorbing sunlight. Unfortunately, it is also unstable -- and the structure rapidly degrades into a yellowish form, less able to absorb sunlight.

This study instead explored the crystal in its beta phase, a less well-known arrangement of the structure that is more stable than its alpha phase. While this structure is more stable, it shows relatively low power conversion efficiency.

This low efficiency partly results from the cracks that often emerge in thin-film solar cells. These cracks induce the loss of electrons into adjacent layers in the solar cell -- electrons that can no longer flow as electricity. The team treated the material with a choline iodide solution to heal these cracks, and this solution also optimized the interface between layers in the solar cell, known as energy level alignment.

"Electrons naturally flow to materials with lower potential energy for electrons, so it is important that the adjacent layers' energy levels are similar to CsPbI3," says Dr. Luis K. Ono, a co-author from Professor Qi's lab. "This synergy between layers results in fewer electrons being lost -- and more electricity being generated."

The OIST team, supported by the OIST Technology Development and Innovation Center, used ultraviolet photoemission spectroscopy to investigate the energy level alignment between CsPbI3 and the adjacent layers. These data showed how electrons can then move freely through the different layers, generating electricity.

The results showed a low loss of electrons to adjacent layers following treatment with choline iodide --due to better energy level alignments between the layers. By repairing the cracks that naturally emerge, this treatment led to an increase in conversion efficiency from 15% to 18%.

While that leap may seem small, it brings CsPbI3 into the realm of certified efficiency, the competitive values offered by rival solar materials. Although this early result is promising, inorganic perovskite is still lagging. For CsPbI3 to truly compete with silicon, the team will next work on the trinity of factors allowing silicon's reign to continue -- stability, cost, and efficiency.

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

Reevaluating the impacts of smoke plumes aloft, based on the 2017 pacific northwest wildfires

Extensive wildfires in the Pacific Northwest in the summer of 2017 unleashed a vast plume of smoke that ascended high into the stratosphere, persisted for more than eight months and provided researchers a rare opportunity to evaluate current models of smoke ascent. Their study reveals gaps in the way smoke plume rise and duration is modeled now, they say. Powerful firestorms will occasionally cause pyrocumulonibus clouds (pyroCbs) to erupt violently into the atmosphere. The rapidly rising, super-heated air of the fires below results in a towering, smoke-infused, thunderstorm-like cloud, which - like a chimney - funnels smoke particles directly into the Earth's stratosphere, with lingering global implications. While pyroCb events have been previously observed, they are relatively rare and, outside of model simulations, little is known about their physical and chemical impacts. To assess current model-based assumptions of these events, Pengfei Yu and colleagues compared direct observations of the 2017 Pacific Northwest wildfires from the Stratospheric Aerosol and Gas Experiment III (SAGE III-ISS) and the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite platforms with output from model simulations of these fires. Yu et al. report that the solar heating of the black carbon particles warmed the air within the smoke plume, causing it to self-loft, ascending from 12 kilometers to nearly 23 kilometers within two months, increasing its ability to spread latitudinally throughout the stratosphere. However, the observed smoke lifetime in the stratosphere was 40% shorter, the authors say, than what would be calculated using a standard model. This is because standard models do not necessarily consider photochemical loss of organic carbon - a phenomenon apparent in the 2017 plume rise. For future large wildfire events, photochemical reaction rates are important characteristics to measure, say the authors, to improve predictability of the 3-D transport of the smoke. They note their observational data confirms predictions of numerous models of Nuclear Winter, related to how smoke injected into the upper troposphere from urban fires will self-loft high into the stratosphere. However, the persistence of the smoke in the 2017 fire plume also calls into question assumptions of Nuclear Winter models related to how organics in smoke can be ignored due to their rapid loss, which was not observed here.

Credit: 
American Association for the Advancement of Science (AAAS)