Culture

Combo-drug treatment for Type 2 diabetes remains effective after two years

PHILADELPHIA - Patients with Type 2 diabetes often take metformin as first-line therapy to help stabilize their blood glucose. Eventually, some patients no longer respond to metformin and require additional treatment. A few years ago, pivotal short-term trials showed that a combination of two drugs controlled diabetes progression better than either single drug alone. Now, new research demonstrates that this drug combo of dapagliflozin and exenatide continues to stay effective, without loss of effect, after two years of continual use.

"Many therapies in diabetes management are short-lived, which is why it's useful to test for long-term effect," says senior author Serge Jabbour, MD, director of the division of endocrinology and the Diabetes Center at Thomas Jefferson University. "Our study showed that a combo regimen of dapagliflozin and exenatide continued to control patients' glucose for over two years. This is a very encouraging."

The results of this multi-center double-blind, phase 3, randomized controlled trial were published in the journal Diabetes Care.

A total of 695 adults whose Type 2 diabetes was not controlled with metformin, were randomly assigned to three study groups. One group received weekly exenatide injections in addition to metformin. Another group took daily dapagliflozin pills in addition to metformin, and a third group received both drugs together. The study was an extension study of the pivotal DURATION-8 trial, meaning that patients were given the option to continue in the trial longer.

The two classes of drugs act additively, improving the effects on a number of diabetes indicators. Dapagliflozin belongs to a class of drugs called sodium-glucose cotransporter-2 (SGLT2) inhibitors that cause excess glucose to be excreted in the urine. Exenatide belongs to a class of drugs called glucagon-like peptide-1 receptor agonists (GLP-1RAs), which enhance glucose-dependent insulin secretion, lower hepatic glucose output, slow gastric emptying and increase satiety. Together, the two drugs promote and maintain better glucose control and produce additive weight loss and improve blood pressure.

The results confirmed that the group of patients receiving both drugs had better glycemic control than patients receiving just one of the drugs - and demonstrated, for the first time, that the effect was stable for the duration of the extended two-year study period. The study also showed a clinically relevant reduction in weight and blood pressure, measures that can contribute to Type-2 diabetes and overall health. The researchers saw no unexpected safety concerns related to the drug combination in the study participants.

Other studies with both drugs have also suggested that metabolic markers such as lipid profile also improved.

"These two classes work synergistically to help control a Type-2 diabetes patient's glucose levels, and other measures associated with diabetes," says Dr. Jabbour. "We can now feel more confident about prescribing these medications long term."

Credit: 
Thomas Jefferson University

Do octopuses' arms have a mind of their own?

video: In the experiments that tested for proprioception, the octopuses tended to use "straight" movements, aimed at either the left or right side of the maze. In the experiments that tested for tactile discrimination, the octopuses chose to use slower "search" movements.

Image: 
Reproduced with permission from Elsevier. Originally published 10 Sep 2020 by Current Biology in "Use of Peripheral Sensory Information for Central Nervous Control of Arm Movement by Octopus vulgaris"

Often described as aliens, octopuses are one of most unusual creatures on the planet, with three hearts, eight limbs and a keen intelligence. They can open jars, solve puzzles and even escape from their tanks, aided by their eight ultra-flexible and versatile arms. But determining how exactly octopuses control all eight limbs is a puzzle that scientists are still trying to crack.

"Octopus arms are completely unique. First off, there are eight of them, each with over 200 suckers that can feel, taste and smell the surroundings. And everything is moveable. The suckers can grasp, and the arms can twist in an almost limitless number of ways," said Dr. Tamar Gutnick, an octopus researcher formerly at the Okinawa Institute of Science and Technology Graduate University (OIST). "So this raises a huge computational issue for the brain and their nervous system has to be organized in a really unusual way to deal with all this information."

Octopuses have an extensive nervous system, with over 500 million neurons, similar in number to that of a dog. But unlike dogs and other vertebrates, where the majority of neurons are in the brain, over two thirds of the octopuses' neurons are located within their arms and body.

With such a strangely-built nervous system, scientists have long suspected that octopuses' arms may have a mind of their own and act autonomously from the central brain. Research has shown that octopuses' arms use reflex loops to create coordinated movements, and some octopuses can even distract predators by discarding limbs that continue to move for long periods of time.

"Some scientists think about octopuses as nine-brained creatures, with one central brain and eight smaller brains in each arm," said Dr. Gutnick. But her new research, published in Current Biology, suggests that the arms and the brain are more connected than previously thought.

Dr. Gutnick and her colleagues have shown that octopuses are capable of learning to associate inserting a single arm into a specific side of a two-choice maze with receiving a food reward, even when neither the reward nor the arm in the maze are visible to the octopus. But crucially, while the learning process takes place in the central part of the brain, the information needed for the brain to choose the correct path is detected only by the arm in the maze.

"This study makes it clear that octopus's arms don't behave totally independently from the centralized brain - there's information flow between the peripheral and central nervous system," said Dr. Gutnick. "Rather than talking about an octopus with nine brains, we're actually talking about an octopus with one brain and eight very clever arms."

Navigating the maze

The scientists tested whether single arms were able to provide the brain with two different types of sensory information - proprioception (the ability to sense where a limb is and how its moving) and tactile information (the ability to feel texture).

Humans have a strong sense of proprioception. Sensory receptors located within skin, joints and muscles provide feedback to the brain, which stores and constantly updates a mental map of our body. Proprioception allows us to walk without looking at our feet and touch a finger to our nose with our eyes closed.

But whether octopuses have the same ability is not yet proven.

"We don't know whether an octopus actually knows where its arm is, or what its arm is doing," said Dr. Gutnick. "So our first question was - can the octopus direct its arm, based only on sensing where its arm was, without being able to see it?"

The researchers created a simple Y-shaped opaque maze and trained six common Mediterranean octopuses to associate either the right or left path with a food reward.

Rather than slowly exploring the internal shape of the maze, the octopuses immediately used fast arm movements, pushing or unravelling their arm straight through the side tube into the goal box. If they pushed their arm into the right goal box, they could retrieve the food, but if their arm entered the wrong goal box, the food was blocked by a net and the scientists removed the maze.

Five out the of the six octopuses eventually learned the correct direction to push or unroll their arm through the maze in order to get the food.

"This shows us that the octopuses clearly have some sense of what their arm is doing, because they learn to repeat the movement direction that resulted in a food reward," said Dr. Gutnick. "It's unlikely to be to the same extent as humans have with our mental maps and the representations we have of our body in the brain, but there is some sense of self-movement from the arms that is available to the central brain."

The team then explored whether octopuses were able to determine the correct path when using a single arm to sense the texture of the maze.

The researchers presented another six octopuses with a Y-shaped maze where one side tube was rough, and the other side tube was smooth. For each octopus, picking either the rough side or the smooth side of the maze led to a food reward.

After many trials, five out of the six octopuses were able to successfully navigate the maze, regardless of whether the correct texture was located on the left or right side tube, showing that they had learnt which texture was correct for them. This time, the octopuses opted for a slower searching movement inside the maze, first determining a side tube's texture and then deciding whether to continue down that side tube, or to switch sides.

Importantly, the team found that for both types of mazes, once octopuses had learnt the correct association, they could successfully navigate the maze using arms that hadn't been used before. "This further rules out the idea that each arm could be learning the task independently - the learning occurs in the brain and then the information is made available to each arm."

But where this information is stored within the brain, Dr. Gutnick isn't sure, and is a question left for future experiments.

"The brain of octopuses is so different - it's still a black box to us really," she said. "There's so much more to learn."

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

Higher risk of future fecal incontinence after sphincter injuries

image: Maria Gyhagen, MD/PhD, Sahlgrenska Academy, University of Gothenburg.

Image: 
Photo: Sodra Alvsborg Hospital

The risk of subsequent fecal incontinence and intestinal gas leakage is significantly higher among women who, during childbirth, have suffered a sphincter injury and consequent damage to the anal sphincter muscle, was shown in a new study from the University of Gothenburg.

Most births in Sweden occur without complications for the mother and child. Mild perimeal tears are relatively common, but self-healing usually takes place. Alternatively, these tears can be sutured without lasting ill-effects.

However, damage to the ring of muscles around the anus -- the anal sphincter, the focus of the current study -- also sometimes occurs. In Sweden, according to the latest statistics in the Swedish Medical Birth Register, approximately one woman in 20 sustain a rupture of the anal sphincter the first time she gives birth.

The study was based on register data on 7,741 women who gave birth to two children between 1992 and 1998. This information was combined with their responses to a questionnaire about fecal incontinence symptoms and, if the women had current symptoms, how these were affecting them some 20 years later, when they were aged 50-60.

The results, published in the American Journal of Obstetrics and Gynecology, show that for women sustaining sphincter injuries at the first birth, the risk of sphincter injury rose threefold at the second; and, moreover, the likelihood of future fecal incontinence increased with the number of injuries.

Of the women who avoided sphincter injury entirely in two vaginal births, 11.7% reported some degree of fecal incontinence 20 years later. The corresponding proportion among those who incurred sphincter damage at one of the births was 23.8%, and at both births 36.1%.

Severe fecal incontinence followed a similar pattern: from 1.8% in the women without sphincter damage to 5.4% and 9% after, respectively, one or two births causing sphincter injury. Leakage of liquid feces and gas was most common; in severe cases, solid feces were leaked as well.

"For women affected by sphincter damage who have had fecal incontinence later in life, there are few or no effective surgical treatments, and it often leads to chronic suffering, with social isolation as a result. Preventing sphincter injuries is the paramount measure," says Maria Gyhagen, researcher in obstetrics and gynecology at Sahlgrenska Academy, University of Gothenburg, and gynecologist at Södra Älvsborg Hospital in Borås, who was responsible for the study.

Although a sphincter injury can occur without known risk factors, the likelihood is greater in first-time mothers; after previous sphincter damage; during an assisted delivery using a ventouse suction cup and forceps; and, if the baby is large, at the first vaginal birth after a previous cesarean section. An elevated risk of fecal incontinence after ventouse delivery was demonstrated in a previous study from the same research group.

"In this study, we've been able to determine how sphincter injuries are associated with involuntary intestinal incontinence much later on in life, which has been questioned before. This must be taken into consideration in counseling for mothers-to-be whose risk of injury is highest," Gyhagen concludes.

Credit: 
University of Gothenburg

Follow your gut: How farms protect from childhood asthma

We are born into an environment full of small organisms called microbiota. Within the first minutes and hours of our lives, they start challenging but also educating our immune system. The largest immune organ is our gut, where maturation of the immune system and maturation of the colonizing bacteria, the gut microbiome, go hand in hand. After profound perturbations in the first year of life, the maturation process, the composition of the gut microbiome gradually stabilizes and accompanies us for our lives. Previous research of the Munich scientists showed an asthma-protective effect by a diverse environmental microbiome, which was particularly pronounced in farm children. The question now was whether this effect could be attributed to the maturation process of the early gut microbiome.

Farm life boosts gut microbiome maturation in children
The researchers analyzed fecal samples from more than 700 infants partly growing up on traditional farms between the age of 2 and 12 months who took part in PASTURE - a European birth cohort, which runs for almost 20 years now with funding from the European Commission.

"We found that a comparatively large part of the protective farm effect on childhood asthma was mediated by the maturation of the gut microbiome in the first year of life" states Dr. Martin Depner, biostatistician at Helmholtz Zentrum München, and further concludes: "This suggests that farm children are in contact with environmental factors possibly environmental microbiota that interact with the gut microbiome and lead to this protective effect."

The researchers anticipated effects of nutrition on the gut microbiome maturation but were surprised to find strong effects of farm-related exposures such as stays in animal sheds. This emphasizes the importance of the environment for the protective effect. In addition, vaginal delivery and breastfeeding fostered a protective microbiome in the first two months of life.

Furthermore, the researchers discovered an inverse association of asthma with measured level of fecal butyrate. Butyrate is a short chain fatty acid which is known to have an asthma protective effect in mice. The researchers concluded that gut bacteria such as Roseburia and Coprococcus with the potential of producing short chain fatty acids may contribute to asthma protection in humans as well. Children with a matured gut microbiome showed a higher amount of these bacteria (Roseburia and Coprococcus) compared to other children.

"Our study provides further evidence that the gut may have an influence on the health of the lung. A mature gut microbiome with a high level of short chain fatty acids had a protective effect on the respiratory health of the children in this study. This suggests the idea of a relevant gut-lung axis in humans", says Dr. Markus Ege, professor for clinical-respiratory epidemiology at the Dr. von Hauner Children's Hospital. "This also means, however, that an immature gut microbiome may contribute to the development of diseases. This emphasizes the need for prevention strategies in the first year of life, when the gut microbiome is highly plastic and amenable to modification."

Probiotic prevention strategies
The researchers demonstrated that the asthma protective effect is not dependent on one single bacteria only, but on the maturation of the entire gut microbiome. This finding questions the approach of using single bacteria as probiotics for the prevention of asthma. Probiotics should rather be tested with respect to their sustained effect on the compositional structure of the gut microbiome and its maturation early in life.

Further studies on cow milk
Nutritional aspects analyzed in this study may serve as prevention strategies such as consumption of cow's milk. Unprocessed raw milk, however, cannot be recommended because of the risk of life-threatening infections such as EHEC. Scientists at the Dr. von Hauner Children's Hospital are currently running a clinical trial on the effects of minimally processed but microbiologically safe milk for the prevention of asthma and allergies (MARTHA trial).

Helmholtz Zentrum München
Helmholtz Zentrum München is a research center with the mission to discover personalized medical solutions for the prevention and therapy of environmentally-induced diseases and promote a healthier society in a rapidly changing world. It investigates important common diseases which develop from the interaction of lifestyle, environmental factors and personal genetic background, focusing particularly on diabetes mellitus, allergies and chronic lung diseases. Helmholtz Zentrum München is headquartered in Neuherberg in the north of Munich and has about 2,500 staff members. It is a member of the Helmholtz Association, the largest scientific organization in Germany with more than 40,000 employees at 19 research centers.

Credit: 
Helmholtz Munich (Helmholtz Zentrum München Deutsches Forschungszentrum für Gesundheit und Umwelt (GmbH))

Analyzing biological and chemical damage on 20th-century construction materials

image: Photo of Dr Iratxe Ibarrondo using portable high-resolution spectroscopic techniques to analyse the impact on 20th-century construction materials of the environment and biological conditions in the surrounding area.

Image: 
Iratxe Ibarrondo / UPV/EHU

It is customary for the research conducted in the IBeA research group in the UPV/EHU's department of Analytical Chemistry to be approached from a multidisciplinary perspective. One of the group's lines of work is the diagnosis and restoration of historical and cultural heritage for which spectroscopic analytical techniques are used. Although the group's research has until now focussed on historical monuments, right now they have begun to study the origin and types of deterioration of a range of synthetic materials used in 20th-century buildings.

New construction materials are made up of synthetic materials featuring concrete, reinforced concrete, mortars and bricks. Once the work has been completed, the materials are at the mercy of the environment: the degrading agents in the surrounding area, such as acid gases in the atmosphere, water seepage, and lichens, among other things, can cause physical, chemical and biological damage. "The increasing contamination of the environment, the atmosphere and aquatic media is clear," said the researcher Iratxe Ibarrondo, "and we wanted to find out the effect exerted by chemical agents emerging from contamination and materials and the type of compounds that are formed as a result of chemicals reacting with each other".

The high-resolution spectroscopic analysis techniques used by the IBeA research group are essential when characterizing not only the original compounds of the stone materials but also the compounds formed as a result of deterioration processes. Raman analyses made in situ in the buildings have allowed the use of this technique to be proposed as a protocol to diagnose the type of environmental damage affecting 20th-century buildings in the process of refurbishment and renovation. Furthermore, by means of analyses conducted with greater resolution in the lab, they have been able to determine the types of damage found in new construction materials as well as the way in which they occurred. "We have managed to identify many different degradation compounds," explained the author of the research.

Method for measuring environmental contamination based on lichen pigments

In addition, "we analysed the effects that can be caused by lichens or biological agents in these materials; the kind of reactions they cause or which take place in their presence as well as the damage that can be detected in the materials. The fact is that materials of this type tend to be in a very poor condition and display serious degradation problems", added Ibarrondo. That way they managed to establish that lichens play an active role in biodegradation processes: they discovered that they absorb different types of atmospheric particles, incorporate them into their metabolisms and synthesise new biominerals (minerals not found in the original stone substrate).

Finally, by using carotenoid pigments from the lichens, they developed a new method of measuring the degree of environmental contamination. "We established that lichens are highly resistant in contaminated environments, and in addition, the carotenoids they synthesise when contamination levels change and when greater degrees of oxidation are reached," explained Dr Ibarrondo.

In the researcher's view, "this research has opened up new paths, mainly ones relating to biological agents. They can be analysed better and more deeply and under no circumstances should they be underestimated because they cause significant damage". The research group collaborates with professionals in architecture, and "we saw that it is possible to establish this collaboration in the form of a protocol". Ibarrondo is hoping that, depending on the case, when a building project is designed, there will be a possibility of receiving advice about the effects exerted by the environment on the materials. "Building professionals see the type of deterioration the materials undergo, but we analytical chemists can determine the problem with greater precision. But that will be the next step; it is not yet reality".

Credit: 
University of the Basque Country

New UTSA research identifies link between food insecurity and unengaged distance learning

How do you feel when you're hungry? Are you at your best? A new study by the UTSA Urban Education Institute (UEI) found that food insecure students in San Antonio struggled with distance learning and academic engagement more than their peers.

The findings linking food insecurity and learning signify how hunger and larger issues of family instability during the ongoing pandemic threaten student growth.

"It is well understood that we all have basic needs that must be met if we are to pursue and realize our fullest potential," said Mike Villarreal, UEI director. "It is urgent that we as a community find effective ways to care for and respond to families in crisis so children won't lag behind their peers in cognitive, emotional, and physical development."

The research, as part of a multi-part Teaching and Learning In the Time of COVID-19 survey project, found that 26 percent of local K-12 students and parents surveyed said they were experiencing food insecurity, meaning food ran out and there was no money to buy more.

The UEI spent the spring and summer surveying almost 2,000 K-12 public school students, parents and teachers across eight Bexar County school systems for the overall project. This research is helping local education leaders plan and improve their operations during the pandemic.

Families in each school system reported experiencing food insecurity, with the highest rates of 49 percent found in Edgewood ISD and 41 percent in both Harlandale and Southwest ISDs. Other school-district-level data can be found in the report.

Bexar County's pre-pandemic rates of food insecurity already were hovering at about 14 percent in 2018. That number nearly doubled for public school families by spring of 2020.

"A majority of our public school children depend on our public schools for food, safety and community. The pandemic not only stranded students, but also caused many of them and their parents (48%) to lose their jobs or suffer pay cuts," Villarreal said.

Schools responded quickly with massive technology and meal distributions for students learning at home.

"Despite many obstacles and regulations by the federal government, districts pivoted quickly and offered free meals with little to no questions asked while physical campuses were closed," Villarreal said. "Their efforts were heroic - and yet there is still more work to be done. Our research is intended to support our school leaders by providing them rigorous and actionable research and nurturing an ongoing community conversation about the challenges and solutions in education."

Some of the key findings in the research report include:

26 percent of students and parents surveyed said they were experiencing food insecurity

Families in each school system surveyed reported having experienced food insecurity, with the highest rates of 49 percent found in Edgewood ISD and 41 percent in Harlandale and Southwest ISDs

A higher incidence of food insecurity was correlated with higher numbers of children in households

Food insecure high school students were less motivated during distance learning. While food insecure high school students represented 20 percent of all high school students, they represented 65 percent of high students who said they never turned in assignments

Food insecure high school students were also overrepresented among those who said they were never engaged by school lessons.

Credit: 
University of Texas at San Antonio

Novel technique spotlights neuronal uptake of amyloid beta in Alzheimer's disease

image: Confocal images of cells with and without the cellular prion protein (labeled red), showing uptake of amyloid beta (labeled green) in cells with the prion protein (above) and no uptake in cells lacking the prion protein (below). (Magenta is membrane dye and blue is nuclear dye.)

Image: 
Graham Roseman, UCSC

One of the hallmarks of Alzheimer's disease is the formation of amyloid plaques--sticky clumps of a protein called amyloid beta--that collect between neurons in the brain. Increasingly, however, attention has turned from these insoluble plaques to soluble forms of amyloid beta that can be taken up into neurons and are highly neurotoxic.

A new study by researchers at the University of California, Santa Cruz, pinpoints a segment of the amyloid beta protein that is recognized by receptors involved in neuronal uptake of this toxic peptide. The researchers used a novel approach to study the mechanisms of cellular uptake of amyloid beta. Their findings, published November 2 in Proceedings of the National Academy of Sciences, suggest that targeting this process may be a promising approach for Alzheimer's drug development.

"There are many different ways that amyloid beta can be toxic inside cells, so wouldn't it be nice if we could block its uptake by neurons? This is a pathway we can target," said corresponding author Jevgenij Raskatov, assistant professor of chemistry and biochemistry at UC Santa Cruz.

In the new study, Raskatov teamed up with co-corresponding author Glenn Millhauser, distinguished professor of chemistry and biochemistry at UCSC, to study the interactions of amyloid beta with the cellular prion protein. Millhauser's lab studies the structure and function of the prion protein, a membrane protein found on the surfaces of various cell types, including neurons in the brain.

Previous studies by other researchers have indicated that the normal cellular prion protein (not the abnormal variant that causes prion diseases) binds to clumps of amyloid beta and is involved in uptake and neurotoxicity. The UCSC researchers, led by graduate student Alejandro Foley and postdoctoral researcher Graham Roseman, sought to test whether the prion protein also acts as a receptor to take up soluble forms of amyloid beta, and to identify the site within amyloid beta that binds to this receptor.

The researchers took an approach based on previous work from Raskatov's lab using mirror-image versions of amyloid beta to show that cellular uptake is mostly mediated by receptors on the cell surface. In the earlier work, researchers compared the uptake of natural amyloid beta with a synthetic version in which the arrangement of the atoms in the protein's amino acids is a mirror image of the natural arrangement. This rearrangement subtly changes the protein's structure in ways that would interfere with its binding to a receptor, so the finding that cellular uptake of the mirror-image version was greatly reduced pointed to receptor-mediated uptake.

Amyloid beta can be different lengths, but the most toxic variant is 42 amino acids long. To isolate the site involved in receptor binding, the researchers created a library of peptide segments of amyloid beta composed of, for example, just amino acids 1 through 16 or 1 through 30. For each segment, they made a version with natural amino acids (the "L stereoisomer") and one with mirror-image amino acids (the "D stereoisomer").

After testing all the peptides in the library for cellular uptake, they found that amino acids 1-30 showed the same stereoselectivity as full-length amyloid beta, with much greater uptake of the L form than the D form. In addition, this segment is completely soluble and does not form aggregates because it is missing a long hydrophobic domain involved in the aggregation of amyloid beta into clumps and fibrils.

"With this shortened amyloid beta, we are able to decouple cellular uptake from aggregation, giving us a great model for studying uptake," Raskatov said.

By assessing amyloid beta uptake in cells with and without the prion protein, the researchers demonstrated for the first time the prion protein's role in cellular uptake of soluble amyloid beta, consistent with its selectivity for the L stereoisomer of amyloid beta.

The non-aggregating amyloid beta segment 1-30 showed the same prion-dependent uptake and stereoselectivity. The researchers also used nuclear magnetic resonance spectroscopy to gather detailed information about the interactions involved in the binding of amyloid beta to the cellular prion protein.

Their findings show that the binding of amyloid beta at the cell surface, leading to its internalization, is largely due to the amino acid sequence 1-30 and not the state of aggregation. When amyloid beta molecules begin to aggregate, they form "oligomers" consisting of a small number of molecules stuck together that are still soluble and can be taken up by neurons. These soluble oligomers are increasingly considered to be the form of amyloid beta that triggers the pathological processes leading to Alzheimer's disease, but there are many different aggregated forms.

"There are different levels of aggregation, different sizes and types of oligomers, and a big question has been which forms are internalized and cause neurotoxicity," Raskatov said. "Our findings suggest that any form in which that sequence of amino acids is exposed will bind to the prion protein."

According to Millhauser, "The initial steps leading to Alzheimer's disease may be the prion protein-mediated transport of soluble amyloid beta into neurons, where it then clumps, forming toxic aggregates that ultimately lead to the characteristic plaques associated with the disease."

Although the prion protein seems to be responsible for most of the amyloid beta uptake, the study suggests other receptors may offer alternate routes. Nevertheless, drugs that target the 1-30 segment of amyloid beta or the sites it binds to on receptors may hold therapeutic potential for treating Alzheimer's disease.

"Our findings open up new avenues for understanding Alzheimer's and suggest promising strategies for therapeutics," Millhauser said.

Credit: 
University of California - Santa Cruz

Teens who participate in extracurriculars, get less screen time, have better mental health

A new study from UBC researchers finds that teens, especially girls, have better mental health when they spend more time taking part in extracurricular activities, like sports and art, and less time in front of screens.

The study, published in the journal Preventive Medicine, found that spending less than two hours per day of recreational screen time (such as browsing the internet, playing video games, and using social media) was associated with higher levels of life satisfaction and optimism, and lower levels of anxiety and depressive symptoms, especially among girls, the researchers found. Similarly, extracurricular participation was associated with better mental health outcomes.

"Although we conducted this study before the COVID-19 pandemic, the findings are especially relevant now when teens may be spending more time in front of screens in their free time if access to extracurricular activities, like sports and arts programs is restricted due to COVID-19," says the study's lead author Eva Oberle, assistant professor with the Human Early Learning Partnership in the UBC school of population and public health. "Our findings highlight extracurricular activities as an asset for teens' mental wellbeing. Finding safe ways for children and teens to continue to participate in these activities during current times may be a way to reduce screen time and promote mental health and wellbeing."

Data for this study was drawn from a population-level survey involving 28,712 Grade 7 students from 365 schools in 27 school districts across B.C. The researchers examined recreational screen time such as playing video games, watching television, browsing the internet, as well as participating in outdoor extracurricular activities such as sport and art programs after school. They then compared its association with positive and negative mental health indicators.

Highlights of the study's findings include the following:

Adolescents who participated in extracurricular activities were significantly less likely to engage in recreational screen-based activities for two or more hours after school

Taking part in extracurricular activities was associated with higher levels of life satisfaction and optimism, and lower levels of anxiety and depressive symptoms

Longer screen time (more than two hours a day) was associated with lower levels of life satisfaction and optimism, and higher levels of anxiety and depressive symptoms

Differences among boys and girls, with longer screen time negatively affecting girls' mental health more significantly than boys

Among both boys and girls, however, mental health was strongest when teens both participated in extracurricular activities and spent less than two hours on screen time

Oberle says further research is needed to examine why the negative effects of screen time were more detrimental for girls than for boys. She also hopes to focus future research on the effects of different types of screen time.

"We do know that some forms of screen time can be beneficial, like maintaining connections with friends and family members online if we cannot see them in person, but there are other types of screen time that can be quite harmful," she says. "There are many nuances that are not well understood yet and that are important to explore."

Credit: 
University of British Columbia

Fossils reveal mammals mingled in age of dinosaurs

The fossil remains of several small mammals discovered in tightly packed clusters in western Montana provide the earliest evidence of social behavior in mammals, according to a new study co-authored by a Yale scientist.

The fossils, which are about 75.5 million years old but exquisitely preserved, offer a rare glimpse into mammalian behavior during the Late Cretaceous Period when dinosaurs dominated, and indicate that mammals developed sociality much earlier than previously thought, said Eric Sargis, professor of anthropology in Yale's Faculty of Arts and Sciences, and a co-author of the study.

The findings were published in the journal Nature Ecology & Evolution.

"It's an exceptional set of fossils -- the most complete and well-preserved mammal specimens from the Mesozoic Era ever discovered in North America," said Sargis, curator of mammalogy and vertebrate paleontology at the Yale Peabody Museum of Natural History.

"They provide a wealth of information about how these animals lived. They tell us that they burrowed and nested together. It's multiple mature adults and subadults congregating, which we'd never seen before from this period."

Excavated from Egg Mountain in Montana --a well-known dinosaur nesting site -- the fossils include skulls or skeletons of at least 22 individuals of Filikomys primaevus, a newly named genus of multituberculate. The small mammal, which was omnivorous or herbivorous, was as abundant in the Mesozoic Era as rodents are today. (The genus name translates to "neighborly mouse.") Multituberculates existed for more than 130 million years from the Middle Jurassic to the late Eocene epoch, the longest lineage of any mammal. The fossils were discovered in groups of two to five specimens. At least 13 of the specimens were located in the same rock layer within a 344 square-foot area, a space smaller than an average-sized studio apartment.

"These fossils are game changers," said Gregory Wilson Mantilla, professor of biology at the University of Washington, curator of vertebrate paleontology at the Burke Museum of Natural History & Culture in Seattle, and the study's senior author. "As paleontologists working to reconstruct the biology of mammals from this time period, we're usually stuck staring at individual teeth and maybe a jaw that rolled down a river, but here we have multiple, near complete skulls and skeletons preserved in the exact place where the animals lived. We can now credibly look at how mammals really interacted with dinosaurs and other animals that lived at this time."

The researchers found no evidence that the fossils had accumulated through some natural phenomena, such as by being transported by a river, or that they were deposited by predators. The specimens were extracted from suspected burrows where the animals had lived. The bones show no signs of bite marks from predation, and the skeletons are too complete and well-preserved to have been moved by water or accumulated on the surface, according to the study.

An analysis of the specimens' teeth -- the level of wear and whether they had fully erupted into the oral cavity -- and their skeletons -- the degree of fusion in cranial and long bones -- showed that the individuals were either mature adults or subadults, not nursing infants. The groupings of multiple mature adults and subadults strongly suggests social behavior, Sargis said.

Social behavior occurs in about half of today's placental mammals, which bear live young that are nourished in the mother's uterus into late stages of development. Some marsupials, such as kangaroos, also behave socially. The trait was thought to have evolved following the Cretaceous-Paleogene mass extinction that killed off most dinosaur species about 66 million years ago, but the study shows that sociality occurred earlier than that, the researchers concluded.

"Because humans are such social animals, we tend to think that sociality is somehow unique to us, or at least to our close evolutionary relatives, but now we can see that social behavior goes way further back in the mammalian family tree," said Luke Weaver, the study's lead author and a graduate student in biology at the University of Washington in Seattle. "Multituberculates are one of the most ancient mammal groups, and they've been extinct for 35 million years, yet in the Late Cretaceous they were apparently interacting in groups similar to what you would see in modern-day ground squirrels."

The fossils also provide the earliest evidence of burrowing behavior in multituberculates, the study states. The skeletons indicate that F. primaevus could move its shoulders, elbows, and paws similarly to today's burrowing species, such as chipmunks. The researchers found that its hips and knees had flexibility adapted for maneuvering in tight, confined spaces, such as burrows, rather than running on open ground.

Credit: 
Yale University

Consequences of glacier shrinkage

image: The ice-covered Gya glacier lake shortly after the GLOF event

Image: 
Marcus Nüsser

Researchers from the South Asia Institute and the Heidelberg Center for the Environment of Ruperto Carola investigated the causes of a glacial lake outburst with subsequent flooding in the Ladakh region of India. In order to frame the case study in a larger picture, the research team led by geographer Prof. Dr Marcus Nüsser used satellite images to create a comprehensive survey of glacial lakes for the entire Trans-Himalyan region of Ladakh. They were able to identify changes in the extent and number of glacial lakes over a 50-year period, including previously undocumented floods. This analysis allows them to better assess the future risk of such events, known as glacial lake outburst floods (GLOFs).

"In the wake of global glacier shrinkage caused by climate change, the danger from glacial lake outburst floods is seen as an increasingly pressing problem," explains Prof. Nüsser of the South Asia Institute. An event like this unleashes huge amounts of water. Flash floods, for example, can wreak havoc on villages, agricultural areas, and infrastructure. To find out more about such events, the Heidelberg researchers studied a glacial lake outburst flood in Ladakh that destroyed houses, fields, and bridges in the village of Gya in August of 2014. Studies of the glacial lake, which is situated at 5,300 m above sea level, revealed a drastic short-tem lake level rise prior to the GLOF.

The cause, as the scientists discovered, was a "previously little-known mechanism". "Increased glacial melting caused the lake level to rise quite rapidly. Instead of resulting in spillover, however, the thawing ice cores in the moraine, i.e. the glacier's debris field, drained through subsurface tunnels without disturbing the moraine's surface," states Nüsser. In addition to conducting field surveys, the scientists also interviewed the locals about their recollections of the GLOF event. On the basis of satellite images, the team additionally studied the evolution of the glacial lake since the 1960s in order to reconstruct possible GLOF events.

"The high temporal and high spatial remote sensing imagery from satellites supported by field surveys will allow us to better assess the possible risk of future outbursts in this region," adds the Heidelberg geographer. In light of recurring GLOFs, the new inventory can help us "re-evaluate the hazards, identify vulnerable locations, and develop possible adaptation measures," continues Nüsser. After the flood in Gya, for example, concrete walls were constructed along the undercut bank as a flood protection measure to safeguard the villages and fields from future floods.

Credit: 
Heidelberg University

Immunotherapy side effect could be a positive sign for kidney cancer patients

image: Microscopic image of kidney tubules (yellow arrows) under an immune attack (green arrows) characteristic of acute interstitial nephritis (AIN). Micrograph taken from a biopsy of a kidney cancer patient treated with immunotherapy who had a good response against the cancer.

Image: 
UT Southwestern Medical Center

DALLAS - Nov. 2, 2020 - An autoimmune side effect of immune checkpoint inhibitor (ICI) drugs could signal improved control of kidney cancer, according to a new study by researchers in UT Southwestern's Kidney Cancer Program (KCP).

The study, published today in the Journal for ImmunoTherapy of Cancer, may have broad implications for patients being treated with ICIs, a type of immunotherapy that is used against a large number of cancers, including lung, breast, liver, and cervical.

Renal cell carcinoma, the most common type of kidney cancer, is the ninth leading cause of cancer in the U.S. Once the cancer has spread to other organs, or metastasized, the survival rate averages 12 percent at five years.

With the advent of ICIs, kidney cancer survival rates are improving. However, only a fraction of patients respond to ICIs - and who will respond is unpredictable. ICIs disable cloaking mechanisms put in place by tumors to evade killing by immune cells. But taking down these cloaking mechanisms also increases the chances that the immune system will turn against the body, causing autoimmune side effects.

KCP investigators hypothesize that kidney cancer patients whose immune system attacked their kidneys may be more likely to benefit from ICIs. Using Kidney Cancer Explorer, a proprietary tool that extracts information from electronic health records, investigators identified metastatic kidney cancer patients treated with ICIs between 2014 and 2018. Using this tool, they analyzed thousands of laboratory test results to identify patients whose kidney function became impaired. Out of 177 patients, they found 36 such patients.

In three of the 36 patients, the impairment was due to an immune-mediated attack. A fourth more recent patient was also identified. All four patients developed acute interstitial nephritis (AIN), an autoimmune condition in which immune cells attack kidney cells, causing inflammation and swelling. While ICIs induce responses in the cancer in up to 40 percent of patients, in this case all four patients with AIN had a response.

"For 100 percent of patients to respond is quite significant," says Roy Elias, M.D., assistant instructor of internal medicine and co-first author of the study. "If a patient develops AIN, it is a sign that the treatment may be working."

This is not the first time that a particular autoimmune effect has been linked to increased response rates to ICIs, says James Brugarolas, M.D., Ph.D., professor of internal medicine and director of the Kidney Cancer Program. In fact, patients with melanoma who developed vitiligo, a condition in which skin pigment cells are killed by immune cells, also had higher chances of response.

Upon identifying a second example of an immune attack to the tissue of origin associated with a favorable cancer response, KCP investigators propose that this finding may be generalizable to other tumor types.

"All cancer cells start out as normal cells," says Brugarolas. "Even after turning malignant, they retain some of their original traits. Thus, an attack against the tissue of origin may signal a higher chance that the immune system will also recognize and attack the cancer."

Further study will be needed to determine whether positive outcomes could be generalizable to patients with other cancers who are experiencing similar immune attacks against the cell of origin of the cancer.

Credit: 
UT Southwestern Medical Center

In your gut: How bacteria survive low oxygen environments

Researchers from ITQB NOVA, in collaboration with the Institut Pasteur in Paris, have shed light on the mechanisms that allow Clostridioides difficile, a pathogen that can only grow in oxygen-free environments, to be able to survive low oxygen levels. C. difficile is a major cause of intestinal problems associated with the use of antibiotics, causing an estimated number of 124k cases per year in the EU, costing on average 5k€ per patient, as a direct consequence of healthcare-associated contagion. Particularly pathogenic varieties of C. difficile are an important cause of high prevalence infections in health care environments and will keep hindering the ideal use of antimicrobial therapy unless these mechanisms are understood more rapidly than these organisms evolve.

A healthy human gut is generally regarded as mainly free of oxygen but, in reality, there are varying levels of oxygen along the gastrointestinal tract, which poses a challenge to anaerobic organisms of the human microbiome, such as C. difficile. In organisms similar to this bacterium, two families of enzymes, flavodiiron proteins and rubrerythrins, have been shown to play an important role in protection against oxidative stress.

"Little was known about the actual proteins involved in the ability of C. difficile to tolerate O2, and our studies have demonstrated a key role of flavodiiron proteins and rubrerythrins proteins in providing C. difficile with the ability to grow in conditions such as those encountered in the colon", says Miguel Teixeira, head of the Functional Biochemistry of Metalloenzymes Lab.

This finding led the ITQB NOVA team, along with the I. Martin- Verstreaet Lab at Institut Pasteur, to develop a comprehensive study on four of these types of proteins. It had been previously established that a flavodiiron protein is able to reduce both oxygen and hydrogen peroxide, and this study confirmed the same for two types of rubrerythrins proteins. In a particular mutant strain of C. difficile, inactivation of both rubrerythrins led to the bacteria not growing at an oxygen level above 0.1%, a significant difference from the bacteria's usual resistance, of up to 0.4% O2.

By demonstrating that flavodiiron and reverse rubrerythrin proteins are essential in C. difficile's ability to tolerate damage to its cells in the presence of oxygen, the two teams of researchers have managed a significant step towards better understanding its mechanisms of resistance. The researchers will now move on to explore other survival mechanisms of these bacteria.

Credit: 
Instituto de Tecnologia Química e Biológica António Xavier da Universidade NOVA de Lisboa ITQB NOVA

Decennial 2020 research sets the agenda for advancing safe healthcare

NEW YORK (November 2, 2020) -- More than 700 studies, including 250 international abstracts, highlighting worldwide progress in preventing and controlling healthcare-associated infections and addressing antibiotic resistance were published today as part of the proceedings from the Sixth Decennial International Conference on Healthcare-Associated Infections. The Sixth Decennial, a conference co-hosted by the Centers for Disease Control and Prevention and the Society for Healthcare Epidemiology of America, was cancelled in March due to the COVID-19 pandemic. All abstracts accepted for the meeting appear in a supplement for the journal Infection Control & Hospital Epidemiology.

"While COVID-19 disrupted the plans for sharing the progress of preventing healthcare-associated infections and combating antibiotic resistance over the last decade, it is critical we disseminate, learn from, and promote successes across the world," said Denise Cardo, MD, Director of the Division of Healthcare Quality Promotion at the Centers for Disease Control and Prevention. "Despite existing and emerging threats, safe care must be delivered, and patients must be protected from harm. We can meet this challenge with robust and informed action."

As part of the planning for the conference, coordinated once every 10 years for the past 60 years, the Decennial Program Committee of 25 experts in infectious diseases selected the central theme, Global Solutions to Antibiotic Resistance in Healthcare, to acknowledge the need for coordinated international collaboration in the fight against healthcare-associated infections, including those caused by antibiotic-resistant bacteria.

The committee reviewed international advances of the previous decade and future trends in the fields of healthcare epidemiology, infectious diseases, infection prevention, patient safety and antibiotic stewardship. Three key topics emerged:

Innovation - The development of novel prevention tools, strategies, diagnostics, and therapeutics has been critical in the progress of infection prevention and addressing the threat of antibiotic resistance. Further innovation related to healthcare technology, practices, policies, and programs are needed to continue to move towards the goal of eliminating HAIs and slowing antibiotic resistance.

Data for Action - Facilities, states, clinicians, and other stakeholders need data to drive detection and prevention strategies to eliminate healthcare-associated infections and combat antibiotic resistance. Improvements in use of surveillance, epidemiologic, clinical, and laboratory data are critical to help close knowledge gaps and allow for the implementation of effective strategies to provide safe care.

Addressing Antimicrobial Resistance Without Borders - Many factors impact the local and global burden and transmission of antibiotic resistance. To prevent resistant pathogens from spreading within and between healthcare facilities and the environment, constant vigilance and action are needed. Global success in containing spread of healthcare-associated infections and antibiotic resistance will require coordinated responses at the local, regional, and international levels. Public health and healthcare systems must work together to share information to detect and to implement effective practices to prevent infections from occurring and spreading.

"While we have worked to manage, control, and understand the COVID-19 pandemic, we still need to address the constant threat of treatment-resistant infections and our imperative to prevent healthcare-associated infections and foster appropriate use of antimicrobial treatments," said David Henderson, MD, FSHEA, president of SHEA. "This volume of research is invaluable for helping us identify how best to move the field ahead to deliver safer healthcare for all."

The meeting was planned in collaboration with the Association for Professionals in Infection Control and Epidemiology (APIC) and the Infectious Diseases Society of America (IDSA).

To access the Decennial 2020 supplement to Infection Control & Hospital Epidemiology, go to https://bit.ly/3mLRRhm

Credit: 
Society for Healthcare Epidemiology of America

A loan for lean season

For farmers in rural Zambia, payday comes just once a year, at harvest time. This fact impacts nearly every aspect of their lives, but until now researchers hadn't realized the true extent.

Economist Kelsey Jack, an associate professor at UC Santa Barbara, sought to investigate how this extreme seasonality affects farmers' livelihoods, as well as development initiatives aimed at improving their condition. She and her coauthors conducted a two-year experiment in which they offered loans to help families through the months before harvest.

The researchers found that small loans in the lean season led to higher quality of life, more time invested in one's own farm, and greater agricultural output, all of which contributed to higher wages in the labor market. The study, which appears in the American Economic Review, is part of a new wave of research re-evaluating the importance of seasonality in rural agricultural settings.

Jack came to this research topic through her personal experience working with communities in rural Zambia over the past 12 years. She would often ask folks what made their lives harder, and she kept hearing the same story. These farmers rely on rainfall, rather than irrigation, for their crops. So their harvest follows the seasons. This means that all of their income arrives at once, during harvest time in June.

"Imagine if you got your paycheck once a year, and then you had to make that last for the remaining 11 months," Jack said. This leads to what's referred to locally as the hungry season, or lean season, in the months preceding harvest.

When households find themselves low on food and cash, they rely on selling labor in a practice known as ganyu to make ends meet. Instead of working on their own farms, family members work on other people's farms, essentially reallocating labor from poor families to those of better means - though it's not always the same people in these positions from year to year.

When Jack spoke about this with her collaborator Günter Fink at the University of Basel, in Switzerland, he mentioned hearing the same story during his work in the region. They contacted another colleague, Felix Masiye, chair of the economics department at the University of Zambia, who said that while this was a known phenomenon in Zambia, no one had researched it yet. The three decided to validate the farmers' story and quantify its effects.

"This is basically the farmers' paper," said Jack. "They told us to write it and we did. And it turned out to be a really interesting story."

Before even launching this project, the researchers met with communities and conducted a full 1-year pilot study across 40 villages. They designed the experiment around the input they received, including loan sizes, interest rates, payment timeframes and so forth. Throughout the project the team worked with village leadership and the district agricultural office, and had their proposal evaluated by institutional review boards in both the United States and Zambia.

The experiment consisted of a large randomized control trial with 175 villages in Zambia's Chipata District. It essentially spanned the whole district, Jack said. The project lasted two years and comprised over 3,100 farmers.

The researchers randomly assigned participants to three groups: a control group in which business proceeded as usual, a group that received cash loans, and a group that received loans in the form of maize. The loans were designed to feed a family of four for four months and were issued at the start of the lean season in January, with payments due in July, after harvest.

"They were designed to coincide with people's actual income flows," Jack said. She contrasted this with most lending and microfinance in rural areas, which doesn't account for the seasonality of income.

The project provided loans to around 2,000 families the first year and about 1,500 the second year. Some of the households were assigned to different groups in the second year to measure how long the effect of the loan persisted.

In addition to collecting data on metrics like crop yield, ganyu wages and default rates, the team conducted thousands of surveys over the course of the study to learn about behaviors like consumption and labor.

Overall, the results affirmed the importance of seasonal variability to the livelihoods of rural farmers and the impact of any economic interventions. "Transferring money to a rural agricultural family during the hungry season is a lot more valuable to that family than transferring money at harvest time," Jack said.

The experiment's most striking result was simply how many people took the loan. "The take-up rates that we saw were absolutely astounding," Jack exclaimed. "I don't think there's an analogue for it in any kind of lending intervention."

A full 98% of eligible households took the loan the first year, and more surprisingly, the second year as well. "If the only measure for whether this intervention helped people was whether they wanted it again, that alone would be enough to say people were better off," Jack stated.

For the most part farmers were able to repay their loans. Only 5% of families defaulted in the first year, though this rose a bit to around 15% in year two. Though she can't be certain, Jack suspects poorer growing conditions in the second year may have contributed to this increase.

Of course, loan uptake was far from the only promising sign the researchers saw. Food consumption in the lean season increased by 5.5% for households in the treatment groups, relative to the control, which essentially bridged the difference between the hungry season and the harvest season.

Families that received loans were also able to devote more energy to their own fields. These households reported a 25% drop in total hours working ganyu, which translated to around 60 hours of additional labor on their own land over the course of the season. This saw agricultural production rise by about 9% in households eligible for the loan, which was more than the value of the loan itself.

With fewer people selling their labor, those who did choose to do ganyu saw their wages increase by 17 to 19% in villages where the program was offered. This was buoyed by a 40% rise in hiring from those who received loans, which helped address economic inequality in the community.

What's more, Jack and her colleagues found little difference in the outcomes between families in the cash group versus those who received shipments of maize. It was a welcome finding, since cash is much cheaper to deliver than sacks of corn, though by no means inexpensive.

In fact, a huge challenge the researchers faced was simply the cost of delivering and collecting the small loans. In rural Zambia people are spread out, financial institutions are rudimentary, and infrastructure like roads are underdeveloped.

"If it was profitable to get these farmers loans then people would be giving them loans," Jack said. "But loans for things like food, school fees, and other basic needs just don't exist at reasonable interest rates."

To account for the large transaction costs, a lender could simply increase the size of their loans. That way the same interest rate yields more money to cover the fixed costs. But according to Jack, most families don't want to take on the burden of a large loan.

The alternative is to charge higher interest on small loans. Interest rates for the loans in the study were 4.5% per month over the course of half a year, which worked out to a 30% interest rate over the six-month loan. This is steep compared to most lenders in countries like the United States; however, it was vastly lower than the 40-100% monthly interest rates otherwise available in these communities.

Several other factors contribute to these sky-high interest rates in addition to the transaction costs, including high risks and the difficulty of enforcing contracts. What's more, the low availability of creditors makes it essentially a lender's market. Economists continue to search for solutions to these challenges.

Until recently, economists had largely written off seasonality as an important factor in rural development, Jack explained. But the results of this study underscore how everything -- from grain prices to wages to labor allocation -- fluctuates around the fact that everyone is poorer at one time of year and better off at another.

"As a result, there are potentially large gains for interventions that help people smooth their very infrequent income over the rest of the year," she said. These can take many forms in addition to loans, from irrigation and new crops to bank accounts and farmer cooperatives -- basically anything that helps smooth out resources or enables income to arrive more frequently.

The upshot is that governments and NGOs can increase their impact by incorporating seasonality into their interventions. Making better use of resources is particularly critical in light of budget cuts and economic hardships caused by the COVID-19 pandemic.

In fact, many of Jack's current projects have been disrupted by the pandemic, including another randomized control trial that sought to build on the understanding gleaned from this experiment. She hopes to resume these studies, as well as discussions with different governments, as conditions improve.

Nevertheless, this study has provided a wealth of insights in its own right. "Crucially, the value of a dollar depends a lot on how many dollars you have," Jack said, "so you want to direct assistance to those times of the year when they will be most helpful."

Credit: 
University of California - Santa Barbara

Vitamin D levels during pregnancy linked with child IQ

image: Vitamin D is a critical nutrient and has many important functions in the body.

Image: 
Seattle Children's

Vitamin D is a critical nutrient and has many important functions in the body. A mother’s vitamin D supply is passed to her baby in utero and helps regulate processes including brain development. A study published today in The Journal of Nutrition showed that mothers’ vitamin D levels during pregnancy were associated with their children’s IQ, suggesting that higher vitamin D levels in pregnancy may lead to greater childhood IQ scores. The study also identified significantly lower levels of vitamin D levels among Black pregnant women.

Melissa Melough, the lead author of the study and research scientist in the Department of Child Health, Behavior, and Development at Seattle Children’s Research Institute, says vitamin D deficiency is common among the general population as well as pregnant women, but notes that Black women are at greater risk. Melough says she hopes the study will help health care providers address disparities among women of color and those who are at higher risk for vitamin D deficiency.

“Melanin pigment protects the skin against sun damage, but by blocking UV rays, melanin also reduces vitamin D production in the skin. Because of this, we weren’t surprised to see high rates of vitamin D deficiency among Black pregnant women in our study. Even though many pregnant women take a prenatal vitamin, this may not correct an existing vitamin D deficiency,” Melough said. “I hope our work brings greater awareness to this problem, shows the long-lasting implications of prenatal vitamin D for the child and their neurocognitive development, and highlights that there are certain groups providers should be paying closer attention to. Wide-spread testing of vitamin D levels is not generally recommended, but I think health care providers should be looking out for those who are at higher risk, including Black women.”

Addressing disparities

According to Melough, as many as 80% of Black pregnant women in the U.S. may be deficient in vitamin D. Of the women who participated in the study, approximately 46% of the mothers were deficient in vitamin D during their pregnancy, and vitamin D levels were lower among Black women compared to White women.

Melough and her co-authors used data from a cohort in Tennessee called the Conditions Affecting Neurocognitive Development and Learning in Early Childhood (CANDLE) study. CANDLE researchers recruited pregnant women to join the study starting in 2006 and collected information over time about their children’s health and development.

After controlling for several other factors related to IQ, higher vitamin D levels in pregnancy were associated with higher IQ in children ages 4 to 6 years old. Although observational studies like this one cannot prove causation, Melough believes her findings have important implications and warrant further research.

Vitamin D deficiency

“Vitamin D deficiency is quite prevalent,” Melough said. “The good news is there is a relatively easy solution. It can be difficult to get adequate vitamin D through diet, and not everyone can make up for this gap through sun exposure, so a good solution is to take a supplement.”

The recommended daily intake of vitamin D is 600 international units (IU). On average, Americans consume less than 200 IU in their diet, and so if people aren’t making up that gap through sun exposure or supplementation, Melough says people will probably become deficient. Foods that contain higher levels of vitamin D include fatty fish, eggs and fortified sources like cow’s milk and breakfast cereals. However, Melough notes that vitamin D is one of the most difficult nutrients to get in adequate amounts from our diets.

Additional research is needed to determine the optimal levels of vitamin D in pregnancy, but Melough hopes this study will help to develop nutritional recommendations for pregnant women. Especially among Black women and those at high risk for vitamin D deficiency, nutritional supplementation and screening may be an impactful strategy for reducing health disparities.

Key takeaways

Melough says there are three key takeaways from the study:

Vitamin D deficiency is common during pregnancy, and Black women are at greater risk because melanin pigment in the skin reduces production of vitamin D
Higher vitamin D levels among mothers during pregnancy may promote brain development and lead to higher childhood IQ scores
Screening and nutritional supplementation may correct vitamin D deficiency for those at high risk and promote cognitive function in offspring

“I want people to know that it’s a common problem and can affect children’s development,” Melough said. “Vitamin D deficiency can occur even if you eat a healthy diet. Sometimes it’s related to our lifestyles, skin pigmentation or other factors outside of our control.”

Credit: 
Seattle Children's