Earth

Real cost of heart attacks and strokes: Double the direct medical expense

Sophia Antipolis, 7 April 2019: The full financial cost of a heart attack or stroke is twice as much as the medical costs when lost work time for patients and caregivers is included.

That's the finding of research published today, World Health Day, in the European Journal of Preventive Cardiology, a journal of the European Society of Cardiology (ESC).1 The study concludes that victims of heart attack and stroke who return to work are 25% less productive in their first year back.

In the year after the event, heart patients lost 59 workdays and caregivers lost 11 workdays, for an average cost of €13,953, and ranging from €6,641 to €23,160 depending on the country. After stroke, 56 workdays were lost by patients and 12 by caregivers, for an average €13,773, ranging from €10,469 to €20,215.

Study author Professor Kornelia Kotseva, of Imperial College London, UK, said: "Patients in our study returned to work, meaning their events were relatively mild. Some still had to change jobs or careers, or work less, and caregivers lost around 5% of work time. Not included in our study are those with more severe events who quit work altogether and presumably need even more help from family and friends."

The study enrolled 394 patients from seven European countries - 196 with acute coronary syndrome (86% heart attack, 14% unstable chest pain) and 198 with stroke - who returned to work 3 to12 months after the event. Patients completed a questionnaire2,3 during a visit to a cardiologist, neurologist, or stroke physician. Hours lost were valued according to country labour costs in 2018. The average age of patients was 53 years.

According to published estimates for Europe, the direct medical costs of acute coronary syndrome are €1,547 to €18,642, and €5,575 to €31,274 for stroke.4 "This is the metric commonly used to estimate the costs of medical conditions while indirect costs from productivity loss are often not taken into account by clinicians, payers or policymakers," said Professor Kotseva. "Taken together, the actual burden on society is more than twice the amount previously reported."

Reasons for lost productivity were consistent across countries: 61% was the initial hospitalisation and sick leave after discharge; 23-29% was absence from work after the initial sick leave (for medical appointments and shorter sick leave); 9-16% was being unable to work at full capacity because of feeling unwell.

Even more workdays were lost in the first year after the event for patients with previous events or established cardiovascular disease. When adding days lost by patients and caregivers together, this was 80 for acute coronary syndrome and 73 for stroke, costing €16,061 and €14,942, respectively.

In the study, 27% of heart patients and 20% of stroke patients were obese, while 40% of heart patients and 27% of stroke patients were current smokers.

"Productivity loss associated with cardiovascular events is substantial and goes beyond the patient," said Professor Kotseva. "Preventing acute coronary syndrome and stroke is the key to improving health and longevity and avoiding the myriad of costs that come with such an event. The true tragedy is that so many heart attacks and strokes could be averted by not smoking, being physically active, eating healthily, and controlling blood pressure and cholesterol. The evidence could not be stronger."

Credit: 
European Society of Cardiology

Research identifies genetic causes of poor sleep

The international collaboration, led by the University of Exeter and published in Nature Communications, has found 47 links between our genetic code and the quality, quantity and timing of how we sleep. They include ten new genetic links with sleep duration and 26 with sleep quality.

The Medical Research Council-funded study looked at data from 85,670 participants of UK Biobank and 5,819 individuals from three other studies, who wore accelerometers - wrist-worn devices (similar to a Fitbit) which record activity levels continuously. They wore the accelerometers continuously for seven days, giving more detailed sleep data than previous studies, which have relied on people accurately reporting their own sleep habits.

Among the genomic regions uncovered is a gene called PDE11A. The research team discovered than an uncommon variant of this gene affects not only how long you sleep but your quality of sleep too. The gene has previously been identified as a possible drug target for treatment of people with neuropsychiatric disorders associated with mood stability and social behaviours.

The study also found that among people with the same hip circumference, a higher waist circumference resulted in less time sleeping, although the effect was very small - around 4 seconds less sleep per 1cm waist increase in someone with the average hip circumference of around 100cm.

The team involved colleagues from the Center for Sleep and Circadian Neurobiology in Pennsylvania, Massachusetts General Hospital as well as the Netherlands, France and Switzerland. They found that collectively, the genetic regions linked to sleep quality are also linked to the production of serotonin - a neurotransmitter associated with feelings of happiness and wellbeing. Serotonin is known to play a key role in sleep cycles and is theorised to help promote deeper and more restful sleep.

Senior author Dr Andrew Wood, of the University of Exeter Medical School, said: "We know that getting enough sleep improves our health and wellbeing, yet we still know relatively little about the mechanisms in our bodies that influence how we sleep. Changes in sleep quality, quantity and timing are strongly associated with several human diseases such as diabetes and obesity, and psychiatric disorders.

Lead author Dr Samuel Jones, of the University of Exeter Medical School, said: "This study identifies genetic variants influencing sleep traits, and will provide new insights into the molecular role of sleep in humans. It is part of an emerging body of work which could one day inform the development of new treatments to improve our sleep and our overall health."

The group also found further evidence that Restless Leg Syndrome is linked to poorer sleep from the genetic variants they found to be associated with sleep measures derived from the accelerometer data.

The full paper is entitled 'Genetic studies of accelerometer-based sleep measures yield new insights into human sleep behaviour'

Credit: 
University of Exeter

The carbon offset market: Leveraging forest carbon's value in the Brazilian Amazon

As companies seek and are required to reduce their greenhouse gas emissions, the world's carbon markets are expanding. A government-run program in the Amazon rainforest in northwestern Brazil transforms forest carbon value into public wealth by focusing on labor rather than land rights. In the Brazilian state of Acre, some of the revenue from carbon credits is distributed to rural laborers and family farmers without land rights. A Dartmouth study just published examines Acre's forest carbon program, and the benefits and risks associated with this approach, including the potential impact on labor, state power and efforts to prevent deforestation. The study's findings are published in the Journal of Peasant Studies.

"In resource rich areas in the rainforest, powerful actors have often profited at the expense of indigenous peoples and other local communities, who are often displaced from their land. Efforts to value forest carbon threaten to do the same. Through Acre's carbon value program, in contrast, the state government attempts to create an inclusive, green economy in a way that also expands its social welfare initiatives. There are also risks to this approach" says study author, Maron Greenleaf, a lecturer and research associate in the department of anthropology at Dartmouth College.

In many other forest carbon credit programs, land has to be privately owned. In contrast, "through Acre's system, poorer people are not excluded from forest carbon's value because of their lack of formal land rights. Rather, many benefit from it," explained Greenleaf. "This is an example of how we're seeing environmental policy being harnessed to create new kinds of economies."

Acre's 2010 System of Incentives for Environmental Services (SISA) focuses on environmental services, including carbon sequestration, to protect the approximately 86 percent of the state (total of 164,000 sq km) that is forested. SISA's carbon program is an example of Reducing Emissions from Deforestation and Forest Degradation (REDD+), which prioritizes sustainable forestry management and conservation practices to combat climate change.

With Acre's 2017 agreements with Britain and Germany, and others, demand for Acre's forest carbon value appears to be relatively strong. One such agreement is Acre's 2010 Memorandum of Understanding with California. This month, the California Air Resources Board , which regulates the carbon market within the state, will determine whether or not to adopt the "Tropical Forest Standard." Considered to be a model of tropical forest protection, Acre is the most likely jurisdiction to be approved under the Standard, which would allow for its carbon credits to be purchased by California-based companies. This would be the first time that REDD+ credits are accepted into a cap and trade market.

Greenleaf's insights on Acre's carbon offset program are based on her fieldwork conducted in 2012 and 2013-14. As part of her research, she interviewed "rural producers," who practice small-scale farming for subsistence and markets, and engage in local wage labor and/or cattle ranching. These include "posseiros," who have land claims based on informal possession.

In the study, Greenleaf discusses the complexity of the land tenure process in Acre. There may be land conflicts and land usage, which may be dictated by culturally specific arrangements, including overlapping individual and/or collective claims-- nuances that are not reflected on maps. She explains how if REDD+ was not administered by the Acreano Government, many rural laborers and family farmers might not be able to participate in this program; obtaining formal title from the government can be challenging or next to impossible, requiring resources and documentation that many lack.

Shifting allocation of carbon value to "green labor" provides an alternative that bypasses some of the challenges associated with the recognition of land rights, but does not eliminate them, as seeking land title is not only a path for upward mobility for the poor but also integral to identity and status. Even if rural laborers lack sufficient evidence to demonstrate formal land claims, Greenleaf explains how the small economic benefit that they may receive from the state could help make their land claims "less easily dismissed" in the future. At the same time, she points out that this model reinforces Acre's state power and enables the government to make decisions over the land "as forest carbon's presumptive owner." The question looms as to whether Acre's progressive, carbon value program will continue on its path of inclusive forest protection, particularly in the wake of the 2018 Brazilian elections, and whether other entities will look to adopt this model.

Credit: 
Dartmouth College

Seed dispersal by invasive birds in Hawaii fills critical ecosystem gap

On the Hawaiian island of O'ahu, where native birds have nearly been replaced by invasive ones, local plants depend almost entirely on invasive birds to disperse their seeds, new research shows. The findings are an example of how ecological communities dominated by introduced or invasive non-native species can be as dynamic, complex and stable as native communities, with invaders maintaining crucial ecosystem functions. Previous studies have shown that the network of interactions between plants and animals within an ecological community is particularly sensitive to invasive species or extinctions. Many of these networks, however, have only been investigated in native-dominated communities where invasive species are not present and where complex, coevolved interactions between native plants and animals have formed over long time spans. While human-caused species invasions and extinctions are becoming common across the globe, the novel ecological communities that result from these activities - and the ways in which they compare to native communities - remain virtually unknown. To address this, Jefferson Vizentin-Bugoni and colleagues investigated bird-based seed dispersal networks on O'ahu Hawaii, where native birds have been nearly replaced by invasive bird species. Vizentin-Bugoni et al. identified over 100,000 seeds collected from bird feces across O'ahu and found that native plants are almost entirely dependent on invasive birds for seed dispersal. What's more, the authors find that the novel networks on O'ahu bear a striking resemblance to the structure and stability of native-dominated networks worldwide, suggesting that novel ecological communities can arise over a short period of time and do so independently of the species involved. The results demonstrate that invasive species can quickly become well-integrated into native communities and help to fill critical gaps in ecosystem networks in the absence of native species. However, this role is not fulfilled completely, suggest the authors. While the birds on O'ahu are the only dispersers of native plants, they spread the seeds of non-native plants in much higher proportions.

Credit: 
American Association for the Advancement of Science (AAAS)

The Lancet: Globally, 1 in 5 deaths are associated with poor diet

Globally, one in five deaths (11 million deaths) in 2017 were associated with poor diet, with cardiovascular disease being the biggest contributor, followed by cancers and type 2 diabetes.

Largest shortfalls in global consumption were seen for foods such as nuts and seeds, milk, and whole grains, while sugary drinks, processed meat and sodium were overeaten.

The largest number of diet-related deaths were associated with eating too much sodium, not enough whole grains and not enough fruits. Across all 15 dietary factors, more deaths were associated with not eating enough healthy foods compared with eating too many unhealthy foods.

Out of all 195 countries, the proportion of diet-related deaths was highest in Uzbekistan, and lowest in Israel. The UK ranked 23rd, the United States 43rd, China 140th, and India 118th.

People in almost every region of the world could benefit from rebalancing their diets to eat optimal amounts of various foods and nutrients, according to the Global Burden of Disease study tracking trends in consumption of 15 dietary factors from 1990 to 2017 in 195 countries, published in The Lancet.

The study estimates that one in five deaths globally - equivalent to 11 million deaths - are associated with poor diet, and diet contributes to a range of chronic diseases in people around the world. In 2017, more deaths were caused by diets with too low amounts of foods such as whole grains, fruit, nuts and seeds than by diets with high levels of foods like trans fats, sugary drinks, and high levels of red and processed meats (see figure 3 [1] and supplemental table 9).

The authors say that their findings highlight the urgent need for coordinated global efforts to improve diet, through collaboration with various sections of the food system and policies that drive balanced diets.

"This study affirms what many have thought for several years - that poor diet is responsible for more deaths than any other risk factor in the world," says study author Dr Christopher Murray, Director of the Institute for Health Metrics and Evaluation, University of Washington, USA. "While sodium, sugar, and fat have been the focus of policy debates over the past two decades, our assessment suggests the leading dietary risk factors are high intake of sodium, or low intake of healthy foods, such as whole grains, fruit, nuts and seeds, and vegetables. The paper also highlights the need for comprehensive interventions to promote the production, distribution, and consumption of healthy foods across all nations." [2]

Impact of diet on non-communicable diseases and mortality

The study evaluated the consumption of major foods and nutrients across 195 countries and quantified the impact of poor diets on death and disease from non-communicable diseases (specifically cancers, cardiovascular diseases, and diabetes). It tracked trends between 1990 and 2017.

Previously, population level assessment of the health effects of suboptimal diet has not been possible because of the complexities of characterising dietary consumption across different nations. The new study combines and analyses data from epidemiological studies - in the absence of long-term randomised trials which are not always feasible in nutrition - to identify associations between dietary factors and non-communicable diseases.

The study looked at 15 dietary elements - diets low in fruits, vegetables, legumes, whole grains, nuts and seeds, milk, fibre, calcium, seafood omega-3 fatty acids, polyunsaturated fats, and diets high in red meat, processed meat, sugar-sweetened beverages, trans fatty acids, and sodium [3]. The authors note that there were varying levels of data available for each dietary factor, which increases the statistical uncertainty of these estimates - for example, while data on how many people ate most dietary factors was available for almost all countries (95%), data for the sodium estimates was only available for around one in four countries [4].

Overall in 2017, an estimated 11 million deaths were attributable to poor diet. Diets high in sodium, low in whole grains, and low in fruit together accounted for more than half of all diet-related deaths globally in 2017 [1].

The causes of these deaths included 10 million deaths from cardiovascular disease, 913,000 cancer deaths, and almost 339,000 deaths from type 2 diabetes. Deaths related to diet have increased from 8 million in 1990, largely due to increases in the population and population ageing.

Global trends in consumption

The authors found that intakes of all 15 dietary elements were suboptimal for almost every region of the world - no region ate the optimal amount of all 15 dietary factors, and not one dietary factor was eaten in the right amounts by all 21 regions of the world.

Some regions did manage to eat some dietary elements in the right amounts. For example, intake of vegetables was optimal in central Asia, as was seafood omega-3 fatty acids intake in high-income Asia Pacific, and legume intake in the Caribbean, tropical Latin America, south Asia, western sub-Saharan Africa, and eastern sub-Saharan Africa.

The largest shortfalls in optimal intake were seen for nuts and seeds, milk, whole grains, and the largest excesses were seen for sugar sweetened beverages, processed meat and sodium. On average, the world only ate 12% of the recommended amount of nuts and seeds (around 3g average intake per day, compared with 21g recommended per day), and drank around ten times the recommended amount of sugar sweetened beverages (49g average intake, compared with 3g recommended).

In addition, the global diet included 16% of the recommended amount of milk (71g average intake per day, compared with 435g recommended per day), about a quarter (23%) of the recommended amount of whole grains (29g average intake per day, compared with 125g recommended per day), almost double (90% more) the recommended range of processed meat (around 4g average intake per day, compared with 2g recommended per day), and 86% more sodium (around 6g average intake per day, compared with 24 h urinary sodium 3g per day).

Regional variations

Regionally, high sodium intake (above 3g per day) was the leading dietary risk for death and disease in China, Japan, and Thailand. Low intake of whole grains (below 125g per day) was the leading dietary risk factor for death and disease in the USA, India, Brazil, Pakistan, Nigeria, Russia, Egypt, Germany, Iran, and Turkey. In Bangladesh, low intake of fruits (below 250g per day) was the leading dietary risk, and, in Mexico, low intake of nuts and seeds (below 21g per day) ranked first. High consumption of red meat (above 23g per day), processed meat (above 2g per day), trans fat (above 0.5% total daily energy), and sugar-sweetened beverages (above 3g per day) were towards the bottom in ranking of dietary risks for death and disease for highly populated countries [3].

In 2017, there was a ten-fold difference between the country with the highest rate of diet-related deaths (Uzbekistan) and the country with the lowest (Israel). The countries with the lowest rates of diet-related deaths were Israel (89 deaths per 100,000 people), France, Spain, Japan, and Andorra. The UK ranked 23rd (127 deaths per 100,000) above Ireland (24th) and Sweden (25th), and the United States ranked 43rd (171 deaths per 100,000) after Rwanda and Nigeria (41st and 42nd), China ranked 140th (350 deaths per 100,000 people), and India 118th (310 deaths per 100,000 people). The countries with the highest rates of diet-related deaths were Uzbekistan (892 deaths per 100,000 people), Afghanistan, Marshall Islands, Papua New Guinea, and Vanuatu.

The magnitude of diet-related disease highlights that many existing campaigns have not been effective and the authors call for new food system interventions to rebalance diets around the world. Importantly, they note that changes must be sensitive to the environmental effects of the global food system to avoid adverse effects on climate change, biodiversity loss, land degradation, depleting freshwater, and soil degradation.

In January 2019, The Lancet published the EAT-Lancet Commission [5], which provides the first scientific targets for a healthy diet from a sustainable food production system that operates within planetary boundaries for food. This report used 2016 data from the Global Burden of Disease study to estimate how far the world is from the healthy diet proposed.

The authors note some limitations of the current study, including that while it uses the best available data, there are gaps in nationally representative individual-level data for intake of key foods and nutrients around the world. Therefore, generalising the results may not be appropriate as most of the studies of diet and disease outcomes are largely based on populations of European descent, and additional research in other populations is desirable. The strength of epidemiological evidence linking dietary factors and death and disease is mostly from observational studies and is not as strong as the evidence linking other major risk factors (such as tobacco and high blood pressure) to ill health. However, most of the diet and health associations are supported by short term randomized studies with risk factors for disease as the outcomes.

For sodium, estimates were based on 24-hour urinary sodium measurements, rather than spot urine samples, which was only available for around a quarter of the countries in the study [3]. Accurate estimation of some nutrients (such as fibre, calcium, and polyunsaturated fatty acids) is complex. As a result, the authors call for increased national surveillance and monitoring systems for key dietary risk factors, and for collaborative efforts to collect and harmonise dietary data from cohort studies.

In addition, the authors only looked at food and nutrient intake and did not evaluate whether people were over- or underweight. Lastly, some deaths could have been attributed to more than one dietary factor, which may have resulted in an overestimation of the burden of diseases attributable to diet.

Writing in a linked Comment, Professor Nita G Forouhi, Medical Research Council Epidemiology Unit, University of Cambridge School of Clinical Medicine, UK, says: "Limitations notwithstanding, the current GBD findings provide evidence to shift the focus, as the authors argue, from an emphasis on dietary restriction to promoting healthy food components in a global context. This evidence largely endorses a case for moving from nutrient-based to food based guidelines... There are of course
considerable challenges in shifting populations' diets in this direction, illustrated by the cost of fruits and vegetables being disproportionately prohibitive: two servings of fruits and three servings of vegetables per day per individual accounted for 52% of household income in low-income countries, 18% in low to middle-income countries, 16% in middle to upper-income countries, and 2% in high-income countries. A menu of integrated policy interventions across whole food systems, internationally and within countries, is essential to support the radical shift in diets needed to optimise human, and protect planetary health."].

Credit: 
The Lancet

Transparent wood can store and release heat

image: A new transparent wood becomes cloudier (right) upon the release of stored heat.

Image: 
American Chemical Society

ORLANDO, Fla., April 3, 2019 -- Wood may seem more at home in log cabins than modern architecture, but a specially treated type of timber could be tomorrow's trendy building material. Today, scientists report a new kind of transparent wood that not only transmits light, but also absorbs and releases heat, potentially saving on energy costs. The material can bear heavy loads and is biodegradable, opening the door for its eventual use in eco-friendly homes and other buildings.

The researchers will present their results today at the American Chemical Society (ACS) Spring 2019 National Meeting & Exposition. ACS, the world's largest scientific society, is holding the meeting here through Thursday. It features nearly 13,000 presentations on a wide range of science topics.

A brand-new video on the research is available at http://bit.ly/HLS_Transparent_Wood.

"Back in 2016, we showed that transparent wood has excellent thermal-insulating properties compared with glass, combined with high optical transmittance," says Céline Montanari, a Ph.D. student who is presenting the research at the meeting. "In this work, we tried to reduce the building energy consumption even more by incorporating a material that can absorb, store and release heat."

As economic development progresses worldwide, energy consumption has soared. Much of this energy is used to light, heat and cool homes, offices and other buildings. Glass windows can transmit light, helping to brighten and heat homes, but they don't store energy for use when the sun goes down.

Three years ago, lead investigator Lars Berglund, Ph.D., and colleagues at KTH Royal Institute of Technology in Stockholm, Sweden, reported an optically transparent wood in the ACS journal Biomacromolecules. The researchers made the material by removing a light-absorbing component called lignin from the cell walls of balsa wood. To reduce light scattering, they incorporated acrylic into the porous wood scaffold. The team could see through the material, yet it was hazy enough to provide privacy if used as a major building material. The transparent wood also had favorable mechanical properties, enabling it to bear heavy loads.

Building on this work, Montanari and Berglund added a polymer called polyethylene glycol (PEG) to the de-lignified wood. "We chose PEG because of its ability to store heat, but also because of its high affinity for wood," Montanari says. "In Stockholm, there's a really old ship called Vasa, and the scientists used PEG to stabilize the wood. So we knew that PEG can go really deep into the wood cells."

Known as a "phase-change material," PEG is a solid that melts at a temperature of 80 F, storing energy in the process. The melting temperature can be adjusted by using different types of PEGs. "During a sunny day, the material will absorb heat before it reaches the indoor space, and the indoors will be cooler than outside," Montanari explains. "And at night, the reverse occurs -- the PEG becomes solid and releases heat indoors so that you can maintain a constant temperature in the house."

The team encapsulated PEG within the de-lignified wood scaffold, which prevented leakage of the polymer during phase transitions. They also incorporated acrylic into the material to protect it from humidity. Like their earlier version, the modified wood was transparent, though slightly hazy, and strong, but had the added bonus of storing heat.

The researchers point out that the transparent wood has the potential to be more environmentally friendly than other building materials such as plastic, concrete and glass. In addition to its thermal-storage capabilities, the transparent wood could be easier to dispose of after it has served its purpose. "The PEG and wood are both bio-based and biodegradable," Berglund notes. "The only part that is not biodegradable is the acrylic, but this could be replaced by another bio-based polymer."

Now, the focus turns to scaling up the production process to be industrially feasible. The researchers estimate that transparent wood could be available for niche applications in interior design in as little as five years. They are also trying to increase the storage capacity of the material to make it even more energy-efficient.

A press conference on this topic will be held Wednesday, April 3, at 9 a.m. Eastern time in the Orange County Convention Center. Reporters may check-in at the press center, Room W231B, or watch live on YouTube http://bit.ly/ACSLive_Orlando2019 ("ACSLive_Orlando2019" is case sensitive). To ask questions online, sign in with a Google account.

Credit: 
American Chemical Society

Stress in childhood and adulthood have combined impact on hormones and health

Adults who report high levels of stress and who also had stressful childhoods are most likely to show hormone patterns associated with negative health outcomes, according to findings published in Psychological Science, a journal of the Association for Psychological
Science
.

One of the ways that our brain responds to daily stressors is by releasing a hormone called cortisol -- typically, our cortisol levels peak in the morning and gradually decline throughout the day. But sometimes this system can become dysregulated, resulting in a flatter cortisol pattern that is associated with negative health outcomes.

"What we find is that the amount of a person's exposure to early life stress plays an important role in the development of unhealthy patterns of cortisol release. However, this is only true if individuals also are experiencing higher levels of current stress, indicating that the combination of higher early life stress and higher current life stress leads to the most unhealthy cortisol profiles," says psychological scientist Ethan Young, a researcher at the University of Minnesota.

For the study, Young and colleagues examined data from 90 individuals who were part of a high-risk birth cohort participating in the Minnesota Longitudinal Study of Risk and Adaptation.

The researchers specifically wanted to understand how stressful events affect the brain's stress-response system later in life. Is it the total amount of stress experienced across the lifespan that matters? Or does exposure to stress during sensitive periods of development, specifically in early childhood, have the biggest impact?

Young and colleagues wanted to investigate a third possibility: Early childhood stress makes our stress-response system more sensitive to stressors that emerge later in life.

The researchers assessed data from the Life Events Schedule (LES), which surveys individuals' stressful life events, including financial trouble, relationship problems, and physical danger and mortality. Trained coders rate the level of disruption of each event on a scale from 0 to 3 to create an overall score for that measurement period. The participants' mothers completed the interview when the participants were 12, 18, 30, 42, 48, 54, and 64 months old; when they were in Grades 1, 2, 3, and 6; and when they were 16 and 17 years old. The participants completed the LES themselves when they were 23, 26, 28, 32, 34, and 37 years old.

The researchers grouped participants' LES scores into specific periods: early childhood (1-5 years), middle childhood (Grades 1-6), adolescence (16 and 17 years), early adulthood (23-34 years), and current (37 years).

At age 37, the participants also provided daily cortisol data over a 2-day period. They collected a saliva sample immediately when they woke up and again 30 minutes and 1 hour later; they also took samples in the afternoon and before going to bed. They sent the saliva samples to a lab for cortisol-level testing.

The researchers found that neither total life stress nor early childhood stress predicted cortisol level patterns at age 37. Rather, cortisol patterns depended on both early childhood stress and stress at age 37. Participants who experienced relatively low levels of stress in early childhood showed relatively similar cortisol patterns regardless of their stress level in adulthood. On the other hand, participants who had been exposed to relatively high levels of early childhood stress showed flatter daily cortisol patterns, but only if they also reported high levels of stress as adults.

The researchers also investigated whether life stress in middle childhood, adolescence, and early adulthood were associated with adult cortisol patterns, and found no meaningful relationships.

These findings suggest that early childhood may be a particularly sensitive time in which stressful life events -- such as those related to trauma or poverty -- can calibrate the brain's stress-response system, with health consequences that last into adulthood.

Young and colleagues note that cortisol is one part of the human stress-response system, and they hope to investigate how other components, such as the microbiome in our gut, also play a role in long-term health outcomes.

Credit: 
Association for Psychological Science

Programmable 'Legos' of DNA and protein building blocks create novel 3D cages

image: This protein-DNA 'Lego' was co-assembled with a triangular DNA structure bearing three complementary arms to the handles, resulting in tetrahedral cages comprised of six DNA sides capped by the protein trimer.

Image: 
Nicholas Stephanopoulos

The central goal of nanotechnology is the manipulation of materials on an atomic or molecular scale, especially to build microscopic devices or structures. Three-dimensional cages are one of the most important targets, both for their simplicity and their application as drug carriers for medicine. DNA nanotechnology uses DNA molecules as programmable "Legos" to assemble structures with a control not possible with other molecules.

However, the structure of DNA is very simple and lacks the diversity of proteins that make up most natural cages, like viruses. Unfortunately, it is very difficult to control the assembly of proteins with the precision of DNA. That is, until recently. Nicholas Stephanopoulos -- an assistant professor in Arizona State University's Biodesign Center for Molecular Design and Biomimetics, and the School of Molecular Sciences -- and his team built a cage constructed from both protein and DNA building blocks through the use of covalent protein-DNA conjugates.

In a paper published in ACS Nano, Stephanopoulos modified a homotrimeric protein (a natural enzyme called KDPG aldolase) with three identical single strand DNA handles by functionalizing a reactive cysteine residue they introduced onto the protein surface. This protein-DNA "Lego" was co-assembled with a triangular DNA structure bearing three complementary arms to the handles, resulting in tetrahedral cages comprised of six DNA sides capped by the protein trimer. The dimensions of the cage could be tuned through the number of turns per DNA arm and the hybrid structures were purified and characterized to confirm the three-dimensional structure.

Cages were also modified with DNA using click chemistry, which is a customized type of chemistry, to create elements rapidly with great reliability joining microscopic units together demonstrating the generality of the method.

"My lab's approach will allow for the construction of nanomaterials that possess the advantages of both protein and DNA nanotechnology, and find applications in fields such as targeted delivery, structural biology, biomedicine, and catalytic materials," Stephanopoulos said.

Stephanopoulos and his team see an opportunity with hybrid cages -- merging self-assembling protein building blocks with a synthetic DNA scaffold -- that could combine the bioactivity and chemical diversity of the former with the programmability of the latter. And that is what they set out to create -- a hybrid structure constructed through chemical conjugation of oligonucleotide (a synthetic DNA strand) handles on a protein building block. The triangular base bearing three complementary single-stranded DNA handles is self-assembled and purified separately by heating it to alter its properties.

"We reasoned that by designing these two purified building blocks, they would spontaneously snap together in a programmable way, using the recognition properties of the DNA handles," Stephanopoulos said. "It was especially critical to use a highly thermally stable protein like this aldolase, because this self-assembly only works at 55 degrees Celsius, and many proteins fall apart at those temperatures."

Another advantage of DNA, which is not possible with proteins, is tuning the cage size without having to redesign all the components. Stephanopoulos continued, "The size of this assembly could then be rationally tuned by changing the length of each DNA edge, whereas the protein would provide a scaffold for the attachment of small molecules, targeting peptides or even fusion proteins."

While other examples of hybrid structures exist, this particular cage is the first one constructed through chemical conjugation of oligonucleotide handles on a protein building block. This strategy can in principle be expanded to a wide range of proteins (some with cancer targeting abilities, for example). Thus, Stephanopoulos's work has the potential to enable a whole new hybrid field of protein-DNA nanotechnology with applications not possible with either proteins or DNA alone.

Credit: 
Arizona State University

Brightly-colored fairy wrens not attacked by predators more than their dull counterparts

In "Conspicuous Plumage Does Not Increase Predation Risk: A Continent-Wide Test Using Model Songbirds," published in the American Naturalist, Kristal E. Cain examines the factors that drive the predation levels of Australia's fairy wrens. After measuring attack rates on both conspicuously and dull colored 3D fairy wren models in various habitats, Cain found that bright or "conspicuous" plumage is not associated with an increase in predation.

"These findings do not support the long-standing hypothesis that conspicuous plumage, in isolation, is costly due to increased attraction from predators," Cain writes. "Our results indicate that conspicuousness interacts with other factors in driving the evolution of plumage coloration."

The forces shaping plumage color of female birds -- who are sometimes brightly colored like their male counterparts and other times much more dull -- is a long-debated topic of evolutionary biology that remains unresolved. Fairy wrens, who vary greatly in both female coloration and the habitats in which they live, are an excellent group for investigating the evolutionary forces shaping female plumage, Cain writes.

Cain and her co-authors produced 60 3D fairy wren models at the University of Melbourne School of Engineering. The models had three types: conspicuously-colored male purple-backed fairy wren, dull-colored female purple-backed fairy wren, and conspicuously-colored female lovely fairy wren. Field experiments took place at eight different fairy wren habitats across Australia that ranged from open savannah to dense forest. Mimicking actual fairy wrens' foraging habits, models were placed approximately 5 meters apart on bare ground or in short vegetation, attached to metal stakes with magnets. Cameras were used at some locations to determine that the models were knocked off their perches by attack or by accident, and in the case of attacks, to identify the predator.

Cain found that, contrary to their predictions, there was not a significant difference between attack rates on the conspicuous models and the dull models. Attack rates did vary, however, depending on habitat and latitude: predator pressure was stronger at sites that were open savannahs, as well as at sites that were further from the equator. These increases in predation in open habitats occurred more dramatically for female models, both dull and conspicuous, than for the conspicuous male models. Females also saw a greater decrease in predation pressure in habitats that fell on the opposite end of the spectrum: dense forest habitats or habitats closer to the equator.

"Our data suggest that adult birds living in open Australian habitats experience higher predation pressure than those in closed habitats, though it is unclear whether this pattern is due to differences in detectability, predator density, or both," Cain writes.

The lack of differences between attack rates between conspicuous and dull models imply that coloration alone doesn't predict how often a bird will be attacked. This doesn't mean, however, that conspicuousness is not an important factor, Cain writes. For instance, it is possible that a combination of predator behavior, sex differences in behavior, and conspicuousness against particular backgrounds may all play a role in the evolution of plumage colors.

As for the female models experiencing a wider range of predation across sites, Cain writes that this finding joins a substantial body of empirical evidence suggesting that predators may avoid male birds, or that they preferentially attack females or cryptic bird species. There are many theories as for why this may be the case, including some studies that have found that birds that are conspicuously colored, male, or both, are more vigilant against predators.

Further research is needed to determine the mechanisms that are at play with fairy-wrens, Cain writes. "These conflicting patterns suggest that this relationship may be less straightforward than is often assumed and that explicit tests of the relationship between color and predation risk are required. "

Credit: 
University of Chicago Press Journals

Breast cancer study by UCR medical student could help patients live longer

image: Ross Mudgway is a third-year student in the UCR School of Medicine.

Image: 
AACR

A student at the University of California, Riverside, presented research results at the annual meeting of the American Association for Cancer Research, or AACR, in Atlanta showing that surgery is associated with higher survival rates for patients with HER2-positive stage 4 breast cancer compared with those who did not undergo surgery.

The protein HER2, or human epidermal growth factor receptor 2, can play a role in the development of breast cancer.

"Between 20% and 30% of all newly diagnosed stage 4 breast cancer cases are HER2-positive," said Ross Mudgway, the study's lead author and a third-year student in the UCR School of Medicine. "This form of breast cancer once had poor outcomes, but in recent years, advances in targeted therapy have led to improved survival."

Mudgway explained at the meeting that in recent years, most patients with HER2-positive breast cancer have been treated with systemic therapy, which could include chemotherapy, targeted therapy, or endocrine therapy.

"Surgery is sometimes offered to these patients, but previous research on whether surgery improves survival has yielded mixed results," he said.

According to Mudgway and senior author Dr. Sharon Lum, a professor in the Department of Surgery-Division of Surgical Oncology and medical director of the Breast Health Center at Loma Linda University Health, HER2 status has been reported in large registry data sets since 2010, but the impact of surgery on this type of breast cancer has not been well documented across hospital systems. The researchers conducted a retrospective cohort study of 3,231 women with HER2-positive stage 4 breast cancer, using records from the National Cancer Database from 2010-12.

They found that 89.4% of the women had received chemotherapy or targeted therapies, 37.7% had received endocrine therapy, and 31.8% had received radiation. Overall, 1,130 women, or 35%, received surgery.

The researchers found surgery was associated with a 44% increased chance of survival, assuming the majority also had systemic treatment.

"This suggests that, in addition to standard HER2-targeted medications and other adjuvant therapy, if a woman has stage 4 HER2-positive breast cancer, surgery to remove the primary breast tumor should be considered," Lum said.

The study also examined factors associated with receipt of surgery and found that women with Medicare or private insurance were more likely to have surgery and less likely to die of their disease than those with Medicaid or no insurance. White women were also more likely than non- Hispanic black women to have surgery and less likely to die of their cancer.

"These results suggest disparities in health care due to race and socioeconomic factors, and these must be addressed," Mudgway said.

According to Mudgway and Lum, numerous factors may contribute to a physician's decision about whether to recommend surgery, including other chronic diseases in the patient, response to other forms of treatment, and overall life expectancy.

"Our findings should be considered in the context of all other factors," Mudgway said. "For patients, the decision to undergo breast surgery, especially a mastectomy, can often be life- changing as it affects both physical and emotional health. The patient's own feelings about whether or not she wishes to have surgery should be considered."

Mudgway was invited to discuss the research project at a press conference held at the AACR meeting. The conference received more than 5300 abstract submissions. Mudgway's abstract was one of only 17 selected for the press program.

Credit: 
University of California - Riverside

Laser technology helps researchers scrutinize cancer cells

image: A scanned image of a grid containing one cancer cell and some blood inside each colored box. The color of the boxes indicates the amount of oxygen dissolved in the blood.

Image: 
Caltech

Devising the best treatment for a patient with cancer requires doctors to know something about the traits of the cancer from which the patient is suffering. But one of the greatest difficulties in treating cancer is that cancer cells are not all the same. Even within the same tumor, cancer cells can differ in their genetics, behavior, and susceptibility to chemotherapy drugs.

Cancer cells are generally much more metabolically active than healthy cells, and some insights into a cancer cell's behavior can be gleaned by analyzing its metabolic activity. But getting an accurate assessment of these characteristics has proven difficult for researchers. Several methods, including position emission tomography (or PET) scans, fluorescent dyes, and contrasts have been used, but each has drawbacks that limit their usefulness.

Caltech's Lihong Wang believes he can do better through the use of photoacoustic microscopy (PAM), a technique in which laser light induces ultrasonic vibrations in a sample. Those vibrations can be used to image cells, blood vessels, and tissues.

Wang, Bren Professor of Medical Engineering and Electrical Engineering, is using PAM to improve on an existing technology for measuring the oxygen-consumption rate (OCR) in collaboration with Professor Jun Zou at Texas A&M University. That existing technology takes many cancer cells and places them each into individual "cubbies" filled with blood. Cells with higher metabolisms will use up more oxygen and will lower the blood oxygen level, a process which is monitored by a tiny oxygen sensor placed inside each cubby.

This method, like those previously mentioned, has weaknesses. To get a meaningful sample size of metabolic data for cancer cells would require researchers to embed thousands of sensors into a grid. Additionally, the presence of the sensors within the cubbies can alter the metabolic rates of the cells, causing the collected data to be inaccurate.

Wang's improved version does away with the oxygen sensors and instead uses PAM to measure the oxygen level in each cubby. He does this with laser light that is tuned to a wavelength that the hemoglobin in blood absorbs and converts into vibrational energy--sound. As a hemoglobin molecule becomes oxygenated, its ability to absorb light at that wavelength changes. Thus, Wang is able to determine how oxygenated a sample of blood is by "listening" to the sound it makes when illuminated by the laser. He calls this single-cell metabolic photoacoustic microscopy, or SCM-PAM.

In a new paper, Wang and his co-authors show that SCM-PAM represents a huge improvement in the ability to assess the OCR of cancer cells. Using individual oxygen sensors to measure OCR limited researchers to analyzing roughly 30 cancer cells every 15 minutes. Wang's SCM-PAM improves that by two orders of magnitude and allows researchers to analyze around 3,000 cells in about 15 minutes.

"We have techniques to improve the throughput further by orders of magnitude, and we hope this new technology can soon help physicians make informed decisions on cancer prognosis and therapy," says Wang.

Credit: 
California Institute of Technology

Adults with mental health, substance disorders more likely subject to Medicaid work rules

A new research study has found that Medicaid enrollees with behavioral health and other chronic conditions are less likely to be working part or full time than those without these conditions, making it less likely they will meet new or proposed work requirements for Medicaid that have been implemented or proposed in some states.

Several states are pursuing reforms to their Medicaid programs that would require Medicaid enrollees to work a specified number of hours, look for a job, receive job training, and/or participate in community service in order to maintain their Medicaid coverage.

In an article published in the April issue of Health Affairs, researchers from the University of Kentucky, Johns Hopkins University and Emory University used data from the National Survey on Drug Use and Health to examine if adults with serious mental illness, substance use disorders, and/or other health conditions are more likely to be subject to Medicaid work requirements compared to adults without any identified conditions.

Among Medicaid enrollees age 18 to 64, those with serious mental illness were less than half as likely to have worked part or full time (at least 20 hours) in the past week as those without any health conditions -- and would therefore be unlikely to meet work requirements. Medicaid enrollees with substance use disorders, and enrollees with comorbid serious mental illness and substance use disorders were also less likely to have worked part time in the past week than those without any identified health conditions. The authors excluded from the sample those who are typically not subject to work requirements including pregnant women, full-time students, and those receiving Supplemental Security Income.

Janet Cummings, PhD, associate professor of health policy and management at Emory University Rollins School of Public Health, is senior author of the study; Hefei Wen, PhD, assistant professor of health management and policy at the University of Kentucky is first author; and Brendan Saloner of Johns Hopkins Bloomberg School of Public Health is a co-author.

Cummings, explains, "These data tell us that Medicaid enrollees with mental health disorders or substance use disorders are more likely to be affected by new Medicaid work requirements. If policymakers consider implementing Medicaid work requirements, it is crucial that they also take a hard look at how accessible mental health and substance use treatment are for Medicaid enrollees in their state. Many behavioral health providers do not accept Medicaid, and many enrollees face geographic barriers when trying to access services."

The authors describe other important considerations for Medicaid enrollees with mental health or substance use disorders that may be subject to work requirements. For example, policymakers may consider including language about exemptions from work requirements for some of those with behavioral health disorders. In addition, lawmakers may assess whether the Medicaid program in their state covers evidence-based treatments for behavioral health conditions. For example, a number of states do not cover some of the services recommended in the Society of Addiction Medicine's continuum of care.

Cummings states, "Policymakers need to consider how these policies may affect Medicaid enrollees with mental health and substance use disorders. Do these individuals have access to the treatment they need? Can they get an exemption if they are unable to work? We need thoughtful consideration of these issues as proposals move forward in the legislative process."

Credit: 
Emory Health Sciences

Liquid crystals could help deflect laser pointer attacks on aircraft

image: Liquid crystals sandwiched between two1-inch squares of glass scatter green and blue light on a wall when the cells are triggered by laser illumination (right panels).

Image: 
Daniel Maurer

ORLANDO, Fla., March 31, 2019 -- Aiming a laser beam at an aircraft isn't a harmless prank: The sudden flash of bright light can incapacitate the pilot, risking the lives of passengers and crew. But because attacks can happen with different colored lasers, such as red, green or even blue, scientists have had a difficult time developing a single method to impede all wavelengths of laser light. Today, researchers report liquid crystals that could someday be incorporated into aircraft windshields to block any color of bright, focused light.

The researchers will present their results today at the American Chemical Society (ACS) Spring 2019 National Meeting & Exposition. ACS, the world's largest scientific society, is holding the meeting here through Thursday. It features nearly 13,000 presentations on a wide range of science topics.

According to the Federal Aviation Administration, 6,754 laser strikes on aircraft were reported in 2017. "We were approached by collaborators in the aviation department at our university about the growing problem happening at airports across the world, where people were shooting lasers at planes during takeoff and landing, the critical phases of flight," says Jason Keleher, Ph.D., the project's principal investigator. Such attacks, which cause bright flashes of light in the cockpit, can distract pilots or inflict temporary or permanent visual damage, depending on the wavelength and intensity of the laser.

"We wanted to come up with a solution that didn't require us to completely re-engineer an aircraft's windshield, but instead adds a layer to the glass that harnesses the existing power system for windshield defrosting," says Daniel Maurer, an undergraduate student. Keleher and Maurer are at Lewis University.

Rather than being integrated into the windshield, previous approaches have included pull-down windscreens or goggles that pilots don during takeoff and landing. However, these can be inconvenient because they require the flight crew to take these precautions whether or not they are actually being targeted. An even bigger problem is that these strategies work only for specific wavelengths of laser light. "They don't block everything," Maurer says. "They're usually targeted toward green lasers because those are used for the majority of the attacks."

To develop their new approach, the researchers took advantage of liquid crystals -- materials with properties between those of liquids and solid crystals that make them useful in electronic displays. The team placed a solution of liquid crystals called N-(4-methoxybenzylidene)-4-butylaniline (MBBA) between two 1-inch-square panes of glass. MBBA has a transparent liquid phase and an opaque crystalline phase that scatters light. By applying a voltage to the apparatus, the researchers caused the crystals to align with the electrical field and undergo a phase change to the more solid crystalline state.

The aligned crystals blocked up to 95 percent of red, blue and green beams, through a combination of light scattering, absorption of the laser's energy and cross-polarization. The liquid crystals could block lasers of different powers, simulating various distances of illumination, as well as light shone at different angles onto the glass.

In addition, the system was fully automatic: A photoresistor detected laser light and then triggered the power system to apply the voltage. When the beam was removed, the system turned off the power, and the liquid crystals returned to their transparent, liquid state. "We only want to block the spot where the laser is hitting the windshield and then have it quickly go back to normal after the laser is gone," Keleher notes. The rest of the windshield, which was not hit by the laser, would remain transparent at all times.

Now that the researchers have shown that their approach works, they plan to scale it up from 1-inch squares to the size of an entire aircraft windshield. Initial results have shown that a sensor grid pattern on 2-inch squares of glass will respond only to the section of glass that is illuminated. The team is also testing different types of liquid crystals to find even more efficient and versatile ones that return to the transparent state more quickly once the laser is removed.

Credit: 
American Chemical Society

Stunning discovery offers glimpse of minutes following 'dinosaur-killer' Chicxulub impact

LAWRENCE -- A study to be published Monday in the Proceedings of the National Academy of Sciences offers a scientific first: a detailed snapshot of the terrible moments right after the Chicxulub impact -- the most cataclysmic event known to have befallen life on Earth.

At a site called Tanis in North Dakota's Hell Creek Formation, a team of paleontologists whose headquarters are at the University of Kansas unearthed a motherlode of exquisitely-preserved animal and fish fossils -- creatures that lived in and around a deeply chiseled river connected to the ancient Western Interior Seaway -- that were killed suddenly in events triggered by the Chicxulub impact.

The fossils were crammed into a "rapidly emplaced high-energy onshore surge deposit" along the KT boundary that contained associated ejecta and iridium impactite associated with the impact about 66 million years ago -- an impact that eradicated about 75 percent of Earth's animal and plant species.

"A tangled mass of freshwater fish, terrestrial vertebrates, trees, branches, logs, marine ammonites and other marine creatures was all packed into this layer by the inland-directed surge," said lead author Robert DePalma, a KU doctoral student in geology who works in the KU Biodiversity Institute and Natural History Museum. "Timing of the incoming ejecta spherules matched the calculated arrival times of seismic waves from the impact, suggesting that the impact could very well have triggered the surge."

DePalma, who discovered the fossil motherlode, said the find outlines how the impact could have devastated areas very far from the crater quite rapidly.

"A tsunami would have taken at least 17 or more hours to reach the site from the crater, but seismic waves -- and a subsequent surge -- would have reached it in tens of minutes," he said.

DePalma and his colleagues describe the rushing wave that shattered the Tanis site as a "seiche."

"As the 2011 Tohoku earthquake in Japan showed us, seismic shaking can cause surges far from the epicenter," he said. "In the Tohoku example, surges were triggered nearly 5,000 miles away in Norway just 30 minutes after impact. So, the KT impact could have caused similar surges in the right-sized bodies of water worldwide, giving the first rapid 'bloody nose' to those areas before any other form of aftermath could have reached them."

According to KU researchers, even before the surge arrived, Acipenseriform fish (sturgeon) found at the site already had inhaled tiny spherules ejected from the Chicxulub impact.

"The fish were buried quickly, but not so quickly they didn't have time to breathe the ejecta that was raining down to the river," said co-author David Burnham, preparator of vertebrate paleontology at the KU Biodiversity Institute. "These fish weren't bottom feeders, they breathed these in while swimming in the water column. We're finding little pieces of ejecta in the gill rakers of these fish, the bony supports for the gills. We don't know if some were killed by breathing this ejecta, too."

The number and quality of preservation of the fossils at Tanis are such that Burnham dubs it the "lagerstätte" of the KT event -- paleontologist-speak for a landmark sedimentary deposit with exceptionally intact specimens. He said this is especially true as the fish are cartilaginous, not bony, and are less prone to fossilization.

"The sedimentation happened so quickly everything is preserved in three dimensions -- they're not crushed," Burnham said. "It's like an avalanche that collapses almost like a liquid, then sets like concrete. They were killed pretty suddenly because of the violence of that water. We have one fish that hit a tree and was broken in half."

Indeed, the Tanis site contains many hundreds of articulated ancient fossil fish killed by the Chicxulub impact's aftereffects and is remarkable for the biodiversity it reveals alone.

"At least several appear to be new species, and the others are the best examples known of their kind," DePalma said. "Before now, fewer than four were known from the Hell Creek, so the site was already magnificently significant. But we quickly recognized that the surrounding sediment was deposited by a sudden, massive rush of water, and that the surge was directed inland, away from an ancient nearby seaway. When we noticed asteroid impact debris within the sediment and a compact layer of KT boundary clay resting on top of it from the long-term fallout, we realized that this unusual site was right at the KT boundary."

According to Burnham, the fossil trove fills a void in scientific knowledge with vivid new detail.

"We've understood that bad things happened right after the impact, but nobody's found this kind of smoking-gun evidence," he said. "People have said, 'We get that this blast killed the dinosaurs, but why don't we have dead bodies everywhere?' Well, now we have bodies. They're not dinosaurs, but I think those will eventually be found, too."

DePalma said his find provides spectacular new detail to what is perhaps the most important event to ever affect life on Earth.

"It's difficult not to get choked up and passionate about this topic," he said. "We look at moment-by-moment records of one of the most notable impact events in Earth's history. No other site has a record quite like that. And this particular event is tied directly to all of us -- to every mammal on Earth, in fact. Because this is essentially where we inherited the planet. Nothing was the same after that impact. It became a planet of mammals rather than a planet of dinosaurs.

"As human beings, we descended from a lineage that literally survived in the ashes of what was once the glorious kingdom of the dinosaurs. And we're the only species on the planet that has ever been capable of learning from such an event to the benefit of ourselves and every other organism in our world."

Credit: 
University of Kansas

Gastrointestinal complaints in children could signal future mental health problem

A Columbia University study has found that adversity early in life is associated with increased gastrointestinal symptoms in children that may have an impact on the brain and behavior as they grow to maturity.

The study was published online March 28 in the journal Development and Psychopathology.

"One common reason children show up at doctors' offices is intestinal complaints," said Nim Tottenham, a professor of psychology at Columbia and senior author on the study. "Our findings indicate that gastrointestinal symptoms in young children could be a red flag to primary care physicians for future emotional health problems."

Scientists have long noted the strong connection between the gut and brain.
Previous research has demonstrated that a history of trauma or abuse has been reported in up to half of adults with irritable bowel syndrome (IBS), at a prevalence twice that of patients without IBS.

"The role of trauma in increasing vulnerability to both gastrointestinal and mental health symptoms is well established in adults but rarely studied in childhood," said study lead author Bridget Callaghan, a post-doctoral research fellow in Columbia's psychology department. In addition, she said, animal studies have demonstrated that adversity-induced changes in the gut microbiome - the community of bacteria in the body that regulates everything from digestion to immune system function-influence neurological development, but no human studies have done so.

"Our study is among the first to link disruption of a child's gastrointestinal microbiome triggered by early-life adversity with brain activity in regions associated with emotional health," Callaghan said.

The researchers focused on development in children who experienced extreme psychosocial deprivation due to institutional care before international adoption. Separation of a child from a parent is known to be a powerful predictor of mental health issues in humans. That experience, when modeled in rodents, induces fear and anxiety, hinders neurodevelopment and alters microbial communities across the lifespan.

The researchers drew upon data from 115 children adopted from orphanages or foster care on or before approximately they were 2 years old, and from 229 children raised by a biological caregiver. The children with past caregiving disruptions showed higher levels of symptoms that included stomach aches, constipation, vomiting and nausea.

From that sample of adoptees, the researchers then selected eight participants, ages 7 to 13, from the adversity exposed group and another eight who'd been in the group raised by their biological parents. Tottenham and Callaghan collected behavioral information, stool samples and brain images from all the children. They used gene sequencing to identify the microbes present in the stool samples and examined the abundance and diversity of bacteria in each participant's fecal matter.

The children with a history of early caregiving disruptions had distinctly different gut microbiomes from those raised with biological caregivers from birth. Brain scans of all the children also showed that brain activity patterns were correlated with certain bacteria. For example, the children raised by parents had increased gut microbiome diversity, which is linked to the prefrontal cortex, a region of the brain known to help regulate emotions.

"It is too early to say anything conclusive, but our study indicates that adversity-associated changes in the gut microbiome are related to brain function, including differences in the regions of the brain associated with emotional processing," says Tottenham, an expert in emotional development.

More research is needed, but Tottenham and Callaghan believe their study helps to fill in an important gap in the literature.

"Animal studies tell us that dietary interventions and probiotics can manipulate the gut microbiome and ameliorate the effects of adversity on the central nervous system, especially during the first years of life when the developing brain and microbiome are more plastic," Callaghan says. "It is possible that this type of research will help us to know if and how to best intervene in humans, and when."

Callaghan and Tottenham are currently working on a larger-scale study with 60 children in New York City to see if their findings can be replicated. They expect the results later this year.

Credit: 
Columbia University