Earth

Serotonin boosts neuronal powerplants protecting against stress

image: This is serotonin action on neuronal mitochondria.

Image: 
Vidita A. Vaidya and Ullas Kolthur-Seetharam

Mitochondria in neurons are the powerhouses that generate energy to execute cellular functions and regulate neuronal survival under conditions of stress. Collaborative research by Prof. Vidita Vaidya and Prof. Ullas Kolthur-Seetharam groups at TIFR, along with Dr. Ashok Vaidya, at Medical Research Centre, Kasturba Health Society, has demonstrated an unusual function for the neurotransmitter serotonin, in the generation of new mitochondria--a process called mitochondrial biogenesis--in neurons, accompanied by increase in cellular respiration and ATP, the energy currency of the cell.

These effects of serotonin involve the serotonin2A receptor and master regulators of mitochondrial biogenesis, SIRT1 and PGC-1α. Serotonin reduces toxic reactive oxygen species in neurons, boosts anti-oxidant enzymes and buffers neurons from the damaging effects of cellular stress. This study (Fanibunda et al., 2019), appearing in the international journal PNAS, uncovers an unprecedented role for serotonin in energy production in neurons directly impacting how neurons handle stress. Mitochondrial function in neurons is vital in determining how neurons cope with stress and the trajectory of aging.

This work provides exciting evidence that the neurotransmitter serotonin can directly influence neuronal powerplants, thus impacting the manner in which neurons grapple with stress. This work identifies novel drug targets for treating mitochondrial dysfunction in neurons, with therapeutic potential for neurodegenerative and psychiatric disorders.

Credit: 
Tata Institute of Fundamental Research

Are otters threatening amphibian populations?

image: Bone fragments, used in the identification of amphibian prey, from the faeces of the Eurasian otter.

Image: 
Dr. Balestrieri

The Eurasian otter typically eats fish, but amphibians, which are in global decline, are also part of its diet, especially when fish are scarce. In a Mammal Review study, researchers identified bones of amphibians in otter faeces from southern Italy to determine which types of amphibians are typically eaten. They also reviewed 64 studies of otter diet.

In the 64 studies, an average of 12 percent of prey items taken by otters were amphibians. Predation of amphibians increased with longitude and was highest in the Alpine biogeographical region in winter and spring. Also, 28 amphibian species (35 percent of European species) were eaten by otters.

In their analyses from southern Italy, the investigators identified 355 individuals belonging to at least seven amphibian taxa. The investigators also concluded that when feeding on frogs and toads, otters are more likely to take the noisy males than the quieter females.

The findings suggest that amphibians are a more significant part of the otter's diet than commonly perceived. While this may constitute a threat to small populations of endemic amphibians, their global decline is also likely to have consequences for otter survival wherever fish resources have been depleted by overfishing and pollution.

"We knew that amphibians may represent a major food for otters in the Mediterranean area, but I admit we were amazed and impressed to discover how great the diversity of this resource could be," said corresponding author Dr. Alessandro Balestrieri, of the University of Milan, in Italy.

Credit: 
Wiley

NASA-NOAA satellite catches formation of Tropical Cyclone Lili

image: NASA-NOAA's Suomi NPP satellite passed over the Southern Indian Ocean and captured a visible image of Tropical Cyclone Lili on May 9, as it continued to linger north of Australia's Northern Territory.

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS)/NOAA

NASA-NOAA's Suomi NPP satellite passed over the Southern Indian Ocean and captured a visible image of newly formed Tropical Cyclone Lili, located north of the coast of Australia's Northern Territory.

The Australian Bureau of Meteorology or ABM issued a Strong Wind Warning for the following areas: Beagle Bonaparte Coast, North Tiwi Coast, Arafura Coast and Roper Groote Coast. There is no tropical cyclone warning currently in effect.

NASA-NOAA's Suomi NPP satellite passed over Lili on May 9 and the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument provided a visible image of the storm. The VIIRS image showed strong thunderstorms around the center of circulation and in a large band extending to the east of the storm. The satellite imagery shows a consolidating system in the Timor Sea.

At 11 a.m. EDT (1500 UTC) on May 9, the Joint Typhoon Warning Center or JTWC noted that Lili had maximum sustained winds near 40 knots (46 mph/74 kph). Lili is centered near 9.1 degrees south latitude and 128.8 degrees east longitude. Lili is located approximately 236 nautical miles north-northwest of Darwin, Australia and has tracked south-southwestward.

Lili is expected to strengthen slightly in the next day before weakening as it moves in a westerly direction toward Timor.

Credit: 
NASA/Goddard Space Flight Center

Adverse childhood experiences negatively impact adults with lupus

Adults with lupus who report having had adverse childhood experiences (ACEs), such as abuse, neglect and household challenges, report higher disease activity, depression and poorer overall health compared to those without such experiences, according to a study by researchers at UC San Francisco.

"Our results support the notion that stress in the form of ACEs may be a factor in poor health in systemic lupus, both in disease development and in more severe outcomes," said lead author Kimberly DeQuattro, MD, a clinical fellow in rheumatology at UCSF. "These findings are a call to action to focus efforts on ACE prevention in childhood, as well as clinical and mental health interventions that foster resilience in adulthood."

The findings appear online May 9, 2019, in Arthritis Care & Research.

Systemic lupus erythematosus is an autoimmune disease in which the immune system attacks the body's own tissues, causing widespread inflammation and tissue damage in affected organs. Lupus is influenced by genetics and the environment, with stress acting as a potential trigger of disease onset and flares, and can lead to chronic disability.

Studies have shown that a large percentage of adults have had adverse childhood experiences. According to the U.S. Centers for Disease Control and Prevention, ACEs have been linked to risky health behaviors, chronic health conditions, low life potential and early death. They also are believed to be a risk factor for autoimmune diseases such as lupus.

In the Arthritis Care & Research study, DeQuattro and her colleagues surveyed 269 lupus patients in the California Lupus Epidemiology Study (CLUES) who completed the ACE questionnaire, a 10-item survey covering abuse, neglect and household challenges. These patients were compared to 6,107 participants from the 2015 California Behavioral Risk Factor Surveillance System (BRFSS), which also includes an ACE questionnaire. The researchers then examined five patient-reported outcomes (lupus flares, damage, depression, physical function and quality of life) with three physician-assessed measures (lupus flares, damage and severity indices) made during an in-person study visit with the patients.

Overall, adverse childhood experience levels were similar for lupus patients and Behavioral Risk study respondents. In lupus patients, 63.2 percent self-reported at least one ACE, and 19.3 percent had at least four ACEs. ACEs were more prevalent in those who were older, female, Latino or African American, without college degrees, and with lupus nephritis (kidney inflammation from lupus).

In adjusted models, more experience of abuse, neglect and household challenges were associated with worse self-reported lupus activity, depression and health status. For example, those with more than four ACEs reported nearly double the disease activity scores of their non-ACE exposed peers (13.1 points to 7.7). These findings were not significantly associated with physician-assessed lupus activity, damage or severity.

"This work in lupus patients supports more broadly the body of studies on adversity and trauma in childhood that have found a link between ACEs and health," DeQuattro said. "Our next steps are to look at other types of stress and trauma, how the body responds, and how they relate to lupus outcomes."

The researchers emphasize the need for prevention of adverse childhood experiences and promotion of safe, stable, nurturing relationships and environments for children. Clinicians also should regularly screen lupus patients for ACEs, along with depression and overall perceived health status, regardless of disease status, the researchers said. It may be more beneficial to screen near the time of diagnosis to identify those at higher risk for poor outcomes.

Credit: 
University of California - San Francisco

Childhood maltreatment linked to e-cigarette use during young adulthood

Young adults who experienced maltreatment during childhood are more prone to use e-cigarettes, according to a study published in The American Journal on Addictions.

In the study of 208 US individuals aged 18-21 years, childhood maltreatment was also related to negative urgency, or the tendency to act rashly when distressed, which was in turn associated with greater use of e-cigarettes. The study's authors noted that the impulsive nature of negative urgency may link childhood maltreatment to e-cigarette use as children get older.

"Many young adults who have experienced abuse or neglect in their childhood struggle with substance abuse. Our study looked at e-cigarette use specifically and found that an individual's childhood maltreatment experiences might play a role in their use of e-cigarettes during their transition to adulthood," said lead author Dr. Sunny H. Shin, of the Virginia Commonwealth University.

Credit: 
Wiley

'Good enough' parenting is good enough

image: Susan Woodhouse is an associate professor of counseling psychology at Lehigh University.

Image: 
Lehigh University

What really matters in caring for babies may be different than commonly thought, says Lehigh University researcher Susan S. Woodhouse, an expert on infant attachment. In new research, she finds that caregivers need only "get it right" 50 percent of the time when responding to babies' need for attachment to have a positive impact on a baby. Securely attached infants are more likely to have better outcomes in childhood and adulthood, and based on Woodhouse's potentially paradigm-shifting work, there is more than one way to get there, particularly for low socioeconomic-status families.

Woodhouse, an associate professor of counseling psychology, studied 83 low socioeconomic-status mothers and infants at ages 4.5 months, 7 months, 9 months and 12 months to observe and assess attachment. Infants and mothers in the study were racially and ethnically diverse, and infants were selected for high levels of temperamental irritability.

Her findings are detailed in "Secure Base Provision: A New Approach to Examining Links Between Maternal Caregiving and Infant Attachment," which appears in the journal Child Development, co-authored with Julie R. Scott of Pennsylvania State University, Allison D. Hepsworth of the University of the Maryland School of Social Work, and Jude Cassidy of the University of Maryland.

The study scored mother-baby pairs based on a mother's responses to the infant while the baby was crying and not crying to assess the qualities of "secure base provision." This framework focuses on aspects of caregiving that tell an infant about the caregiver's availability to serve as a secure base, such as soothing to cessation of crying and providing a present and safe base from which to explore.

Researchers found that this framework significantly predicted infant attachment, and that babies learned their mothers were providing a secure base when mothers responded properly at least 50 percent of the time.

"The findings provide evidence for the validity of a new way of conceptualizing the maternal caregiving quality that actually works for low-income families," Woodhouse said.

What is Infant Attachment and Secure Base Provision?

Infant attachment is the bond infants form with their primary caregiver. A secure attachment allows babies to feel safe, which gives them both comfort in times of distress and the ability to explore, knowing they can return to their secure base when needed. Attachment is an infant's first bond with important caregivers and a critical phase in development, with a major impact on emotional and social development.

Numerous studies have shown the importance of secure infant attachment to developmental outcomes. But, for the past 30 years, the actual building blocks leading to attachment have been unresolved. Caregiver "sensitivity" - the ability to accurately interpret infant needs and to respond promptly and appropriately - was shown to be a key predictor of attachment. But studies showed sensitivity accounts for a surprisingly low percentage of variation in attachment, and has an even lower impact among families with low socioeconomic status.

"That's a real problem, because low-income babies face the most amount of risk, toxic stress and other factors that go along with being low income," Woodhouse said. Data suggest secure attachment may serve a protective function in children's socio-emotional development when in a context of high risk. Secure attachment is associated with better mental health outcomes in both childhood and adulthood - including less incidence of externalizing behaviors such as acting out and internalizing behaviors such as depression and anxiety - as well as greater school readiness.

"If we want to give advice to parents about what they can do to give their baby the best start in life, it would be really good to know what helps a baby to be secure," Woodhouse said.

Woodhouse's study seeks to address this critical gap in understanding what leads to secure attachment, through examining whether a new conceptualization of caregiving behavior, "secure base provision" - the degree to which a caregiver is able to meet an infant's needs on both sides of the attachment-exploration continuum - predicts attachment security in infants. It is the first time this conceptualization has been tested separately from sensitivity and as a predictor of infant attachment. The new way of conceptualizing caregiving focuses on the aspects of caregiving that theoretically should be most important to building infant attachment because of what an infant can learn from them about a caregiver's availability to serve as a secure base for the infant - both when the infant needs comforting and when the infant is focused on exploring.

Differences Between Secure Base Provision and Sensitivity

As frameworks, both sensitivity and secure base provision look at how caregivers perceive, interpret and appropriately respond to infant signals; and, in both, important infant signals occur at each end of the attachment-exploration continuum. But secure base provision looks only at certain key infant signals and more specific caregiver responses. It also focuses much less on prompt response and more on crying resolution (the ratio of infant crying episodes that end in chest-to-chest soothing until the infant is fully calmed, regardless of promptness).

Secure base provision also does not consider attunement to a baby's state and mood in a moment-by-moment manner, as the sensitivity framework does. "Attunement is not key because the focus is on what the infant learns about his or her ability to, in the end, recruit the caregiver when needed - even in the context of a fair degree of insensitive behavior," such as not picking up the baby right away, or saying, "Come on, don't cry," to the baby, the researchers said. "It is this infant learning about the availability of the caregiver to be recruited to provide a secure base (more often than not) that is central to the construct."

Specifically, secure base provision looks at the degree to which a parent, on average, soothes a crying infant to a fully calm and regulated state while in chest-to-chest contact. "It is at the end of each crying episode that the infant learns about whether, on average, the caregiver can be counted on to be available as the infant achieves a calm state or whether the infant typically must stop crying alone," the researchers said.

During infant exploration and other times when the infant is not distressed, the secure base provision approach focuses on whether the caregiver allows exploration to occur without terminating or interrupting it - for example, by making the baby cry through play that is too sudden or rough - and on "calm connectedness," which communicates the mother's ongoing availability if needed for regulation or protection: "I am here if you need me, and you can count on me."

In addition, there are behaviors that caregivers must not do, either when the baby needs comfort or during exploration, in order for secure base provision to occur. Specifically, caregivers must not frighten the baby or fail to protect the baby when real hazards are present, such as another child who is too rough.

Secure Base Provision 8 Times More Effective at Predicting Attachment

The study scored mother-baby pairs based on maternal responses to the infant during episodes of infant crying and maternal responses outside of infant crying episodes. A separate group in another lab also scored for the commonly used sensitivity framework.

The researchers found the new maternal caregiving concept of secure base provision correlated significantly with infant attachment security: mothers who had higher scores on secure base provision were more likely to have more securely attached infants, with an effect eight times larger than that of sensitivity, based on a meta-analysis of findings for low socioeconomic-status families. This was true, even after controlling for maternal sensitivity. They also found that "maternal sensitivity" did not significantly predict infant attachment security.

"What this paper tells us is that we need to change not only how we measure sensitivity, but how we are thinking about the caregiving behaviors that really matter," Woodhouse said. "What we found was that what really matters is not really so much that moment-to-moment matching between what the baby's cue is and how the parent responds. What really matters is in the end, does the parent get the job done - both when a baby needs to connect, and when a baby needs to explore?"

Research suggests that infants demonstrate statistical learning to identify complex underlying patterns in stimuli, the researchers said. "Thus, we expected that infants whom caregivers soothed from crying to calm in a chest-to-chest position for at least half of the observed episodes of infant crying would learn that, on average, they could trust their caregivers to provide a secure base," they said, which they found to be true.

Woodhouse calls the findings "paradigm shifting."

"It really is a different way of looking at the quality of parenting," she said. "It's looking at this idea of does the job get done in the end, and it allows us to see strengths in low-income parents that our previous ideas about sensitivity don't let us see."

Additional Dos and Don'ts

Researchers also noted a number of problematic behaviors by mothers while their babies were crying that disrupted the process of comforting the infant. Such as: turning the baby away from their chests before crying ends; rough handling; harsh verbal tones; verbal instructions not to cry; and verbally attributing negative characteristics to the baby. They also documented presence or absence of frightening behavior, such as sudden looming into the baby's face or toward the baby, during crying episodes.

"If the mother did frightening things when the baby cried, like hard yelling or growling at the baby, or suddenly looming toward the baby's face while the baby was upset, even if it only happened one time, the baby would be insecure," Woodhouse said. "Similarly, if the mother did anything really frightening even when the baby wasn't in distress, like saying 'bye-bye' and pretending to leave, throwing the baby in the air to the point they would cry, failure to protect the baby, like walking away from the changing table or not protecting them from an aggressive sibling, or even what we call 'relentless play' - insisting on play and getting the baby worked up when it is too much - that also leads to insecurity."

Interestingly, overprotective-type behaviors, such as moms who don't let the baby explore more than an arm's length away, or interrupting or redirecting play (except for safety) also contributed to insecure baby attachment. "Some moms really had trouble allowing the baby to explore and were very insistent on the baby doing certain things or turning the baby's head to look at the mom," Woodhouse said. "In really intrusive parenting, if we saw that, the baby was insecure."

Applications for Parents and Practitioners, Across Cultures

One application of the findings is improving effectiveness of intervention programs that aim to increase secure infant attachment. The results indicate that low socioeconomic-status mothers who do a better job of providing a secure base increase their infants' chances of developing a secure attachment from about 30% to 71%; while low-SES mothers who fail to provide a secure base decrease their infants' chances of developing a secure relationship from about 71% to 30%.

Knowing this can help those leading interventions to view caregiving behavior in a new way. For example, this framework allows them to shift focus from urging mothers to respond as promptly as possible to working with mothers to focus on relenting and ultimately picking up and soothing a crying infant in a chest-to-chest position until calm.

"Because low socioeconomic-status parents juggle multiple challenges associated with low socioeconomic status, it may be helpful for them to know that holding a crying infant until fully soothed, even 50% of the time, promotes security," the researchers said. "Such a message could help parents increase positive caregiving without raising anxiety regarding 'perfect parenting' or setting the bar so high as to make change unattainable in families that face multiple stressors."

Methods of engaging an infant in calm, regulating connectedness, such as being available for eye contact without actively making eye contact and carrying an infant on the hip during daily tasks, also promote secure attachment in the baby, they said.

Focusing on the secure base also avoids emphasizing the importance of parenting practices that are often associated with white, middle-class populations, such as moment-to-moment attunement, prompt responses, sweet tone of voice and affectionate verbal comments. The new approach "captures strengths that can be present in parents who may be under economic strain or who ascribe to 'no-nonsense parenting,''' the researchers said. This also makes the secure base provision approach potentially more culturally sensitive and likely to be accepted across diverse low socioeconomic-status families.

"Across cultures, social class and race, parents want the best for their kids," Woodhouse said, "so parents are excited to know about this when I talk with them." Clinicians such as psychologists, counselors, social workers, home visitors, Head Start programs, early child care providers and pediatricians will also find it as a valuable lens, she said. "It has the potential to change intervention for agencies and practitioners, and I think that is really valuable," Woodhouse said.

The research isn't meant to contradict sensitivity as a framework, which remains useful, Woodhouse argues. The findings also aren't a challenge to attachment theory, which assumes that infants universally form attachments with familiar caregivers based on evolutionary pressures. "Attachment theory is a really important theory that has guided lots of research," she said. "(These findings) are about enriching, deepening and adding to the theory in ways that support applicability in diverse contexts."

Woodhouse clarifies that this is different from commonly understood tenets of "attachment parenting" in popular culture, such as co-sleeping, baby-wearing, breastfeeding or organic foods. "None of these things is inherently good or bad for attachment or a guarantee of having secure children - it's about how they are done," she said. "Moms get secure children in different ways. It is more of an attachment-informed perspective, that biologically babies do have certain needs - for safety, comfort and connection, exploration of their world - and how do we meet these needs? There is more than one way to get there."

For Woodhouse, the takeaway is two-fold:

"The first message gets at the core of getting the job done - supporting the baby in exploration and not interrupting it and welcoming babies in when they need us for comfort or protection," Woodhouse said. "The other part is that you don't have to do it 100 percent - you have to get it right about half of the time, and babies are very forgiving and it's never too late. Keep trying. You don't have to be perfect, you just have to be good enough."

Credit: 
Lehigh University

Fewer than half of British men and women have sex at least once a week

Fewer than half of men and women in Britain aged 16-44 have sex at least once a week, reveals a large study published by The BMJ today.

The data show a general decline in sexual activity in Britain between 2001 and 2012, with the steepest declines among the over 25s and those who are married or living together.

There is evidence that regular sexual activity is beneficial to health and wellbeing, but a recent decline has been seen in several high-income countries in the proportion of people who are sexually active, and how often they have sex.

Little is known about these trends in Britain and the lifestyle factors associated with them.

So to explore this further, researchers at the London School of Hygiene & Tropical Medicine used data from over 34,000 men and women aged 16 to 44 years in three successive waves of the British National Survey of Sexual Attitudes and Lifestyles (Natsals 1, 2 and 3) to measure changes in actual and preferred frequency of sex, and to examine factors associated with sexual activity.

The three surveys were completed in 1991, 2001 and 2012 and reported sexual activity included vaginal, anal, or oral sex with opposite or same-sex partners.

Overall, the data show declines in people having sex between 2001 and 2012. For example, the proportion reporting no sex in the past month increased from 23% to 29.3% among women and from 26% to 29.2% among men.

The proportion reporting sex 10 or more times in the past month also fell during this time, from 20.6% to 13.2% among women and from 20.2% to 14.4% among men.

Declines in levels of sexual activity were evident across all age groups for women, and for all but the 16-24 year old age group for men, but were largest among those aged 25 and over and those who were married or living together.

For instance, the average number of times that 35-44 year olds reported having sex in the past month fell from four to two among women and from four to three among men, and the odds of reporting sex 10 or more times in the past month halved.

Similarly, among men and women who were married or living together, reported sexual inactivity in the last month was higher, while the odds of reporting sex 10 or more times in the past month were roughly halved.

Declines of this magnitude were not seen among single people, suggesting that the trend towards lower sexual frequency overall is largely due to the decline among sexually active married or cohabiting couples, say the authors.

However, the data also show that close to half of all women (50.6%) and almost two thirds of men (64.3%) said they would prefer to have sex more often, particularly those who were married or living together, which the authors say "merits concern."

People in better physical and mental health, and those who were fully employed and had higher incomes, reported having sex more often.

This is an observational study, and as such, can't establish cause. And because the data was volunteered, this may have influenced the results.

But the authors say that the changing norms around sex may affect both reported and actual sexual frequency. For example, the social pressure to over-report sexual activity may have eased, while gender equality means that women may now be less inclined to meet their partner's sexual needs irrespective of their own.

They also point out that the decline in sexual frequency appears to coincide with increasing use of social media (which has created diversions) and the global recession of 2008 (which may explain the decline both among men who are better off and those worse off).

However, given the age and marital status of the groups most affected, the "most compelling" explanation may relate to the stress and 'busyness' of modern life, such that work, family life and leisure are constantly juggled, they add.

"It is perhaps the wider implications of the decline in sexual frequency that may be more worrying," write the authors. "Should frequency of sexual contact serve as a barometer for more general human connectedness then the decline might be seen as signalling a disquieting trend. The decrease in sexual activity is interesting, as yet unexplained, and warrants further exploration," they conclude.

In a linked editorial, Dr Peter Leusink from Radboud University Medical Centre says that "as the authors point out, less frequent sexual activity is not necessarily a problem for individual health and wellbeing" and the "quantity and quality of sexual activity are not necessarily connected."

He adds "Healthcare professionals should be aware of the links between sexual health, general health, and social factors and should be alert to the possibility of sexual problems during discussions with patients. [These] findings should encourage both researchers and clinicians to start talking about sex."

Credit: 
BMJ Group

Paper wasps capable of behavior that resembles logical reasoning

image: A Polistes dominula paper wasp on a flower.

Image: 
Photo by Elizabeth Tibbetts.

ANN ARBOR--A new University of Michigan study provides the first evidence of transitive inference, the ability to use known relationships to infer unknown relationships, in a nonvertebrate animal: the lowly paper wasp.

For millennia, transitive inference was considered a hallmark of human deductive powers, a form of logical reasoning used to make inferences: If A is greater than B, and B is greater than C, then A is greater than C.

But in recent decades, vertebrate animals including monkeys, birds and fish have demonstrated the ability to use transitive inference.

The only published study that assessed TI in invertebrates found that honeybees weren't up to the task. One possible explanation for that result is that the small nervous system of honeybees imposes cognitive constraints that prevent those insects from conducting transitive inference.

Paper wasps have a nervous system roughly the same size--about one million neurons--as honeybees, but they exhibit a type complex social behavior not seen in honeybee colonies. University of Michigan evolutionary biologist Elizabeth Tibbetts wondered if paper wasps' social skills could enable them to succeed where honeybees had failed.

To find out, Tibbetts and her colleagues tested whether two common species of paper wasp, Polistes dominula and Polistes metricus, could solve a transitive inference problem. The team's findings are scheduled for online publication May 8 in the journal Biology Letters.

"This study adds to a growing body of evidence that the miniature nervous systems of insects do not limit sophisticated behaviors," said Tibbetts, a professor in the Department of Ecology and Evolutionary Biology.

"We're not saying that wasps used logical deduction to solve this problem, but they seem to use known relationships to make inferences about unknown relationships," Tibbetts said. "Our findings suggest that the capacity for complex behavior may be shaped by the social environment in which behaviors are beneficial, rather than being strictly limited by brain size."

To test for TI, Tibbetts and her colleagues first collected paper wasp queens from several locations around Ann Arbor, Michigan.

In the laboratory, individual wasps were trained to discriminate between pairs of colors called premise pairs. One color in each pair was associated with a mild electric shock, and the other was not.

"I was really surprised how quickly and accurately wasps learned the premise pairs," said Tibbetts, who has studied the behavior of paper wasps for 20 years.

Later, the wasps were presented with paired colors that were unfamiliar to them, and they had to choose between the colors. The wasps were able to organize information into an implicit hierarchy and used transitive inference to choose between novel pairs, Tibbetts said.

"I thought wasps might get confused, just like bees," she said. "But they had no trouble figuring out that a particular color was safe in some situations and not safe in other situations."

So, why do wasps and honeybees--which both possess brains smaller than a grain of rice--perform so differently on transitive inference tests? One possibility is that different types of cognitive abilities are favored in bees and wasps because they display different social behaviors.

A honeybee colony has a single queen and multiple equally ranked female workers. In contrast, paper wasp colonies have several reproductive females known as foundresses. The foundresses compete with their rivals and form linear dominance hierarchies.

A wasp's rank in the hierarchy determines shares of reproduction, work and food. Transitive inference could allow wasps to rapidly make deductions about novel social relationships.

That same skill set may enable female paper wasps to spontaneously organize information during transitive inference tests, the researchers hypothesize.

For millennia, transitive inference was regarded as a hallmark of human cognition and was thought to be based on logical deduction. More recently, some researchers have questioned whether TI requires higher-order reasoning or can be solved with simpler rules.

The study by Tibbetts and her colleagues illustrates that paper wasps can build and manipulate an implicit hierarchy. But it makes no claims about the precise mechanisms that underlie this ability.

In previous studies, Tibbetts and her colleagues showed that paper wasps recognize individuals of their species by variations in their facial markings and that they behave more aggressively toward wasps with unfamiliar faces.

The researchers have also demonstrated that paper wasps have surprisingly long memories and base their behavior on what they remember of previous social interactions with other wasps.

Credit: 
University of Michigan

The Lancet:Targets to reduce harmful alcohol use are likely to be missed as global alcohol intake increases

Globally, alcohol intake increased from 5.9 litres pure alcohol a year per adult in 1990, to 6.5 litres in 2017, and is predicted to increase further to 7.6 litres by 2030. This is likely to be caused by increased alcohol use in low- and middle-income countries as they become wealthier.

Between 2010-2017, the most notable increases in alcohol drinking occurred in India and Vietnam, compared with significant decreases in Azerbaijan, Russia, the UK, and Peru.

The world is not on track to achieve global targets to reduce harmful alcohol use, and the authors call for effective policy measures, such as the WHO best-buys including increasing taxation, restricting availability, and banning alcohol marketing and advertising, to be introduced globally.

Increasing rates of alcohol use suggest that the world is not on track to achieve targets against harmful alcohol use, according to a study of 189 countries' alcohol intake between 1990-2017 and estimated intake up to 2030, published in The Lancet.

As a result of increased alcohol consumption and population growth, the total volume of alcohol consumed globally per year has increased by 70% (from 20,999 million litres in 1990 to 35,676 million litres in 2017). Intake is growing in low- and middle-income countries, while the total volume of alcohol consumed in high-income countries has remained stable.

The estimates suggest that by 2030 half of all adults will drink alcohol, and almost a quarter (23%) will binge drink at least once a month.

Alcohol is a major risk factor for disease, and is causally linked to over 200 diseases, in particular non-communicable diseases and injuries.

"Our study provides a comprehensive overview of the changing landscape in global alcohol exposure. Before 1990, most alcohol was consumed in high-income countries, with the highest use levels recorded in Europe. However, this pattern has changed substantially, with large reductions across Eastern Europe and vast increases in several middle-income countries such as China, India, and Vietnam. This trend is forecast to continue up to 2030 when Europe is no longer predicted to have the highest level of alcohol use," says study author Jakob Manthey, TU Dresden, Germany. [1]

He continues: "Based on our data, the WHO's aim of reducing the harmful use of alcohol by 10% by 2025 will not be reached globally. Instead, alcohol use will remain one of the leading risk factors for the burden of disease for the foreseeable future, and its impact will probably increase relative to other risk factors. Implementation of effective alcohol policies is warranted, especially in rapidly developing countries with growing rates of alcohol use." [1]

Monitoring alcohol use is part of several international programmes, including the WHO's Global Action Plan for the Prevention and Control of NCDs 2013-2020, the UN's Sustainable Development Goals, and the WHO's Global Strategy to Reduce the Harmful Use of Alcohol. These targets are based on per capita alcohol consumption in adults (the number of litres of pure alcohol consumed per person aged 15 years or more in a year taking into account recorded and unrecorded use, and tourism) [2].

The new study measured per capita alcohol consumption using data for 189 countries between 1990-2017 from the WHO and the Global Burden of Disease study. Over the same period, it also measured prevalence of people who did not drink for their whole lives or were current drinkers (ie, drank alcohol at least once a year) using surveys for 149 countries, and binge drinkers (drinking 60g or more pure alcohol in one sitting once or more within 30 days) using surveys from 118 countries. Using estimates of gross domestic product and the religious composition of the population, the results were modelled to create estimates for all 189 countries up to 2030.

In 2017, the lowest alcohol intakes were in North African and Middle Eastern countries (typically less than 1 litre per adult per year), while the highest intakes were in Central and Eastern European countries (in some cases more than 12 litres per adult per year). At the country-level, Moldova had the highest alcohol intake (15 litres per adult per year), and Kuwait had the lowest (0.005 litres per person per year)

Globally, alcohol consumption is set to increase from 5.9 litres pure alcohol a year per adult in 1990 to 7.6 litres in 2030. However, intake varied regionally. Between 2010-2017, consumption increased by 34% in southeast Asia (from 3.5 litres to 4.7 litres), with increases in India, Vietnam and Myanmar. In Europe [3], consumption reduced by 12% (from 11.2 to 9.8 litres), mainly due to decreases in former Soviet Republics such as Azerbaijan, Kyrgyzstan, Ukraine, Belarus, and Russia. Intake levels remained similar in African, American, and Eastern Mediterranean regions.

In the UK, consumption decreased from 12.3 litres in 2010 to 11.4 litres in 2017, compared to increases of 38% in India (from 4.3 to 5.9 litres). Over the same timescale, consumption increased slightly in the USA (9.3-9.8 litres) and in China (7.1-7.4 litres).

Globally, the prevalence of lifetime abstinence decreased from 46% in 1990 to 43% in 2017, while the prevalence of current drinking increased from 45% in 1990 to 47% in 2017, and the prevalence of heavy episodic drinking increased from 18.5% to 20%. However, the authors note that the changes in abstinence and heavy episodic drinking are not statistically significant.

They estimate these trends to continue, and that by 2030 40% of people will abstain from alcohol, 50% of people will drink alcohol, and almost a quarter (23%) will binge drink at least once a month.

They note that, globally, and in most regions, the volume of alcohol consumed grows faster than the number of drinkers (for example, alcohol per capita is expected to grow by 17.8% from 6.5-7.6 litres globally between 2018-2030, while the number of current drinkers is estimated to grow by just 5% from 47.3% to 49.8% in the same timeframe), meaning the average alcohol intake per drinker is forecasted to increase. Increased alcohol intake per drinker not only results in a growing proportion of heavy episodic drinkers, but also inevitably leads to an increased alcohol-attributable disease burden.

"Alcohol use is prevalent globally, but with clear regional differences that can largely be attributed to religion, implementation of alcohol policies, and economic growth. Economic growth seems to explain the global increase in alcohol use over the past few decades - for example, the economic transitions and increased wealth of several countries - in particular, the transitions of China and India - were accompanied by increased alcohol use. The growing alcohol market in middle-income countries is estimated to more than outweigh the declining use in high-income countries, resulting in a global increase," says Mr Manthey. [1]

The authors note some limitations, including that there is uncertainty around estimates of unrecorded alcohol consumption, in addition to scarcity of data in certain regions. In addition, drinking status estimates were based on surveys, where individuals often under-report their intake. Their estimates for 2018-2030 are based on economic conditions and religion only, and cannot take future policy changes or behaviour changes into account.

Writing in a linked Comment, Dr Sarah Callinan, La Trobe University, Australia, notes that the shift in alcohol consumption globally from high-income to lower income countries could lead to disproportionate increases in harm, as the harm per litre of alcohol is substantially higher in low-income and middle-income countries than in high-income countries. She says: "An increasingly robust evidence base supports use of key alcohol policy levers such as increasing price and restricting availability to curtail growing alcohol consumption beyond Europe and North America. However, this evidence comes largely from high-income countries, and the potential efficacy of such policies in lower-middle-income countries, where more than half of alcohol consumption is unrecorded, is likely to be limited without substantial reductions in unrecorded alcohol consumption (although previous studies show that unrecorded consumption tends to decline with economic development). Thus, although price or availability-based policies are important, strict restrictions on advertising and other promotional activities are crucial to slow the growing demand for alcohol in these countries. Similarly, rigorous drink-driving countermeasures are necessary so that increasing consumption does not lead to increases in road traffic injury. Supporting evidence-based policies outside high-income countries, despite anticipated strong industry resistance, will be a key task for public health advocates in the coming decades."

Credit: 
The Lancet

A moody gut often accompanies depression -- new study helps explain why

NEW YORK, NY (May 7, 2019)--For people with depression, gastrointestinal distress is a common additional burden, and a new study suggests that for some, the two conditions arise from the same glitch in neuron chemistry--low serotonin.

The study, conducted in mice, shows that a shortage of serotonin in the neurons of the gut can cause constipation, just as a serotonin shortage in the brain can lead to depression.

The study also found that a treatment that raises serotonin in the gut and the brain may alleviate both conditions.

WHY IT'S IMPORTANT

Up to a third of people with depression have chronic constipation, and a few studies report that people with depression rate their accompanying bowel difficulties as one of the biggest factors reducing their quality of life.

Severe constipation can obstruct the GI tract and cause serious pain. The condition leads to 2.5 million physician visits and 100,000 hospitalizations each year.

Though some antidepressants are known to cause constipation, medication side effects do not explain all cases.

"Ultimately, many patients with depression are faced with limited treatment options and have to suffer with prominent GI dysfunction," says study leader Kara Gross Margolis, MD, associate professor of pediatrics at Columbia University Vagelos College of Physicians and Surgeon.

BACKGROUND

Similarities between the gut and the brain suggest the two conditions may also share a common cause.

"The gut is often called the body's 'second brain,'" says Margolis. "It contains more neurons than the spinal cord and uses many of the same neurotransmitters as the brain. So it shouldn't be surprising that the two conditions could be caused by the same process."

Because low levels of serotonin in the brain have been linked to depression and serotonin is also used by neurons in the gut, the researchers studied mice to determine if a serotonin shortage also plays a role in constipation.

The mice used in the study carry a genetic mutation (linked to severe depression in people) that impairs the ability of neurons in the brain and the gut to make serotonin.

STUDY FINDINGS

The serotonin shortage in the gut, the researchers found, reduced the number of neurons in the gut, led to a deterioration of the gut's lining, and slowed the movement of contents through the GI tract.

"Basically, the mice were constipated," Margolis says, "and they showed the same kind of GI changes we see in people with constipation." (In previous studies, these same mice also showed depressive symptoms).

Encouragingly, an experimental drug treatment invented by two of the study's co-authors, Marc Caron, PhD, and Jacob Jacobsen, PhD, of Duke University, raised serotonin levels in the gut's neurons and alleviated constipation in the mice.

The treatment--slow-release drug-delivery of 5-HTP, a precursor of serotonin--works in part by increasing the number of GI neurons in adult mice.

WHAT THE FINDINGS MEAN

The discovery of this connection between a brain and a gastrointestinal disorder suggests that new 5-HTP slow-release therapies could treat related brain-gut conditions simultaneously.

The study is also one of the first to show that neurogenesis in the gut is possible and can correct abnormalities in the gut. "Though it's been known for many years that neurogenesis occurs in certain parts of the brain, the idea that it occurs in the gut nervous system is relatively new," Margolis says.

Neurogenesis may help treat other types of constipation. "We see a reduction of neurons in the GI tract with age, and that loss is thought to be a cause of constipation in the elderly," Margolis says. "The idea that we may be able to use slow-release 5-HTP to treat conditions that require the development of new neurons in the gut may open a whole new avenue of treatment."

An immediate-release version of 5-HTP is available as a supplement, but it has not been proved scientifically to work and physiologically it should not, as it is too short-acting, Margolis says. 5-HTP is the immediate precursor to serotonin. Once ingested, 5-HTP is converted to serotonin, but the serotonin is rapidly inactivated before it can work effectively.

The slow-release version of 5-HTP used in the current study produces constant administration of 5-HTP which has been demonstrated to remedy the limitations of currently available immediate-release 5-HTP.

WHAT'S NEXT

Clinical studies are already planned for testing a slow-release 5-HTP drug in people with treatment-resistant depression.

Planning for testing a slow-release 5-HTP drug in constipation is in progress.

Credit: 
Columbia University Irving Medical Center

Bacteria causing infections can be detected more rapidly

image: BacGO, a novel Gram-positive bacterial probe, was originated from a fluorescent library carrying a boronic acid motif for binding to peptidoglycan on the cell wall. BacGO can be used to identify Gram-positive planktonic bacteriain mixedsamplehaving diversity andcomplexity.

Image: 
POSTECH

Two years ago, a group of infants died at the university hospital and it was found to be Gram-negative bacteria that caused their death. The Gram-negative bacteria turn into pink color and Gram-positive bacteria turn into violet color when stained using the Gram stain which is a bacterial staining method used since 1884. Usually, bacteria that cause tetanus, pneumonia, and food poisoning are types of Gram-positive bacteria. The Gram staining is a standard staining method that has been used to distinguish bacteria for a long time, however, there have been many difficulties with this method, requiring multi-steps of procedures and experienced technical skills.

Prof. Young-Tae Chang, Dr. Nam Young Kang, Dr. Hwa-Young Kwon, and Xiao Liu of Pohang University of Science and Technology(POSTECH) Department of Chemistry developed a fluorescent probe, BacGo that can detect Gram-positive bacteria precisely and promptly. They published their research on the most renowned journal of the field of chemistry, Angewandte Chemie. The research team used bacterial sludge from wastewater for the demonstration experiment. They successfully monitored the proportion of bacteria in the process of wastewater treatment and confirmed the possible application to clinical diagnosis of keratitis.

The Gram staining was first developed by a Danish scientist, Christian Gram, in 1884 and it has been used so far as the golden standard to classify bacteria. However, it has several obstructions. For example, using a set of dyes such as crystal violet and safranin, the method can only be applied to fixed samples (a chemical process that kills bacteria), not to live bacteria. It also involves multiple steps of process to go through by sequential using of crystal violet and safranin dyes. To overcome these issues, there have been several fluorescent probes developed with better sensitivity than the Gram staining, yet, they have limited selectivity against Gram-positive bacteria and slow detection of bacteria. In this regard, the developed probes so far are not suitable for universal bacterial discrimination such as in sludge from wastewater and other work that requires rapid detection.

The research team focused on polysaccharide in the peptidoglycan of Gram-positive bacteria and screened fluorescent molecules with a boronic acid, which can bind to polysaccharide for detecting Gram-positive bacteria. Finally, they successfully developed a fluorescent probe that can select and stain only Gram-positive bacteria.

This fluorescent probe can specifically select various Gram-positive bacteria. Based on this accomplishment, the research team demonstrated the application to sludge from wastewater and the mice infected by keratitis. As a result, they confirmed that BacGo can monitor the proportion of bacteria in sludge during the treatment of wastewater. Also, the experiment with the keratitis infected mice showed that the new probe can diagnose the infection of bacteria very precisely. This illustrates the possibility of its application to clinical diagnosis.

Professor Young-Tae Chang who led the research team showed his anticipation in his comment, "BacGo is different from the Gram stain which has been commonly used. With this new probe, we can monitor various live Gram-positive bacteria through minimal staining process. It not only can replace the former fluorescent probe for screening Gram-positive bacteria which have many limitations, but also it can be utilized in many different applications such as monitoring wastewater and clinical diagnosis of bacterial infections."

Credit: 
Pohang University of Science & Technology (POSTECH)

Excessive use of skin cancer surgery curbed with awareness effort

image: A macro and micro look at skin cancer: (left) A leg with a squamous cell carcinoma lesion and (right) a colorized photomicrograph of the cancer at the cellular level. Mohs micrographic surgery, the subject of a new paper from Johns Hopkins Medicine, is one of the most effective techniques for treating this type of malignancy.

Image: 
Lesion photo courtesy of Kelly Nelson, National Cancer Institute, and photomicrograph courtesy of Markus Schrober and Elaine Fuchs, The Rockefeller University

Sometimes a little gentle peer persuasion goes a long way toward correcting a large problem.

That's the message from researchers at Johns Hopkins Medicine and seven collaborating health care organizations which report that a "Dear Colleague" performance evaluation letter successively convinced physicians nationwide to reduce the amount of tissue they removed in a common surgical treatment for skin cancer to meet a professionally recognized benchmark of good practice.

In a study published in the journal JAMA Dermatology, the researchers reported an immediate positive change in surgical behavior -- an improvement that was sustained for one year -- for 83 percent of the physicians notified that they were excising more-than-necessary amounts of tissue on a regular basis during Mohs micrographic surgery (MMS). The surgery is considered the most effective technique for treating many basal cell and squamous cell carcinomas, the two most common types of skin cancer.

"This study demonstrates the tremendous power of physicians within a specialty to create peer-to-peer accountability and of using that accountability to reduce unnecessary treatment and lower health care costs," says Martin A. Makary, M.D., Ph.D., senior author of the study, professor of surgery at the Johns Hopkins University School of Medicine and an authority on health care quality. He also serves as principal investigator of Improving Wisely, a national project to lower medical costs in the United States by implementing measures of appropriateness in health care.

The new study, part of the Improving Wisely effort, was supported by a grant from the Robert Wood Johnson foundation.

MMS, developed by Frederic Mohs at the University of Wisconsin in the 1930s, is a specialized technique for the treatment of skin cancer, the most common malignancy in the United States at greater than 5.4 million cases annually. Performed as an outpatient procedure, MMS is designed so that the surgeon can methodically remove cancerous tissue on the surface and all of its "roots" -- extensions of the tumors that may exist under the skin or lie along blood vessels, nerves and cartilage.

The surgery is conducted in stages, with stage 1 involving the removal of the visible cancer and a thin layer of surrounding tissue. The excised sample is then cut into sections, stained and examined microscopically while the patient waits. If residual cancer is found, the surgeon can elect right then to remove more tissue in successive stages. The process is repeated as many times as necessary.

The American College of Mohs Surgery (ACMS) considers a surgeon's annual mean stages per MMS case to be the measure of quality and appropriateness for the technique. Using that metric, the organization defines physicians whose practices are two standard deviations or more beyond the overall average as outliers who are performing excessive stages in MMS procedures.

Because previous studies suggest that MMS practices vary widely among surgeons, the study by Makary, his team and the ACMS had two aims: evaluate outlier practice patterns using a big-data approach and then, test whether a peer-to-peer notification could change the behavior of surgeons not meeting the appropriateness standard.

"This was an important goal because overuse of stages per case burdens patients with unnecessary and time-consuming surgical resections, and taxes the health care system with avoidable costs," says Christine Fahim, Ph.D., M.Sc., one of the study authors, a postdoctoral fellow at the Johns Hopkins University School of Medicine and the Johns Hopkins Bloomberg School of Public Health, and implementation and intervention design lead for Improving Wisely.

In their paper, the researchers describe how they used Medicare Part B claims to choose their study population of 2,329 U.S. surgeons who each performed more than 10 MMS procedures between Jan. 1 and Dec. 31, 2014. The claim forms included the number of stages done in each case, so individual and overall annual averages were easily calculated. Outliers and inliers (surgeons whose MMS performance was within the accepted range of appropriateness defined by the ACMS) were identified by their performances before they became part of the study population (as measured between Jan. 1, 2016, and Jan. 31, 2017).

The study population was then divided into four groups: (1) 53 outliers, each of whom would receive an intervention letter indicating his or her performance, and urging an improvement in practice, (2) 87 outliers, each of whom would not receive an intervention, (3) 992 inliers who would receive a straightforward performance evaluation letter, and (4) 1,197 inliers who would not receive a letter.

The intervention groups received their letters in February 2017. Each surgeon's MMS performance, defined as annual mean stages per case, was measured pre-intervention (between Jan. 1, 2016, and Jan. 31, 2017) and post-intervention (between March 1, 2017, and March 31, 2018).

The notified outlier group demonstrated a pre- to post-intervention decrease in mean stages per case from 2.55 to 2.31, with 44 of the 53 surgeons (83 percent) improving their MMS behavior. The non-notified outliers dropped from 2.56 to 2.46, with 69 percent making positive changes.

The researchers attribute the drop by non-notified outliers to two factors: an awareness campaign by ACMS around the time the intervention letters went out and possible communications between surgeons who received the letters and their colleagues who did not.

The performance of the inlier groups, as expected, remained statistically about the same.

The researchers also estimated that the relatively inexpensive ($150,000 or about $144 per surgeon) peer-to-peer intervention saved $11 million in Medicare costs during the study period.

"We observed an immediate and sustained improvement in quality with a simple intervention based on the spirit of physicians helping one another," Makary says. "The low cost to implement the program relative to the significant savings achievable suggests that this model could be applied to other areas of medicine with broad financial implications. More importantly, we found that even small improvements in a physician's performance can positively impact the many patients he or she treats."

Credit: 
Johns Hopkins Medicine

Banana disease boosted by climate change

image: Climate change has raised the risk of a fungal disease that ravages banana crops, new research shows.

Image: 
Dan Bebber

Climate change has raised the risk of a fungal disease that ravages banana crops, new research shows.

Black Sigatoka disease emerged from Asia in the late 20th Century and has recently completed its invasion of banana-growing areas in Latin America and the Caribbean.

The new study, by the University of Exeter, says changes to moisture and temperature conditions have increased the risk of Black Sigatoka by more than 44% in these areas since the 1960s.

International trade and increased banana production have also aided the spread of Black Sigatoka, which can reduce the fruit produced by infected plants by up to 80%.

"Black Sigatoka is caused by a fungus (Pseudocercospora fijiensis) whose lifecycle is strongly determined by weather and microclimate," said Dr Daniel Bebber, of the University of Exeter.

"This research shows that climate change has made temperatures better for spore germination and growth, and made crop canopies wetter, raising the risk of Black Sigatoka infection in many banana-growing areas of Latin America.

"Despite the overall rise in the risk of Black Sigatoka in the areas we examined, drier conditions in some parts of Mexico and Central America have reduced infection risk."

The study combined experimental data on Black Sigatoka infections with detailed climate information over the past 60 years.

Black Sigatoka, which is virulent against a wide range of banana plants, was first reported in Honduras in 1972.

It spread throughout the region to reach Brazil in 1998 and the Caribbean islands of Martinique, St Lucia and St Vincent and the Grenadines in the late 2000s.

The disease now occurs as far north as Florida.

"While fungus is likely to have been introduced to Honduras on plants imported from Asia for breeding research, our models indicate that climate change over the past 60 years has exacerbated its impact," said Dr Bebber.

The Pseudocercospora fijiensis fungus spreads via aerial spores, infecting banana leaves and causing streaked lesions and cell death when fungal toxins are exposed to light.

The study did not attempt to predict the potential effects of future climate on the spread and impact of Black Sigatoka. Other research suggests drying trends could reduce disease risk, but this would also reduce the availability of water for the banana plants themselves.

Credit: 
University of Exeter

Training for first-time marathon 'reverses' aging of blood vessels

Venice, Italy - 3 May 2019: Training for and completing a first-time marathon "reverses" ageing of major blood vessels, according to research presented today at EuroCMR 2019, a scientific congress of the European Society of Cardiology (ESC).1 The study found that older and slower runners benefit the most.

Study author Dr Anish Bhuva, a British Heart Foundation Fellow at University College London, UK, said: "Novice runners who trained for six months and completed their first marathon had a four-year reduction in arterial age and a 4 mmHg drop in systolic blood pressure. This is comparable to the effect of medication, and if maintained translates to approximately 10% lower risk of stroke over a lifetime."

A hallmark of normal ageing is stiffening of the blood vessels, which increases the risk of stroke and heart disease even in healthy people. Compared to their peers, lifelong athletes have biologically younger blood vessels. This study investigated whether training for a marathon could modify aortic stiffness even in novice runners.

The study included 139 healthy first-time marathon runners aged 21-69 years who were advised to follow a first-time finisher training programme and ran an estimated 6-13 miles (10-20 km) a week for six months ahead of completing the 2016 or 2017 London Marathon.2,3

Before they started training and two weeks after completing the marathon, participants had magnetic resonance imaging (MRI) and ultrasound scans of the heart and blood vessels, a fitness test, and measurements of blood pressure and heart rate. Biological age of the aorta was calculated at both time points.

After completing the marathon, aortic stiffness had reduced and the aorta was four years younger than before training. Older participants and those with longer marathon finish times had greater reductions in aortic stiffness after training. Reductions in aortic stiffness were independent of changes in blood pressure.

Dr Bhuva said: "You don't have to be an elite athlete to gain the benefits from marathon running, in fact the benefits appeared greatest in those who were older and slower. By completing training, and getting to the finish line, it is possible to rejuvenate the cardiovascular system of first-time marathon runners."

Fitness improved and heart rate dropped after training - both to a modest extent. "The minimal impact on these conventional markers of health suggests that study participants trained within their personal limits," said Dr Bhuva. "Aortic stiffness and blood pressure changed more than fitness and heart rate."

Dr Bhuva noted that participants had been running for less than two hours a week before marathon training and their finish times were slower than average, which was expected as it was their first race. "The study shows that the health gains of lifelong exercise start to appear after a relatively brief training programme," he said. "Training for a marathon can be a good motivator to keep active. Many people enjoy it and continue running, which should increase the likelihood of sustaining the benefits."

Professor Sanjay Sharma, medical director of the London Marathon and an author of the study, said: "The benefits of exercise on the heart and circulation are well established, and are associated with lower cardiovascular disease and mortality. Recent studies have shown that exercise may retard ageing of the cardiovascular system. Our study shows that a first-time marathon makes the cardiovascular system 'younger' therefore participants will reap these benefits whilst running for a good cause."

Credit: 
European Society of Cardiology

Cooperation among fishers can improve fish stock in coral reefs

image: A solitary fisher in Kenya.

Image: 
Tim McClanahan, Wildlife Conservation Society

Cooperation within a group of people is key to many successful endeavors, including scientific ones. According to a study published in Nature Communications, cooperation among competing fishers can boost fish stocks on coral reefs.

The study analyzed the social relationships among competing fisheries, the species they collect, and the local reefs from which these species are extracted.

The results suggest that even though they are considered business rivals, fishers communicate and cooperate in addressing local environmental issues, which can lead to improvements in both the quality and quantity of fish in their local reefs. In the end, this cooperation could translate into further economic gain and more sustainable business, explains Orou Gaoue, assistant professor of ecology and evolutionary biology at the University of Tennessee, Knoxville, and coauthor of the study.

The research team was led by Michele Barnes, senior research fellow in the ARC Centre of Excellence for Coral Reef Studies at James Cook University in Australia.

For the study, Barnes and her team--which in addition to Gaoue included researchers from Conservation International, Lancaster University in the UK, and Stockholm University in Sweden--interviewed 648 fishers and gathered data on reef conditions across five coral reef fishing communities in Kenya.

They found that in places where fishers communicated with their competitors about the fishing gear they use, locations for hunting, and fishing rules, there were more fish in the sea--and of higher quality.

"Relationships between people have important consequences for the long-term availability of the natural resources we depend on," Barnes says.

"Although this study is on coral reefs," says Gaoue, "the results are also relevant for terrestrial ecosystems where, in the absence of cooperation, competition for non-timber forest products can quickly lead to depletion even when locals have detailed ecological knowledge of their environment."

Credit: 
University of Tennessee at Knoxville