Culture

Faking emotions at work does more harm than good

The adage "Fake it until you make it" - the idea that someone can fake a positive attitude to elicit real-life benefits - often backfires when used with co-workers, according to a study led by a University of Arizona researcher.

Instead, researchers say, making an effort to actually feel the emotions you display is more productive.

Allison Gabriel, associate professor of management and organizations in the Eller College of Management, led a team that analyzed two types of emotion regulation that people use at work: surface acting and deep acting.

"Surface acting is faking what you're displaying to other people. Inside, you may be upset or frustrated, but on the outside, you're trying your best to be pleasant or positive," Gabriel said. "Deep acting is trying to change how you feel inside. When you're deep acting, you're actually trying to align how you feel with how you interact with other people."

The study surveyed working adults in a wide variety of industries including education, manufacturing, engineering and financial services.

"What we wanted to know is whether people choose to engage in emotion regulation when interacting with their co-workers, why they choose to regulate their emotions if there is no formal rule requiring them to do so, and what benefits, if any, they get out of this effort," Gabriel said.

Putting on a Happy Face

Gabriel says that when it comes to regulating emotions with co-workers, four types of people emerged from the study:

Nonactors, or those engaging in negligible levels of surface and deep acting;

Low actors, or those displaying slightly higher surface and deep acting;

Deep actors, or those who exhibited the highest levels of deep acting and low levels of surface acting; and,

Regulators, or those who displayed high levels of surface and deep acting.

In each study, nonactors made up the smallest group, with the other three groups being similar in size.

The researchers identified several drivers for engaging in emotion regulation and sorted them into two categories: prosocial and impression management. Prosocial motives include wanting to be a good co-worker and cultivating positive relationships. Impression management motives are more strategic and include gaining access to resources or looking good in front of colleagues and supervisors.

The team found that regulators, in particular, were driven by impression management motives, while deep actors were significantly more likely to be motivated by prosocial concerns. This means that deep actors are choosing to regulate their emotions with co-workers to foster positive work relationships, as opposed to being motivated by gaining access to more resources.

Faking It Versus Feeling It

"The main takeaway," Gabriel says, "is that deep actors - those who are really trying to be positive with their co-workers - do so for prosocial reasons and reap significant benefits from these efforts."

According to the researchers, those benefits include receiving significantly higher levels of support from co-workers, such as help with workloads and offers of advice. Deep actors also reported significantly higher levels of progress on their work goals and trust in their co-workers than the other three groups.

The data also showed that mixing high levels of surface and deep acting resulted in physical and mental strain.

"Regulators suffered the most on our markers of well-being, including increased levels of feeling emotionally exhausted and inauthentic at work," Gabriel said.

Lessons Learned

While some managers Gabriel spoke to during the course of her research still believe emotions have little to do with the workplace, the study results suggest there is a benefit to displaying positive emotions during interactions at work, she said.

"I think the 'fake it until you make it' idea suggests a survival tactic at work," Gabriel said. "Maybe plastering on a smile to simply get out of an interaction is easier in the short run, but long term, it will undermine efforts to improve your health and the relationships you have at work."

"In many ways," Gabriel added, "it all boils down to, 'Let's be nice to each other.' Not only will people feel better, but people's performance and social relationships can also improve."

Credit: 
University of Arizona

Fewer than half of US clinical trials have complied with the law on reporting results

January 2020 is the third anniversary of the implementation of the new US regulations that require clinical trials to report results within one year of completion (Final Rule of the FDA Amendments Act)--but compliance remains poor, and is not improving, with US Government sponsored trials most likely to breach.

Less than half (41%) of clinical trial results are reported promptly onto the US trial registry, and 1 in 3 trials remain unreported, according to the first comprehensive study of compliance since new US regulations came into effect in January 2017.

The findings, published in The Lancet, indicate that trials with non-industry sponsors (such as universities, hospitals, and governments) are far more likely to breach the rules than trials sponsored by industry [1]--with US Government sponsored trials least likely to post results on time at the world's largest clinical trial registry, ClinicalTrials.gov.

It has been known for several decades that the results of clinical trials are often not fully reported. To improve public disclosure, and limit selective publishing of results, the US Food and Drug Administration Amendment Act (FDAAA) of 2007 requires sponsors of most US-regulated clinical trials to register and report results on ClinicalTrials.gov within 12 months of primary completion, irrespective of whether the results are positive or negative.

A subsequent 'Final Rule' to the Act took effect in January, 2017. This introduced clearer reporting requirements including fines of up to US$10,000 a day for non-compliance (now US$ 12,103 inflation adjusted). National Institute of Health (NIH) leaders said that the Final Rule would result in "rapid increases" in the percentage of trials registered and shared on the US registry [2].

The authors say that the high rates of non-compliance found in the new study likely reflect the lack of enforcement by regulators, and they call for trial sponsors to be held to account by the FDA.

"Patients and clinicians cannot make informed choices about which treatments work best when trial results are routinely withheld. Clinical trials are not abstract research projects: they are large, expensive, practical evaluations that directly impact on patient care by informing treatment guidelines and evidence reviews." says Dr Ben Goldacre from Oxford University, UK, who led the research. [3]

He continues: "Sponsors are breaching their legal obligations, but also their ethical obligations to the patients who generously participate in clinical trials. Our study has identified over 2,400 trials breaching the rules, but to our knowledge the FDA has never levied a single fine or other enforcement action, despite all the levers available to them. Compliance will only improve when action is taken." [3]

Non-reporting of clinical trial results has been well documented since the 1980s, especially those trials finding no evidence of effectiveness for the treatment being tested [4]. However, failing to disclose trial results threatens the integrity of the evidence base of all clinical medicine, breaches participants' trust, and wastes valuable research resources.

The first trials covered by the Final Rule were due to report in January 2018. To investigate the extent of compliance with these new reporting requirements, the researchers examined all 4,209 trials registered on ClinicalTrials.gov that were legally required to report results between March 2018 and September 2019. They also assessed trends in compliance, factors associated with compliance, and ranked individual sponsors according to their level of compliance.

Of the completed trials included in the study, around half (52%; 2,178) had non-industry sponsors, most involved a drug intervention (71%; 2,968), and most were solely conducted in the USA (71%; 3,000).

Analyses found that only 41% (1,722/4,209) of completed clinical trials reported results within the one year legal deadline, whilst 36% (1,523/4,209) still had not been reported by September 16, 2019. Moreover, progress has stalled--the proportion of compliant trials has remained stable since July 2018. The median delay from completion to submitting results was 424 days--59 days higher than the legal reporting requirement of one year (figure 1).

Trials with an industry sponsor were much more likely to comply with the law than those with a non-industry or US Government sponsor (50% vs 34% vs 31% trials submitted in time). Better performance was also seen among sponsors with more experience of running large numbers of trials, when compared with those who have only ever run a very small number of projects (66% vs 21% trials submitted in time; table 3). Encouragingly, the authors say, this suggests that "research experience and robust internal governance processes can contribute to improved performance."

Further analyses estimate that had the law been strictly enforced, over US$4 billion in fines could have been collected up to the end of September 2019.

"Over four decades since non-reporting of clinical trials was first reported, it is disappointing to see that we have only progressed to legislation being passed, and then largely ignored," says co-author Nicholas DeVito from the University of Oxford, UK. "The fact that the US Government cannot comply with its own laws is particularly concerning." [3]

He continues: "Until effective enforcement action is taken, public audit may help. We have established an openly accessible public website at fdaaa.trialstracker.net where fresh data on compliance with FDAAA will be posted every day, identifying each individual overdue trial, and compliance statistics for each individual sponsor. We hope this will help to incentivise sponsors, and provide useful targeted information for all those who aim to comply with the law." [3]

The authors note that they only examine the availability of results on ClinicalTrials.gov as required by the law, and not the quality of the results or their availability elsewhere.

Writing in a linked Comment, lead author Dr Erik von Elm (who was not involved in the study) from the University of Lausanne in Switzerland points out that, "any law is only as good as its enforcement", adding that, "if this rule were to be enforced, academic sponsors would probably make substantial efforts to reduce the number of non- or late-reported trials and to improve data quality. Training, auditing and incentive mechanisms could be overseen by dedicated staff. A senior "transparency officer" versed in trial conduct and reporting could take a proactive mentoring role and help investigators overcome barriers that currently prevent them from timely reporting of trial results in registries. If completeness of reporting was a criterion in individual academic evaluations, this could have a considerable "signalling effect" within the local research community."

Credit: 
The Lancet

New dog, old tricks? Stray dogs can understand human cues

If you have a dog, hopefully you're lucky enough to know that they are highly attuned to their owners and can readily understand a wide range of commands and gestures. But are these abilities innate or are they exclusively learned through training?

To find out, a new study in Frontiers in Psychology investigated whether untrained stray dogs could understand human pointing gestures.

The study revealed that about 80% of participating dogs successfully followed pointing gestures to a specific location despite having never received prior training. The results suggest that dogs can understand complex gestures by simply watching humans and this could have implications in reducing conflict between stray dogs and humans.

Dogs were domesticated 10,000-15,000 years ago, likely making them the oldest domesticated animals on the planet. Humans then bred dogs with the most desirable and useful traits so that they could function as companions and workers, leading to domesticated dogs that are highly receptive to human commands and gestures.

However, it was not clear whether dogs understand us through training alone, or whether this was innate. Can dogs interpret a signal, such as a gesture, without specific training, or even without having met the signaling person previously? One way to find out is to see whether untrained, stray dogs can interpret and react to human gestures.

Stray dogs are a common feature in cities around the world and particularly in many developing countries. While they may observe and occasionally interact with people, such dogs have never been trained, and are behaviorally "wild". Conflicts between stray dogs and humans are a problem and understanding how humans shape stray dog behavior may help alleviate this.

To investigate, Dr. Anindita Bhadra of the Indian Institute of Science Education and Research Kolkata, India, and colleagues studied stray dogs across several Indian cities. The researchers approached solitary stray dogs and placed two covered bowls on the ground near them. A researcher then pointed to one of the two bowls, either momentarily or repeatedly, and recorded whether the dog approached the indicated bowl. They also recorded the perceived emotional state of the dogs during the experiment.

Approximately half of the dogs did not approach either bowl. However, the researchers noticed that these dogs were anxious and may have had bad experiences with humans before. The dogs who approached the bowls were noted as friendlier and less anxious, and approximately 80% correctly followed the pointing signals to one of the bowls, regardless of whether the pointing was momentary or repeated. This suggests that the dogs could indeed decipher complex gestures.

"We thought it was quite amazing that the dogs could follow a gesture as abstract as momentary pointing," explained Bhadra. "This means that they closely observe the human, whom they are meeting for the first time, and they use their understanding of humans to make a decision. This shows their intelligence and adaptability."

The results suggest that dogs may have an innate ability to understand certain human gestures which transcends training. However, it should be noted that the shyer, more anxious animals tended not to participate, so future studies are needed to determine more precisely how an individual dog's personality affects their ability to understand human cues.

Overall, dogs may be more perceptive than we realize. "We need to understand that dogs are intelligent animals that can co-exist with us," said Bhadra "They are quite capable of understanding our body language and we need to give them their space. A little empathy and respect for another species can reduce a lot of conflict."

Credit: 
Frontiers

Study finds disparity in critical care deaths between non-minority and minority hospitals

image: Disparities found in ICU deaths at hospitals with few minority patients vs those with a large number of minority patients.

Image: 
ATS

Jan. 17, 2020--While deaths steadily declined over a decade in intensive care units at hospitals with few minority patients, in ICUs with large numbers of minority patients, there was less improvement, according to new research published online in the American Thoracic Society's American Journal of Respiratory and Critical Care Medicine.
The disparity was most pronounced among critically ill African American patients.

In "Temporal Trends in Critical Care Outcomes in United States Minority Serving Hospitals," lead author John Danziger, MD, MPhil, and colleagues report on their analysis of nearly 1.1 million patients hospitalized at more than 200 hospitals that participated in the telehealth platform provided by Philips Healthcare from 2006-16.

In additional to studying mortality, the researchers looked at length of stay in the ICU and in the hospital. The data showed a similar pattern of improvement over the decade at non-minority serving hospitals and less improvement at minority serving hospitals.

"We wanted to know whether racial inequalities, previously described across a range of health care environments, extend into the highest level of care, namely the ICU," said Dr. Danziger, an assistant professor of medicine at Harvard Medical School and a physician at Beth Israel Deaconess Medical Center in Boston.

For the purposes of this study, the authors defined minority serving hospitals two ways. The first definition defined such hospitals as having twice as many minority patients as expected based on the percentage of African American or Hispanic living in the region according to the U.S. Census. The second defined a minority serving hospital as having more than 25 percent of its ICU patients identify as African American or Hispanic. The different definitions yielded similar results.

The study found that over a decade:

Nearly a third of critically ill African American and half of critically ill Hispanic patients were treated at only 14 of the surveyed hospitals.

A steady decline (about 2 percent) each year in ICU deaths at non-minority hospitals, but no decline after the first few years at minority hospitals.

Longer lengths of ICU and hospital stay at minority hospitals than non-minority hospitals.

African Americans treated at non-minority hospitals experienced a 3 percent decline in mortality each year, compared to no decline in mortality when treated at minority hospitals.

To avoid biasing results, the researchers took into account a range of variables, including age, gender, admission diagnosis, severity of illness and co-existing health problems. They found that minority serving hospitals tended to care for younger, but sicker patients.

The authors said that their study could not determine whether the health disparities they observed "reflect caring for an increasing disadvantaged population" or differences in hospital resources. The researchers did find that patients at minority service hospitals had to wait longer to be admitted to the ICU from the emergency room than patients who were treated in non-minority serving hospitals, suggesting that differences in resources contributed to the findings of their study.

Still, said Dr. Danziger, "The observation that large numbers of critically ill minorities are cared for in poorer performing ICUs gives us an important target for focused research efforts and additional resources to help close the health care divide amongst different minorities in the United States."

Credit: 
American Thoracic Society

Psychology program for refugee children improves wellbeing

A positive psychology program created by researchers at Queen Mary University of London focuses on promoting wellbeing in refugee children. It is unusual in that it focuses on promoting positive outcomes, rather than addressing war trauma exposure.

This is the first positive psychology-based intervention to be systematically evaluated for use with refugee children.

The purpose of the intervention, known as Strengths for the Journey, is to build positive psychological resources in young refugees - such as positive emotions, character strengths, optimistic thinking, community and nature connectedness, hope, and mindfulness - in order to promote their wellbeing and resilience.

In a study, published in the journal Development and Psychopathology, the researchers show that the intervention improved the children's positive outcomes (wellbeing, optimism, and self-esteem) and reduced their depressive symptoms.

Specifically, it more than doubled participants' ratings of their wellbeing and optimism and led to dramatic reductions in depressive symptoms.

Dr Sevasti Foka, lead author of the study from the Department of Biological and Experimental Psychology at Queen Mary University of London, said: "The key finding of the study is that the Strengths for the Journey intervention seems to be quite effective. Our results suggest that short, inexpensive positive psychology interventions such as Strengths for the Journey can lead to real improvements in refugee children's mental health and wellbeing, even when those children are experiencing the many challenges of living in a refugee camp."

The program is delivered over a seven-day period and was evaluated in refugee camps in Lesvos, Greece, with 72 children ranging from 7 to 14 years-old. The children were predominantly displaced from Syria and Afghanistan.

Over a million refugees have arrived by sea in Greece in the last four years, almost half of whom are under 18 years old*. Upon arrival in Greece, many are placed in camps with limited access to school and mental health services, and report high rates of attempted suicide, panic attacks, anxiety, and aggressive outbursts**.

The intervention is one of very few programs to be developed specifically for use with kids living in refugee camps. Refugee camps are quite a different context from resettlement or settlement in communities and are very challenging places for children to live.

The researchers suggest that the intervention should be expanded to a larger group of refugee children in Greece - and potentially those living in refugee camps elsewhere - because it has real potential to improve their mental health and wellbeing.

Isabelle Mareschal, an author of the study from Queen Mary University of London, said: "It seems like child refugees living in low-resource settings like refugee camps would benefit from Strengths for the Journey or other short positive psychology interventions that promote resilience."

In the study, the researchers ran a pilot evaluation using a wait-list controlled trial design to see whether the intervention improved children's mental health and wellbeing. This involved providing the program to one group of children and then providing it to the next group a little later in order to compare outcomes while still providing treatment for all participants.

The researchers were improving on most previous work in three ways: by doing a controlled trial, focusing on children living in refugee camps, and by looking at positive outcomes rather than just psychopathology.

Dr Kristin Hadfield, the corresponding author of the study from Queen Mary University of London, added: "In addition to the intervention being different, our evaluation of it is also different. Very few interventions for use with refugees are rigorously evaluated. NGOs and governments spend a lot of money attempting to improve refugee children's outcomes through various programmes, but many do not actually check whether these are effective. We have compared changes in the kids who took part in Strengths for the Journey with changes in kids living in the camps who did not take part to see whether those who took part in Strengths for the Journey did better."

Credit: 
Queen Mary University of London

Climate may play a bigger role than deforestation in rainforest biodiversity

image: By measuring characteristics like ear, foot, and tail size in species like Euryoryzomys russatus, researchers can quantify functional diversity in large rainforests.

Image: 
Photo courtesy of Ricardo S. Bovendorp

"Save the rainforests" is a snappy slogan, but it doesn't tell the full story of how complicated it is to do just that. Before conservationists can even begin restoring habitats and advocating for laws that protect land from poachers and loggers, scientists need to figure out what's living, what's dying, and which patterns explain why. Tackling these questions--in other words, finding out what drives a region's biodiversity--is no small task.

The better we measure what's in these rainforests, the more likely we are to find patterns that inform conservation efforts. A new study in Biotropica, for instance, crunched numbers on a behemoth dataset on small mammals in South America and found something surprising in the process: that climate may affect biodiversity in rainforests even more than deforestation does.

Noé de la Sancha, a scientist at the Field Museum in Chicago, professor at Chicago State University, and the paper's lead author, stresses that changing how we measure biodiversity can uncover patterns like these.

"When we think about biodiversity, we usually think about the number of species in a particular place--what we call taxonomic diversity," says de la Sancha. "This paper aims to incorporate better measures of biodiversity that include functional and phylogenetic diversity."

Functional diversity looks at biodiversity based on the roles organisms play in their respective ecosystems. Rather than simply counting the species in an area, scientists can use categories--"Do these mammals primarily eat insects, or do they primarily eat seeds?" and "Do they only live on the forest floor, or do they live in trees?" as well as quantitative characters like weight and ear, foot, and tail size, for instance--to determine and quantify how many different ecological roles a habitat can sustain.

Meanwhile, phylogenetic diversity looks at how many branches of the animal family tree are represented in a given area. By this measure, a patch of land consisting almost entirely of closely-related rodents would be considered far less diverse than another that was home to a wide genetic range of rodents, marsupials, and more--even if the two patches of land had the same number of species.

By applying these approaches to data on all known small mammal species and all those species' characteristics, scientists are able to see the forest from the trees, uncovering patterns they wouldn't have using any single dimension of diversity alone.

This is how de la Sancha and his co-authors found, based on functional and phylogenetic measures, that while deforestation causes local extinctions, climate-related variables had more of an effect on small mammal biodiversity patterns across the entire forest system.

In other words, if a section of rainforest was cut down, some of the animals living there might disappear from that area, while the same species living in intact patches of rainforest could survive. And, the researchers found, even if a species disappears from one area, different species that play a similar role in the ecosystem tend to replace them in other forest patches and other parts of the forest system. Meanwhile, changes to the climate may have big, sweeping effects on a whole rainforest system. This study found that BIO9, a bioclimatic variable measuring mean temperature of the driest quarter--more simply put, how hot the forest is in its least rainy season--affects biodiversity across the whole forest system.

Knowing these climate variables play a role in rainforest health can be concerning. This study and others provide strong evidence of climate change's effects on large ecosystems, underlining the urgency of studying and protecting habitats like the Atlantic Forest, the South American forest system at the center of the study.

"We still have so much that we don't know about so many of these species, which underlines the necessity for more fieldwork," de la Sancha says. "Once we have more specimens, we can improve how we quantify functional diversity and our understanding of why these small mammals evolved the way they did. From there, we can keep better track of biodiversity in these areas, leading to improved models and conservation strategies down the line."

Still, with only 9-16 percent of the Atlantic Forest's original habitat space remaining, this study lends a silver lining to an otherwise grim narrative about the effects of human activity on rainforests.

"I think this gives us a little bit of hope. As long as we have forest--and we need to have forest still--we can maintain biodiversity on a large scale," de la Sancha says. "As long as we don't wipe it all out, there's good evidence to show that we can maintain biodiversity, at least for small mammals, and the ecosystem services these critters provide."

Credit: 
Field Museum

Observational study explores fish oil supplements, testicular function in healthy young men

What The Study Did: An observational study of nearly 1,700 young healthy Danish men looked at how fish oil supplements were associated with testicular function as measured by semen quality and reproductive hormone levels. Limitations of this study include a lack of information on the actual concentration of omega-3 fatty acids in the fish oil supplements self-reported by the men. Researchers suggest randomized clinical trials are needed.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Author: Tina Kaid Jensen, Ph.D., University of Southern Denmark, Odense, and coauthors.

(10.1001/jamanetworkopen.2019.19462)

Editor's Note: The article includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Study traces evolution of acoustic communication

image: Frogs, like mammals, originated as predominantly nocturnal animals, but maintained the ability to communicate acoustically after switching to being active during the day.

Image: 
Peter Trimming/Creative Commons

Imagine taking a hike through a forest or a stroll through a zoo and not a sound fills the air, other than the occasional chirp from a cricket. No birds singing, no tigers roaring, no monkeys chattering, and no human voices, either. Acoustic communication among vertebrate animals is such a familiar experience that it seems impossible to imagine a world shrouded in silence.

But why did the ability to shout, bark, bellow or moo evolve in the first place? In what is likely the first study to trace the evolution of acoustic communication across terrestrial vertebrates, John J. Wiens of the University of Arizona and Zhuo Chen, a visiting scientist from Henan Normal University in Xinxiang, China, traced the evolution of acoustic communication in terrestrial vertebrates back to 350 million years ago.

The authors assembled an evolutionary tree for 1,800 species showing the evolutionary relationships of mammals, birds, lizards and snakes, turtles, crocodilians, and amphibians going back 350 million years. They obtained data from the scientific literature on the absence and presence of acoustic communication within each sampled species and mapped it onto the tree. Applying statistical analytical tools, they tested whether acoustic communication arose independently in different groups and when; whether it is associated with nocturnal activity; and whether it tends to be preserved in a lineage.

The study, published in the open-access journal Nature Communications, revealed that the common ancestor of land-living vertebrates, or tetrapods, did not have the ability to communicate through vocalization - in other words, using their respiratory system to generate sound as opposed to making noise in other ways, such as clapping hands or banging objects together. Instead, acoustic communication evolved separately in mammals, birds, frogs and crocodilians in the last 100-200 million years, depending on the group. The study also found that the origins of communication by sound are strongly associated with a nocturnal lifestyle.

This makes intuitive sense because once light is no longer available to show off visual cues such as color patterns to intimidate a competitor or attract a mate, transmitting signals by sound becomes an advantage.

Extrapolating from the species in the sample, the authors estimate that acoustic communication is present in more than two-thirds of terrestrial vertebrates. While some of the animal groups readily come to mind for their vocal talents - think birds, frogs and mammals - crocodilians as well as a few turtles and tortoises have the ability to vocalize.

Interestingly, the researchers found that even in lineages that switched over to a diurnal (active by day) lifestyle, the ability to communicate via sound tends to be retained.

"There appears to be an advantage to evolving acoustic communication when you're active at night, but no disadvantage when you switch to being active during the day," Wiens said. "We have examples of acoustic communication being retained in groups of frogs and mammals that have become diurnal, even though both frogs and mammals started out being active by night hundreds of millions of years ago."

According to Wiens, birds kept on using acoustic communication even after becoming diurnal for the most part. Interestingly, many birds sing at dawn, as every birdwatcher can attest. Although speculative, it is possible that this "dawn chorus" behavior might be a remnant of the nocturnal ancestry of birds.

In addition, the research showed that acoustic communication appears to be a remarkably stable evolutionary trait. In fact, the authors raise the possibility that once a lineage has acquired the ability to communicate by sound, the tendency to retain that ability might be more stable than other types of signaling, such as conspicuous coloration or enlarged, showy structures.

In another unexpected result, the study revealed that the ability to vocalize does not appear to be the driver of diversification - the rate at which a lineage evolves into new species - it has been believed to be.

To illustrate this finding, Wiens pointed to birds and crocodilians: Both lineages have acoustic communication and go back roughly 100 million years, but while there are close to 10,000 bird species known, the list of crocodilians doesn't go past 25. And while there are about 10,000 known species of lizards and snakes, most go about their lives without uttering a sound, as opposed to about 6,000 mammalian species, 95% of which vocalize.

"If you look at a smaller scale, such as a few million years, and within certain groups like frogs and birds, the idea that acoustic communication drives speciation works out," Wiens said, "but here we look at 350 million years of evolution, and acoustic communication doesn't appear to explain the patterns of species diversity that we see."

The authors point out that their findings likely apply not only to acoustic communication, but also to other evolutionary traits driven by the ecological conditions known to shape the evolution of species. While it had been previously suggested that ecology was important for signal evolution, it was thought to apply mostly to subtle differences among closely related species.

"Here, we show that this idea of ecology shaping signal evolution applies over hundreds of millions of years and to fundamental types of signals, such as being able to communicate acoustically or not," Wiens said.

Credit: 
University of Arizona

Internet use reduces study skills in university students

video: Research conducted at Swansea University and the University of Milan has shown that students who use digital technology excessively are less motivated to engage with their studies, and are more anxious about tests.

Image: 
Roberto Truzoli, Caterina Vigano, Paolo Gabriele Galmozzi, Phil Reed

Research conducted at Swansea University and the University of Milan has shown that students who use digital technology excessively are less motivated to engage with their studies, and are more anxious about tests. This effect was made worse by the increased feelings of loneliness that use of digital technology produced.

Two hundred and eighty-five university students, enrolled on a range of health-related degree courses, participated in the study. They were assessed for their use of digital technology, their study skills and motivation, anxiety, and loneliness. The study found a negative relationship between internet addiction and motivation to study. Students reporting more internet addiction also found it harder to organise their learning productively, and were more anxious about their upcoming tests. The study also found that internet addiction was associated with loneliness, and that this loneliness made study harder.

Professor Phil Reed of Swansea University said: "These results suggest that students with high levels of internet addiction may be particularly at risk from lower motivations to study, and, hence, lower actual academic performance."

About 25% of the students reported that they spent over four hours a day online, with the rest indicating that they spent between one to three hours a day. The main uses of the internet for the student sample were social networking (40%) and information seeking (30%).

Professor Truzoli of Milan University said: "Internet addiction has been shown to impair a range of abilities such as impulse control, planning, and sensitivity to rewards. A lack of ability in these areas could well make study harder."

In addition to the links between levels of internet addiction and poor study motivation and ability, internet addiction was found to be associated with increased loneliness. The results indicated that loneliness, in turn, made studying harder for the students.

The study suggests that loneliness plays a large role in positive feelings about academic life in higher education. The poorer social interactions that are known to be associated with internet addiction make loneliness worse, and, in turn, impact on motivation to engage in a highly social educational environment such as a university.

Professor Reed added: "Before we continue down a route of increasing digitisation of our academic environments, we have to pause to consider if this is actually going to bring about the results we want. This strategy might offer some opportunities, but it also contains risks that have not yet been fully assessed."

Credit: 
Swansea University

Rich rewards: Scientists reveal ADHD medication's effect on the brain

image: The fMRI machine located in the IDOR imaging suite. fMRI is used to indirectly detect increased neuronal activity in an area of the brain by measuring higher levels of oxygenated to deoxygenated blood.

Image: 
Andressa Dias Lemos, IDOR

Attention-deficit hyperactivity disorder (ADHD) is a neurobiological disorder characterized by symptoms of hyperactivity, inattention and impulsivity. People with the condition are often prescribed a stimulant drug called methylphenidate, which treats these symptoms. However, scientists do not fully understand how the drug works.

Now, researchers at the Okinawa Institute of Science and Technology Graduate University (OIST) have identified how certain areas of the human brain respond to methylphenidate. The work may help researchers understand the precise mechanism of the drug and ultimately develop more targeted medicines for the condition.

Previous research suggests that people with ADHD have different brain responses when anticipating and receiving rewards, compared to individuals without ADHD. Scientists at OIST have proposed that in those with ADHD, neurons in the brain release less dopamine - a 'feel-good' neurotransmitter involved in reward-motivated behavior - when a reward is expected, with dopamine neurons firing more when a reward is given.

"In practice, what this means is that children, or even young adults, with ADHD may have difficulty engaging in behavior that doesn't result in an immediate positive outcome. For example, children may struggle to focus on schoolwork, as it may not be rewarding at the time, even though it could ultimately lead to better grades. Instead, they get distracted by external stimuli that are novel and interesting, such as a classmate talking or traffic noises," said Dr Emi Furukawa, first author of the study and a researcher in the OIST Human Developmental Neurobiology Unit, led by Professor Gail Tripp.

Scientists believe that methylphenidate helps people with ADHD maintain focus by influencing dopamine availability in the brain. Therefore, Dr Furukawa and her colleagues set out to examine how the drug affects a brain region called the ventral striatum, which is a vital component of the reward system and where dopamine is predominantly released.

"We wanted to take a look at how methylphenidate affects the ventral striatum's responses to reward cues and delivery," said Furukawa.

The study, which was recently published in the journal Neuropharmacology, was jointly conducted with scientists at D'Or Institute for Research and Education (IDOR) in Rio de Janeiro, Brazil. The collaboration allowed the researchers to combine expertise across multiple disciplines and provided access to IDOR's functional magnetic resonance imaging (fMRI) facility.

Delving into the brain

The researchers used fMRI to measure brain activity in young adults with and without ADHD as they played a computer game that simulated a slot machine. The researchers scanned individuals in the ADHD group on two separate occasions - once when they took methylphenidate and another time when they took a placebo pill. Each time the reels of the slot machine spun, the computer also showed one of two cues, either the Japanese character ? (mi) or ? (so). While familiarizing themselves with the game before being scanned, the participants quickly learned that when the slot machine showed ?, they often won money, but when the slot machine showed ?, they didn't. The symbol ? therefore acted as a reward-predicting cue, whereas ? acted as a non-reward-predicting cue.

The researchers found that when individuals with ADHD took the placebo, neuronal activity in the ventral striatum was similar in response to both the reward predicting and non-reward predicting cue. However, when they took methylphenidate, activity in the ventral striatum increased only in response to the reward cue, showing that they were now able to more easily discriminate between the two cues.

The researchers also explored how neuronal activity in the striatum correlated with neuronal activity in the medial prefrontal cortex - a brain region involved in decision-making that receives information from the outside world and communicates with many parts of the brain, including the striatum.

When the individuals with ADHD took placebo instead of methylphenidate, neuronal activity in the striatum correlated strongly with activity in the prefrontal cortex at the exact moment the reward was delivered, and the participants received money from the slot machine game. Therefore, the researchers believe that in people with ADHD, the striatum and the prefrontal cortex communicate more actively, which may underline their increased sensitivity to rewarding external stimuli. In participants who took methylphenidate, this correlation was low, as it was in people without ADHD.

The results implicate a second neurotransmitter, norepinephrine, in the therapeutic effects of methylphenidate. Norepinephrine is released by a subset of neurons common in the prefrontal cortex. Researchers speculate that methylphenidate might boost levels of norepinephrine in the prefrontal cortex, which in turn regulates dopamine firing in the striatum when rewards are delivered.

"It's becoming clear to us that the mechanism by which methylphenidate modulates the reward response is very complex," said Furukawa.

Tailoring New Therapies for ADHD

Despite the complexity, the scientists believe that further research could elucidate methylphenidate's mechanism of action, which could benefit millions of people worldwide.

Pinning down how methylphenidate works may help scientists develop better therapies for ADHD, said Furukawa. "Methylphenidate is effective but has some side effects, so some people are hesitant to take the medication or give it to their children," she explained. "If we can understand what part of the mechanism results in therapeutic effects, we could potentially develop drugs that are more targeted."

Furukawa also hopes that understanding how methylphenidate impacts the brain could help with behavioral interventions. For example, by keeping in mind the difference in brain responses when children with ADHD anticipate and receive rewards, parents and teachers could instead help children with ADHD stay focused by praising them frequently and reducing the amount of distracting stimuli in the environment.

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

Rethinking interactions with mental health patients

New research overturns the belief that people with severe mental illness are incapable of effective communication with their psychiatrist, and are able to work together with them to achieve better outcomes for themselves.

"Interviews are a critical part of assessing people suffering from thought disorder (TD), and deciding what the best therapy is for them," says Professor Cherrie Galletly from the Adelaide Medical School, University of Adelaide.

"Clinical interactions with people suffering with severe mental illness can be challenging, especially if the patient has disordered communication."

Published in the journal Australian Psychiatry the study analysed 24 routine clinical interviews between psychiatrists and inpatients, with a mean age of just under 30 years, who were suffering from TD.

"The study, the first of its kind, examined the expertise with which psychiatrists conducted clinical interviews of people suffering from TD, and the shared goals that were accomplished," says Professor Galletly.

"When interviewing people with TD psychiatrists need to adopt a mindset that the information the patient provides in that particular moment is, for them, meaningful, truthful, relevant and clear.

"They have to piece together snippets of information in order to create and interpret meaning and build respectful relationships by inviting patients to share their perspectives no matter how disordered or delusional their responses appear."

Thought disorder is common in psychotic disorders. The thoughts and conversation of people suffering from TD appear illogical and lacking in sequence and may be delusional or bizarre in content.

In 2010, 0.3% of Australians aged 18-64 years, had a psychotic illness with men aged 25-34 experiencing the highest rates (0.5%) of illness.

"Patients are positioned as active participants by psychiatrists who adopt a non-confrontational, non-judgemental approach, conveying support and safety, and ask open ended questions which allows the patient to engage, feel listened to, and work with the psychiatrist to achieve a shared understanding," says Professor Galletly.

"Findings from this study of sample interviews between psychiatrists and their patients highlight the need to rethink the notion that patients experiencing TD are incapable of communicating productively with the people trying to help them.

"Psychiatrists use transactional, relational and interactional techniques when they are talking to patients with thought disorder, which go beyond techniques normally employed in clinical interviews.

"Experienced psychiatrists undertake meaningful interviews with these patients, who in turn respond in ways that belie the notion that effective communication is not possible.

"The findings from this research can be used to develop training resources for clinicians who work with people with psychotic disorders."

Credit: 
University of Adelaide

Molecules move faster on a rough terrain

image: 3D rendering of polymer chains near the asperities of a rough substrate. Faster molecules were pictured by warmer colors.

Image: 
© ULB

Roughness, the presence of irregularities on a surface, is commonly associated to slower motion and stickiness. This is true at different length scales: at human size (1 meter), it takes longer to walk along a path that goes up and down, rather than walking on a flat road. At the size of smaller objects (1/100 - 1/1000 meter), Italians use pasta shapes with a rough surface, e.g. rigatoni, to make better adhesive surfaces for the tomato sauce and cheese. Till now, however, no experiment was able to test if the behavior of molecules really follows the same trend observed at human scale.

Now writing in Physical Review Letters, Cristian Rodriguez-Tinoco and a team of Université libre de Bruxelles' (ULB) Faculty of Sciences lead by Simone Napolitano shows that large molecules actually move faster in the proximity of rougher surfaces at the nanometric scale. Their experiments clearly demonstrate that the common belief that surface irregularities allow molecules to better stick on a surface is actually wrong. When the size of the surface roughness, that is the average distance between the tiny hills and valleys present on the surface of a material, is reduced to few nanometers (1 nm = one billionth of a meter), molecules of P4ClS, a type of polymer, start to move faster.

Detecting molecular motion is not easy: molecules move fast (even up to 1 million and more steps per second) and their displacements are too small to be observed by microscopes. Performing such experiments on a rough surface is even more complicated, because of its uneven character and by the difficulties on adjusting the size and distribution of the surface irregularities. The ULB team has been able to form rough surface of aluminum, by evaporating the metal in a controlled way. To measure how fast molecules move, the researchers have applied weak electric fields and recorded how quickly the molecules responds to the stimulus.

Surprisingly, the team has noticed that molecules present near a rough substrate behave as if they were surrounded by less neighbors, which explains why they speed up instead of slowing down. This trend is in neat disagreement with the predictions of computer simulations, which proposed that molecules move slower near a rough wall. Differently than what considered in simulations, polymer molecules do not enjoy sitting near rough substrate. Because of the way in which these molecules like to arrange themselves in the space, they prefer to move away from the asperities. The few molecules present near the asperities form less contact with the wall, can enjoy more free volume and, consequently, they move faster.

By sharing their results with a group of theoreticians of Dartmouth College (USA) led by Jane Lipson, the ULB team has been able to find a strong link between the way hills and valleys are organized on a rough surface and how molecules move. The theoreticians have shown that a very small change in the free volume around a molecule induces a tremendous boost in mobility, and the prediction of their calculations are in perfect agreement with the experiments.

This paper shows that the current way we think at interfaces is not valid. This new molecular trend observed has thus huge impact at the level of fundamental science. The work of the ULB team could be exploited on a large number of applications. Since almost a decade, several research groups have shown that properties of many thin coatings - such as flow, the ability to retain or be repel water, the velocity of formation of crystals - depend on the number of contacts between the film and its supporting substrate. Till now, to modify this number it was necessary to change the type of molecules at the interface, often involving complex chemical reactions. The findings show that it is possible to tailor the performance of nanomaterials by simply changing the roughness of the surface. This method, hence, allows controlling the polymer layer without touching it, as by using a remote control!

Credit: 
Université libre de Bruxelles

Artificial intelligence to improve resolution of brain magnetic resonance imaging

video: Researchers of the ICAI Group -- Computational Intelligence and Image Analysis -- of the University of Malaga (UMA) have designed an unprecedented method that is capable of improving brain images obtained through magnetic resonance imaging using artificial intelligence.

Image: 
University of Malaga

Researchers of the ICAI Group -Computational Intelligence and Image Analysis- of the University of Malaga (UMA) have designed an unprecedented method that is capable of improving brain images obtained through magnetic resonance imaging using artificial intelligence.

This new model manages to increase image quality from low resolution to high resolution without distorting the patients' brain structures, using a deep learning artificial neural network -a model that is based on the functioning of the human brain- that "learns" this process.

"Deep learning is based on very large neural networks, and so is its capacity to learn, reaching the complexity and abstraction of a brain", explains researcher Karl Thurnhofer, main author of this study, who adds that, thanks to this technique, the activity of identification can be performed alone, without supervision; an identification effort that the human eye would not be capable of doing.

Published in the scientific journal Neurocomputing, this study represents a scientific breakthrough, since the algorithm developed by the UMA yields more accurate results in less time, with clear benefits for patients. "So far, the acquisition of quality brain images has depended on the time the patient remained immobilized in the scanner; with our method, image processing is carried out later on the computer", explains Thurnhofer.

According to the experts, the results will enable specialists to identify brain-related pathologies, like physical injuries, cancer or language disorders, among others, with increased accuracy and definition, because image details are thinner, thus avoiding the performance of additional tests when diagnoses are uncertain.

Nowadays, the ICAI Group of the UMA, led by Professor Ezequiel López, co-author of this study, is a benchmark for neurocomputing, computational learning and artificial intelligence. The Professors of the Department of Computer Science and Programming Languages Enrique Domínguez and Rafael Luque, as well as researcher Núria Roé-Vellvé, have also participated in this study.

Credit: 
University of Malaga

Human-caused biodiversity decline started millions of years ago

image: Leopard, by Hans Ring, Naturfotograferna.

Image: 
Hans Ring, Naturfotograferna

The human-caused biodiversity decline started much earlier than researchers used to believe. According to a new study published in the scientific journal Ecology Letters the process was not started by our own species but by some of our ancestors.

The work was done by an international team of scientists from Sweden, Switzerland and the United Kingdom.

The researchers point out in the study that the ongoing biological diversity crisis is not a new phenomenon, but represents an acceleration of a process that human ancestors began millions of years ago.

"The extinctions that we see in the fossils are often explained as the results of climatic changes but the changes in Africa within the last few million years were relative minor and our analyses show that climatic changes were not the main cause of the observed extinctions," explains Søren Faurby, researcher at Gothenburg University and the main author of the study.

"Our analyzes show that the best explanation for the extinction of carnivores in East Africa is instead that they are caused by direct competition for food with our extinct ancestors," adds Daniele Silvestro, computational biologist and co-author of the study.

Carnivores disappeared

Our ancestors have been common throughout eastern Africa for several million years and during this time there were multiple extinctions according to Lars Werdelin, co-author and expert on African fossils.

"By investigating the African fossils, we can see a drastic reduction in the number of large carnivores, a decrease that started about 4 million years ago. About the same time, our ancestors may have started using a new technology to get food called kleptoparasitism," he explains.

Kleptoparasitism means stealing recently killed animals from other predators. For example, when a lion steals a dead antelope from a cheetah.

The researchers are now proposing, based on fossil evidence, that human ancestors stole recently killed animals from other predators. This would lead to starvation of the individual animals and over time to extinction of their entire species.

"This may be the reason why most large carnivores in Africa have developed strategies to defend their prey. For example, by picking up the prey in a tree that we see leopards doing. Other carnivores have instead evolved social behavior as we see in lions, who among other things work together to defend their prey," explains Søren Faurby

Humans today affect the world and the species that live in it more than ever before.

"But this does not mean that we previously lived in harmony with nature. Monopolization of resources is a skill we and our ancestors have had for millions of years, but only now are we able to understand and change our behavior and strive for a sustainable future. 'If you are very strong, you must also be very kind'," concludes Søren Faurby and quotes Astrid Lindgrens book about Pippi Longstocking.

Credit: 
University of Gothenburg

The core of massive dying galaxies already formed 1.5 billion years after the Big Bang

image: The red galaxy at the center is a dying galaxy at 12 billion years ago. Astronomers measured the motion of stars in the galaxy and found that the core of the galaxy is nearly fully formed.

Image: 
NAOJ/M. Tanaka

Astrophysics, Galaxies: The most distant dying galaxy discovered so far, more massive than our Milky Way -- with more than a trillion stars -- has revealed that the 'cores' of these systems had formed already 1.5 billion years after the Big Bang, about 1 billion years earlier than previous measurements revealed. The discovery will add to our knowledge on the formation of the Universe more generally, and may cause the computer models astronomers use, one of the most fundamental tools, to be revised. The result was obtained in close collaboration with Masayuki Tanaka and his colleagues at the National Observatory of Japan is now published in two works in the Astrophysical Journal Letters and the Astrophysical Journal.

What is a "dead" galaxy?

Galaxies are broadly categorized as dead or alive: dead galaxies are no longer forming stars, while alive galaxies are still bright with star formation activity. A 'quenching' galaxy is a galaxy in the process of dying -- meaning its star formation is significantly suppressed. Quenching galaxies are not as bright as fully alive galaxies, but they are not as dark as dead galaxies. Researchers use this spectrum of brightness as the first line of identification when observing galaxies in the Universe.

The farthest dying galaxy discovered so far reveals remarkable maturity

A team of researchers of the Cosmic Dawn Center at the Niels Bohr Institute and the National Observatory of Japan recently discovered a massive galaxy dying already 1.5 billion years after the Big Bang, the most distant of its kind. "Moreover, we found that its core seems already fully formed at that time", says Masayuki Tanaka, the author of the letter. "This result pairs up with the fact that, when these dying gigantic systems were still alive and forming stars, they might have not been that extreme compared with the average population of galaxies", adds Francesco Valentino, assistant professor at the Cosmic Dawn Center at the Niels Bohr Institute and author of an article on the past history of dead galaxies appeared in the Astrophysical Journal.

Why do galaxies die? - One of the biggest and still unanswered questions in astrophysics

"The suppressed star formation tells us that a galaxy is dying, sadly, but that is exactly the kind of galaxy we want to study in detail to understand why it dies", continues Valentino. One of the biggest questions that astrophysics still has not answered is how a galaxy goes from being star-forming to being dead. For instance, the Milky Way is still alive and slowly forming new stars, but not too far away (in astronomical terms), the central galaxy of the Virgo cluster - M87 - is dead and completely different. Why is that? "It might have to do with the presence of gigantic and active black hole at the center of galaxies like M87" Valentino says.

Earth based telescopes find extremes - but astronomers look for normality

One of the problems in observing galaxies in this much detail is that the telescopes available now on Earth are generally able to find only the most extreme systems. However, the key to describe the history of the Universe is held by the vastly more numerous population of normal objects. "Since we are trying hard to discover this normality, the current observational limitations are an obstacle that has to be overcome."

The James Webb Telescope (JWST) represents hope for better data material in the near future

The new James Webb Space Telescope, scheduled for launch in 2021, will be able to provide the astronomers with data in a level of detail that should be able to map exactly this "normality". The methods developed in close collaboration between the Japanese team and the team at the Niels Bohr Institute have already proven to be successful, given the recent result. "This is significant, because it will enable us to look for the most promising galaxies from the start, when JWST gives us access to much higher quality data" Francesco Valentino explains.

Combining observations with the tool - the computer models of the Universe

What has been found observationally is not too far away from what the most recent models predict. "Until very recently, we did not have many observations to compare with the models. However, the situation is in rapid evolution, and with JWST we will have valuable larger samples of ``normal'' galaxies in a few years. The more galaxies we can study, the better we are able to understand the properties or situations leading to a certain state - if the galaxy is alive, quenching or dead. It is basically a question of writing the history of the Universe correctly, and in greater and greater detail. At the same time, we are tuning the computer models to take our observations into account, which will be a huge improvement, not just for our branch of work, but for astronomy in general" Francesco Valentino explains.

Credit: 
University of Copenhagen