Culture

Haloperidol does not prevent delirium or improve survival rates in ICU patients

Prophylactic use of the drug haloperidol does not help to prevent delirium in intensive care patients or improve their chances of survival. Therefore, there is no reason anymore to administer the drug as a preventive measure to reduce the burden of delirium. This was revealed following a three-year, large-scale study among 1,800 patients in 20 Dutch ICUs, headed by Radboud university medical center. The results of this world's largest research project into delirium prevention in the ICU have been published on February 20 in the Journal of the American Medical Association (JAMA).

Acute confusion, or delirium, occurs in approximately one third to half of all patients in the intensive care unit (ICU), and hasserious short-term and long-term consequences. Patients who develop delirium need mechanical ventilation for a longer time and their stay in the ICU and in the hospital is also longer. Also, patients with delirium are more likely to die compared to patients without delirium. If a patient develops delirium, the drug haloperidol is often used to treat it.

Large-scale research

There were indications that haloperidol could be effective not only to treat, but also to prevent delirium. A large-scale trial, headed by Mark van den Boogaard from the Radboud university medical center, was conducted in 20 Dutch ICUs to investigate if prophylactic use of haloperidol could reduce delirium and its consequences. A total of 1,800 ICU patients with a high risk of delirium were included in this trial and received a low dose of haloperidol, or a placebo. This trial, funded by ZonMw (the Netherlands Organisation for Health Research and Development), is worldwide the largest trial in this field.

Mortality

As mortality rates among patients with delirium are higher, the researchers tried to find out whether using prophylactic haloperidol would reduce the mortality and delirium and its sequelae.

No difference

The conclusions of this trial were crystal clear: prophylactic therapy with haloperidol did not affect any of the endpoints being studied. Principal investigator Mark van den Boogaard: "This large-scale study shows indisputably that use of prophylactic haloperidol in ICU patients has no beneficial effects whatsoever. These findings will lead to fewer unnecessary drugs being prescribed to ICU patients."

Head of the research, Professor Peter Pickkers: "The scope of the study and the fact that the results are so unambiguous make the message from our research abundantly clear: there is absolutely no point in administering haloperidol to ICU patients as a preventive measure."

Credit: 
Radboud University Medical Center

Perceptions of God make Democrats more conservative, Republicans more liberal in some ways

Republicans who believe that God is highly engaged with humanity are like Democrats -- more liberal -- when it comes to social and economic justice issues, according to a Baylor University study.

"Partisanship explains only so much. Images of God reveal deep moral perspectives that affect the ways in which Americans understand justice, so much so that they can blur the lines of partisan politics," said researcher Robert Thomson, Ph.D., Postdoctoral Research Fellow at Rice University.

The study -- "God, Party and the Poor: How Politics and Religion Interact to Affect Economic Justice Attitudes" -- is published in the journal Sociological Forum.

In previous research, Thomson and co-author Paul Froese, Ph.D., Baylor professor of sociology, found Republicans and Democrats who believe God is highly judgmental tend to agree about issues of retributive justice, such as capital punishment.

"Liberals with a 'strict father' image of God are more inclined to support harsher criminal punishments and military solutions to foreign conflicts because they adhere to a theology of retribution and just deserts," Froese said. "It appears that Americans who see God as wrathful are quicker to support policies which seek an eye-for-an-eye outcome."

In the new study, Froese and Thomson found that Republicans who view God as actively involved in the world tend to support more generous welfare policies, in opposition to their party's platform.

"Conservatives who feel close to God tend to go to church more, volunteer more, but also more likely to want help from the government to take care of the poor," Froese said. "Republicans with a distant God tend be less compassionate."

Froese and Thomson used data from the 2007 wave of the Baylor Religion Survey, a national cross-sectional survey developed by Baylor Institute for Studies of Religion and administered by the Gallup Organization. The sample size was 1,588 respondents, excluding atheists because they did not have an image of God to compare with those of other respondents. The group's makeup included 41 percent Republicans, 37 percent Democrats and 22 percent Independent.

Respondents were asked:

Whether the federal government should (1) distribute wealth more evenly and (2) improve the standard of living for ethnic minorities, with responses on each ranging from "strongly agree" to "strongly disagree," as well as "undecided." The research found that 50.3 percent affirmed distribution of wealth, while 49.6 affirmed standard of living for ethnic minorities.

How important it is to (1) actively seek social and economic justice and (2) take care of the sick and needy if one wishes to be a good person, selecting from answers ranging from "not important" to "very important." Results showed that 39.1 percent affirmed "seek justice," while 62 percent affirmed care for the sick and needy.

What traits God possesses, with the options being options being distant, ever present, removed from the world, concerned with the world's well-being, concerned with personal well-being, directly involved in worldly affairs and differently involved in personal affairs.

Additionally, respondents were asked to respond to how religious they were on a four-point scale; and how frequently they attended religious services, with answers ranging from zero ("Never") to 8 ("Several times a week").

Researchers noted that typically, Republicans are consistently and distinctly more conservative on both issues of social justice and retributive justice than Democrats. Put simply, conservatism predicts negative views towards social justice, specifically (1) distributing wealth more evenly, (2) improving the standard of living for ethnic minorities, (3) seeking social and economic justice and (4) taking care of the sick and needy.
Conservatism also predicts positive attitudes towards retributive justice, specifically, (1) keeping the death penalty, (2) expanding authority to fight terrorism, (3) punishing criminals more harshly and (4) affirming the importance of serving in the military.

While the GOP opposes efforts to distribute wealth more evenly through taxation and welfare programs, some Republicans feel a personal obligation to assist in nongovernmental ways, researchers said. Because Republicans are more likely to be active Christians than Democrats when it comes to affiliating with a church, they are more likely to donate time and money to charity than more secular Americans.

"Republicans with a deeply engaged God are consistently liberal on issues of social justice," Froese said. "And Democrats with a highly judgmental God are consistently conservative on issues of retributive justice."

Credit: 
Baylor University

Rediscovered Andy Warhol interview explores pop art and queerness

A new paper in the Oxford Art Journal examines the significance of a newly discovered recoding of Andy Warhol's famous 1963 interview with Gene Swenson, published in ARTnews under the heading "What is Pop Art?" The printed interview omitted a large part of the recording, which actually starts with the question "What do you say about homosexuals?" Warhol's early and explicit on-the-record statements about Pop's relationship to homosexuality were suppressed from publication.

Author Jennifer Sichel discovered the original cassette recording of the interview, which contains discussion of Warhol's views on homosexuality that were removed from the final print edition. Her paper asks to what ends the editorial decision to redact sections from the printed version effected subsequent receptions of both Swenson and Warhol's work.

In 1963, as part of an ARTnews series titled "What is Pop Art? Answers from 8 painters," the art critic Gene Swenson conducted a defining interview with Andy Warhol. It was in this interview that Warhol first declared "I think everybody should be a machine" and "I think everybody should like everybody"- utterances that have, over the years, sustained many of the most rigorous arguments about Pop, Postmodernism, and Warhol's practice.

Despite the fact that the printed interview was taken to define Warhol's world view, the discovery of the original cassette recording shows that the final printed version was a heavily edited edition of the original conversation. Most strikingly, Swenson begins the interview by asking Warhol, "What do you say about homosexuals?" - a theme that runs through much of the interview. However this question, along with every subsequent reference to homosexuality, was expunged from the published interview.

Although it's unclear why the material was removed, ample evidence survives documenting the many protracted battles Swenson waged against publishers, curators, and institutions over their willingness to suppress disruptive social, political, and queer content during the sixties.

This new evidence both supports many of the important arguments scholars have advanced over the past two decades to establish and understand Warhol's queerness, and restores the significance of Swenson's battle to discuss disruptive social, political, and queer content despite it being supressed in 1963.

"During a dissertation research trip in March 2016, I found the unknown tape-recording of Andy Warhol's defining early interview with Gene Swenson," said Sichel. "Warhol's famous statements 'everybody should be a machine' and 'everybody should like everybody' were uttered in direct response to Swenson's probing questions about homosexuality. Elevating Swenson alongside Warhol, I argue for the importance of both of their divergent queer practices, and in particular, of Swenson's loud, angry demand to show up and protest even when resistance feels futile. This opens a new approach to the study of Pop Art focused on archival evidence of what actually happened on the ground, and on the queer practices that grew up in response to the movement's dominant trends."

Credit: 
Oxford University Press USA

There may be a better way to reduce hospital readmission rates

A recent study published in Health Education Research suggests that lay-health workers may be able to significantly reduce readmissions rates to hospitals for high risk patients following surgery.

A lay-health worker is someone who has received some training to promote health or to carry out some health-care services and who acts as a link between formal health services and patients, especially those at high-risk. In the United States lay-health worker programs first emerged as part of the Great Society domestic policies in the 1960s.

Lay-health workers, who perform specific, delineated tasks, like assisting during medical appointments and providing access to transportation, can be deployed much faster than more highly trained health professionals. Often, they can improve patient experiences through culturally sensitive, community-based health services. They also serve as a resource for patients attempting to obtain health education or navigate the healthcare system.

Patients are uncertain and vulnerable when discharged after a long hospitalization. These patients shift from being dependent and complacent while hospitalized to having significant responsibilities, which can potentially affect their risks for readmission. Roughly 20% of all Medicare fee-for-service patients are readmitted within 30-days of hospital discharge, costing the healthcare system an estimated $17 billion annually. The majority of these readmissions are avoidable.

The aim of this study was to reduce 30-day hospital readmission rates in a community hospital in Kentucky using lay-health workers to assess and assist hospitalized high-risk patients. The study was conducted in St. Claire Regional Medical Center in Morehead, KY. Hospitalized patients (men and women over 18 years old of any racial/ethnic group and admitting diagnosis) at high-risk of a 30-day readmission to the hospital were targeted for the study. This group was identified as high-risk given their medical history and health problems.

This study was designed to assess the implementation of a lay-health model for assisting high-risk patients with their post-discharge social needs. Outcome measures included 30-day hospital readmissions rates during a 4-month baseline period compared with a 6-month post-implementation period.

The study involved assessment and development of a personalized social needs plan for enrolled patients (e.g. transportation and community resource identification), with post-discharge follow-up calls. There was a 47.7% relative reduction of 30-day hospital readmissions rates during the period studied. Simple regression analyses demonstrated a 56% decrease in odds in being readmitted within 30-days. Once adjusting for education, transportation cost and anxiety symptoms, there was a 77% decrease in odds among those exposed to the lay-health program.

"We have the potential of impacting one's overall health if we can assist with those social determinants, such as paying bills and having access to fresh food, much more so than what we can do through traditional medicine that occurs in clinics and hospitals," said the paper's lead researcher, Roberto Cardarelli. "Our dilemma is that our healthcare system does not pay for such services and we continue to see marginalized populations keep coming back to hospitals in an acute crisis."

Credit: 
Oxford University Press USA

Early results from clinical trials not all they're cracked up to be, shows new research

ROCHESTER, Minn. -- When people are suffering from a chronic medical condition, they may place their hope on treatments in clinical trials that show early positive results. However, these results may be grossly exaggerated in more than 1 in 3 early clinical trials, reports a new study led by Mayo Clinic and published today in Mayo Clinic Proceedings.

"This phenomenon of exaggerated early results was present in a whopping 37 percent of the studies we reviewed," says Fares Alahdab, M.D., lead author of the study and a research fellow in Mayo Clinic's Evidence-Based Practice Center. "Physicians and patients should be cautious about new or early clinical trial evidence. Exaggerated results could lead to false hope as well as possibly harmful effects."

Dr. Alahdab and the team of researchers led by the center's director, M. Hassan Murad, M.D., reviewed thousands of research articles from the top 10 general medical journals using impact factors. (Impact factors are an internationally accepted ranking system for scientific journals). They collected 70 meta-analysis articles published during an 8½-year period (Jan. 1, 2007, to June 23, 2015) that included the results of 930 clinical trials.

They discovered that the first or second studies (in terms of publication year) reported an effect that was 2.67 times larger than what eventually was shown when subsequent trials were published. The team focused its research on clinical trials evaluating drugs or devices for use in treating chronic conditions -- those conditions that generally persist over time, including cancer, stroke, heart disease, diabetes and kidney disease.

"Often, patients are living with more than one chronic condition, and they and their doctors watch for research about new treatments," says Dr. Murad. "They need to be aware that the effect seen in earlier trials may not bear out over time and may be much more modest."

"Some people may think this is an anti-innovation message," Dr. Murad says. "To the contrary, we welcome new treatments. We just want people to know that the benefit seen in real practice, when treatments are given to people with various comorbidities and in different settings, may be smaller than what was seen in the earliest clinical trials."

When publishing early clinical trial results, the research team also hopes their research colleagues will consider the possibility of the Proteus effect -- a name often given to the phenomenon of early exaggeration followed by moderation over time. The research team hopes their colleagues will temper their language to avoid providing false optimism.

Credit: 
Mayo Clinic

Scientists poised to win the race against rust disease and beyond

In a race to prevent and control rust disease epidemics, scientists have positioned themselves to better understand how rust fungi infect crops and evolve virulence. After using the latest genome sequencing technologies to understand how rust fungi adapt to overcome resistance in crop varieties, scientists from the University of Minnesota, the USDA-ARS Cereal Disease Laboratory, the Australian National University, Commonwealth Scientific and Industrial Research Organisation (CSIRO) and the University of Sydney are releasing results with two publications in mBio, a journal by the American Society of Microbiology.

Damage inflicted by rust fungi is a significant constraint in food production across the globe. Cereals and many other important crops such as coffee, sugarcane, and soybean are impacted by these devastating pathogens. Traditional approaches using fungicides can be expensive and present environmental and health costs. Genetic resistance in crop development is often the best disease management strategy to prevent rust outbreaks. However, genetic resistance in crop varieties is frequently defeated by the emergence of new rust strains, turning what used to be a disease resistant plant variety to one that is completely vulnerable. The joint US and Australian research team has now generated the first haplotype-resolved genome sequences for the rust fungi causing oat crown rust and wheat stripe rust diseases, two of the most destructive pathogens in oat and wheat, respectively.

"Like humans, rust fungi contain two copies of each chromosome, which makes their genetics much more complicated than other types of fungi," said Assistant Professor Melania Figueroa from the University of Minnesota. Figueroa co-led the sequencing effort for the oat crown rust fungus P. coronata f. sp. avenae along with Shahryar Kianian, research leader at the USDA-ARS Cereal Disease Laboratory and adjunct professor at the University of Minnesota. "A key advance of this work is that for the first time, separate genome assemblies were generated reflecting both of the two chromosome copies in the rust."

In parallel, Postdoctoral Fellow Benjamin Schwessinger and Professor John Rathjen at the Australian National University applied similar approaches to develop an improved genome assembly of the stripe rust fungus, P. striiformis f. sp. tritici. By working together the two teams were able to combine their techniques and knowledge to achieve these breakthroughs much more rapidly than by working alone.

These studies represent a breakthrough in plant pathology as they now show how genetic diversity between the two chromosome copies can influence the emergence of new virulent pathogen strains.

Both studies uncovered a surprisingly high level of diversity between the two copies, suggesting that such variation likely serves as the basis to rapidly evolve new rust strains. "Reports from growers facing yield losses due to oat crown rust occur during most cropping seasons and the genome assemblies of this pathogen will help us understand the evolution of this pathogen and means to develop more resistant crops," said Kianian, who coordinates annual rust surveys in the US in order to monitor the pathogen population in oat growing areas. The oat crown rust genomics study compared two strains from North Carolina and South Dakota with different virulent profiles which were obtained in 2012 as part of the routine USDA-ARS Rust Surveys.

The first author of this publication, Marisa Miller, is the awardee of a prestigious USDA-NIFA Postdoctoral Fellow and recently embarked on a study comparing the genomic composition of oat crown rust strains collected in 1990 and 2015. "In the last 25 years the population of oat crown rust has gained additional virulences, and we would like to understand how this has occurred. Miller's work is essential to answering this question," commented Figueroa.

"Oat crown rust is one of the most rapidly evolving rust pathogens," explained University of Minnesota Adjunct Professor Peter Dodds of CSIRO Agriculture and Food. "So this work will really help understand how new rust diseases like the highly destructive Ug99 race of wheat stem rust can overcome resistance in crops."

The publications describing the work in the oat crown rust and wheat stripe rust pathogens, both released in the current issue of mBio, will serve as a framework for future studies of virulence evolution in these pathogens as well as for applying similar approaches to the rust fungi causing many other major crop diseases.

Credit: 
University of Minnesota

Largest study of its kind finds alcohol use biggest risk factor for dementia

TORONTO, February 20, 2018 - Alcohol use disorders are the most important preventable risk factors for the onset of all types of dementia, especially early-onset dementia. This according to a nationwide observational study, published in The Lancet Public Health journal, of over one million adults diagnosed with dementia in France.

This study looked specifically at the effect of alcohol use disorders, and included people who had been diagnosed with mental and behavioural disorders or chronic diseases that were attributable to chronic harmful use of alcohol.

Of the 57,000 cases of early-onset dementia (before the age of 65), the majority (57%) were related to chronic heavy drinking.

The World Health Organization (WHO) defines chronic heavy drinking as consuming more than 60 grams pure alcohol on average per day for men (4-5 Canadian standard drinks) and 40 grams (about 3 standard drinks) per day for women.

As a result of the strong association found in this study, the authors suggest that screening, brief interventions for heavy drinking, and treatment for alcohol use disorders should be implemented to reduce the alcohol-attributable burden of dementia.

"The findings indicate that heavy drinking and alcohol use disorders are the most important risk factors for dementia, and especially important for those types of dementia which start before age 65, and which lead to premature deaths," says study co-author and Director of the CAMH Institute for Mental Health Policy Research Dr. Jürgen Rehm. "Alcohol-induced brain damage and dementia are preventable, and known-effective preventive and policy measures can make a dent into premature dementia deaths."

Dr. Rehm points out that on average, alcohol use disorders shorten life expectancy by more than 20 years, and dementia is one of the leading causes of death for these people.

For early-onset dementia, there was a significant gender split. While the overall majority of dementia patients were women, almost two-thirds of all early-onset dementia patients (64.9%) were men.

Alcohol use disorders were also associated with all other independent risk factors for dementia onset, such as tobacco smoking, high blood pressure, diabetes, lower education, depression, and hearing loss, among modifiable risk factors. It suggests that alcohol use disorders may contribute in many ways to the risk of dementia.

"As a geriatric psychiatrist, I frequently see the effects of alcohol use disorder on dementia, when unfortunately alcohol treatment interventions may be too late to improve cognition," says CAMH Vice-President of Research Dr. Bruce Pollock. "Screening for and reduction of problem drinking, and treatment for alcohol use disorders need to start much earlier in primary care." The authors also noted that only the most severe cases of alcohol use disorder - ones involving hospitalization - were included in the study. This could mean that, because of ongoing stigma regarding the reporting of alcohol-use disorders, the association between chronic heavy drinking and dementia may be even stronger.

Credit: 
Centre for Addiction and Mental Health

How chemistry can improve bargain hot cocoa (video)

image: Nobody really likes bargain hot cocoa powder. It's lumpy, it's too thin and it leaves scummy residue behind. But premium hot cocoa mix is too expensive for some imbibers. Fortunately, Reactions is here with some easy kitchen chemistry hacks to turn cheap cocoa mix into a satisfying cold weather pick-me-up: https://youtu.be/7M105LuTJvo.

Image: 
The American Chemical Society

WASHINGTON, Feb. 20, 2018 -- Nobody really likes bargain hot cocoa powder. It's lumpy, it's too thin and it leaves scummy residue behind. But premium hot cocoa mix is too expensive for some imbibers. Fortunately, Reactions is here with some easy kitchen chemistry hacks to turn cheap cocoa mix into a satisfying cold weather pick-me-up: https://youtu.be/7M105LuTJvo.

Credit: 
American Chemical Society

Climate change, evolution, and what happens when researchers are also friends

image: This is an illustration of an eco-evolutionary feedback loop perpetuated by evolution of carbon cycling traits, in response to climate change.

Image: 
Grey Monroe et al./Colorado State University

What happens when six graduate students in different fields, who happen to be friends, put their heads together on an emerging issue in climate change?

They get published in a major journal.

The Colorado State University researchers, whose studies cut across three colleges, three departments and the Graduate Degree Program in Ecology, urge more of this type of collaboration in a new paper in Trends in Ecology and Evolution. Their paper, which addresses how climate change is affecting the evolution of organisms, underscores the need for evolutionary, ecosystem and climate scientists to work together to better understand eco-evolutionary feedback dynamics. They pose the question of whether evolution of plants, animals and other organisms altered by climate change will ultimately help, or hurt, the planet's current warming trend.

Evolutionary biologist Grey Monroe, lead author and a student in the Graduate Degree Program in Ecology, said the idea for the paper came from a course called Ecosystem Ecology, taught by biology professor Joe von Fischer. In a final paper for the course, Monroe explored evolution's impact on broad ecosystem processes like nitrogen and carbon cycling. Wanting to learn more, he approached other graduate students with similar interests for help fleshing out ideas.

"Ultimately, we were all interested in understanding the role evolution might play in shaping the future of carbon cycling, and how it might affect atmospheric carbon dioxide and climate change," Monroe said.

Colleen Webb, director of the Graduate Degree Program in Ecology and professor in biology, noted that it is highly unusual for graduate students to publish without a faculty member as a co-author. The paper's novel approach and dissemination by a major ecological journal speaks to the CSU graduate experience, she said.

The paper is a comprehensive literature review that explains, via synthesized analysis of published research, how evolution interacts with the environment and how it affects the global carbon cycle. The carbon cycle is the constant movement of carbon through various ecosystems. For example, atmospheric carbon dioxide is absorbed by plants through photosynthesis. Some of that carbon is deposited into the soil through the plants' roots, or animals eat the plants and exhale the carbon dioxide back into the atmosphere

For many decades, humans have inserted themselves into these natural processes by adding excess carbon dioxide into ecosystems through agriculture, burning fossil fuels, and other activities; thus the era we live in is known as the "Anthropocene," or the geological age dominated by human activity.

In their paper, the researchers delved into the idea that climate change in the Anthropocene is directly affecting natural selection. As global temperatures increase, precipitation patterns change, oceans acidify, and all those changes work together to alter selection pressures on many organisms, sometimes over just one or two generations.

Their paper points to the possibility that these altered states of evolution could either accelerate or mediate climate change through a feedback loop. For example, if plants evolve larger root systems in response to prolonged drought, they may deposit more carbon into the soil, thus increasing rates of carbon sequestration.

As another example, scientists have studied selection pressures exhibited by phytoplankton - photosynthetic algae that deposit carbon into ocean bottoms. In some cases, these algae evolve higher photosynthetic rates in response to climate change-induced pressures. They also evolve traits that make them sink more quickly to the ocean floor, pointing to a net increase of ocean carbon flux.

"There is a growing interest in eco-evolutionary feedback loops," Monroe said. "Our paper fits into this conceptual framework. We feel there is room for research in this area, to provide more empirical consideration for how evolution will affect the future trajectory of atmospheric carbon."

After all, the carbon cycle is dominated by organisms, and photosynthesis globally moves 20 times more carbon than humans do.

"That was one of the reasons we felt that the magnitude of evolutionary changes could be significant," Monroe said.

Credit: 
Colorado State University

How to get the most out of foreign investment

Researchers at the Higher School of Economics (HSE University) have revealed that Russian companies need to invest in the development of intellectual resources in order to maximize the benefits from partners in developed countries. Results of the study have been published in the journal, Knowledge Management Research & Practice .

For the most part, theoretical research in the field of international trade tends to focus on the importance of foreign direct investment, not only as an additional source of financing, but also as an important element in the transformation and improvement of the quality of human capital, the mechanism of knowledge-sharing between companies and the adaptation of new technologies. This should, in the end, lead to an increase in a company's competitiveness. However, empirical evidence is extremely varied and contradictory, especially with regard to companies located in emerging economies. Moreover, recent studies, in particular, Taglioni, Winkler (2016), have shown that the possible advantages of foreign direct investment do not appear automatically. Rather, they are strongly tied to the company's ability to perceive potential benefits.

Two studies by HSE researchers, Anna Bykova, Carlos Hardon and Felix Iturriaga, have analyzed the data of Russian public companies and the impact of foreign investment. 'We found that, in general, the presence of foreign property does not affect the performance of companies. It either has a positive effect in a period of economic recession, or it strengthens the positive effect for export-oriented companies,' says Anna Bykova. 'This surprised us and it seemed rather strange, especially considering the number of companies with foreign ownership in our study. It occurred to us that it is possible that foreign investments are not 'useful' to everyone - that they are only useful for companies with a certain resource endowment. In other words, in order to have a positive effect, there needs to be what we call, 'mediators', or intermediaries, between foreign investments and the company's performance. This hypothesis was tested using the concept of dynamic capabilities developed by D. Teece in 1997. According to our assumptions, the dynamic capabilities play this mediative role.'

'Dynamic capabilities'- the ability of the company to constantly create, expand and modify resources depending on environmental conditions and company strategy (Helfat et al. 2007).

Data from 1096 Russian public companies from 2004 to 2014 were used as a sample for the study. The data were collected by the International Laboratory of Intangible-driven Economy at HSE in Perm from open sources. The method of partial least squares (Partial Least Square-Structural Equation Modeling, PLS-SEM) and software SmartPLS 3.0 were used. The unique feature of this method is that it enables scientists to evaluate the functional relationships between unobservable variables, such as dynamic capabilities, and observable variables, such as foreign investment and company performance. In this case, unobservable variables are represented in the model by observable indicators. This method has certain advantages, including less strict requirements regarding the normality of data distribution and the use of secondary information. PLS-SEM is therefore used increasingly in studies in the field of strategic management.

'Unlike previous studies relying on PLS-SEM, our model used open data, rather than data from surveys, to measure dynamic capabilities,' says Anna Bykova, research associate at the International Laboratory of Intangible-driven Economy at HSE in Perm.

The results show that dynamic capabilities play not a partial intermediary role, as the authors assumed, but a full intermediary role. The results of previous studies which demonstrate the absence of a statistically significant effect of foreign direct investment were therefore confirmed. Furthermore, the results show that the transformation of foreign investment into added economic value occurs either through the transformation of the company's internal resources or through the use of dynamic capabilities.

Researchers identified three types of dynamic capabilities which are most relevant for analyzing the interaction between recipient companies from developing economies and investing companies in developed countries. The first type is absorption capability, or a company's ability to learn, absorb and implement the knowledge and technology they receive from foreign partners. The second type is adaptive capability, or a company's ability to transform internal resources according to the requirements of foreign investors. This is especially important for companies in emerging markets. The third type is communicative capability, or a company's ability to effectively detect and introduce knowledge around higher market standards, technology related to working with buyers and suppliers, and increasing customer loyalty.

In analyzing the impact of different types of dynamic capabilities, the authors found that communicative capability has the greatest potential to create value: 52.2% of the total effect is due to this type. For operational efficiency, measured in terms of the profitability of total assets, the greatest effect is due to adaptive capability, responsible for 50% of the overall effect. The findings enable us to construct various strategies for long-term and short-term company development.

The authors plan to continue their study of the mediating role of dynamic capabilities, taking into account industry specificity, including the mediating effect of the industry, in the model which has been developed and verified.

Credit: 
National Research University Higher School of Economics

Can you eat cells? Computer model predicts which organisms are capable of phagocytosis

image: Phagocytosis, or cell eating, is a key feature of eukaryotic cells that is not present in the archaea or bacteria. Current genomic data provides the key to this and other complex processes with hints at how they may have originated.

Image: 
© AMNH/S. Thurston

A team of American Museum of Natural History researchers has created a computational model capable of predicting whether or not organisms have the ability to "eat" other cells through a process known as phagocytosis. The model may be a useful tool for large-scale microbe surveys and provides valuable insight into the evolution of complex life on Earth, challenging ideas put forward in recent studies. The model and researchers' findings are featured in a paper published today in the journal Nature Ecology & Evolution.

"Phagocytosis is a major mechanism of nutritional uptake in many single-celled organisms, and it's vital to immune defenses in a number of living things, including humans," said co-author Eunsoo Kim, an associate curator in the American Museum of Natural History's Division of Invertebrate Zoology. "But lesser known is the idea that phagocytosis dates back some 2 to 3 billion years and played a role in those symbiotic associations that likely started the cascading evolution toward the more diverse and complex life we see on the planet today. Our research provides some hints as to how phagocytosis first arose."

Prokaryotes, a group that includes bacteria and archaea, are microscopic, mostly single-celled organisms with relatively simple internal structure. Eukaryotes, the group that comprises animals, plants, fungi, and protists (representing assemblages of diverse, unrelated lineages such as amoebozoans and green algae), have generally larger cells that are filled with a number of internal components including the nucleus, where DNA is stored, and energy-generating organelles called mitochondria. Scientific theory dictates that between about 2 to 3 billion years ago a single bacterium merged with an unrelated prokaryotic microbe, leading to the evolution of mitochondria-a key feature of eukaryotic cells. This merger only happened once, and some scientists suggest that the process involved cellular "engulfment." Yet there are no known prokaryotes capable of phagocytosis today. So under what circumstances did the trait arise?

To investigate this long-standing question, the research team used genetic patterns common to phagocytic cells to build a computational model that uses machine learning to predict whether an organism feeds via phagocytosis.

"There's no single set of genes that are strongly predictive of phagocytosis because it's a very complicated process that can involve more than 1,000 genes, and those genes can greatly vary from species to species," said lead author John Burns, a research scientist in the Museum's Sackler Institute for Comparative Genomics." But as we started looking at the genomes of more and more eukaryotes, a genetic pattern emerged, and it exists across diversity, even though it's slightly different in each species. That pattern is the core of our model, and we can use it to very quickly and efficiently predict which cells are likely to be 'eaters' and which are not."

Because many eukaryotes, such as some green algae, can be "picky"-only eating under certain conditions-it can be hard to determine whether they are capable of phagocytosis simply by watching them under a microscope. This new model can help microbiologists make a quick assessment, and it may also be valuable for use in large DNA sequencing projects that involve multiple unknown microbial species, for example, surveying sea water samples.

The researchers applied the model to a group of "eukaryote-like" microbes called Asgard archaea. In recent years, some researchers have proposed that Asgard microbes might be living relatives of long-sought-after microbes that merged with the bacterium that became mitochondria. But the researchers' new work finds that these microbes most likely do not use phagocytosis. They also used the gene set developed as part of the model-building process to look more closely at the ancient lineages of archaea and bacteria, and found that, as a whole, neither group on its own has the genes required for phagocytosis. While the new work eliminates one scenario for the birth of mitrochondria-that Asgard archaea engulfed a bacteria-many other options remain.

"When you tease apart the components of these predictive genes, some have roots in archaea, some have roots in bacteria, and some are only unique to eukaryotes," Kim said. "Our data are consistent with the hypothesis that our cells are a chimera of archaeal and bacterial components, and that the process of phagocytosis arose only after that combination took place. We still have a lot of work to do in this field."

Credit: 
American Museum of Natural History

Noise from ships scares porpoises

image: Signe Sveegaard and Siri Elmegaard from the Department of Bioscience, Aarhus University, fit electronic tags on a porpoise caught in a pound net.

Image: 
Photo: Lis Bach, AU

Porpoises communicate with each other using sounds. Therefore, they are highly sensitive to noise, such as ship noise. And the Danish belts and sounds are some of the most heavily trafficked waters in the world.

An international research team led by researchers from Aarhus University successfully fitted electronic tags on the backs of seven porpoises with the aim to obtain insight into how they react to disturbance of normal living conditions by ship noise.The groundbreaking data are published today in the journal Proceedings of the Royal Society B with post doc Danuta Wisniewska as main author.

A swimming employee

For many years, the research team headed by Senior Researcher Jonas Teilmann, Department of Bioscience, Aarhus University, has cooperated closely with Danish pound net fishermen as porpoises are sometimes accidentally caught in pound nets.

A fast phone call to the researchers and their speedy response ensure that the team can be at the site within a few hours. The porpoise is lifted out of the net and into the fishing boat where its sex, size and health are determined before the new collaborative partner is fitted with an electronic tagging device by means of suction cups.

The electronic tags record the sounds from the animals and noise from ships and allow the researchers to see when the porpoises feed and the depth at which they stay. The extensive data make it possible, for the first time, to see how ship noise affects the movement and feeding of the porpoises.

Lunch is disturbed

"When the ship noise exceeds a certain level, the porpoises stop feeding. At very high sound levels the animals dive to the bottom and move fastly along this, and they cease emitting the biosonar clicking sounds that they use when searching for food," says Jonas Teilmann.

The porpoise is Denmark's smallest whale and researchers have long been concerned about how its living conditions are influenced by the intensified human activity at sea.

"Our measurements show that the porpoises do respond to heavy ship noise. It is still too early to say, though, what this means to the well-being of the porpoises, their production of offspring and, in the long term, their survival," says Professor Peter Teglberg Madsen, Aarhus University, who is one of the researchers behind the sensational findings.

Credit: 
Aarhus University

Brain size of human ancestors evolved gradually over 3 million years

image: Modern human brains (left) are more than three times larger than our closest relatives, chimpanzees (right).

Image: 
Andrew Du, UChicago

Modern humans have brains that are more than three times larger than our closest living relatives, chimpanzees and bonobos. Scientists don't agree on when and how this dramatic increase took place, but new analysis of 94 hominin fossils shows that average brain size increased gradually and consistently over the past three million years.

The research, published this week in the Proceedings of the Royal Society B, shows that the trend was caused primarily by evolution of larger brains within populations of individual species, but the introduction of new, larger-brained species and extinction of smaller-brained ones also played a part.

"Brain size is one of the most obvious traits that makes us human. It's related to cultural complexity, language, tool making and all these other things that make us unique," said Andrew Du, PhD, a postdoctoral scholar at the University of Chicago and first author of the study. "The earliest hominins had brain sizes like chimpanzees, and they have increased dramatically since then. So, it's important to understand how we got here."

Du began the work as a graduate student at the George Washington University (GW). His advisor, Bernard Wood, GW's University Professor of Human Origins and senior author of the study, gave his students an open-ended assignment to understand how brain size evolved through time. Du and his fellow students, who are also co-authors on the paper, continued working on this question during his time at George Washington, forming the basis of the new study.

"Think about the entrance to a building. You can reach the front door by walking up a ramp, or you can take the steps," Wood said. "The conventional wisdom was that our large brains had evolved because of a series of step-like increases each one making our ancestors smarter. Not surprisingly the reality is more complex, with no clear link between brain size and behavior."

"The moral is this: When you don't understand something ask a bunch of bright and motivated students to figure it out," he said.

Du and his colleagues compared published research data on the skull volumes of 94 fossil specimens from 13 different species, beginning with the earliest unambiguous human ancestors, Australopithecus, from 3.2 million years ago to pre-modern species, including Homo erectus, from 500,000 years ago when brain size began to overlap with that of modern-day humans.

The researchers saw that when the species were counted at the clade level, or groups descending from a common ancestor, the average brain size increased gradually over three million years. Looking more closely, the increase was driven by three different factors, primarily evolution of larger brain sizes within individual species populations, but also by the addition of new, larger-brained species and extinction of smaller-brained ones. The team also found that the rate of brain size evolution within hominin lineages was much slower than how it operates today, although why this discrepancy exists is still an open question.

The study quantifies for the first time when and by how much each of these factors contributes to the clade-level pattern. Du said he likens it to how a football coach might build a roster of bigger, strong players. One way would be to make all the players hit the weight room to bulk up. But the coach could also recruit new, larger players and cut the smallest ones.

"That's exactly what we see going on in brain size," he said. "The dominant process is like the players hitting the gym. They're evolving larger brains within a population. But we also see speciation events adding larger-brained daughter species, or recruiting bigger players, and we see extinction, or cutting the smallest players too."

Credit: 
University of Chicago Medical Center

How people cope with difficult life events fuels development of wisdom, study finds

image: Carolyn Aldwin is director of the Center for Healthy Aging Research in the College of Public Health and Human Sciences at Oregon State University.

Image: 
Oregon State University

CORVALLIS, Ore. - How a person responds to a difficult life event such as a death or divorce helps shape the development of their wisdom over time, a new study from Oregon State University suggests.

For many, the difficult life event also served to disrupt their sense of personal meaning, raising questions about their understanding of their world. These disruptions ultimately lead to the development of new wisdom, said Carolyn Aldwin, director of the Center for Healthy Aging Research in the College of Public Health and Human Sciences at OSU.
"The adage used to be 'with age comes wisdom,' but that's not really true," said Aldwin, an expert on psychosocial factors that influence aging. "Generally, the people who had to work to sort things out after a difficult life event are the ones who arrived at new meaning."

The findings were just published in the Journals of Gerontology: Series B. The paper's lead author is Heidi Igarashi, who worked on the research as part of the dissertation for her doctorate at OSU; co-author is Michael R. Levenson of OSU.

The goal of the study was to better understand how wisdom develops in the context of adversity such as death of a loved one, divorce, health crisis, or loss of job. Understanding how people cope with adversity and develop wisdom provides insight into healthy aging, Aldwin said.
"What we're really looking at is 'when bad things happen, what happens?'" Aldwin said. "The event can become a catalyst for changes that come afterward."

Igarashi reviewed interviews with 50 adults ages 56 to 91 who had experienced one or more significant difficult life events. The participants were asked to identify a specific difficult or challenging life event, describe how they coped, and describe whether the experience changed their outlook or actions in life.

"One thing that stood out right away is that, when asked to think about a difficult life event or challenge, people had an answer right away," Aldwin said. "Difficult times are a way people define themselves."
The researchers found that people responded to the difficult life situations in three ways. For one group of respondents, 13 in all, the difficult life event led to little or no questioning of meaning in their life. Part of the people in this group simply accepted the event as something that could not be changed, while the remainder described using their intelligence, self-control and planning to solve problems related to the event.

The smallest group, five participants, indicated that the difficult life event helped them clarify a specific value or belief that had not previously been articulated.

The majority of the participants - 32 - indicated that the difficult life event disrupted their personal meaning and prompted the person to reflect on themselves, their fundamental beliefs and their understanding of the world.

"For these folks, the event really rocked their boat and challenged how they saw life and themselves," Aldwin said.

Further analysis showed that a person's social environment helped to shape their responses to the difficult life event. These social interactions included: enlisting help from others during the difficult time; unsolicited emotional support from family, friends or strangers; being held or holding, particularly among people sharing a difficult life event such as a loss; receiving unwanted support; comparing one's reaction to the event with the reactions of others; seeking expert advice; seeking out others with similar experiences; making new connections; and learning from society at large.

The researchers found that some of these social supports and interactions influenced a person's development of wisdom. Those who received unsolicited emotional support, for example, developed wisdom around compassion and humility. Seeking others with similar experiences exposed some participants to new ideas and interactions, supporting deeper exploration of their new sense of self.

"It mattered whether a participant was expected to adjust to the event quickly and 'get back to life,' or whether they were encouraged to grow and change as a result of the event," Igarashi said. "The quality of the social interactions really make a difference."

The findings provide new insight into the role of social support and interaction in developing wisdom, she said. The challenge for now is to determine how best to ensure that people are accessing the social supports they need to cope and grow from significant life challenges.

"Typically, the type of social support you get is the kind you ask for and allow, and there is no 'one size fits all' approach," Igarashi said. "But being open to the resources in your social network, or seeking out things like grief support groups may be worth exploring."

Credit: 
Oregon State University

African Americans with atrial fibrillation at significantly higher risk for stroke compared to Caucasians with the disease

PHILADELPHIA--African Americans with atrial fibrillation (AF) - a quivering or irregular heartbeat that can lead to a host of dangerous complications - have a significantly higher risk of stroke than Caucasians with the condition, according to new research published today in Heart Rhythm by researchers from the Perelman School of Medicine at the University of Pennsylvania. The new findings build on previous studies examining the impact of race on the risk of developing atrial fibrillation (AF), which is linked to blood clots, stroke, heart failure and other complications. It's well reported that African Americans have a lower risk of developing AF as compared to Caucasians, but until now, there was little data on the additional risks that come with AF for each race.

"To date, large phase-3 clinical trials that have evaluated the safety and efficacy of novel blood-thinner medications for AF have enrolled very few African American participants, which left us with little data about risks for this patient population," said study's senior author Rajat Deo, MD, MTR an associate professor of Cardiovascular Medicine in the Perelman School of Medicine at the University of Pennsylvania. "Now, having identified that African Americans with AF are more at risk for stroke will allow us to more proactively monitor and treat these patients before an adverse event occurs."

In the United States alone, it's estimated by the Centers of Disease Control and Prevention that between three and six million Americans have AF - a number that is expected to rise as the population ages.

Researchers created the Penn Atrial Fibrillation Free Study, a centralized pool of patient data from across the University of Pennsylvania Health System, which was comprised of 56,835 patients without a history of atrial fibrillation or a remote history of stroke. Of these, 3,507 developed AF--the inception cohort used for the current study. The study design was unique in that researchers had a time point that represented the initial diagnosis of atrial fibrillation. This approach provided an opportunity to examine the risk of stroke during a six-month period prior to a formal, clinical diagnosis of atrial fibrillation. Until now, no prior study has examined stroke risk in this period prior to a diagnosis of atrial fibrillation.

These data demonstrate the potential importance of early detection of atrial fibrillation. In fact, it showed that of the 538 observed strokes in the patients in the inception cohort, 254 occurred prior to AF diagnosis, and in many cases, the stroke was the presenting condition for newly diagnosed AF patients. Compared to Caucasians, African Americans have an increased risk of stroke both before and after atrial fibrillation diagnosis. As such, certain anticoagulation medications could help mitigate this risk of stroke, especially if AF is detected early.

"These findings indicate that a more rigorous effort to identify individuals from the community with atrial fibrillation may result in a reduction in overall stroke burden," said Deo. "Evolving mobile and wearable technologies are providing individuals the opportunity to acquire cardiac rhythm data. Our findings highlight a need to make these technologies available to diverse populations and a need to understand how to process this data once received in order to enhance the care of our diverse populations."

Credit: 
University of Pennsylvania School of Medicine