Culture

How moral obligation drives protest

Researchers have long studied the motives that inspire people to join in collective action. Three factors have received particular attention: anger caused by apparent social injustice; belief in the efficacy of collective action; and politicised identity. In 2008, these factors informed a predictive model of collective action - SIMCA, or a Social Identity Model of Collective Action. New studies have recently prompted a team of scholars, including a HSE researcher, toincorporate two additional factors into the existing model: ideology and moral obligation. A survey of protesters in Spain was used to test the impact of these two new factors.

Background

In 1997, American researchers found that, regardless of their agenda, all mass protests had several things in common: all of them prioritised collective rights over individual rights; each came at a cost determined by the type of society and timing of a given protest; and protest actions have become a normal political phenomenon in modern democracies.

For a long time now, scholars have reflected on what drives people to hit the streets in collective action. An important milestone was set by the book Talking Politics by American sociologist William Gamson and published in 1992, in which the author pinpointed three main reasons for political mobilisation: perceived injustice; anticipated efficacy of collective action; and search for political identity. This theory gave rise to a new model, SIMCA, or the Social Identity Model of Collective Action, initially developed in 2008 to reflect the interrelation of factors, which one needs to consider when predicting collective action. This model's viability has been empirically confirmed. However, researchers have continued to look for other reasons, in addition to the three mentioned above, prompting people to join collective action.

Axiological Theory

In 2017, a group of researchers from the University of Santiago de Compostela (Spain), in collaboration with Dmitry Grigoryev of the HSE International Laboratory of Socio-Cultural Research, proposed the hypothesis, whereby axiological components such as ideology and moral obligation may also play a role in people's engagement in collective action.

According to the researchers, one's choice of whether or not to join a protest is ideological by nature. First, it reflects one's attitude towards an existing social system. Second, political protesters and counter-protesters are often attracted to radical left-wing or right-wing ideologies. As a simplified example, after the French Revolution, the right wing traditionalists sought to maintain the status quo, keep things as they used to be, and defend the political hierarchy and traditions, while the left wingers pushed for change to achieve social, economic and political equality.

Furthermore, certain ideological motives can be explained by System Justification Theory (SJT).

The SJT postulates that most people are motivated to defend, support and justify social, economic and political systems on which they are dependent. Familiar institutions, mechanisms, ideas and practices, or the status quo, are perceived as natural and inevitable and, therefore, legitimate.

Moreover, the researchers argue that politics are inseparable from ethical or moral issues; therefore, moral obligation is a key axiological component in political engagement. The underlying theory is Kant's categorical imperative, stating that one must always obey the moral law regardless of the consequences. It is well known that sometimes people act as their morality dictates, often ignoring the costs of their actions. The study authors' concept of moral obligation covers five aspects:

moral obligation itself;
autonomy;
personal satisfaction (if one's conduct conforms to a moral obligation);
discomfort (if one's conduct does not conform to a moral obligation);
sacrifice.

Since a moral norm determines what is considered right and wrong conduct, a moral obligation can be defined as an incentive to comply with the given moral norm.

To test the viability of their model, the researchers conducted two studies, both of which took place in Spain.

Testing the New Model

The first study was held during a political protest in Madrid in May 2017. The researchers surveyed a sample of participants in a rally organised by left wing parties to push for a vote of no confidence against the Prime Minister of Spain, following numerous reported incidents of corruption in his right-leaning People's Party.

The survey sample included 270 active protesters and 261 non-protesters, who were at the scene of the protest at the time, but did not take part directly in the rally.

The respondents were asked to fill out questionnaires measuring each of the model's five variables.

An analysis of the responses revealed that that moral obligation was indeed a key factor in the protesters' choice to join the rally, followed by identification with the protesting group's values and beliefs. These were the main variables distinguishing protester and non-protester responses. The proposed model supported the correct classification of respondents in 87% of cases.

Unconventional Protest

A second study, also published in 2017, sought to explain people's intentions when joining in a protest, rather than their actual participation, as in the first case. With this in mind, the researchers tested their model on potential protesters. They surveyed a total of 607 people about their intentions to take part in protests to defend Spain's healthcare system.

An analysis of the responses, once again, revealed the major role played by moral obligation which turned out to be the main factor determining the intention to join non-conventional protest actions, followed by ideological orientation: people with stronger left wing views were more ready to go out onto the streets.

Conclusions

The context of every collective action is unique. Protests have different objectives, different types of organisers, and take place in different historical periods, not to mention that the countries where protests are occurring have distinctive cultural characteristics. These factors of context can modify and fine-tune the pattern of interactions between the model's variables. Therefore, differences in research findings can reflect the role of context and the fact that the model can work somewhat differently each time albeit follow the same logic.

'Potential protesters must identify with a certain group, share a certain perspective on the situation, feel a sense of dissatisfaction, deprivation and negative emotions (most often anger), and also perceive a moral obligation to defend their position despite the possible costs and negative consequences. As it turns out, believing that they can change the situation is not always necessary,' noted co-author Dmitry Grigoryev.

He maintains that the five aforementioned factors, which motivate people to join collective action, have practical applications and have been used by governments and political opposition -- the former seeking to reduce social tension while the latter aiming to mobilise people to join in protests. Using Russia as an example, we can observe that pro-government institutions focus on justifying the existing social system, while the opposition emphasises social injustice and encourages political identity.

Credit: 
National Research University Higher School of Economics

A new way to make valuable chemicals

image: Feng Jiao, an associate professor of chemical and biomolecular engineering at the University of Delaware, is a leader in the field of carbon capture and utilization.

Image: 
Photo illustration by Joy Smoker

In an effort to develop sustainable solutions to humanity's energy needs, many scientists are studying carbon capture and utilization -- the practice of using excess carbon dioxide in the atmosphere or from point sources, instead of fossil fuels, to synthesize chemicals used to make everyday products, from plastics to fuels to pharmaceuticals.

Feng Jiao, an associate professor of chemical and biomolecular engineering at the University of Delaware, is a leader in the field of carbon capture and utilization. Now, he and his colleagues have made a new discovery that could further advance carbon capture and utilization and extend its promise to new industries.

In the journal Nature Chemistry, Jiao and collaborators from the California Institute of Technology, Nanjing University (China), and Soochow University (China) describe how they formed carbon-nitrogen bonds in an electrochemical carbon monoxide reduction reaction, which led to the production of high-value chemicals called amides. These substances are useful in a variety of industries, including pharmaceuticals.

The team is the first to do this. "Now, starting with carbon dioxide as a carbon source, we can expand to a variety of products," said Jiao, the associate director for UD's Center for Catalytic Science and Technology (CCST).

Ingenuity that began at UD

The science behind these findings is electrochemistry, which utilizes electricity to produce chemical change. In previous research efforts, Jiao developed a special silver catalyst, which converts carbon dioxide to carbon monoxide. Next, he wanted to further upgrade carbon monoxide into multi-carbon products useful in the production of fuels, pharmaceuticals and more.

"In the field of electrochemical carbon dioxide conversion, we were stuck with only four major products we can make using this technology: ethylene, ethanol, propanol, and, as we reported just a couple months ago in Nature Catalysis, acetate," said Jiao.

Nitrogen is the secret ingredient to unlock the potential of the system. The team used an electrochemical flow reactor that is typically fed with carbon dioxide or carbon monoxide, but this time they put in both carbon monoxide and ammonia, a compound that contains nitrogen. The nitrogen source interacts with the copper catalyst at the electrode-electrolyte interface, leading to the formation of carbon-nitrogen (CN) bonds. This process allowed the team to synthesize chemicals that had never before been made in this way, including amides, which can be used in pharmaceutical synthesis. Many pharmaceutical compounds contain nitrogen, and "this actually provides a unique way to build large molecules which contains nitrogen from simple carbon and nitrogen species," said Jiao.

At a meeting of the American Chemical Society, Jiao shared some of his preliminary findings with William A. Goddard III, principal investigator at the Joint Center for Artificial Photosynthesis at Caltech. Goddard, a world-leading expert who uses Quantum Mechanics to determine reaction mechanism and rates of such electrocatalytic processes, was very excited about this unexpected discovery and immediately set his team. Tao Cheng in the Goddard lab found that the new carbon-nitrogen bond coupling was an off-shoot of the mechanism that had been determined for the production of ethylene and ethanol, suggesting that Jiao might be able couple bonds other than CN.

"Through a close collaboration with Prof. Goddard, we learned quite a lot in terms of how this carbon-nitrogen bond formed on the surface of the catalyst," said Jiao. "This gave us important insights on how we can design even better catalysts to facilitate some of these kinds of chemical reactions."

The implications of this work could be far-ranging.

"This has the significant impact down the road, I think, to partially address carbon dioxide emission issues," said Jiao. "Now we can actually utilize it as the carbon feedstock to produce high-value chemicals."

Credit: 
University of Delaware

Memory loss, dementia an understudied yet widespread phenomenon among Chinese Americans

The U.S. Chinese population is growing - and graying - rapidly! From 2000 to 2010, the Chinese American population aged 65 and over grew at a rate four times higher than the overall U.S. older adult population. As of 2016, 14% of the approximately four million Chinese Americans were aged 65 and older. As this population ages, they are increasingly susceptible to memory loss and lacking the necessary supports for healthy aging, according to four new Rutgers studies published in the Journal of the American Geriatrics Society.

The studies are the first to extensively study memory loss and dementia in the context of immigration, gender, psychological distress, education, social engagement, and oral health among older Chinese Americans. The papers explore the risk factors and offer recommendations for healthcare providers to overcome the linguistic, cultural, and socioeconomic barriers facing this vulnerable population.

"Research has found that as many as 34% of American adults aged 60 and older are cognitively impaired," said XinQi Dong, director of Rutgers University's Institute for Health, Health Care Policy and Aging Research. " Further, more than one in four people aged 65 and older will be diagnosed with dementia in their lifetime. Although these facts are known to apply to the general American population, there is little information about Alzheimer's disease and related dementias or the contributing factors among Asian Americans."

For the studies, researchers administered multiple cognitive function tests to 2,713 Chinese Americans aged 60 and older. They measured the effects of immigration and gender, psychological distress, education level and social engagement, and oral health on the three domains of cognitive function - global cognition, episodic memory, and working memory.

Key findings:

Older Chinese American women experience higher rates of cognitive impairment.

Depression, chronic conditions, and disability are associated with cognitive decline.

Lower education levels increase the risk of cognitive impairment. A one-year increase in education decreases the risk by 25%.

Difficulties performing functional and instrumental activities of daily living are predictive of the risk of developing cognitive impairment.

Higher levels of perceived stress are associated with poorer episodic memory, perceptual speed, and working memory in older Chinese American adults

Stress negatively affects the brain and cognitive functioning throughout life.

Older Chinese Americans who face increased stressors from linguistic and cultural barriers have poorer cognitive functioning and faster cognitive decline.

41.5% of Asian Americans reported not receiving annual oral health examinations, which is linked to decreased quality of life, depression, hypertension, poor cognition, and cognitive decline.

"Chinese older adults are confronting significant life challenges and health disparities due to multiple social, structural, cultural and linguistic barriers," said Dong. "A thorough understanding of Chinese Americans' cognitive risk factors is necessary to guide the development of policy and interventions to delay the onset of memory loss," Dong continued.

"Researchers, social workers, and policymakers must collaborate to reduce health disparities and expand access to culturally-sensitive care for older Chinese American adults," he continued. "Developing programs and interventions that reduce stress, increase activity engagement, and improve quality of care, is necessary to limit memory loss among the aging U.S. Chinese population."

Credit: 
Rutgers University

Researchers identify properties of cells that affect how tissue structures form

Researchers have found that changing the mechanical properties of individual cells disrupts their ability to remain stable, profoundly affecting their health and the health of the tissue that comprises them.

In the September issue of the journal Current Biology, Virginia Commonwealth University researchers, in collaboration with researchers from the University of Florida, identify a fundamental factor in maintaining a stable multicellular structure: the LINC complex.

The LINC complex -- a group of proteins known as the nuclear linker of nucleoskeleton to cytoskeleton -- anchors the nucleus to the cell. The study, using 3D culture models of cell clusters known as acini, suggests that mechanically disrupting the LINC complex destabilizes the clusters.

"If these main connections that help anchor the nucleus are disrupted, the cells try their best to compensate in order to keep things normal," said Vani Narayanan, a doctoral candidate in the Department of Biomedical Engineering in the VCU College of Engineering and lead co-author of the study. "Unfortunately, this compensation becomes over-compensation. Various proteins that should ideally remain in nominal amounts within the cell for proper cellular function get unregulated, causing rapid movement of cells within the acinus, abnormal cell division and migration and, hence, the system collapses."

Problems with cells such as epithelial cells -- which separate the body from the outside environment and provide barriers between different areas inside organs, such as the liver, and are critical for tissue and organ function -- are linked to defective wound healing and the development and progression of diseases such as cancer.

"The mechanics of the cells affect how tissue structures form," said Daniel Conway, Ph.D., associate professor in the Department of Biomedical Engineering and co-author of the study. "If you change the mechanical properties of individual cells, they lose the ability to form more complex tissue or to stay organized. ... The structures actually aren't that stable, [therefore] disruptions in physical connections between or within cells can result in these structures collapsing."

Credit: 
Virginia Commonwealth University

Survey reveals skyrocketing interest in marijuana and cannabinoids for pain

CHICAGO - Millennials lead the escalating interest in marijuana and cannabinoid compounds for managing pain - with older generations not far behind - and yet most are unaware of potential risks. Three-quarters (75%) of Americans who expressed interest in using marijuana or cannabinoids to address pain are under the impression they are safer or have fewer side effects than opioids or other medications, according to a nationwide survey commissioned by the American Society of Anesthesiologists (ASA) in conjunction with September's Pain Awareness Month.

More than two-thirds of those surveyed said they have used or would consider using marijuana or cannabinoid compounds - including cannabidiol (CBD) and tetrahydrocannabinol (THC) - to manage pain. Nearly three-quarters of millennials fall in that category, with 37% noting they have used them for pain. Two-thirds of Gen Xers and baby boomers expressed interest, with 25% of Gen Xers and 18% of baby boomers saying they have used them for pain.

"As experts in managing pain, physician anesthesiologists are concerned about the lack of research regarding the safety and effectiveness of marijuana and cannabinoids," said ASA President Linda J. Mason, M.D., FASA. "The good news is that until the research is completed and we fully understand the risks and potential benefits, physician anesthesiologists today can develop a personalized plan for patients' pain drawing from effective alternatives such as non-opioid medications and other therapies, including injections, nerve blocks, physical therapy, radio waves and spinal cord stimulation."

ASA members express concern that patients in pain are unaware marijuana and cannabinoids may not be safer than other medications, that they can have side effects - ranging from excessive sleepiness to liver damage - and more importantly that these products are not regulated or monitored for quality.

Misunderstandings about Marijuana and Cannabinoid Safety and Oversight

Results of the nationwide survey of adults 18 or older confirm physician anesthesiologists' concerns. When respondents who said they have used or would consider using marijuana or cannabinoids were asked why, the majority (62%) said they believe them to be safer than opioids and (57%) believe they have fewer side effects than other medications.

Marijuana and cannabinoids currently are in uncharted territory with no way for people to know exactly what they are purchasing. Even though it is widely available, CBD is not regulated. The U.S. Food and Drug Administration (FDA) has approved only one prescription version of CBD for patients with one of two rare forms of epilepsy. (No form of marijuana is approved by the FDA and the federal government considers it a controlled substance and illegal). Thirty-three states and Washington, D.C. have legalized marijuana in some form (for recreational or medical use) but all set their own regulations, which vary widely. Further, studies have shown that no matter what the label says, the actual ingredients may differ, and may contain dangerous synthetic compounds, pesticides and other impurities.

Yet the survey results reflect a significant misunderstanding of that reality. Among all surveyed (including those who said they would never use marijuana or cannabinoids):

Only a little more than half (57%) believe more research is needed;

More than one-third (34%) don't feel the need to discuss using these products with their doctor;

Nearly three out of five (58%) think they have fewer side effects than other medications;

Nearly half (48%) think they know what they are getting with marijuana or cannabinoids; and

40% believe CBD sold at grocery stores, truck stops, health food stores or medical marijuana dispensaries is approved by the FDA. The younger the generation, the more likely they were to believe that is the case.

The ASA recently endorsed two bills that seek to expand research on CBD and marijuana: H.R. 601, the Medical Cannabis Research Act of 2019 and S. 2032, the Cannabidiol and Marihuana Research Expansion Act.

Alternative pain management options

People in pain looking for alternatives to opioids should know there are other options besides marijuana or cannabinoids. For example, only 13% of respondents said they have used or would consider using marijuana or cannabinoids because no other type of pain management works for them. Physician anesthesiologists and other pain management specialists can work with people in pain to develop a safe, effective pain management plan that doesn't include opioids, marijuana or cannabinoids.

Credit: 
American Society of Anesthesiologists

Coating developed by Stanford researchers brings lithium metal battery closer to reality

image: Lead authors and Ph.D. students David Mackanic, left, and Zhiao Yu with their battery tester at right. Yu holds a dish of already tested cells that they call "the battery graveyard."

Image: 
Mark Golden

Hope has been restored for the rechargeable lithium metal battery - a potential battery powerhouse relegated for decades to the laboratory by its short life expectancy and occasional fiery demise while its rechargeable sibling, the lithium-ion battery, now rakes in more than $30 billion a year.

A new coating could make lightweight lithium metal batteries safe and long lasting, a boon for development of next-generation electric vehicles. (Image credit: Shutterstock)

A team of researchers at Stanford University and SLAC National Accelerator Laboratory has invented a coating that overcomes some of the battery's defects, described in a paper published Aug. 26 in Joule.

In laboratory tests, the coating significantly extended the battery's life. It also dealt with the combustion issue by greatly limiting the tiny needlelike structures - or dendrites - that pierce the separator between the battery's positive and negative sides. In addition to ruining the battery, dendrites can create a short circuit within the battery's flammable liquid. Lithium-ion batteries occasionally have the same problem, but dendrites have been a non-starter for lithium metal rechargeable batteries to date.

"We're addressing the holy grail of lithium metal batteries," said Zhenan Bao, a professor of chemical engineering, who is senior author of the paper along with Yi Cui, professor of materials science and engineering and of photon science at SLAC. Bao added that dendrites had prevented lithium metal batteries from being used in what may be the next generation of electric vehicles.

The promise

Lithium metal batteries can hold at least a third more power per pound as lithium-ion batteries do and are significantly lighter because they use lightweight lithium for the positively charged end rather than heavier graphite. If they were more reliable, these batteries could benefit portable electronics from notebook computers to cell phones, but the real pay dirt, Cui said, would be for cars. The biggest drag on electric vehicles is that their batteries spend about a fourth of their energy carrying themselves around. That gets to the heart of EV range and cost.

PhD students David Mackanic, left, and Zhiao Yu in front of their battery tester.

Lead authors and PhD students David Mackanic, left, and Zhiao Yu in front of their battery tester. Yu is holding a dish of already tested cells that they call the "battery graveyard." (Image credit: Mark Golden)

"The capacity of conventional lithium-ion batteries has been developed almost as far as it can go," said Stanford PhD student David Mackanic, co-lead author of the study. "So, it's crucial to develop new kinds of batteries to fulfill the aggressive energy density requirements of modern electronic devices."

The team from Stanford and SLAC tested their coating on the positively charged end - called the anode - of a standard lithium metal battery, which is where dendrites typically form. Ultimately, they combined their specially coated anodes with other commercially available components to create a fully operational battery. After 160 cycles, their lithium metal cells still delivered 85 percent of the power that they did in their first cycle. Regular lithium metal cells deliver about 30 percent after that many cycles, rendering them nearly useless even if they don't explode.

The new coating prevents dendrites from forming by creating a network of molecules that deliver charged lithium ions to the electrode uniformly. It prevents unwanted chemical reactions typical for these batteries and also reduces a chemical buildup on the anode, which quickly devastates the battery's ability to deliver power.

"Our new coating design makes lithium metal batteries stable and promising for further development," said the other co-lead author, Stanford PhD student Zhiao Yu.

The group is now refining their coating design to increase capacity retention and testing cells over more cycles.

"While use in electric vehicles may be the ultimate goal," said Cui, "commercialization would likely start with consumer electronics to demonstrate the battery's safety first."

Credit: 
Stanford University

Hiring committees that don't believe in gender bias promote fewer women

Is gender bias in hiring really a thing?

Opinions vary, but a new study by a UBC psychologist and researchers in France reveals that hiring committees who denied it's a problem were less likely to promote women.

"Our evidence suggests that when people recognize women might face barriers, they are more able to put aside their own biases," said Toni Schmader, a UBC psychology professor and Canada Research Chair in social psychology. "We don't see any favourability for or against male or female candidates among those committees who believe they need to be vigilant to the possibility that biases could be creeping in to their decision-making."

The study was unique in that findings were based on actual decisions made by 40 hiring committees in France, charged with filling elite research positions with the National Committee for Scientific Research (CNRS) for two consecutive years. Past research in this area has relied mostly on hypothetical scenarios, such as presenting a large sample of participants with identical resumés bearing either male or female names and asking who they would hire. By contrast, the decisions made during this study had real impact on scientists' careers.

With cooperation from the CNRS, the researchers were able to first measure how strongly hiring committee members associated men with science. They did this using an "implicit association test" that flashes words on a computer screen and measures how quickly participants are able to assign those words to a particular category. People who make a strong association between men and science have to think a bit longer, and react more slowly, when challenged to pair female-related words with science concepts.

Both men and women on the hiring committees tended to show the science = male association, which is difficult to hide in such a test.

"There's research suggesting that you can document a 'think science, think male' implicit association showing up with kids as early as elementary school," Schmader said. "We learn associations from what we see in our environment. If we don't see a lot of women who are role models in science, then we learn to associate science more with men than women."

These implicit associations are distinct from people's explicit beliefs about women in science. In a separate survey that asked panellists directly whether women in science careers are impacted by such things as discrimination and family constraints, some hiring committees minimized those issues. Others acknowledged them.

When the researchers compared these implicit and explicit beliefs with the actual hiring outcomes, they learned that committees attuned to the barriers women face were more likely to overcome their implicit science/male associations when selecting candidates for the job. Among committees that believed "science isn't sexist," those which implicitly associated science more with men promoted fewer women. The difference was especially pronounced in Year 2 of the study, when committee members would have been less conscious of the fact that their selections were being studied.

The findings show that awareness and acknowledgement of the barriers women face might be key to making sure implicit biases don't affect hiring decisions. They also point to the importance of educating hiring committees about gender bias and how to guard against it, Schmader said.

The study was published today in Nature Human Behaviour.

Credit: 
University of British Columbia

Even scientists have gender stereotypes ... which can hamper the career of women researchers

However convinced we may be that science is not just for men, the concept of science remains much more strongly associated with masculinity than with femininity in people's minds. This automatic bias, which had already been identified among the general public, also exists in the minds of most scientists, who are not necessarily aware of it. And, in certain conditions, it may lead to otherwise careful scientific evaluation committees putting women at a disadvantage during promotion rounds involving men and women researchers. These are the findings of a study conducted by behavioural scientists from the Social and cognitive psychology laboratory (CNRS/Université Clermont Auvergne), the Laboratory of Cognitive Psychology (CNRS/Aix-Marseille Université), and the University of British Columbia (Canada), with the support of the CNRS Mission for the place of women. The study is published in the journal Nature Human Behaviour on 26 August 2019.

Women remain underrepresented in scientific research: at the French National Centre for Scientific Research (CNRS), across all disciplines, the average percentage of female researchers is 35%. And the higher the scientific research position, the more this percentage declines. Several reasons have been cited to explain these disparities: differences in levels of motivation, self-censorship ... but is discrimination also part of the story?

To find out, scientists in social and cognitive psychology studied 40 evaluation committees (1) tasked with evaluating applications for research director (2) positions at the CNRS over a period of two years. This is the first time that a research institution has carried out such a scientific study of its practices in the course of an annual nationwide competition covering the entire scientific spectrum.

This study shows that, from particle physics to the social sciences, most scientists, whether male or female, associate "science" and "masculine" in their semantic memory (the memory of concepts and words). This stereotype is implicit, which is to say that most often it is not detectable at the level of discourse. And it is equivalent to that observed among the general population.

Yet does this implicit stereotype have consequences on the decisions made by evaluation committees? Yes, when committees deny or minimise the existence of bias against women.(3) Here, this is the case for around half of the committees. In these committees, the stronger the implicit stereotypes, the less often women are promoted. In contrast, when committees acknowledge the possibility of bias, implicit stereotypes, however strong they may be, have no influence.

Even if disparities between men and women in science have multiple causes and start at school (as the same authors have shown in other publications), this study indicates for the first time the existence of implicit gender stereotypes among male and female researchers across all disciplines - stereotypes that can harm the careers of women scientists.

Since 2019, at the instigation of the CNRS Mission for the place of women, members of evaluation committees have been invited to participate in training sessions on gender stereotypes and each committee has appointed a reference person in charge of gender equality issues. However, the authors of the study emphasise that, in order to be fully effective, this process must be accompanied by other measures aiming, on the one hand, to enlighten committee members on the exact conditions in which implicit stereotypes influence their decisions, and, on the other, to explain strategies likely to control this influence.

Credit: 
CNRS

Stable home lives improve prospects for preemies

image: Cynthia E. Rogers, MD, an associate professor of child psychiatry at Washington University School of Medicine in St. Louis, visits 4-day-old Abel Stanart in the newborn intensive care unit (NICU) at St. Louis Children's Hospital. Rogers' team has found that as premature babies grow, their mental health may be related less to medical challenges they face after birth than to the environment the babies enter once they leave the NICU.

Image: 
Matt Miller

As they grow and develop, children who were born at least 10 weeks before their due dates are at risk for attention-deficit/hyperactivity disorder (ADHD), autism spectrum disorder and anxiety disorders. They also have a higher risk than children who were full-term babies for other neurodevelopmental issues, including cognitive problems, language difficulties and motor delays.

Researchers at Washington University School of Medicine in St. Louis who have been trying to determine what puts such children at risk for these problems have found that their mental health may be related less to medical challenges they face after birth than to the environment the babies enter once they leave the newborn intensive care unit (NICU).

In a new study, the children who were most likely to have overcome the complications of being born so early and who showed normal psychiatric and neurodevelopmental outcomes also were those with healthier, more nurturing mothers and more stable home lives.

The findings are published Aug. 26 in The Journal of Child Psychology and Psychiatry.

"Home environment is what really differentiated these kids," said first author Rachel E. Lean, PhD, a postdoctoral research associate in child psychiatry. "Preterm children who did the best had mothers who reported lower levels of depression and parenting stress. These children received more cognitive stimulation in the home, with parents who read to them and did other learning-type activities with their children. There also tended to be more stability in their families. That suggests to us that modifiable factors in the home life of a child could lead to positive outcomes for these very preterm infants."

The researchers evaluated 125 5-year-old children. Of them, 85 had been born at least 10 weeks before their due dates. The other 40 children in the study were born full-term, at 40 weeks' gestation.

The children completed standardized tests to assess their cognitive, language and motor skills. Parents and teachers also were asked to complete checklists to help determine whether a child might have issues indicative of ADHD or autism spectrum disorder, as well as social or emotional problems or behavioral issues.

It turned out the children who had been born at 30 weeks of gestation or sooner tended to fit into one of four groups. One group, representing 27% of the very preterm children, was found to be particularly resilient.

"They had cognitive, language and motor skills in the normal range, the range we would expect for children their age, and they tended not to have psychiatric issues," Lean said. "About 45% of the very preterm children, although within the normal range, tended to be at the low end of normal. They were healthy, but they weren't doing quite as well as the more resilient kids in the first group."

The other two groups had clear psychiatric issues such as ADHD, autism spectrum disorder or anxiety. A group of about 13% of the very preterm kids had moderate to severe psychiatric problems. The other 15% of children, identified via surveys from teachers, displayed a combination of problems with inattention and with hyperactive and impulsive behavior.

The children in those last two groups weren't markedly different from other kids in the study in terms of cognitive, language and motor skills, but they had higher rates of ADHD, autism spectrum disorder and other problems.

"The children with psychiatric problems also came from homes with mothers who experienced more ADHD symptoms, higher levels of psychosocial stress, high parenting stress, just more family dysfunction in general," said senior investigator Cynthia E. Rogers, MD, an associate professor of child psychiatry. "The mothers' issues and the characteristics of the family environment were likely to be factors for children in these groups with significant impairment. In our clinical programs, we screen mothers for depression and other mental health issues while their babies still are patients in the NICU."

Rogers and Lean believe the findings may indicate good news because maternal psychiatric health and family environment are modifiable factors that can be targeted with interventions that have the potential to improve long-term outcomes for children who are born prematurely.

"Our results show that it wasn't necessarily the clinical characteristics infants faced in the NICU that put them at risk for problems later on," Rogers said. "It was what happened after a baby went home from the NICU. Many people have thought that babies who are born extremely preterm will be the most impaired, but we really didn't see that in our data. What that means is in addition to focusing on babies' health in the NICU, we need also to focus on maternal and family functioning if we want to promote optimal development."

The researchers are continuing to follow the children from the study.

Credit: 
Washington University School of Medicine

Multiple-birth infants have higher risk of medical mixups in NICU

New York, NY (August 26, 2019)--Multiple-birth infants had a significantly higher risk of wrong-patient order errors compared with singletons in neonatal intensive care units (NICUs), according to a new study by researchers at Columbia University Vagelos College of Physicians and Surgeons and NewYork-Presbyterian Hospital. The higher error rate was due to misidentification between siblings within sets of twins, triplets, or quadruplets.

The study was published online today in the journal JAMA Pediatrics.

Background

Each year, more than 135,000 twins, triplets, and higher-order multiples are born in the United States.

Many are delivered prematurely and require over a month of treatment in a neonatal intensive care unit (NICU).

Wrong-patient order errors--orders for medications, tests, and procedures that are inadvertently written for the wrong patient--are more common in NICUs than in general pediatric care units.

"The challenge for NICU staff is that newborns lack distinguishing physical characteristics," says lead author Jason Adelman MD, MS, assistant professor of medicine at Columbia University Vagelos College of Physicians and Surgeons and chief patient safety officer at NewYork-Presbyterian/Columbia University Irving Medical Center. "This is compounded by the use of medical equipment that can obscure infants' physical features and the convention of identifying babies in the NICU with similar temporary names, such as Babyboy or Babygirl."

Adelman's prior research, published in Pediatrics, found that giving babies more distinctive names--such as Wendysboy--reduced the risk of wrong-patient order errors in the NICU by more than 36%.

Based on that study, The Joint Commission, which accredits healthcare organizations, now requires hospitals to adopt the new more distinct naming convention, along with other measures to minimize wrong-patient errors among newborns. "However, we suspected that the new naming convention may not protect multiples because siblings have the same last name and near-identical first names--for example, Wendysboy1 and Wendysboy2--placing them at risk of being mistaken for each other," says Adelman.

Study Details

To better understand the risk for multiples, the researchers analyzed more than 1.5 million electronic orders placed for 10,819 infants in six NICUs within two New York City hospital systems. Both systems were using the new naming convention.

The risk of making a wrong-patient order error was estimated using the Wrong-Patient Retract-and-Reorder (RAR) Measure, an algorithm devised by Adelman to identify near misses--orders incorrectly placed for one patient that are subsequently canceled and reordered for the intended patient. Although these near-miss errors are caught before causing harm, they occur frequently and follow the same pathway as errors that reach patients, making them a robust outcome for safety intervention studies.

What the Study Found

The risk of wrong-patient order errors was nearly doubled for multiples compared with singletons, the study found. Among multiples, errors between siblings--rather than unrelated infants--accounted for the excess risk. The risk grew with increasing number of siblings receiving care in the NICU: an error occurred in 1 in 7 sets of twins and in 1 of 3 sets of triplets and quadruplets.

The findings were consistent across study sites despite differences in patient populations and electronic health record systems.

"Our study suggests that the safeguards now commonly used to protect against medical errors in the NICU setting are not sufficient to prevent misidentification and medical errors among multiple-birth infants," Adelman says.

What's Next

Adelman's team is currently developing and testing a novel system for identifying newborns in the NICU.

"In the meantime, hospitals may be able to reduce the risk of errors among multiples in the NICU by using the newborns' given names when available or pseudonyms, and by switching from the temporary name to the given name as soon as it becomes available," says Adelman. "Clinicians may also consider encouraging parents--especially those expecting multiples--to select names or pseudonyms that can be used at birth."

Credit: 
Columbia University Irving Medical Center

Study finds big increase in ocean carbon dioxide absorption along West Antarctic Peninsula

image: Morning sunlight reflects off sea ice along the West Antarctic Peninsula.

Image: 
Drew Spacht/The Ohio State University

Climate change is altering the ability of the Southern Ocean off the West Antarctic Peninsula to absorb carbon dioxide, according to a Rutgers-led study, and that could magnify climate change in the long run.

The study, led by scientists at Rutgers University-New Brunswick, is published in the journal Nature Climate Change.

The West Antarctic Peninsula is experiencing some of the most rapid climate change on Earth, featuring dramatic increases in temperatures, retreats in glaciers and declines in sea ice. The Southern Ocean absorbs nearly half of the carbon dioxide - the key greenhouse gas linked to climate change - that is absorbed by all the world's oceans.

"Understanding how climate change will affect carbon dioxide absorption by the Southern Ocean, especially in coastal Antarctic regions like the West Antarctic Peninsula, is critical to improving predictions of the global impacts of climate change," said lead author Michael Brown, an oceanography doctoral student in the Center for Ocean Observing Leadership in the Department of Marine and Coastal Sciences at the School of Environmental and Biological Sciences.

The study tapped an unprecedented 25 years of oceanographic measurements in the Southern Ocean and highlights the need for more monitoring in the region.

The research revealed that carbon dioxide absorption by surface waters off the West Antarctic Peninsula is linked to the stability of the upper ocean, along with the amount and type of algae present. A stable upper ocean provides algae with ideal growing conditions. During photosynthesis, algae remove carbon dioxide from the surface ocean, which in turn draws carbon dioxide out of the atmosphere.

From 1993 to 2017, changes in sea ice dynamics off the West Antarctic Peninsula stabilized the upper ocean, resulting in greater algal concentrations and a shift in the mix of algal species. That's led to a nearly five-fold increase in carbon dioxide absorption during the summertime. The research also found a strong north-south difference in the trend of carbon dioxide absorption. The southern portion of the peninsula, which to date has been less impacted by climate change, experienced the most dramatic increase in carbon dioxide absorption, demonstrating the poleward progression of climate change in the region.

The results also demonstrate the often counterintuitive impacts of climate change. The scientists hypothesize that upper ocean stability off the West Antarctic Peninsula may ultimately decrease in the coming decades as sea ice continues to decline. Once sea ice reaches a critically low level, there won't be enough of it to prevent wind-driven mixing of the upper ocean, or to supply a sufficient amount of stabilizing meltwater. And that could result in reduced carbon dioxide absorption in the Southern Ocean over the long run.

A decrease in the ocean's ability to absorb carbon dioxide could lead to more warming worldwide by allowing more of the heat-trapping gas to remain in the atmosphere.

Credit: 
Rutgers University

New research predicts stability of mosquito-borne disease prevention

image: The team discovered a gene within A. aegypti that is likely involved in Wolbachia's ability to interfere with viral replication, shedding light on a possible mechanism for the blocking trait.

Image: 
Dan Lesher, Penn State

UNIVERSITY PARK, Pa.--To reduce transmission of dengue to humans, scientists have introduced Wolbachia bacteria to A. aegypti mosquitoes. Now a team of international researchers has found that Wolbachia's ability to block virus transmission may be maintained by natural selection, alleviating concern that this benefit could diminish over time.

"The results of our study have significant implications for the continued use of Wolbachia as a biocontrol agent for this disease," said Elizabeth McGraw, professor and Huck Scholar in Entomology at Penn State, and formerly a professor at Monash University.

The team also discovered a gene within A. aegypti that is likely involved in Wolbachia's ability to interfere with viral replication, shedding light on a possible mechanism for the blocking trait.

According to McGraw, Wolbachia bacteria occur in many insects, including some mosquitoes, but they do not naturally occur in A. aegypti, the mosquito species that transmits dengue virus. The bacteria were introduced to A. aegypti in 2011 and since have been released in trials in several countries, including Australia, Brazil, Colombia, Vietnam and Indonesia, to assess the ability of the bacterium to control the spread of disease.

To investigate the stability of Wolbachia as a virus blocker within A. aegypti, McGraw and her colleagues used artificial selection in the laboratory to select for high and low dengue blocking abilities in A. aegypti. Next, they sequenced the genomes of these two groups, identified areas that differed and measured the insects' fitness--or ability to survive and reproduce. The team's results appear today (Aug. 26) in Nature Microbiology.

"There has been concern that dengue virus could evolve an ability to sneak past Wolbachia or that the insects themselves could evolve resistance to Wolbachia," said McGraw. "We found that mosquitoes exhibiting better blocking had increased fitness, at least under idealized conditions in the laboratory, suggesting the potential for natural selection to maintain blocking."

By comparing the genomes of high and low dengue blockers, the team identified a specific gene--AAEL023845, a cadherin protein--that likely is involved in virus blocking.

"We found that much of the genetic variation determining Wolbachia's anti-viral effect was within a mosquito gene controlling cell-to-cell adhesion," said Suzanne Ford, postdoctoral researcher at the University of Oxford and former postdoctoral scholar in entomology at Penn State. "Increased expression of this gene was associated with increased Wolbachia-mediated protection against viruses."

According to McGraw, a previous theory suggested that Wolbachia, as a foreign substance within mosquitoes, could be triggering the insects' immune systems, which in turn, could suppress virus activity. Another theory suggested that Wolbachia could be competing with viruses for nutrients or physical space within mosquitoes.

"Our data do not support either of these theories," said McGraw. "Instead, our results suggest that the cadherin gene may affect cell-to-cell signaling or movement of viruses within cells, altering the virus's ability to enter cells, replicate within them and then exit."

McGraw noted that researchers in the field have long struggled with the concept of Wolbachia-mediated blocking--what it is and how it works.

"Our findings give us a new direction to study with respect to the basis of Wolbachia's blocking of dengue virus," she said.

Ford added that the research is important to understanding the long-term stability of Wolbachia as a biocontrol agent against mosquito-borne viruses.

"By defining this mechanism and understanding how selection might act upon it," said Ford, "we will be better able to predict how effective Wolbachia will be as a biocontrol agent."

Credit: 
Penn State

Parasitic worms infect dogs, humans

image: Flinders University researcher Mira Beknazarova tested more than 270 dog faeces samples collected in remote Australian communities.

Image: 
Flinders University

A human infective nematode found in remote northern areas of Australia has been identified in canine carriers for the first time.

Flinders University environmental health researchers, with experts in the USA, have found a form of the soil-borne Strongyloides worm in faeces collected from dogs.

Strongyloidiasis, carried by several kinds of Strongyloides spp., is estimated to infect up to 370 million people around the world, mainly in places with poor sanitation in developing or disadvantaged communities.

The infection in the gut remains hidden, but can cause abdominal pain, diarrhoea and bloating. Some people may experience nausea, vomiting, weight loss, weakness or constipation but doctors may not check for the condition. In chronic infections, skin and chest symptoms can persist but the condition often goes unchecked.

The Australian study, involving 273 dog faeces samples compared to four human samples were screened using real-time PCR, of which 47 dog and four human DNA samples were then amplified by conventional PCR with further sequencing. The complete set of amplified sequence variants (ASVs) was then analysed.

"Ultimately, we were able to confirm for the first time that potentially zoonotic S. stercoralis populations are present in Australia and suggest that dogs might represent a potential reservoir of human strongyloidiasis in remote Australian communities," says Flinders PhD candidate Mira Beknazarova.

"Our study was able to independently support previous reports of at least two genetically distinct groups of S. stercoralis; one infecting both dogs and humans and another group that is specific to dogs," she says.

Flinders University environmental health senior lecturer Dr Kirstin Ross, who regularly consults with remote Indigenous communities in Australia, says the study does not show direct transmission from dogs to humans, or vice-versa, but builds on the need for further investigations into preventative measures including better sanitation.

"Much needs to be done if you consider the risk to older and younger residents of Indigenous and non-Indigenous communities in northern Australia who are at risk of infection," says Dr Ross, who has studied strongyloidiasis for many years.

"The latest results strongly support moves to treat both humans and dogs for the infection," she says.

The Australian study is similar to a separate study in Cambodia where Strongyloides was found in dog faeces.

The infective form of the worm, the larva, lives in soil which has been contaminated by faeces of an infected person. If a person comes in contact with this soil, the larvae can burrow through the person's skin and make its way to the lungs and then the gut where they eventually become adult worms.

Testing and treating for the larvae in human blood or faeces is often delayed by the slow acceleration of symptoms.

Credit: 
Flinders University

Identification of all types of germ cells tumors

Germ cell tumors constitute a diverse group of rare tumors, which occur in the testes, ovaries and also in other places. Some germ cell tumors exist prenatally, while others present during or after puberty. The majority is benign. Malignant germ cell tumor most frequently appear in the testicle of adolescent and young adult men. Fortunately, these are treatable; even when metastasized, survival rates are over 80 per cent.

Various germ cell tumors

As the name suggests, germ cell tumors develop from germ cells: sperm cells, oocytes and their progenitors. When an oocyte is fertilized, the development of an embryo is initiated. The fertilized oocyte has the potential to become any other type of cell. Early in development cells lose this ability, and the number of cell types they may become is restricted. 'However, in a new individual there need to be new germ cells that have the potential to become all cell types', explains prof. dr. Leendert Looijenga (Princess Máxima Center and Erasmuc MC). 'These cells carry the potential danger of become germ cell tumors by spontaneously starting embryonal development.'

Different molecular mechanisms need to prevent spontaneous embryonal development by germ cells, since these would result in tumors. Sometimes things go awry in one of these protective mechanisms. 'Dependent on which mechanism is affected, we distinguished until recently five types of germ cell tumors', says Looijenga. By prolonged studies Looijenga and his colleague prof. dr. Wolter Oosterhuis (Erasmus MC) gained new insights and expanded the amount of tumor types to seven. They described their finding in the scientific magazine Nature Reviews Cancer.

Expansion of the tumor types

One of the newly described categories is type 0. 'This is in fact an underdeveloped Siamese twin', says Oosterhuis. 'At the points of attachment of the Siamese twins, a second embryo manifests as germ cell tumor. The baby is born with the tumor.'

Type 0 and the other five, earlier described types differ from other types of cancer in that they do not arise as a result of mutations in the DNA, but are caused instead by failure of the mechanism that should prevent spontaneous embryonal development.

Germ cell tumor type VI is the only type that is caused by mutations in the DNA. Very seldom that occurs spontaneously in the body, but more often adult cells are genetically modified in the laboratory for therapeutic purposes, for instance regenerative medicines. 'These cell may only be safely used, when we can prevent the development of germ cell tumor type VI', states Looijenga.

The new insight into the development of germ cell tumors and the division based upon that constituted a solid ground for future epidemiologic, fundamental and clinical research. Moreover, the division is highly valuable in direct patient care. 'Treatment sensitivity and also resistances can be mapped more reliably. This may contribute to improved treatments in the future,' says Looijenga.

Credit: 
Princess Máxima Center for Pediatric Oncology

An innovative new diagnostic for Lyme disease

image: Adult deer tick, Ixodes scapularis. Ticks infected with the bacterial pathogen B. burgdorferi can transmit the pathogen to humans, resulting in Lyme disease.

The illness, which may not be immediately detected, causes fever, headache, muscular issues and fatigue. If untreated, rheumatologic, cardiac and neurological complications can result. Currently, there is no vaccine against the illness.

Image: 
Scott Bauer. PD-USGov-USDA-ARS

When researchers examined the mitochondrial DNA of Ötzi, a man entombed in ice high in the Tyrolean Alps some 5,300 years ago, they made a startling discovery. Secreted within the tangles of the ice man's genetic code was evidence he'd been infected with a bacterial pathogen, Borrelia burgdorferi. Ötzi is the first known case of Lyme disease.

Today, Lyme disease is a mounting health concern, with estimates of over 300,000 cases in the US annually. The illness, which produces a constellation of symptoms, has been notoriously tricky to diagnose.

In new research, Joshua LaBaer, executive director of the Biodesign Institute at Arizona State University and his colleagues describe an early detection method for pinpointing molecular signatures of the disease with high accuracy.

Effective treatments for infectious disease usually require early identification of the suspected pathogen or a diagnostic signal of its presence in fluids or tissues, known as a biomarker. This goal, however, has proven elusive in the case of Lyme disease. Existing methods for diagnosis are imprecise and often require days or weeks for test results. Currently, there is no vaccine against the disease.

The new technique, which uses a multi-platform approach to isolate candidate biomarkers for B. burgdorferi, may be applicable to a broad range of infectious diseases. The study uncovers six potential biomarkers that may be used in combination to make an accurate and early identification of Lyme disease.

While the research was conducted in an animal model of the disease and will require final validation in humans, the study results suggest the method may offer a powerful new approach to identifying this enigmatic illness, as well as other diagnostically challenging infectious diseases.

"This study represents a growing new paradigm for medical diagnostics," LaBaer says. "Increasingly, we have realized that the old approach of measuring a single analyte as an indicator for all infected individuals does not work for many pathogens. Ironically, the old approach was partly established with another spirochete, Treponema pallidum (syphilis), where a single antibody was sufficient to identify most cases. But we now see that many pathogens are not so simple because different patients sometimes respond differently and require a multi-platform approach like this to find multiple markers. A single test would miss too many patients."

In addition to directing the Biodesign Institute, LaBaer is the director of the Biodesign Virginia G. Piper Center for Personalized Diagnostics. He is joined by Biodesign colleagues D. Mitchell Magee and Lusheng Song.

David P. AuCoin of the Department of Microbiology and Immunology, Reno School of Medicine devised and oversaw the study and is the paper's corresponding author.

The research recently appeared in the journal Frontiers in Cellular and Infection Microbiology.

Lurking in the underbrush

For a variety of reasons, Lyme disease, a potentially serious illness transmitted by blacklegged ticks, has been on an alarming ascent. Warmer winters have allowed ticks to become active earlier in the season and to colonize new regions, while changing patterns of land use have facilitated the reproduction and spread of these insects.

In the northeastern, mid-Atlantic, and north-central United States, Lyme disease is spread by the blacklegged tick or deer tick, Ixodes scapularis, whereas the Western blacklegged tick Ixodes pacificus, spreads the disease on the Pacific Coast.

Depending on their species and stage of development, blacklegged ticks can feed on birds, reptiles or amphibians but acquire the bacterial pathogen responsible for Lyme from mammals, when ticks feed on the blood of infected hosts.

Late spring and summer present the highest risk for transmission of Lyme disease. During these months, young ticks in their nymphal stage are active and in search of blood, which is essential to their growth and survival.

Ticks in the nymphal stage are tiny--about the size of a poppyseed and are often difficult to detect. Unlike many other insects, ticks are not capable of flying or jumping. Instead, they rest patiently on the tips of grasses and shrubs in a position known as "questing", attaching their lower legs to leaves or grass while keeping their upper legs outstretched in search of a suitable host. When the host passes, the waiting tick climbs aboard and begins feeding on the host's blood.

The saliva of ticks is a complex cocktail of ingredients, including analgesics that help prevent detection of the bite as well as chemicals inducing vasoconstriction, anticoagulants, anti-inflammatory histamines and immune-modulating agents.

Tracking infection

In humans, a bullseye-shaped rash known as erythema migrans often follows infection with the bacterium and appears at the site of the tick bite. In some cases, however, no rash occurs. Following infection, a variety of symptoms may ensue, including fever, headache, muscular issues and fatigue.

Generally, Lyme disease can be successfully treated with a course of antibiotics lasting several weeks, provided the disease is detected early enough. This is where the challenge arises. Often, a diagnosis of Lyme only occurs well after symptoms appear, and the peculiar variety of Lyme indicators results in frequent misdiagnosis. If left untreated, the effects of Lyme disease can become long-lasting and severe and may include rheumatologic, cardiac and neurological complications.

Unlike many other pathogens, the bacterial spirochete responsible for Lyme disease cannot easily be grown in the lab, making proper diagnosis dependent on clinical assessment, rather than on laboratory culturing of bacteria.

Presently, there is no consistently reliable signature of infection or biomarker for Lyme disease. Instead, existing assays test for the immune system's production of antibodies to the bacteria, a time-consuming and inexact method requiring specialized laboratory analysis and several weeks for proper identification.

Detection of Lyme disease based on patient immune response has several drawbacks:

The time required to produce an antibody response following infection can delay treatment by several weeks.

Not all infected patients register as positive in such tests, requiring further testing for confirmation.

Finally, such tests may not distinguish between new infections and those previously treated.

Such delays and inaccuracies are a serious drawback for Lyme detection as they can allow for dissemination of the disease throughout the body, resulting in potentially dangerous consequences. Further, existing analyses that can easily miss cases of Lyme disease when antibody production is below detectable limits or produce false positives in uninfected cases.

The new study attempts to establish reliable biomarkers for Lyme disease with a multi-platform approach. Methods of both direct and indirect detection were used, including mass spectrometry analysis and protein microarrays, which can capture the binding of specific disease antigens present in blood.

The use of multiple platforms enabled the discovery of six candidate biomarkers for Lyme disease in overlapping samples and tests. Direct detection of proteins is made using mass spectroscopy (MS). The indirect diagnostic method ferrets out microbial antigens to B. burgdorferi in blood or potentially, in urine, using a step known as immunoprecipitation. This technique of biomarker enrichment allows researchers to pinpoint proteins circulating during infection below the levels detectable by MS--a crucial advantage particularly during the disease's earliest phases.

Further validation will be needed before the new technique can find its way into clinical use but a successful biomarker test for Lyme disease would be an impressive feat and may open a new path for diagnosing a broad range of infectious diseases that are resistant to detection by conventional means. Indeed, the multiplatform technique outlined in the new study is currently being used to explore antigenic biomarkers for a pair of diagnostically challenging infectious pathogens, Francisella tularensis and Burkholderia pseudomallei.

Credit: 
Arizona State University