Culture

Fonts in campaign communications have liberal or conservative leanings

image: Political signs in front yard

Image: 
Kristi Blokhin/iStock.

Yard signs for a local politician captured the curiosity of Katherine Haenschen.

"I was driving through the region and noticed the same campaign was using a different font on signs in rural areas than on the signs in town," said Haenschen, an assistant professor in the Department of Communication. "I thought, why would this candidate be using multiple fonts?"

An expert in political messaging, Haenschen and Daniel Tamul, also an assistant professor in the Department of Communication, transformed the question into a captivating research project.

"What's in a Font?: Ideological Perceptions of Typography" questions the potential impact on voters if fonts are found to have political attributes.

Haenschen and Tamul reached the following key conclusions through the study:

Individuals perceive fonts to have liberal or conservative leanings.

The more people view a font as aligned with their ideology, the more they favor it.

Fonts that fall under the serif category -- ones festooned with a small line or stroke -- are viewed as more conservative than fonts in the sans serif group, though differences exist within font families.

"This research is of interest to anyone who cares about political communications, and the results have clear implications for political campaign professionals," said Haenschen. "When you're choosing a candidate's visual identity, you need to consider how people perceive that font."

The findings came from two survey experiments. The first used typeface classification, such as serif or sans serif, and typeface styles (regular, bold, italic).

A total of 987 survey participants read the phrase "the quick brown fox jumped over the lazy dog" presented in each typeface style, and two typefaces representing the serif and sans serif categories: Times New Roman and Gill Sans.

The respondents then rated the typeface as liberal or conservative, and answered several demographic measures related to their political ideology, party affiliation, age, gender, and race.

During the second experiment, Haenschen and Tamul used a wider range of typefaces, including multiple typefaces within the same font family.

Participants read a phrase or a name written in one of two serifs (Jubilat or Times New Roman), one of two sans serifs (Gill Sans or Century Gothic), and one display font (Sunrise, Birds of Paradise, or Cloister Black Light).

The researchers said they chose the Jubilat font because it was used in Sen. Bernie Sanders' 2016 presidential bid, and Century Gothic because of its close approximation to the Gotham font used by former President Barack Obama during his 2008 campaign.

Overall, results showed that typefaces, typeface classifications, and typeface styles are perceived to have different ideological leanings, and partisanship moderates ideological perception.

Haenschen emphasized that the exploratory story suggests many avenues for further research.

"This study shows that font plays a role in American political communication, conveying ideology through the anatomy of its letterforms," said Haenschen. "Through this research, we lay the groundwork for future studies that may identify relationships between fonts and persuasive outcomes in political communication."

Credit: 
Virginia Tech

Study shows effects of Chinese divorce law on women's wellbeing

In 2011, China's Supreme Court dealt a blow to the property rights of women by ruling that family homes purchased before marriage automatically belong to the registered buyer upon divorce, historically the husband.

Previously, under China's 1980 Marriage Law, marital houses were considered joint property. While gender neutral in its language, the 2011 ruling seemed likely to advantage men over women since most family homes in China are deeded to husbands, who by custom are expected to provide a house as a prerequisite for marriage. The new interpretation, which overruled two previous judicial rulings strengthening women's property rights, raised concerns that China was regressing on gender equality.

In a new study, Yale sociologist Emma Zang examined the consequences of the 2011 judicial interpretation on the wellbeing of men and women. Published in the Journal of Marriage and Family, it found that while the judicial interpretation initially diminished women's wellbeing by depriving them of property rights and economic autonomy, the negative effects weakened over the long term.

Zang's analysis showed that couples began adapting to the reform through arrangements more in line with Chinese tradition mandating that married couples share property equally. She found, for example, that couples circumvented the ruling by transferring ownership to their children.

It's not a simple case of men benefitting and women being harmed. Rather, couples are adapting to protect each other's wellbeing.

"The effects of the legal change are more complicated than people thought," said Zang, an assistant professor of sociology at Yale. "It's not a simple case of men benefitting and women being harmed. Rather, couples are adapting to protect each other's well being while adhering to China's "bride price" custom, which calls on husbands to provide family homes, but share their property equally with their wives."

Zang's analysis is based on data from the 2010, 2012, 2014, and 2016 waves of the China Family Panel Studies, a nationally representative longitudinal survey conducted by the Institute of Social Science Survey at Peking University, which collects information at the individual, family, and community levels and examines social and economic changes. The time frame allowed Zang to analyze people's well being before the judicial ruling through five years after it went into effect.

She identifies four socioeconomic factors that drove the court's decision to amend China's divorce law: First, people were acquiring unprecedented wealth amid the booming Chinese economy, leading to the establishment of property laws in 2007. Next, housing prices increased more than 10% annually beginning in 2003, meaning people were investing more than ever in homes. Third, divorce rates started climbing, raising concerns among men's families about losing household wealth through divorce.

Finally, social media started reporting divorce cases involving property disputes, which made people question the credibility of the court system.

The study demonstrates that policymakers must consider that purportedly gender-neutral policies... can create gendered consequences.

The court ruling had potentially profound effects in a country of about 1.3 billion people with marriage and home-ownership rates of 73% and 90%, respectively. The decision potentially affected up to 890 million individuals, Zang explained.

She found that, in the short term, the reform significantly harmed women's wellbeing. Her analysis showed that the change caused a 1-point decrease on a 5-point life satisfaction scale for 1 in 15 married women a substantial outcome given the vast number of individuals the reform potentially affected, Zang said. The negative effect was particularly large for women at a high risk of divorce, she said. She determined that the ruling had no significant effects, positive or negative, on men's wellbeing.

In the long run, Zang found that couples adapted to lessen the reform's impacts on wives. About 9% of people in the study added the wives' names to the deeds, more than 3% transferred ownership from husband to wife, and 29.5% transferred ownership to their children. Overall, about 42% of individuals surveyed changed ownership status of their homes within five years of the ruling.

Despite these adaptations, the change to the divorce law has lingering costs, Zang said.

Most women did not fully regain their property rights, she said. "The reform also has led to women doing more housework, which leaves them less leisure time. The study demonstrates that policymakers must consider that purportedly gender-neutral policies, like the change to China's divorce law, can create gendered consequences. Social norms and cultural context must be taken into account when pursuing these kinds of reforms."

Credit: 
Yale University

Increasing opportunities for sustainable behavior

To mitigate climate change and safeguard ecosystems, we need to make drastic changes in our consumption and transport behaviors. A new IIASA study shows how even minor changes to available infrastructure can trigger tipping points in the collective adoption of sustainable behaviors.

Decades of research in social and ecological psychology, cognitive science, ecology, and cultural evolution has shown that human behavior is influenced by our environments, habits, skills, and attitudes. These behaviors, in turn, alter our environments and can be socially learned and transmitted. It is however less clear how all these processes work together to shape the evolution of sociocultural and socioecological systems. Understanding this is important given that we need radical, systemic change in human behaviors and cultures to reach the Sustainable Development Goals, mitigate climate change, and guard the ecosystems that serve as our life-support systems.

In a new study published in the journal One Earth, IIASA researchers explored how collective behavior patterns emerge systemically as a product of personal, social, and environmental factors. Using an agent-based model--a computational method for simulating interactions between individuals and environments--the study illustrates how personal aspects like attitudes and habits, social networks, and available infrastructure shape the way sustainable behaviors are collectively adopted. The study especially emphasizes the environmental aspect by examining how changes in opportunities to behave sustainably--such as increases in the number of bicycle lanes in a city--affect the adoption of sustainable behaviors like cycling. The researchers used Copenhagen, a city known for its well-developed cycling culture, as a case study. The model was empirically validated by modeling the evolution of cycling and driving patterns in the city.

The results show that even linear increases in opportunities for pro-environmental behaviors--in Copenhagen's case, adding more bicycle friendly infrastructure--can have much larger effects on the adoption of sustainable behaviors than often assumed. This is because when the environment makes it easier for someone to adopt a certain behavior, this not only has an effect on the individual's own habits, but the behavior can also be copied and learned by others. In Copenhagen specifically, a rapid increase in the proportion of cyclists in the city since the 1990s has been attributed not only to the emergence of a cycling culture, but also to heavy investment into cycling infrastructure.

"The drastic effect that the physical environment, for instance, cycling infrastructure, has on our collective behavior patterns is surprising. Even minor changes in the structure of the environment can trigger so-called "tipping points" or "phase transitions" in the collective adoption of sustainable behaviors like cycling. Reaching such tipping points is precisely what is needed to enact society-wide behavior change," explains study lead-author Roope Kaaronen, who worked on the study as a participant of the 2019 IIASA Young Scientists Summer Program. Kaaronen is currently a PhD student at the University of Helsinki's Institute of Sustainability Science.

The authors point out that for large-scale behavior change to occur, it will simply not be enough to rely on changing people's attitudes or increasing environmental awareness.

"We need to understand how behavior patterns emerge from a systems perspective, and learn to locate the leverage points in these systems. The importance of infrastructure that make pro-environmental behaviors easy and the "path of least resistance" is crucial in this regard and must form part of governments' action plans in terms of sustainable future urban planning and development," says study supervisor and coauthor Nikita Strelkovskii--a researcher in the IIASA Advanced Systems Analysis Program.

It is clear that we have to start designing our everyday environments in ways that make sustainable behaviors the default option and as easy as possible. According to the authors, many (if not most) European cities are currently struggling with this, and city planners are instead filling cities with shopping malls and hypermarkets that encourage unsustainable consumerist behavior. This study illustrates that changes in the action opportunities of everyday environments can act as leverage points that are an important step in understanding how to instigate collective behavior change through urban policy and design.

Credit: 
International Institute for Applied Systems Analysis

Stanford researchers conduct census of cell surface proteins

image: Olfactory projection neurons (red) are visible against the outline of a Drosophila fly brain (blue).

Image: 
Liqun Luo

Stanford scientists have completed the first global census of diverse proteins sprouting tree-like from the outer membrane of a cell. These cell surface proteins govern how cells interact with one another and how they assemble into tissues and organs such as hearts and brains.

"Cell architecture is determined by cell-cell interactions, and these interactions are mediated by molecules on the cell surface. They're the business end of things," said Jiefu Li, a graduate student in the biology department at Stanford University.

Knowing what proteins are embedded on cell surfaces and what they do is a first step toward understanding how individual cells fashion themselves into intricate structures like organs--and how to prevent or treat diseases that occur when these molecular social cues are absent or misread.

"Our brains are made up of 100 billion neurons that make 100 trillion synaptic connections," said Liqun Luo, the Ann and Bill Swindells Professor in the School of Humanities and Sciences at Stanford. "How do brain cells know who to make connections with during development? That's a fundamental question in neurobiology."

Luo, Li, and their colleagues took an important new step toward answering this question in a study published online on January 16 in the journal Cell. In it the scientists report surveying cell surface proteins on a specific type of brain cell in fruit flies, whose neurons closely mimic our own.

By comparing the brains of young and adult flies, the scientists showed how the population of proteins changes over time as the insects mature and the needs of the cells change.

An agnostic approach

A coauthor on the study, Alice Ting, a professor of biology and genetics, said the big difference between this approach and previous efforts to understand cell surface proteins lies in the fact that the group didn't know what they were looking for.

In the past, the search for new cell surface proteins was heavily influenced by proteins that had been discovered before. If a cell surface protein was known to help similar neurons connect with one another, for example, scientists would explore other proteins in the same molecular family to see whether they had similar properties. This approach worked, but it limited the likelihood of encountering completely new cell surface proteins.

"What's cool here is that we're not looking only for proteins that resemble known cell surface proteins with the functions that we're interested in," said Ting, who is also a Chan Zuckerberg Biohub investigator. "We can go in without any prior assumptions and just see what turns up."

This powerful agnostic approach was made possible through "proximity labeling," a technique pioneered by Ting's group that involves genetically engineering proteins to mark their closest neighbors with a unique molecular tag.

"It's like spray painting," Ting said. "When spray painting, the paint distribution is going to be densest in the regions closest to your paint bottle."

Biotin

For the new study, Luo's lab employed proximity labeling to mark all of the cell surface proteins on a particular group of neurons in the brains of developing and adult fruit flies.

The scientists used a special strain of flies bred to produce a catalytic protein, or enzyme, borrowed from horseradish plants. This enzyme, called peroxidase, functions like an unopened can of spray paint and only resides on the surface of a particular kind of neuron in the fly brains that the scientists wanted to study.

At different points in the flies' development, the scientists removed the brains and did the molecular equivalent of triggering the spray cans to paint all the nearby proteins. This involved immersing the cells in chemical baths that caused the horseradish proteins to mark their cell surface neighbors with a molecule called biotin.

Next, the researchers broke apart the fly brains and their constituent neurons, and pulled out all the proteins that got sprayed - or, rather, that contained the biotin tag. "The beauty here is that you're only enriching the cell surface proteins and ignoring all of the proteins inside the cell," said Luo, who is also an investigator at the Howard Hughes Medical Institute.

This study was the first time the researchers used the technique in a living organism rather than in a petri dish. "To get this to actually work in vivo"--in intact brains of live organisms--"was quite a technical leap," Ting said.

Beyond the lamppost

In relatively few steps, the scientists were able to conduct a complete census of the more than 700 different kinds of protein that dot the surface of olfactory projection neurons in fruit flies. The team's haul included many cell surface proteins that had been painstakingly identified previously using other techniques.

Even more intriguing, their survey also turned up 20 new cell surface proteins in the brains of the developing flies, which subsequent experiments by the team revealed to be important for brain wiring.

"This shows that this approach is really unbiased," Ting said. "It allows you to look away from under the lamppost to find proteins that you may miss otherwise."

Credit: 
Stanford University - School of Humanities and Sciences

Inequality is bad for society, economic prosperity good

image: Prof. Dr. Jan Delhey is the first author the Chair for Macrosociology at the University of Magdeburg

Image: 
Harald Krieg

Rich countries vary a lot when it comes to health and social problems. A comparison of social ills ranging from intentional homicides to obesity rates in 40 rich societies shows that Asian and European countries fare much better than Anglophone and Latin American countries. The most problem-ridden countries are Trinidad and Tobago, Uruguay and the United States. The positive end of the list is headed by Japan, South Korea and Singapore, followed by Iceland, Norway and Switzerland. Germany ranks 15th just behind Austria. While economic inequality is associated with more social ills, economic prosperity dampens them.

These are the results of a study conducted by a team of sociologists at the Otto von Guericke University Magdeburg (OvGU) in Germany. Prof. Jan Delhey and Leonie Steckermeier (MA) investigate for 40 high-income countries from all world regions, whether income inequality and national prosperity can help understand why some countries are more problem-ridden than others.

In a cross-national comparison, countries with a bigger income gap between rich and poor indeed have more social ills. Inequality is bad for society as it goes along with weaker social bonds between people, which in turn makes health and social problems more likely. At the same time, richer countries have less social ills. Economic prosperity goes along with stronger social bonds in society and thereby makes health and social problem less likely. "This is the main reason behind the geographic pattern we found, with social ills being more widespread in the Americas and the Anglophone New World countries, and less widespread in European and particularly Asian countries", explains Jan Delhey, first author of the research paper.

The good news is that in most countries social ills improved somewhat between 2000 und 2015, although it is difficult to pin down why. In Europe at least, rising prosperity seems to have led to better societies with less social ills, but for the non-European countries is remains unclear why levels of social ills changed. "This shows that other factors beyond income inequality and economic prosperity play a role in the development of social ills, too. Still, our results prompt scholars as well as the public to re-think the widespread negative image of contemporary society. In many countries, there is small progress towards a better society with less social ills" explains Leonie Steckermeier, co-author of the study.

The empirical analysis was based on a set of six social ills, namely low life expectancy, infant mortality, and obesity as health issues, and intentional homicides, teenage pregnancy, and imprisonment rate as social problems. The data were compiled from international sources such as the Worldbank and the World Health Organization for the years from 2000 to 2015. The structure of the compiled dataset allows to compare health and social problems between countries and across time. The research was carried out as part of the project "Inequality, Status Anxiety and Social Ills" at the Chair for Macrosociology at the OvGU and funded by the German Science Foundation (DFG).

Credit: 
Otto-von-Guericke-Universität Magdeburg

The Blue Acceleration: Recent colossal rise in human pressure on ocean quantified

image: Global trends in use of the marine environment. Usage reached an inflection point around the turn of the new millennium.

Image: 
One Earth,

Human pressure on the world's ocean accelerated sharply at the start of the 21st century and shows no sign of slowing, according to a comprehensive new analysis on the state of the ocean.

Scientists have dubbed the dramatic rise the "Blue Acceleration". The researchers from the Stockholm Resilience Centre, Stockholm University, synthesized 50-years of data from shipping, drilling, deep-sea mining, aquaculture, bioprospecting and much more. The results are published in the journal One Earth, 24 January.

The scientists say the largest ocean industry is the oil and gas sector, responsible for about one third of the value of the ocean economy. Sand and gravel are the ocean's most mined minerals to meet demand from the construction industry. As freshwater become an increasingly scarce commodity, around 16,000 desalination plants have sprung up around the world in the last 50 years with a steep rise since 2000, according to the analysis.

Lead author Jean-Baptiste Jouffray from the Stockholm Resilience Centre said, "Claiming ocean resources and space is not new to humanity, but the extent, intensity, and diversity of today's aspirations are unprecedented"

The industrialization of the ocean took off at the end of the last century, driven by a combination of technological progress and declining land-based resources.

"This Blue Acceleration is really a race for ocean resources and space, posing risks and opportunities for global sustainability"

The study highlights some positive human impacts. For example, the area protected from some exploitation has increased exponentially with a surge since 2000 that shows no signs of slowing. And offshore wind farm technology has reached commercial viability in this period allowing the world to reduce reliance on fossil fuels.

The authors conclude by calling for increased attention to who is driving the Blue Acceleration, what is financing it and who is benefiting from it? The United Nations is embarking on a "decade of the ocean" in 2021. The scientists say this is is an opportunity to assess the social-ecological impacts and manage ocean resources for long-term sustainability.

They highlight there is a high degree of consolidation relating the seafood industry, oil and gas exploitation and bioprospecting with just a small handful of multinational companies dominating each sector. The team suggests that banks and other investors could adopt more stringent sustainability criteria for ocean investments.

Credit: 
Stockholm Resilience Centre

The highways of our brain

Researchers from the Netherlands Institute for Neuroscience (NIN) used a new technique to show how electrical impulses are traveling with high speed in the brain. It appears that myelin, the sheath around neurons, creates a coaxial cable producing multiple waves of electrical potentials traveling in a more complicated manner than was envisioned earlier. These findings allow us to create better theories and tools to understand demyelinating diseases, including the most common neurological disorder, multiple sclerosis. The paper has been published in the prestigious scientific journal Cell.

The brain consists of around one hundred billion neurons. All these neurons have to communicate with each other. This happens by means of exchanging electrical impulses traveling at velocities of up to 360 km/h. "We know this requires the presence of myelin sheaths, consisting of multiple layers of fatty material wrapped around the nerve cell extensions. Myelin is often conceptualized as being an insulator that leads to the "jumping" of electrical potentials along the cables that we could see as the 'highways of our brain', but the mechanisms of jumping were not understood. However, this research opens new avenues to understand the hardware of the brain in terms of how they compute with rapid signal transfer," says professor Maarten Kole.

12 nanometers

Together with researchers of the Max-Planck Institute (MPI) of Experimental Medicine (Göttingen, Germany), the researchers used electron microscopy to measure the distance between the nerve cell membrane and the insulating sheath, which turned out to be 12 nanometers, approximately 10,000 times thinner than a hair. Furthermore, the scientists of the NIN used a new technique to make electricity visible and took advantage of a supercomputer to calculate the specific properties of myelin sheaths. "All the findings together showed that instead of being an insulating sheath, myelin creates an additional layer like coaxial cables producing multiple waves of electrical potentials travelling in a more complicated manner than was envisioned earlier", Kole explains. These findings open new avenues to understand the hardware of how brains are computing with rapid signal transfer.

Multiple Sclerosis

This research also will help to better understand demyelinating diseases such as multiple sclerosis (MS). In patients with MS, myelin sheaths are broken down. This leads to an increasing degree of limitations that affect strength, balance and coordination, and thus the patient's mobility. In order to be able to cure and prevent MS, it is important to know the exact way the myelin sheath functions in order to predict what happens if it doesn't function as it should. "Our work now may provide reliable predictions of how impulses travel along the highways without myelin. This finding contributes to the understanding of the cellular changes occurring in MS," says Kole.

Credit: 
Netherlands Institute for Neuroscience - KNAW

Horror movies manipulate brain activity expertly to enhance excitement

image: Top ten scariest movies of the past century

Image: 
Lauri Nummenmaa

Finnish research team maps neural activity in response to watching horror movies. A study conducted by the University of Turku shows the top horror movies of the past 100 years, and how they manipulate brain activity.

Humans are fascinated by what scares us, be it sky-diving, roller-coasters, or true-crime documentaries - provided these threats are kept at a safe distance. Horror movies are no different.

Whilst all movies have our heroes face some kind of threat to their safety or happiness, horror movies up the ante by having some kind of superhuman or supernatural threat that cannot be reasoned with or fought easily.

The research team at the University of Turku, Finland, studied why we are drawn to such things as entertainment? The researchers first established the 100 best and scariest horror movies of the past century (Table 1), and how they made people feel.

Unseen Threats Are Most Scary

Firstly, 72% of people report watching at last one horror movie every 6 months, and the reasons for doing so, besides the feelings of fear and anxiety, was primarily that of excitement. Watching horror movies was also an excuse to socialise, with many people preferring to watch horror movies with others than on their own.

People found horror that was psychological in nature and based on real events the scariest, and were far more scared by things that were unseen or implied rather than what they could actually see.

- This latter distinction reflects two types of fear that people experience. The creeping foreboding dread that occurs when one feels that something isn't quite right, and the instinctive response we have to the sudden appearance of a monster that make us jump out of our skin, says principal investigator, Professor Lauri Nummenmaa from Turku PET Centre.

MRI Reveals How Brain Reacts to Different Forms of Fear

Researchers wanted to know how the brain copes with fear in response to this complicated and ever changing environment. The group had people watch a horror movie whilst measuring neural activity in a magnetic resonance imaging scanner.

During those times when anxiety is slowly increasing, regions of the brain involved in visual and auditory perception become more active, as the need to attend for cues of threat in the environment become more important. After a sudden shock, brain activity is more evident in regions involved in emotion processing, threat evaluation, and decision making, enabling a rapid response.

However, these regions are in continuous talk-back with sensory regions throughout the movie, as if the sensory regions were preparing response networks as a scary event was becoming increasingly likely.

-Therefore, our brains are continuously anticipating and preparing us for action in response to threat, and horror movies exploit this expertly to enhance our excitement, explains Researcher Matthew Hudson.

Credit: 
University of Turku

Deciphering the sugar code

image: Researchers discover vaccine to strengthen the immune system of plants.

Image: 
Sruthi Sreekumar

Like animals and humans, plants possess a kind of immune system. It can e.g. recognize pathogenic fungi by the chitin in their cell walls, triggering disease resistance. Some fungi hide from the immune system by modifying some of the chitin building blocks, converting chitin into chitosan. Researchers of the University of Münster now found that plants can react to a certain pattern in this chitosan, stimulating their immune system. They are already developing a chitosan-based plant immune-stimulant in order to reduce the use of chemical pesticides in agriculture. Their results are published in JACS (Journal of the American Chemical Society).

Background

Chitosans, so-called polysaccharides, are probably the most versatile and promising functional biopolymers. Chitosans can make plants resistant to diseases, promote their growth, and protect them from heat or drought stress. Under chitosan dressings, even large wounds can heal without scars, chitosan nanoparticles can transport drugs across the blood/brain barrier, and chitosans can replace antibiotics in animal fattening as antimicrobial and immunostimulating feed additives. But of course, chitosans are not miracle cures either. "There are many different chitosans and for each individual application, exactly the right one must be found to make it work. Until now, we knew far too little about their effects and how they can be used effectively. With our research, we have now come a step closer to this understanding", explains Prof Bruno Moerschbacher from the Institute for Biology and Biotechnologies of Plants at Münster University.

Chitosans consist of chains of different lengths of a simple sugar called glucosamine. Some of these sugar molecules carry an acetic acid molecule, others do not. Chitosans therefore differ in three factors: the chain length and the number and distribution of acetic acid residues along the sugar chain. For about twenty years, chemists have been able to produce chitosans of different chain lengths and with different amounts of acetic acid residues, and biologists have then investigated their biological activities. Thus, an understanding slowly developed of how these two factors influence the antimicrobial or plant-strengthening effect of chitosans. Such well-characterized chitosans, now called second-generation chitosans, are currently used as the basis for new chitosan-based products such as the plant biostimulant "Kitostim" which was developed based on the research results of the Münster team. It promotes growth and development of plants, and it strengthens them against disease and heat stress.

Bruno Moerschbacher suspected early on that the third structural factor, the distribution of acetic acid residues along the sugar chain, also plays a decisive role in determining biological activities. However, this hypothesis could not be tested for a long time because the acetic acid residues are randomly distributed in all chemically produced chitosans. As biochemists and biotechnologists, the members of his team have therefore used enzymes for the production of chitosans, i.e. the natural 'tools' involved in the biosynthesis of chitosan in chitosan-containing fungi. With their help, they have now succeeded in producing short chitosan chains, so-called oligomers, with a defined arrangement of acetic acid molecules, and tested their bioactivity.

For this test, the researchers used rice cells that they treated with chitosan oligomers to stimulate their immune system. When they used chitosan oligomers consisting of four sugar units (so-called tetramers) carrying only a single acetic acid residue, they found that the tetramer with the acetic acid residue at the first ('left-most') sugar unit (the so-called non-reducing end) had a strong immunostimulating effect, while the other three tetramers were less active or inactive. Thus, very clear differences in bioactivity were found between chitosans with the same chain length (four) and the same number of acetic acid residues (one) when they differed in the position of the acetic acid residue. The researchers led by Bruno Moerschbacher are currently testing the use of this tetramer as a kind of vaccine that stimulates the plants' natural immune system.

Outlook

Such a clear dependence of the bioactivity of a complex sugar on its molecular structure has almost never been observed before. The first and to date only example was human heparin, whose anticoagulant effect is based on a certain distribution of sulphuric acid residues along the sugar chain. It is now known that heparin achieves this effect by binding a coagulation factor to this specific binding site, thus inactivating it. And on the basis of this knowledge, it has been possible to develop anticoagulants with precisely dosed effects and without side effects, which are a blessing for e.g. dialysis patients. "It is now our hope that the precisely defined chitosans can be used in a similar way to enable, for example, scar-free wound healing under chitosan dressings," said Bruno Moerschbacher, whose research group is already collaborating with dermatologists and other biomedical experts.

Credit: 
University of Münster

Registry data -- of sufficient quality -- suitable for extended benefit assessment of drugs

Particularly in the case of accelerated drug approvals and drugs for rare diseases (orphan drugs), the evidence available at the time of market access is often insufficient for the early benefit assessment of drugs. Often, the studies are too short or no data on patient-relevant outcomes were collected. Comparisons with the German standard of care are also often lacking. In order to close such evidence gaps, in future, routine practice data are also to be included in early benefit assessments of drugs.

But how must the data be collected and processed so that they can be used by the Federal Joint Committee (G-BA) for benefit assessments in Germany? In order to answer this question, the G-BA commissioned the Institute for Quality and Efficiency in Health Care (IQWiG) to develop scientific concepts for the generation of routine practice data and their analysis for benefit assessments of drugs - especially with regard to the option of quantifying the added benefit of a new drug. According to the "Gesetz für mehr Sicherheit in der Arzneimittelversorgung" (GSAV, Law for More Safety in the Supply of Medicines), the G-BA may in future commission the collection of routine practice data on selected drugs to support the quantification of added benefit.

Summarizing the most important result of the IQWiG analysis, Jürgen Windeler, IQWiG's Director, notes: "Extensive analyses of the methodological literature and intensive discussions with registry operators and external statisticians have led us to the conclusion that, in the case of high-quality patient registries, it is possible to base studies on these registries and use the routine practice data collected for extended benefit assessments of drugs."

Such registry studies can be conducted either with or without randomization, but the high quality of the data is the decisive factor in both cases.

In order to support the individual registries in particular and the registry landscape in Germany in general in the collection of routine practice data, on the basis of current national and international recommendations, IQWiG compiled criteria for data quality and for ensuring data quality for routine practice data collections for benefit assessments of drugs, condensed them to the essentials, and organized them in a clear and concise manner. In addition, the rapid report provides registry operators, sponsors of registry studies as well as health policy decision-makers with specific recommendations for action on how the collection of routine practice data in registries can be made usable for benefit assessments of drugs.

Focus on collection of routine practice data in registries

Routine practice data are data collected within the context of usual health care in patient populations that can receive the drug under assessment in the approved therapeutic indication. The data can be collected in studies with or without randomization.

In their rapid report, the IQWiG authors describe that the use of routine practice data for benefit assessments of drugs mandatorily requires a comparison between the new drug and the comparator therapy specified by the G-BA, which makes it necessary to conduct comparative studies. In general, four data collection tools are available for comparative studies: study-specific data collection as well as data collection from registries, electronic patient records, and claims data of health insurance funds.

The IQWiG authors are convinced that the collection and processing of routine practice data from electronic patient records and claims data from health insurance funds is currently not possible with regard to benefit assessments of drugs and will not be possible in the near future. This is mainly because the data quality in these sources is insufficient and important data are not collected. These problems cannot be solved in the short or medium term. In contrast, the assessment of disease-related patient registries yielded positive results.

Data quality of registries has improved

As the IQWiG authors note, of the data collection tools not primarily geared towards comparative studies, registries are most likely to offer the option of adapting the data collection requirements for these studies. This concerns both the specification of the necessary data and the data quality.

The authors also note that the question as to whether existing patient registries are currently suitable for the collection of routine practice data according to §35a Social Code Book (SGB V) cannot be answered in a general way. This depends on the respective registry and, above all, on the specific research questions posed. In the discussions with selected registry operators, however, it also became apparent that from a technical and organizational point of view, the registries are generally prepared to implement any necessary extensions of the data set.

Thomas Kaiser, Head of IQWiG's Drug Assessment Department explains: "In recent years, the objectives and scope of documentation of registries have been extended. In particular, the increasing documentation of clinical information in registries that can be used to describe patient populations, interventions and outcomes for benefit assessments is an important step forward. For certain research questions, data on patient-reported outcomes should also be included in registries. This is already the case in some registries."

Benefit assessments always require fair comparisons

As emphasized by the IQWiG authors, if routine practice data are to be used in benefit assessments, it must be taken into account that the basis of any conclusion on the effects of interventions is a comparison. This is because only on the basis of a comparison is it possible to distinguish between "after intervention A" and "due to intervention A"; this distinction is necessary for a causal conclusion. A comparison is only meaningful if the starting conditions are fair (similarity of the groups in terms of prognostic factors). Ideally, this is achieved through randomization, i.e. the random allocation of study participants to the two study arms.

When studies are conducted without randomization, the adjustment of interfering factors (confounders) is an essential part of the assessment. For this purpose, the relevant confounders - such as the severity of a concomitant disease or a genetic mutation - must be determined and documented in the data collection. The completeness and accuracy of the data on confounders is just as important as that of the other data. Depending on the research question and the data already available, it may therefore be less resource-intensive to conduct a study with randomization.

As the IQWiG authors note, in order to be able to use routine practice comparative studies for benefit assessments, it should already be ensured in the study planning phase that the study process and the data collected are of the necessary quality to produce interpretable results.

They therefore compiled a clear list of criteria to ensure that only data of sufficient quality are used. This list is divided into four categories: mandatory criteria for ensuring data quality; general criteria that are always relevant for registry studies used in benefit assessments of drugs; general criteria that, depending on the research question, are relevant for registry studies used in benefit assessments of drugs; and criteria whose degree of fulfilment is to be assessed in relation to the research question.

Thomas Kaiser notes: "In the context of the suitability testing of a specific registry, this list should be used to evaluate for the respective research question whether all necessary data have been collected or whether possible deficits can be corrected with reasonable effort in a registry-based study."

Without randomization, no more than a hint of an effect is conceivable

The smaller the expected differences in treatment effects in a comparison, the more important is a fair comparison in terms of the similarity of the groups in terms of prognostic factors described above. From this, the IQWiG authors conclude that from comparative studies without randomization, a conclusion drawn from the observed effects with regard to the benefit or harm of an intervention is only meaningful if a certain effect size is exceeded. Otherwise, it cannot be excluded that the observed effect was not caused by the intervention, but by confounders. Since without randomization it cannot be excluded, even in a good study, that unknown confounders may influence the results, it is therefore generally not possible to derive more than a hint of an effect from comparative studies without randomization.

According to IQWiG's analysis, whether it is possible to consider retrospective study designs depends on whether the available data sources contain the necessary data in the required quality. Thus, comparisons of patient populations receiving a new drug with patient populations comprising historical controls only appear realistic if the same data source is used for both (e.g. a disease-specific clinical registry).

Registry-based randomized trials as an option

In general, comparative studies with randomization always have a higher informative value than those without randomization. They remain the gold standard because quantification of the added benefit is more reliable. The IQWiG authors emphasize that, particularly after drug approval, routine practice comparative trials with randomization can - depending on the existing research question - also be conducted with a limited collection of data in "large simple trials". Conducting studies in registries has an additional potential to accelerate the studies and make them less complex and resource-intensive (registry-based comparative studies with randomization).

Jürgen Windeler, IQWiG's Director, concludes: "The generation of routine practice data and their analysis is potentially feasible in the near future - but for the time being, in addition to study-specific data collection, only via data collection from registries. We have documented which data must be available in the registries and in what quality. The registry operators were very open-minded in their discussions with us, so I expect that the first data from high-quality registries will soon be available for use in benefit assessments of drugs." In this context, Windeler also calls on politicians to act: "The conditions for high quality registries could be better. This concerns both funding and the fact that there are different requirements for data protection in different German federal states."

Credit: 
Institute for Quality and Efficiency in Health Care

New research shows more people knowingly use fentanyl

Fentanyl use by people who use drugs has doubled since 2015, and two-thirds of people are aware they've taken it, finds new research out of British Columbia, the Canadian province that has experienced the highest number of illicit drug toxicity deaths as a result of the opioid crisis.

The findings point to the importance of taking comprehensive measures to reduce the risk for people who take the toxic opioid knowingly or unknowingly. The study, by the BC Centre for Disease Control (BCCDC) and University of British Columbia, is based on 2018 survey data collected from people who visit harm reduction sites for supplies like new syringes and needles or safer smoking supplies. The study, published this week in the International Journal of Drug Policy, provides valuable insights into fentanyl use that will inform efforts to reduce overdoses and deaths in B.C. and beyond.

"This research shows the majority of people who use fentanyl know they're doing so," says Dr. Jane Buxton, BCCDC epidemiologist, harm reduction lead and professor in the UBC School of Population and Public Health. "Making people who use drugs aware of the presence of fentanyl in the drug supply isn't enough; we need harm reduction services, substance use treatment, overdose prevention resources, and pharmaceutical alternatives to the toxic drug supply to reduce the devastating impact of fentanyl and its analogues on our communities."

Fentanyl is a synthetic opioid 50 to 100 times more toxic than morphine According to preliminary data from the BC Coroner's Service, fentanyl or its analogues such as carfentanil were found in 85 per cent of fatal overdose cases in 2019.

When fentanyl first appeared in the drug supply, many people took fentanyl unknowingly in the drugs they sought like heroin, counterfeit opioid tablets or other substances. At the time, B.C. saw a huge spike in deaths from the contaminated drug supply and instituted measures to prevent deaths including distribution of kits containing the overdose reversing drug naloxone, increased access to treatments for opioid use disorder susbtitiuation treatment, and expansion of overdose prevention services and supervised consumption sites. Scientists and health care workers wanted to know whether people were still unaware they were taking fentanyl, so they repeated a study done in 2015, shortly after fentanyl entered the drug supply.

The study drew on data collected from 303 participants recruited from 27 harm reduction sites across B.C. The participants completed a brief survey on their drug use and provided a urine sample that researchers tested for fentanyl and other substances.

Sixty per cent of participants in 2018 had fentanyl detected in their urine, of these people, 64 per cent knew they had taken fentanyl. The study done in 2015 found 29 per cent of participants tested positive for fentanyl, with only 27 per cent aware that they'd used it.

Researchers do not fully understand the factors that contribute to people knowingly taking fentanyl, but the reasons are varied. Some people may use fentanyl because they are aware it is present in most of the illicit supply of opioids and therefore have no other choice, while some may prefer the experience of taking fentanyl regardless of other options.

"This research lays groundwork that will help us learn more about why fentanyl use is increasing," says Mohammad Karamouzian, PhD student at UBC's School of Population and Public Health and a lead author of the study. "These findings will also contribute to more effective messaging campaigns and harm reduction strategies to help reduce preventable deaths and support the health of people who use substances, their families, and their communities."

Quick facts

Other key findings from the study include:

Recent fentanyl use was more common in people living in urban settings.

People who used fentanyl were more likely to have also recently used heroin/morphine or crystal meth.

Self-reported cannabis use was associated with reduced fentanyl use.

The data for this study were collected at harm reduction sites that provide a range of services to people who use drugs including, but not limited to: condom distribution, needle and syringe distribution and take-home naloxone kits. There are approximately 375 of these facilities across B.C.

A study led by BCCDC in 2019 showed the rapid expansion of harm reduction services (e.g., take home naloxone, opioid agonist therapy and overdose prevention sites) in response to B.C.'s overdose crisis averted more than 3,000 overdose deaths during a 20-month period in 2016-2017.

Credit: 
University of British Columbia

Advancing frozen food safety: UGA evaluates environmental monitoring programs

Arlington, Va. - New research funded by the Frozen Food Foundation evaluates current environmental monitoring practices being implemented across the frozen food industry to prevent and control Listeria monocytogenes (Lm). The findings were published in the January 24, 2020, Journal of Food Protection®.

The University of Georgia (UGA) study used an anonymous survey tool to understand existing environmental monitoring programs across a variety of frozen food manufacturing facilities. Information from more than 45 frozen food facilities was collected and, while monitoring practices were varied across the industry, the data indicated that facilities were predominantly testing for Listeria spp. in the environment (i.e. walls, floors and drains).

"This study is part of the frozen food industry's commitment to better understand Listeria in frozen food facilities by reviewing current practices," said Frozen Food Foundation Executive Vice President Dr. Donna Garren. "This research helps implement the frozen food industry's science-based environmental monitoring programs to identify and reduce the risk of Lm that are available at AFFIFoodSafety.org."

The lead researcher, Dr. Mark Harrison stated that, "There is a need for facilities to review their sampling strategy including the frequency and timing of sampling." He further added, "Facilities should focus on looking for Lm at times and in places where they are most likely to find the pathogen for a realistic assessment."

"Lm is a challenge because of its ubiquity and ability to survive freezing," said Dr. Garren. "UGA's research will help the frozen food industry identify appropriate sampling locations, timing and frequency to address the potential risk of this pathogen."

This research will continue throughout 2020 as UGA analyzes quantitative environmental monitoring data aggregated from frozen food companies to serve as an important baseline for future assessments.

Credit: 
American Frozen Food Institute

30-year study identifies need of disease-modifying therapies for maple syrup urine disease

STRASBURG, PA- A new study analyzes 30 years of patient data and details the clinical course of 184 individuals with genetically diverse forms of Maple Syrup Urine Disease (MSUD), which is among the most volatile and dangerous inherited metabolic disorders. Researchers collected data on survival, hospitalization rates, metabolic crises, liver transplantation, and cognitive outcome. This represents the largest systematic study of MSUD, with regard to both cohort size and the duration of clinical follow up. The study was a broad collaborative effort led by clinicians and researchers at the Clinic for Special Children (CSC) and will appear in Molecular Genetics and Metabolism.

Before the CSC's inception, one in three children born with MSUD died from neurological complications of the disease before 10 years of age, and the majority of survivors were permanently disabled. Three decades of innovation and clinical care by the CSC team have increased survival from 63% to 95% while hospitalization rates have decreased from 7 to just 0.25 hospital days per patient per year. Specific advances in management include new prescription formulas for children and adults as well as elective liver transplantation, a collaboration with the Hillman Center for Pediatric Transplantation (UPMC Children's Hospital of Pittsburgh) that has been 100% successful for 93 individuals transplanted since 2003.

Treatment of MSUD requires close monitoring of blood amino acid levels. A total of 13,589 amino acid profiles were generated by CSC's on-site clinical laboratory and the data were analyzed to determine the overall effectiveness of treatment. The authors conclude that although stringent dietary therapy maintains blood amino acid concentrations within acceptable limits, it is challenging to implement, especially for individuals older than 10 years of age, and does not fully prevent the cognitive and psychiatric disabilities caused by MSUD.

Eighty-two (82) MSUD patients underwent IQ testing, with higher IQ scores correlating by age with younger patients. On average, MSUD patients scored 23% lower on IQ testing than their unaffected siblings and, as compared to the general population, the prevalence of affective illness (depression, anxiety, and panic disorder) was much higher among both MSUD patients and their unaffected siblings. Based on these observations, the authors conclude that despite advances in clinical care, MSUD remains a morbid and potentially fatal disorder, and there remains a critical unmet need for safer and more effective disease-modifying interventions, including gene replacement or editing therapies.

Credit: 
Clinic for Special Children

NIH study finds benefits of fetal surgery for spina bifida persist through school age

Children as young as 6 years old who underwent fetal surgery to repair a common birth defect of the spine are more likely to walk independently and have fewer follow-up surgeries, compared to those who had traditional corrective surgery after birth, according to researchers funded by the National Institutes of Health. Their study appears in Pediatrics.

The procedure corrects myelomeningocele, the most serious form of spina bifida, a condition in which the spinal column fails to close around the spinal cord. With myelomeningocele, the spinal cord protrudes through an opening in the spine and may block the flow of spinal fluid and pull the brain into the base of the skull, a condition known as hindbrain herniation.

In 2011, the Management of Myelomeningocele study, funded by NIH's Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), found that by 12 months of age, children who had fetal surgery required fewer surgical procedures to divert, or shunt, fluid away from the brain. By 30 months, the fetal surgery group was more likely to walk without crutches or other devices.

For the current study, NICHD-funded researchers re-evaluated children from the original trial when they were 6 to 10 years old. Of the 161 children who took part in the follow-up study, 79 had been assigned to prenatal surgery and 82 had been assigned to traditional surgery. Children in the prenatal surgery group walked independently more often than those in the traditional surgery group (93% vs. 80%). Those in the prenatal surgery group also had fewer shunt placements for hydrocephalus, or fluid buildup in the brain (49% vs. 85%), and fewer shunt replacements (47% vs. 70%). The group also scored higher on a measure of motor skills.

The two groups did not differ significantly in a test measuring communication ability, daily living skills, and social interaction skills.

"Prenatal surgery for myelomeningocele carries benefits and risks, compared to traditional postnatal surgery," said Menachem Miodovnik, M.D., of the NICHD Pregnancy and Perinatology Branch. "This study provides important information for physicians with patients who are considering prenatal surgery."

Credit: 
NIH/Eunice Kennedy Shriver National Institute of Child Health and Human Development

Lung microbiome may help predict outcomes in critically ill patients

image: Changes in the lung microbiome may help predict how well critically ill patients will respond to care, according to new research published online in the American Thoracic Society's American Journal of Respiratory and Critical Care Medicine.

Image: 
Michigan Medicine

Jan. 24, 2020--Changes in the lung microbiome may help predict how well critically ill patients will respond to care, according to new research published online in the American Thoracic Society's American Journal of Respiratory and Critical Care Medicine.

Specifically, according to the authors of "Lung Microbiota Predict Clinical Outcomes in Critically Ill Patients," patients with higher levels of lung bacteria one day after admission to the ICU had fewer ventilator-free days, a strong effect that was not explained by severity of critical illness or the presence of pneumonia.

The identity of lung microbiota - which bacteria were detected - was also predictive of ICU outcomes in these patients. Two bacteria normally found in the gut -- Lachnospiraceae and Enterobacteriaceae spp -- were common in the lung microbiome of patients who had worse ICU outcomes.

The presence of Enterobacteriaceae spp in the lung microbiome was also associated with acute respiratory distress syndrome, or ARDS, a life-threatening illness in which the lungs are severely inflamed.

Prior studies by this research team found that the immune function of patients with ARDS is highly variable and that the translocation of gut bacteria to the lungs may play a role in the development of ARDS. In another earlier study, the researchers showed that the lung microbiome in patients with idiopathic pulmonary fibrosis, or IPF, is also predictive of clinical outcomes.

The human microbiome comprises the genetic material of an estimated 100 trillion microbes. Bacteria is the biggest component of the microbiome, but it also includes viruses, fungi and archaea. Unlike the human genome, which is relatively static, the microbiome is altered, sometimes dramatically, by diet, disease and other factors. While the lungs have historically been considered sterile, in the past decade investigators have used DNA-based methods to reveal that the lung contain diverse and dynamic communities of bacteria.

"We already knew that lung microbiota are altered in critically ill patients, and that this disruption is associated with altered lung immunity," said lead author Robert Dickson, MD, assistant professor of pulmonary and critical care medicine and microbiology and immunology at the University of Michigan. "What the current study tells us is that this disruption of lung microbiota is clinically meaningful. In otherwise similar patients, differences in lung bacteria help explain who recovers and who doesn't."

In their study of 91 critically ill patients, the researchers controlled for disease severity and for whether the patient had pneumonia, which would increase the number of bacteria in the lung microbiome. After taking these factors into account, the associations between ventilator-free days and the bacteria level and detection of gut-associated bacteria in the lung microbiome persisted.

The investigators are encouraged that the lung microbiome may represent a novel target for preventing and treating critical illness.

"The microbiome is something we can potentially manipulate, unlike other risk factors in the ICU," said senior author Lieuwe Bos, MD, PhD, a researcher in pulmonology and critical care and pulmonologist in training at Amsterdam University Medical Center. "We can't change our patients' genes or their chronic diseases, but we can potentially change their bodies' microbiota."

Study limitations include the fact that researchers could not control for medications, including antibiotics, that the patients may have taken before being admitted to the ICU. The researchers could not determine if the gut-associated bacteria found in some patients' lung microbiome had migrated from the lower gastrointestinal tract or whether they were found in the lungs because of aspiration, the accidental inhalation of food or liquid.

The investigators say the next step for the field will be determining whether modifying these lung bacteria influences patients' outcomes. They say this task will require both prospective human studies and animal models of critical illness.

"Predicting ICU outcomes is important, but what we really want is a target for therapy," Dr. Dickson said. "We need to figure out if the lung microbiome is something we can modify, either to prevent lung injury or to help it resolve faster."

Dr. Bos added that a "take home" message of this study and the researchers' previous study on immune function in ARDS patients is that ARDS is a heterogenous disease.

"The lungs of ARDS patients are not all alike," he said. "Knowing that immune function and the microbiome differ among these patients may not only help us predict our patients' outcomes but to change them for the better."

"This study adds to growing evidence that the lung microbiome plays a key role in lung disease," said James Kiley, PhD, director of the Division of Lung Diseases at the National Heart, Lung, and Blood Institute, part of the National Institutes of Health. "It's important that we continue to explore the microbiome and other factors that contribute to lung disease and clinical outcomes."

Credit: 
American Thoracic Society