Culture

New potential therapy for fatty liver disease

In those with fatty liver disease, a person’s fat goes to their liver instead of their fat tissue, either because of an absence of fat depots, which is seen in the rare genetic disease lipodystrophy, or because the depots are too full, which is seen in people with obesity.

One third of these people will go on to develop nonalcoholic steatohepatitis, or NASH - an advanced form of fatty liver disease brought on by progressive inflammation and scarring in the organ.

In 2002, Michigan Medicine endocrinologist Elif Oral, M.D., who had just moved from the National Institutes of Health at the time, published her discovery that patients with severe lipodystrophy lack leptin, a hormone that helps curb appetite and control weight gain. When given leptin as a supplement, the patient’s serious metabolic abnormalities like NASH improved substantially.

Oral set out to U-M to further study the role of leptin, now in more common forms of NASH. Almost two decades later, her research team found that whether from a leptin deficiency or the presence of partial lipodystrophy, patients with NASH and relatively low leptin levels can mobilize the extra fat in their liver, out of their liver, and help reverse their condition by undergoing leptin therapy.

“Familial partial lipodystrophy often accompanies NASH. It’s a rare, genetic condition where patients have a lack of fat in their extremities but remain fat in their upper body,” Oral explained. “I wanted to test the effect of leptin in both those with this rare condition and those that just present with NASH to see if there would be a difference in therapeutic outcomes.”

This work, which is the first of its kind in humans and compiles research from three different studies, showed that leptin is an important signal in regulating fat deposition in the liver, and reversing fat deposition and its subsequent NASH.

During the lifetime of the study, the manufacturer of leptin changed several times, posing substantial bureaucratic obstacles for the research team to overcome.

“I could’ve moved on to something easier to study, but I wanted to see this through. This is my life’s work,” Oral said. “I’m grateful for all of my collaborators and co-authors for sticking with me through it all.”

Outlined in Med: Cell Press, Oral and her team conducted two open-label trials studying nine male patients with NASH and relatively low leptin levels (less than 9 ng/ml) and 23 patients with both partial lipodystrophy and NASH. Both groups received leptin therapy in the form of metreleptin for one year.

The trials consisted of male patients because Oral found that 35-40% of the men that had leptin levels measured had levels less than the twenty-fifth percentile of their body weight, making them ideal study candidates.

“Not all NASH is created equal. There’s a vast distribution of leptin levels in this patient population,” Oral said. “High levels of leptin, seen in obesity, can actually be causative of NASH so it was important to carefully select trial participants for low levels.”

After blind, paired liver biopsies, both groups were found to have reduced fat in the liver and lower NASH scores after 12 months of leptin therapy. The patients also had improved insulin sensitivity and body weight.

The findings are only applicable to leptin, but Oral thinks other molecules or treatments that activate leptin in the body could be of focus in future studies in an attempt to widen the therapeutic window for these patients.

After obesity is established, there’s little gain by giving someone leptin. However, a patient in the early overweight state may get value from undergoing leptin therapy, inspiring the research team to study leptin as a preventive weight control option in those at risk of crossing the obesity threshold and developing more fat in the liver.

“Although these results were encouraging, this justifies a larger trial,” Oral said. “But there’s no approved treatments for NASH of any form, so to have a therapeutic that can help at least a fraction of these patients is exciting.”

Credit: 
Michigan Medicine - University of Michigan

Feedback on cafeteria purchases helps employees make healthier food choices

BOSTON - Automated emails and letters that provide personalized feedback related to cafeteria purchases at work may help employees make healthier food choices. That's the conclusion of a new study that was led by investigators at Massachusetts General Hospital (MGH) and is published in JAMA Network Open.

As many adults spend half (and sometimes more) of their waking hours working, the workplace provides a unique opportunity to promote health with programs that target obesity, unhealthy diets, and other risk factors for chronic diseases and premature death.

Building on findings from previous studies, researchers designed the ChooseWell 365 clinical trial to test a 12-month automated, personalized behavioral intervention to prevent weight gain and improve diet in hospital employees. For the trial, 602 MGH employees who regularly used the hospital's cafeterias were randomized to an intervention group or a control group. For one year, participants in the intervention group received two emails per week that included feedback on their previous cafeteria purchases and offered personalized health and lifestyle tips. They also received one letter per month with comparisons of their purchases with those of their peers, as well as financial incentives for healthier purchases. Control participants received one letter per month with general healthy lifestyle information.

"This novel workplace strategy was completely automated and did not require that people take time away from work to participate, making it ideal for busy hospital employees," explains lead author Anne N. Thorndike, MD, MPH, an investigator in the Division of General Internal Medicine at MGH and an associate professor of Medicine at Harvard Medical School.

Participants in the intervention group increased their healthy cafeteria food purchases to a greater extent than participants in the control group. They also purchased fewer calories per day. These differences were observed during the one-year intervention as well as during a year of additional assessments. There were no differences between the groups in terms of weight change at 12 or 24 months, however.

"Few if any prior workplace studies have been able to make sustained changes in dietary choices of employees," says Thorndike. "This study provides evidence that food purchasing data can be leveraged for delivering health promotion interventions at scale."

Credit: 
Massachusetts General Hospital

RUDN University chemists created anti-hantavirus drugs 5 times more efficient than existing drugs

image: RUDN University chemists and their colleagues from Novosibirsk State University, Novosibirsk Institute of Organic Chemistry and The State Research Center of Virology and Biotechnology VECTOR have obtained a new class of compounds that inhibit the replication of the deadly Hantaan virus that affects blood vessels and internal organs of humans. The resulting substances were 5 times more effective than existing antiviral drugs.

Image: 
RUDN University

RUDN University chemists and their colleagues from Novosibirsk State University, Novosibirsk Institute of Organic Chemistry and The State Research Center of Virology and Biotechnology VECTOR have obtained a new class of compounds that inhibit the replication of the deadly Hantaan virus that affects blood vessels and internal organs of humans. The resulting substances were 5 times more effective than existing antiviral drugs. The results have been published Bioorganic & Medicinal Chemistry Letters.

The Hantaan virus causes acute haemorrhagic fever with renal syndrome (HFRS). The disease is common in the Asian part of Russia, China, Korea, Finland, Sweden, and the countries of eastern and central Europe. The main reservoir and carrier of the virus is the striped field mouse. A human can be infected through the skin or mucous membranes. The virus accumulates and replicates in the blood vessels causing their inflammation, affects the internal organs, primarily the kidneys. The mortality rate varies in different regions from 1% to 10-15%. There are no standard treatment regimens for HFRS -- the treatment is symptomatic. In this regard, the efforts of many scientific groups are focused on the cure development, including the synthesis of new antiviral drugs. RUDN chemists and their colleagues from Novosibirsk synthesized a new class of compounds based on the available natural substances (terpenes), which suppressed the reproduction of the virus in cells 5 times more effective than existing drugs in preliminary experiments.

"The study of antiviral drugs based on terpenes (hydrocarbons, which are found in large quantities in many plants and their essential oils), is very promising. We previously discovered a class of novel terpenoids active against the influenza virus, specifically camphor-based hydrazones. Now our goal is to find new drugs based on natural terpenes with specific activity to viruses causing HFRS", Fedor Zubkov, PhD, Associate Professor at RUDN department of Organic chemistry.

In previous studies chemists have synthesized derivatives of N-acylhydrazones of camphor and fenchone, which were active against smallpox and influenza viruses. One of these substances served as the starting point of this work. The new drug was obtained from natural camphor and fenchone. They can be extracted from the essential oils and resin of conifer trees. Using these substances, chemists obtained terpenic compounds and combined them with a heterocyclic fragment. As a result, a broad library of structurally diverse compounds was obtained - with or without a double bond, additional functional groups in the heterocyclic core, and so on. The composition of the resulting compounds was studied using 2D NMR spectroscopy.

Chemists tested the biological activity of the obtained compounds on pseudovirus system -- a biologically safe virus with the same glycoproteins on its surface as hantavirus. Such a model allows scientists to evaluate the bioactivity faster and safer than using a real virus. Chemists compared the results with the effects of broad-spectrum antiviral drugs Ribavirin and Triazavirin. 12 of the produced drugs demonstrated antiviral activity; one of them was 5 times more effective than Ribavirin and Triazavirin. Chemists concluded that the key structural feature necessary for the effective action of the drug the presence of the isoindole fragment attached to the terpene fragment.

"This class of compounds does not prevent the virus from entering the cell, but it inhibits its intracellular replication. Therefore, we can conclude that the therapeutic target of the obtained terpene complexes is the Hantaan virus protein responsible for replication", Alexandra Antonova, student at RUDN Department of Organic Chemistry.

Credit: 
RUDN University

Space travel weakens our immune systems: Now scientists may know why

Microgravity in space perturbs human physiology and is detrimental for astronaut health, a fact first realized during early Apollo missions when astronauts experienced inner ear disturbances, heart arrhythmia, low blood pressure, dehydration, and loss of calcium from their bones after their missions.

One of the most striking observations from Apollo missions was that just over half of astronauts became sick with colds or other infections within a week of returning to Earth. Some astronauts have even experienced re-activation of dormant viruses, such as the chickenpox virus. These findings stimulated studies on the effects of weak gravity, or "microgravity," on the immune system, which scientists have been exploring for decades of manned rockets launches, shuttle travel and space station stints, or sometimes by simulating space gravity in earthbound labs.

In the last study led by one of the first women astronauts, Millie Hughes-Fulford, PhD, researchers at UCSF and Stanford University now have shown that the weakening of an astronaut's immune system during space travel is likely due in part to abnormal activation of immune cells called T regulator cells (Tregs).

Tregs normally are triggered to ramp down immune responses when infection no longer threatens and are important regulators of immune responses in diseases ranging from cancer to COVID-19. In microgravity conditions, however, the researchers found changes in Tregs that prepared them to go to work even before the immune system was challenged. When they stimulated an immune response in human immune cells from blood samples in microgravity, with a chemical often used in research to mimic a disease pathogen, they found that Tregs helped suppress the immune response that was triggered. This unanticipated discovery is published online June 7 in the journal Nature Scientific Reports.

Hughes-Fulford became the first female payload specialist to orbit Earth with her experiments in 1991, and for decades, until her death due to leukemia in February, she studied the effects of microgravity on health, first with an emphasis on osteoporosis and later with a focus on the immune system. As a researcher at the San Francisco Veterans Affairs Medical Center and a UCSF faculty member long affiliated with the Department of Medicine, Hughes-Fulford mentored aspiring space scientists, including the co-principal investigators of this latest immunology study.

Jordan Spatz, PhD, a space scientist and UCSF medical student who became co-PI of the study after Hughes-Fulford's death, noted that as space travel becomes increasingly commercialized and more common, concerns over the health status of space travelers are likely to grow.

"Early in the space program, most astronauts were young and extremely healthy, but now they tend to have much more training and are older," Spatz said. "In addition, apart from astronauts, with the commercialization of space flight there will be many more older and less healthy individuals experiencing microgravity. From a space medical perspective, we see that microgravity does a lot of bad things to the human body, and we are hoping to gain the ability to mitigate some of the effects of microgravity during space travel."

The new study advanced earlier research led by Hughes-Fulford, confirming some of her previous findings from experiments in space and in simulated microgravity, while contributing additional molecular discoveries. Hughes-Fulford earlier had found weaker responses from T lymphocytes of the immune system, some of which attack specific pathogens directly and some of which help orchestrate the immune response.

"It's a double whammy," said co-PI Brice Gaudilliere, MD, PhD, an associate professor in the Department of Anesthesia at Stanford University School of Medicine. "There is a dampening of T lymphocyte immune activation responses, but also an exacerbation of immunosuppressive responses by Tregs." The researchers also found that "natural killer" lymphocytes were less active under simulated microgravity, while antibody-producing B cells appeared to be unaffected.

The researchers simulated microgravity in blood samples with a specialized, cylindrical, cell-culture vessel with motor-driven rotation, a long established microgravity research tool, but the method of single-cell analysis was unique. The scientists identified individual immune cells by specific type and used metal tags and mass spectroscopy to simultaneously detect and quantify dozens of proteins that play a role in immune function, in addition to confirming previously identified patterns of altered gene activation.

Credit: 
University of California - San Francisco

Drop in convalescent plasma use at US hospitals linked to higher COVID-19 mortality rate

A new study from researchers at Johns Hopkins Bloomberg School of Public Health and colleagues suggests a slowdown in the use of convalescent plasma to treat hospitalized COVID-19 patients led to a higher COVID-19 mortality during a critical period during this past winter's surge.

U.S. hospitals began treating COVID-19 patients with convalescent plasma therapy--which uses antibody-rich blood from recovered COVID-19 patients--in the summer of 2020 when doctors were looking to identify treatments for the emerging disease. By the spring of 2021, doctors in the United States had treated over 500,000 COVID-19 patients with convalescent plasma. The use of convalescent plasma started declining late in 2020 after several large clinical trials showed no apparent benefit.

The researchers' analysis suggests that the decline in convalescent plasma use might have led to more than 29,000 excess COVID-19 deaths from November 2020 to February 2021.

The study was published online June 4 in the journal eLife.

"Clinical trials of convalescent plasma use in COVID-19 have had mixed results, but other studies, including this one, have been consistent with the idea that it does reduce mortality," says study senior author Arturo Casadevall, MD, PhD, Alfred and Jill Sommer Professor and Chair of the Department of the Molecular Microbiology and Immunology at the Bloomberg School.

The study was done in collaboration with researchers at Michigan State University and the Mayo Clinic. Casadevall and colleagues observed that while plasma use was declining late last year, the reported COVID-19 patient mortality rate was rising. That led them to hypothesize that the two phenomena were related.

In the study, the researchers compared the number of units of plasma distributed to U.S. hospitals from blood banks, on a per patient basis, to the number of reported COVID-19 deaths per hospital admission across the country.

One finding was that while the total use of plasma peaked last December and January during the winter surge in new COVID-19 patients, the use per hospitalized patient peaked in early October 2020--just as deaths per COVID-19 hospital admission bottomed. Thereafter, in the wake of reports of negative results from clinical trials, use of plasma per hospitalized patient fell sharply--and deaths per COVID-19 hospital admission rose.

The researchers analyzed the relationship between these two datasets and found a strong negative correlation, higher use rate being associated with lower mortality and vice versa. They also grouped periods of plasma use into five "quintile" groupings from lowest-use weeks to highest, and found a graded relationship between less use and higher mortality.

A model the researchers generated to fit the data suggested that the COVID-19 case fatality rate decreased by 1.8 percentage points for every 10-percentage point increase in the rate of plasma use. That model implied that there would have been 29,018 fewer deaths, from November 2020 to February 2021, if the peak use rate of early October had held. Moreover, it suggested that the use of plasma on the whole, as limited as it was, prevented about 95,000 deaths through early March of this year.

The researchers analyzed, and then rejected, the possibility that several other factors could explain away the link between less plasma use and more mortality. These factors included changes in the average age of hospitalized patients, and the emergence of new variants of the COVID-19-causing coronavirus.

As for why some clinical trials found no benefit for plasma use, the researchers note in their paper that many of the clinical trials with negative results had used plasma--mainly considered an antiviral treatment--relatively late in the course of COVID-19, when patients may have been too ill to benefit, and when the disease is driven mainly by immune-related responses rather than the coronavirus itself.

Casadevall notes that convalescent plasma remains under FDA Emergent Use Authorization in the U.S., and that it is readily available. "We hope that physicians, policymakers, and regulators will consider the totality of the available evidence, including our findings, when making decisions about convalescent plasma use in individual COVID-19 patients," Casadevall says.

Credit: 
Johns Hopkins Bloomberg School of Public Health

Research advances one step closer to stem cell therapy for type 1 diabetes

image: This image shows functional beta cells made from human pluripotent stem cells. Insulin (red) and NKX6.1 (green) indicate two proteins produced by beta cells.

Image: 
Salk Institute

LA JOLLA--(June 7, 2021) Type 1 diabetes, which arises when the pancreas doesn't create enough insulin to control levels of glucose in the blood, is a disease that currently has no cure and is difficult for most patients to manage. Scientists at the Salk Institute are developing a promising approach for treating it: using stem cells to create insulin-producing cells (called beta cells) that could replace nonfunctional pancreatic cells.

In a study published on June 7, 2021, in the journal Nature Communications, the investigators reported that they have developed a new way to create beta cells that is much more efficient than previous methods. Additionally, when these beta cells were tested in a mouse model of type 1 diabetes, the animals' blood sugar was brought under control within about two weeks.

"Stem cells are an extremely promising approach for developing many cell therapies, including better treatments for type 1 diabetes," says Salk Professor Juan Carlos Izpisua Belmonte, the paper's senior author. "This method for manufacturing large numbers of safe and functional beta cells is an important step forward."

In the current work, the investigators started with human pluripotent stem cells (hPSCs). These cells, which can be derived from adult tissues (most often the skin), have the potential to become any kind of cell found in the adult body. Using various growth factors and chemicals, the investigators coaxed hPSCs into beta cells in a stepwise fashion that mimicked pancreatic development.

Producing beta cells from hPSCs in the lab is not new, but in the past the yields of these precious cells have been low. With existing methods, only about 10 to 40 percent of cells become beta cells. By comparison, techniques used to create nerve cells from hPSCs have yields of about 80 percent. Another issue is that if undifferentiated cells are left in the mix, they could eventually turn into another kind cell that would be unwanted.

"In order for beta cell-based treatments to eventually become a viable option for patients, it's important to make these cells easier to manufacture," says co-first author Haisong Liu, a former member of the Belmonte lab. "We need to find a way to optimize the process."

To address the problem, the researchers took a stepwise approach to create beta cells. They identified several chemicals that are important for inducing hPSCs to become more specialized cells. They ultimately identified several cocktails of chemicals that resulted in beta cell yields of up to 80 percent.

They also looked at the ways in which these cells are grown in the lab. "Normally cells are grown on a flat plate, but we allowed them to grow in three dimensions," says co-first author Ronghui Li, a postdoctoral fellow in the Belmonte lab. Growing the cells in this way creates more shared surface area between the cells and allows them to influence each other, just as they would during human development.

After the cells were created, they were transplanted into a mouse model of type 1 diabetes, The model mice had a modified immune system that would not reject transplanted human cells. "We found that within two weeks these mice had a reduction of their high blood sugar level into normal range," says co-first author Hsin-Kai Liao, a staff researcher in the Belmonte lab. "The transplanted hPSC-derived beta cells were biologically functional."

The researchers will continue to study this technique in the lab to further optimize the production of beta cells. More research is needed to assess safety issues before clinical trials can be initiated in humans. The investigators say the methods reported in this paper may also be useful for developing specialized cells to treat other diseases.

Credit: 
Salk Institute

Study suggests no link between antiseizure drugs used in pregnancy and cognitive problems in babies

WHAT:

New findings published in JAMA Neurology suggest there is no difference in cognitive outcomes at age 2 among children of healthy women and children of women with epilepsy who took antiseizure medication during pregnancy. The findings are part of the large research project Maternal Outcomes and Neurodevelopmental Effects of Antiepileptic Drugs (MONEAD), which is a prospective, long-term study looking at outcomes in pregnant women with epilepsy and their children. The study was funded by the National Institute of Neurological Disorders and Stroke (NINDS), part of the National Institutes of Health.

This study reports findings from 382 children (292 children born to women with epilepsy and 90 born to healthy women) who were assessed for language development at age 2. The researchers also compared developmental scores with third trimester blood levels of antiseizure medication in these children.

Results suggest that children born to healthy women and those born to women with epilepsy do not show significant differences in language development scores at age 2. Neither was language development linked to third trimester blood levels of epilepsy medications. Most women with epilepsy in the study were taking lamotrigine and/or levetiracetam.

However, the study did find that those children born to mothers with the very highest levels of antiseizure medication in the blood during the third trimester did have somewhat lower scores on tests in the motor and general adaptive domains, which refer to skills related to self-care, such as feeding.

The children in this study will continue to be followed and will participate in additional cognitive tests through age 6. Results so far indicate that controlling epilepsy with these medications during pregnancy may be safe for babies.

Credit: 
NIH/National Institute of Neurological Disorders and Stroke

Largest-ever pre-adolescent brain activation study reveals cognitive function maps

image: New youth brain activation data from the ABCD Study demonstrate which brain regions are involved in a range of psychological processes, including cognitive control, reward processing, working memory and social/emotional function.

Image: 
Graphic image courtesy of Bader Chaarani, Ph.D., University of Vermont

Youth brain activation data from the largest longitudinal neuroimaging study to date provides valuable new information on the cognitive processes and brain systems that underlie adolescent development and might contribute to mental and physical health challenges in adulthood. The study published today online in Nature Neuroscience.

Because of the notable brain, cognitive, and emotional maturation - and emergence of many mental health disorders - that occurs between the ages of 10 and 20, understanding neurodevelopment and how it is impacted by the numerous risk factors that emerge during this timeframe is a critical area of interest. However, to date, most human neuroimaging studies have focused on adult functioning.

The Adolescent Brain Cognitive Development Study (ABCD) Study, which launched in 2016, is a multisite, 10-year-long longitudinal study that has enrolled nearly 12,000 youth aged 9 to 10 at 21 research sites around the country.

These latest findings demonstrate which brain regions are involved in a range of important psychological processes, including cognitive control, reward processing, working memory, and social/emotional function.

Using functional magnetic resonance imaging (fMRI) technology, the researchers observed brain activation during a battery of three different tasks and identified how differences in the patterns of activity related to individual differences in these processes.

"This study - likely the biggest task activation paper ever - shows the brain regions activated by each task, how well they capture individual differences, and will likely serve as a baseline for all the subsequent papers that will track the kids as they age," says Hugh Garavan, Ph.D., professor of psychiatry at the University of Vermont, and a senior author on the study.

The brain maps aim to improve scientists' understanding of the psychological processes that put young people at higher risk for developing mental and physical health challenges and, by identifying the brain correlates of factors that influence development, can give guidance on which interventions could help improve outcomes.

"These brain activation maps and spatial reproducibility findings will serve as a gold standard for the neuroscientific community and could help inform study design," says Bader Chaarani, Ph.D., assistant professor of psychiatry at the University of Vermont and the study's first author.

The study's authors state that these brain activation maps will allow for "cross-sectional analyses of inter-individual and group differences," as well as "offer the potential for examining baseline predictors of future development and behavior and for quantifying changes in brain function that may arise from the numerous influences expected to affect development and behavior."

Credit: 
Larner College of Medicine at the University of Vermont

Chip mimicking bovine endometrium used in study of factors that can jeopardize pregnancy

image: The device was used for the first time to culture two maternal endometrial cell types, revealing the effects of alterations in glucose and insulin levels in the uterine environment.

Image: 
Tiago Henrique Camara de Bem

To investigate factors that can jeopardize pregnancy success in cattle, researchers at the University of São Paulo (USP) in Brazil used a kind of chip to mimic the environment of the endometrium, the tissue that lines the inside of the uterus. 

The study was conducted by biologist Tiago Henrique Camara de Bem, a postdoctoral fellow at the University of São Paulo’s School of Animal Science and Food Engineering (FZEA-USP), in collaboration with four researchers at the University of Leeds in the UK. Their findings are reported in an article in the journal Endocrinology.

The researchers focused on analyzing alterations in levels of insulin and glucose in maternal epithelial and stromal cells, and the possible consequences for initial development of the pregnancy. Epithelial cells are the most external in the endometrium. They interface with the lumen and are in direct contact with the embryo. Stromal cells are further inside, acting as support cells that guide epithelial cell growth, differentiation and development, among other functions.

The group discovered that high levels of glucose altered 21 protein-encoding genes in epithelial cells and 191 in stromal cells, as well as triggering quantitative changes in the protein secretome (proteins secreted in the culture medium, which in this case mimicked the endometrial fluid). “As we changed the amount of glucose and insulin in the culture medium, stressing the cells, we were able to switch genes on or off, determining whether they were or weren’t expressed,” Camara de Bem said. 

Changes in insulin levels altered the quantitative secretion of 196 proteins but resulted in limited alteration of gene transcription. “The key factor may be the protein composition of the uterine fluid, in which these cells secrete protein into the embryo,” he explained. “We found this group of proteins to be associated with signaling pathways that play an important role in early pregnancy success in cattle, relating to metabolism, cellular matrix and other factors. All these discoveries evidence a mechanism whereby maternal glucose and insulin alterations can affect uterine functioning.”

Camara de Bem was supported by FAPESP via a postdoctoral fellowship for a project conducted at the Molecular Morphophysiology and Development Laboratory under the supervision of Professor Flávio Vieira Meirelles, and a scholarship for a research internship abroad (BEPE). 

Stress

According to Camara de Bem, Brazil is a world leader in the production of bovine embryos, but nevertheless has a high rate of lost pregnancies. “A large proportion of our embryos are produced by in vitro fertilization. Oocytes are collected, matured, fertilized, cultured and transferred to synchronized recipients. However, 40% of pregnancies are lost in the third or fourth week,” he said, recalling that a bovine pregnancy lasts about nine months, as in humans. 

Reproductive success depends on a number of conditions. “Pregnancy is an interaction between the mother and the embryo that develops in the maternal uterus,” Camara de Bem said. “It involves cross-talk between the embryo’s cells and the mother’s. This communication is influenced by multiple processes. Pregnancy loss can occur when the communication isn’t right – when the embryo can’t signal its presence or the mother doesn’t recognize the developing embryo.”

Stress due to environmental or nutritional problems or even to the production process itself can lead to instability in maternal-embryo communication and disrupt the pregnancy, he continued. In the case of cattle, pregnancy in high-yield dairy cows is the main problem, with the initial post-partum period often involving metabolic stress due to a negative energy balance in the dam.

“Glucose, for example, is a basic substrate for cell metabolism,” Camara de Bem said. “Cells need glucose to perform their functions. Lactating cows undergo a metabolic challenge to produce milk. They consume a lot of energy because they need to maintain the basic functions of the organism as well as all the functions involved in milk production. The state of the mother’s metabolism significantly influences reproduction. Hence our focus on understanding the factors that cause metabolic stress in the environment that receives the embryo.” 

Endometrium on a chip

Camara de Bem stressed that the study was conducted in partnership with the group led by Niamh Forde, a professor at the University of Leeds’ Medical School and last author of the article. “She’s investigating maternal recognition of pregnancy in cattle. I’m interested in investigating the signals sent by the embryo to the mother. We thought it would be a good collaboration and had this idea of developing an ‘endometrium on a chip’ that could be used for multicellular culture, i.e. growing more than one type of cell from the endometrium,” he said. 

The chip resembles a histology slide, except that it is divided into chambers – compartments in which the scientists seeded two types of cell. The partitions are made of a porous membrane that enables information to be exchanged between the two cell types cultured in the different chambers but does not permit the cell types to switch positions. The device can be considered a commercial chip adapted to simulate an endometrium. 

“Epithelial cells were seeded in the upper chamber, stromal cells in the lower,” Camara de Bem said. “Both cell types are abundant in the endometrium. The upper chamber’s culture medium became enriched with factors produced and secreted by the epithelial cells, representing the endometrial secretome.”

The chip enabled the scientists to infuse the cells constantly with a culture medium. “We cultured the cells for three days, injecting medium the whole time [one microliter per minute for 72 hours] with three different concentrations of glucose or two different concentrations of insulin,” he said. “Nutrients were administered very slowly, in a flow mimicking the best medium physiology. This ensured that the cells were exposed to the same levels of glucose and insulin throughout the experiment.”

Future

The method was innovative and had never been used before to mimic the bovine endometrium. Conventional cell culture is too simple to simulate all endometrial conditions. “The endometrium is three-dimensional, with several types of cells and glands producing factors and nutrients to maintain the pregnancy,” Camara de Bem recalled. “In vitro embryo culture using the traditional method is static and involves a single cell type in an environment that doesn’t reflect the richness of the animal organism. You can grow cells, transfer embryos to a recipient and produce healthy animals, but we set out to recreate the process in a manner that was as close as possible to the physiological reality.” 

Camara de Bem noted that his partners at the University of Leeds are developing other kinds of chip for embryo insertion. “The methodology opens up a wealth of opportunities, and in future we hope to be able to culture cells and embryos together in order to find out exactly what happens when there are changes in the medium and in communication with maternal cells. This is an opening for more applied research,” he said.

The group’s work also offers a potential model for the study of pregnancy in mammals, including humans. “Except for non-human primates, mice are the main model for studying humans. Placenta formation in mice is the most similar to the process in humans. On the other hand, unlike us mice have many offspring. In cattle, placentation is very different from what it is in humans, but the gestational period is similar and cows also have only one offspring per pregnancy. There will never be an ideal model, because of the differences between species, but this can be one more model,” he said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Climate change a bigger threat to landscape biodiversity than emerald ash borer

The emerald ash borer, an invasive beetle native to Southeast Asia, threatens the entire ash tree population in North America and has already changed forested landscapes and caused tens of billions of dollars in lost revenue to the ash sawtimber industry since it arrived in the United States in the 1990s. Despite the devastating impact the beetle has had on forests in the eastern and midwestern parts of the U.S., climate change will have a much larger and widespread impact on these landscapes through the end of the century, according to researchers.

"We really wanted to focus on isolating the impact of the emerald ash borer on biodiversity, forest composition, biomass and other factors," said Stacey Olson, program coordinator and legal assistant at Resources Legacy Fund. Olson completed the research as part of her master's thesis at Penn State. "We found that emerald ash borer and its impact on ash trees has serious implications for forest change at the site level, but at the broad landscape level, the climatic changes over the next century were much more important in terms of forest composition and species diversity."

The researchers used a forest simulation model to examine the effects of the emerald ash borer and climate change on a forested area of northeast Wisconsin through the year 2100. The area includes the Menominee Reservation. The Menominee Indian Tribe of Wisconsin has been sustainably harvesting timber from the forest for more than 150 years. The scientists reported their findings in the journal Ecosystems.

The model took into account how trees grow, disperse seeds, die and interact with disturbances such as climatic changes. It also accounted for emerald ash borer infestation and pre-emptive ash tree removal, an Indigenous forest management strategy.

"When we run the model, all of these components work simultaneously and interact with one another across the landscape and across time," said Olson, who was also an Environmental Scholar in the Earth and Environmental Systems Institute (EESI) at Penn State. "The model gives us a picture of what all these interacting disturbance and succession processes look like."

The researchers looked at moderate and high-level climate change scenarios based on a business-as-usual approach that fails to curb greenhouse gas emissions within the next decade. The landscape itself partly drove the team's decision to study these scenarios, Olson said. The forest sits in what scientists call a tension zone, where vegetation, soil type and climate variability can shift quickly as one moves across the landscape. In these areas, changes in climate can result in drastic changes on the ground.

The model showed a shift in the types of trees present in the study area by 2100. Northern hardwoods, like beech and birch trees, decreased by approximately 12%, to 35% of the total biomass of all species in the forest. Under climate change conditions, southern hardwoods, like black cherry trees, increased from 4% to 23% by mid-century and became the dominant species in the southern part of the study area.

The model showed that in some areas, the emerald ash borer would completely remove ash trees, clearing the way for other species to replace them. Ash, however, is not the dominant species in the forest, and its removal cannot account for the large shift in tree composition from northern to southern hardwoods. The researchers attributed this shift to climatic changes, such as warmer temperatures and periods of drought or water scarcity, identified in the models.

The study led by Olson is part of a larger project, Visualizing Forest Futures (ViFF), led by Erica Smithwick, distinguished professor of geography and EESI associate, and done in collaboration with the Menominee. The project combines Indigenous forest-management practices with cutting-edge modeling and visualization techniques to better understand the connections between human values and forests and how to sustainably manage forest resources.

"Our project seeks to guide decision-making about forest management strategies while also accounting for uncertainties in forest changes under future climates," Smithwick said. "Stacey's paper provides key information to help guide that process."

The team's research demonstrates the importance of focusing more resources on climate change mitigation rather than on a very specific, targeted threat like the emerald ash borer.

"Some of the research I found said that the ash will likely become functionally extinct within the next couple of decades," said Olson. "While this is clearly a serious problem that deserves attention, a big takeaway for me is that climate is an even larger driver of forest change over the next century, and addressing these challenges must be a top priority."

Credit: 
Penn State

Monoclonal antibody prevents HIV infection in monkeys, study finds

An experimental, lab-made antibody can completely prevent nonhuman primates from being infected with the monkey form of HIV, new research published in Nature Communications shows.

The results will inform a future human clinical trial evaluating leronlimab as a potential pre-exposure prophylaxis, or PrEP, therapy to prevent human infection from the virus that causes AIDS.

"Our study findings indicate leronlimab could be a new weapon against the HIV epidemic," said the study's lead researcher and co-corresponding author of this paper, Jonah Sacha, Ph.D., an Oregon Health & Science University professor at OHSU's Oregon National Primate Center and Vaccine & Gene Therapy Institute.

"The results of this pre-clinical study, targeting the HIV co-receptor CCR5, have the potential to be groundbreaking as we essentially have a tool that can mimic the genetic mutations of CCR5 that render some individuals immune to infection and have led in part to two cases of a cure of HIV," said the other co-corresponding author, Lishomwa Ndhlovu, M.D., Ph.D., a professor of immunology in medicine at Weill Cornell Medicine in New York.

Made by Vancouver, Washington-based CytoDyn, the monoclonal antibody blocks HIV from entering immune cells through a surface protein called CCR5. The injectable drug has already been studied in a Phase 3 trial as a potential treatment for people living with HIV when used in combination with standard antiretroviral medications. CytoDyn is in the process of submitting information to the FDA to request its approval for that use. This study, however, specifically examined preventing HIV infection to begin with.

Some PrEP drugs are already available, but they can lead to adverse side effects such as liver, heart and bone problems, and some people are resistant to them due to genetic mutations in HIV. Existing PrEP options typically require frequent use, such as a pill daily, or are infusions that must be given in a clinic. Leronlimab is designed to be a self-administered injection.

To study leronlimab's effectiveness as a potential PrEP drug, the research team created three groups of six rhesus macaques at OHSU's Oregon National Primate Research Center. Two groups received different doses of leronlimab, while the third served as a control that didn't receive the experimental drug.

Macaques that received the higher dose of 50 milligrams per kilogram of the animal's weight every other week were completely protected from the monkey form of HIV. In contrast, two of the animals that received the lower dose of 10 milligrams per kilogram per week became infected, and every animal in the control group became infected. Researchers concluded the low-dose group's partial protection was likely due to monkey immune responses against the human antibody.

Following this study's results, CytoDyn is planning to conduct an early clinical trial investigating leronlimab as a potential PrEP drug in people within the next year. Human doses would likely be lower than those given in this study, as rhesus macaque cells have more surface CCR5 protein than humans.

In the meanwhile, Sacha is already trying to make leronlimab easier to use. He received a five-year, $3-million NIH grant in August 2020 to develop a concentrated, longer-lasting formulation of leronlimab that could allow it to be injected every three months. Less-frequent injections can increase drug regimen adherence, and therefore improve drug effectiveness.

The research team dedicated this study to Timothy Ray Brown, who died Sept. 29, 2020, and was known as the Berlin patient for being the first person to be cured of HIV. While living in Berlin in 2007, Brown underwent a bone morrow transplant to treat his blood cancer. The procedure eliminated HIV in Brown because the transplanted bone marrow came from a donor who had a rare mutation that eliminated the CCR5 gene, which makes the surface protein through which HIV enters cells. Sacha became friends with Brown after meeting him at an AIDS conference in 2015. Brown is also a co-author on the paper, and inspired scientists working on this research.

Credit: 
Oregon Health & Science University

A breakthrough in the physics of blood clotting

image: First author Yueyi Sun inside Georgia Tech's Complex Fluids Modeling and Simulation lab, where she compares the experimental and simulated platelet-driven fibrin clot contraction process.

Image: 
Alexander Alexeev, Georgia Tech

Heart attacks and strokes -- the leading causes of death in human beings -- are fundamentally blood clots of the heart and brain. Better understanding how the blood-clotting process works and how to accelerate or slow down clotting, depending on the medical need, could save lives.

New research by the Georgia Institute of Technology and Emory University published in the journal Biomaterials sheds new light on the mechanics and physics of blood clotting through modeling the dynamics at play during a still poorly understood phase of blood clotting called clot contraction.

"Blood clotting is actually a physics-based phenomenon that must occur to stem bleeding after an injury," said Wilbur A. Lam, W. Paul Bowers Research Chair in the Department of Pediatrics and the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory. "The biology is known. The biochemistry is known. But how this ultimately translates into physics is an untapped area."

And that's a problem, argues Lam and his research colleagues, since blood clotting is ultimately about "how good of a seal can the body make on this damaged blood vessel to stop bleeding, or when this goes wrong, how does the body accidentally make clots in our heart vessels or in our brain?"

How Blood Clotting Works

The workhorses to stem bleeding are platelets -- tiny 2-micrometer cells in the blood in charge of making the initial plug. The clot that forms is called fibrin, which acts as a glue scaffold that the platelets attach to and pull against. Blood clot contraction arises when these platelets interact with the fibrin scaffold. To demonstrate the contraction, researchers embedded a 3-millimeter Jell-O mold of a LEGO figure with millions of platelets and fibrin to recreate a simplified version of a blood clot.

"What we don't know is, 'How does that work?' 'What's the timing of it so all these cells work together -- do they all pull at the same time?' Those are the fundamental questions that we worked together to answer," Lam said.

Lam's lab collaborated with Georgia Tech's Complex Fluids Modeling and Simulation group headed by Alexander Alexeev, professor and Anderer Faculty Fellow in the George W. Woodruff School of Mechanical Engineering, to create a computational model of a contracting clot. The model incorporates fibrin fibers forming a three-dimensional network and distributed platelets that can extend filopodia, or the tentacle-like structures that extend from cells so they can attach to specific surfaces, to pull the nearby fibers.

Model Shows Platelets Dramatically Reducing Clot Volume

When the researchers simulated a clot where a large group of platelets was activated at the same time, the tiny cells could only reach nearby fibrins because the platelets can extend filopodia that are rather short, less than 6 micrometers. "But in a trauma, some platelets contract first. They shrink the clot so the other platelets will see more fibrins nearby, and it effectively increases the clot force," Alexeev explained. Due to the asynchronous platelet activity, the force enhancement can be as high as 70%, leading to a 90% decrease of the clot volume.

"The simulations showed that the platelets work best when they're not in total sync with each other," Lam said. "These platelets are actually pulling at different times and by doing that they're increasing the efficiency (of the clot)."

This phenomenon, dubbed by the team asynchronous mechanical amplification, is most pronounced "when we have the right concentration of the platelets corresponding to that of healthy patients," Alexeev said.

Research Could Lead to Better Ways to Treat Clotting, Bleeding Issues

The findings could open medical options for people with clotting issues, said Lam, who treats young patients with blood disorders as a pediatric hematologist in the Aflac Cancer and Blood Disorders Center at Children's Healthcare of Atlanta.

"If we know why this happens, then we have a whole new potential avenue of treatments for diseases of blood clotting," he said, emphasizing that heart attacks and strokes occur when this biophysical process goes wrong.

Lam explained that fine tuning the contraction process to make it faster or more robust could help patients who are bleeding from a car accident or, in the case of a heart attack, make the clotting less intense and slow it down.

"Understanding the physics of this clot contraction could potentially lead to new ways to treat bleeding problems and clotting problems."

Alexeev added that their research also could lead to new biomaterials such as a new type of Band-Aid that could help augment the clotting process.

First author and Georgia Tech Ph.D. candidate Yueyi Sun noted the simplicity of the model and the fact that the simulations allowed the team to understand how the platelets work together to contract the fibrin clot as they would in the body.

"When we started to include the heterogeneous activation, suddenly it gave us the correct volume contraction," she said. "Allowing the platelets to have some time delay so one can use what the previous ones did as a better starting point was really neat to see. I think our model can potentially be used to provide guidelines for designing novel active biological and synthetic materials."

Sun agreed with her research colleagues that this phenomenon might occur in other aspects of nature. For example, multiple asynchronous actuators can fold a large net more effectively to enhance packaging efficiency without the need of incorporating additional actuators.

"It theoretically could be an engineered principle," Lam said. "For a wound to shrink more, maybe we don't have the chemical reactions occur at the same time -- maybe we have different chemical reactions occur at different times. You gain better efficiency and contraction when one allows half or all of the platelets to do the work together."

Building on the research, Sun hopes to examine more closely how a single platelet force converts or is transmitted to the clot force, and how much force is needed to hold two sides of a graph together from a thickness and width standpoint. Sun also intends to include red blood cells in their model since they account for 40% of all blood and play a role in defining the clot size.

"If your red blood cells are too easily trapped in your clot, then you are more likely to have a large clot, which causes a thrombosis issue," she explained.

Credit: 
Georgia Institute of Technology

Visualizing cement hydration on a molecular level

image: The high temporal and spatial resolution Raman imaging technique opens opportunities to answer millennia-old questions regarding cement chemistry. This high-resolution Raman image shows the hydration of alite (white) forming C-S-H (blue) and portlandite (red). Other components are belite (green) and calcite (yellow).

Image: 
Image courtesy of Franz-Josef Ulm, Admir Masic, Hyun-Chae Chad Loh, et al

The concrete world that surrounds us owes its shape and durability to chemical reactions that start when ordinary Portland cement is mixed with water. Now, MIT scientists have demonstrated a way to watch these reactions under real-world conditions, an advance that may help researchers find ways to make concrete more sustainable.

The study is a "Brothers Lumière moment for concrete science," says co-author Franz-Josef Ulm, professor of civil and environmental engineering and faculty director of the MIT Concrete Sustainability Hub, referring to the two brothers who ushered in the era of projected films. Likewise, Ulm says, the MIT team has provided a glimpse of early-stage cement hydration that is like cinema in Technicolor compared to the black and white photos of earlier research.

Cement in concrete contributes about 8 percent of the world's total carbon dioxide emissions, rivaling the emissions produced by most individual countries. With a better understanding of cement chemistry, scientists could potentially "alter production or change ingredients so that concrete has less of an impact on emissions, or add ingredients that are capable of actively absorbing carbon dioxide," says Admir Masic, associate professor of civil and environmental engineering.

Next-generation technologies like 3D printing of concrete could also benefit from the study's new imaging technique, which shows how cement hydrates and hardens in place, says Masic Lab graduate student Hyun-Chae Chad Loh, who also works as a materials scientist with the company Black Buffalo 3D Corporation.
Loh is the first author of the study published in ACS Langmuir, joining Ulm, Masic, and postdoc Hee-Jeong Rachel Kim.

Cement from the start

Loh and colleagues used a technique called Raman microspectroscopy to get a closer look at the specific and dynamic chemical reactions taking place when water and cement mix. Raman spectroscopy creates images by shining a high-intensity laser light on material and measuring the intensities and wavelengths of the light as it is scattered by the molecules that make up the material.

Different molecules and molecular bonds have their own unique scattering "fingerprints," so the technique can be used to create chemical images of molecular structures and dynamic chemical reactions inside a material. Raman spectroscopy is often used to characterize biological and archaeological materials, as Masic has done in previous studies of nacre and other biomineralized materials and ancient Roman concretes.

Using Raman microspectroscopy, the MIT scientists observed a sample of ordinary Portland cement placed underwater without disturbing it or artificially stopping the hydration process, mimicking the real-world conditions of concrete use. In general, one of the hydration products, called portlandite, starts as a disordered phase, percolates throughout the material, and then crystallizes, the research team concluded.

Before this, "scientists could only study cement hydration with average bulk properties or with a snapshot of one point in time," says Loh, "but this allowed us to observe all the changes almost continuously and improved the resolution of our image in space and time."

For instance, calcium-silicate-hydrate, or C-S-H, is the main binding ingredient in cement that holds concrete together, "but it's very difficult to detect because of its amorphous nature," Loh explains. "Seeing its structure, distribution, and how it developed during the curing process was something that was amazing to watch."

Building better

Ulm says the work will guide researchers as they experiment with new additives and other methods to reduce concrete's greenhouse gas emissions: "Rather than 'fishing in the dark,'" we are now able to rationalize through this new approach how reactions occur or do not occur, and intervene chemically."

The team will use Raman spectroscopy as they spend the summer testing how well different cementitious materials capture carbon dioxide, Masic says. "Tracking this up to now has been almost impossible, but now we have the opportunity to follow carbonation in cementitious materials that helps us understand where the carbon dioxide goes, which phases are formed, and how to change them in order to potentially use concrete as a carbon sink."

The imaging is also critical for Loh's work with 3D concrete printing, which depends on extruding concrete layers in a precisely measured and coordinated process, during which the liquid slurry turns into solid concrete.

"Knowing when the concrete is going to set is the most critical question that everyone is trying to understand" in the industry, he says. "We do a lot of trial and error to optimize a design. But monitoring the underlying chemistry in space and time is critical, and this science-enabled innovation will impact the concrete printing capabilities of the construction industry."

This work was partially supported by the scholarship program of the Kwanjeong Educational Foundation.

Credit: 
Massachusetts Institute of Technology

Stabilizing gassy electrolytes could make ultra-low temperature batteries safer

image: Artistic rendering of a battery separator that condenses gas electrolytes into liquid at a much lower pressure. The new separator improves battery safety and its performance in the extreme cold by keeping more electrolyte, as well as lithium ions, flowing in the battery.

Image: 
Chen group

A new technology could dramatically improve the safety of lithium-ion batteries that operate with gas electrolytes at ultra-low temperatures. Nanoengineers at the University of California San Diego developed a separator—the part of the battery that serves as a barrier between the anode and cathode—that keeps the gas-based electrolytes in these batteries from vaporizing. This new separator could, in turn, help prevent the buildup of pressure inside the battery that leads to swelling and explosions.

“By trapping gas molecules, this separator can function as a stabilizer for volatile electrolytes,” said Zheng Chen, a professor of nanoengineering at the UC San Diego Jacobs School of Engineering who led the study.

The new separator also boosted battery performance at ultra-low temperatures. Battery cells built with the new separator operated with a high capacity of 500 milliamp-hours per gram at -40 C, whereas those built with a commercial separator exhibited almost no capacity. The battery cells still exhibited high capacity even after sitting unused for two months—a promising sign that the new separator could also prolong shelf life, the researchers said.

The team published their findings June 7 in Nature Communications.

The advance brings researchers a step closer to building lithium-ion batteries that can power vehicles in the extreme cold, such as spacecraft, satellites and deep-sea vessels.

This work builds on a previous study published in Science by the lab of UC San Diego nanoengineering professor Ying Shirley Meng, which was the first to report the development of lithium-ion batteries that perform well at temperatures as low as -60 C. What makes these batteries especially cold hardy is that they use a special type of electrolyte called a liquefied gas electrolyte, which is a gas that is liquefied by applying pressure. It is far more resistant to freezing than a conventional liquid electrolyte.

But there’s a downside. Liquefied gas electrolytes have a high tendency to go from liquid to gas. “This is the biggest safety issue with these electrolytes,” said Chen. In order to use them, a lot of pressure must be applied to condense the gas molecules and keep the electrolyte in liquid form.

To combat this issue, Chen’s lab teamed up with Meng and UC San Diego nanoengineering professor Tod Pascal to develop a way to liquefy these gassy electrolytes easily without having to apply so much pressure. The advance was made possible by combining the expertise of computational experts like Pascal with experimentalists like Chen and Meng, who are all part of the UC San Diego Materials Research Science and Engineering Center (MRSEC).

Their approach makes use of a physical phenomenon in which gas molecules spontaneously condense when trapped inside tiny, nanometer-sized spaces. This phenomenon, known as capillary condensation, enables a gas to become liquid at a much lower pressure.

The team leveraged this phenomenon to build a battery separator that would stabilize the electrolyte in their ultra-low temperature battery—a liquefied gas electrolyte made of fluoromethane gas. The researchers built the separator out of a porous, crystalline material called a metal-organic framework (MOF). What’s special about the MOF is that it is filled with tiny pores that are able to trap fluoromethane gas molecules and condense them at relatively low pressures. For example, fluoromethane typically condenses under a pressure of 118 psi at -30 C; but with the MOF, it condenses at just 11 psi at the same temperature.

“This MOF significantly reduces the pressure needed to make the electrolyte work,” said Chen. “As a result, our battery cells deliver a significant amount of capacity at low temperature and show no degradation.”

The researchers tested the MOF-based separator in lithium-ion battery cells—built with a carbon fluoride cathode and lithium metal anode—filled with fluoromethane gas electrolyte under an internal pressure of 70 psi, which is well below the pressure needed to liquefy fluoromethane. The cells retained 57% of their room temperature capacity at -40 C. By contrast, cells with a commercial separator exhibited almost no capacity with fluoromethane gas electrolyte at the same temperature and pressure.

The tiny pores of the MOF-based separator are key because they keep more electrolyte flowing in the battery, even under reduced pressure. The commercial separator, on the other hand, has large pores and cannot retain the gas electrolyte molecules under reduced pressure.

But tiny pores are not the only reason the separator works so well in these conditions. The researchers engineered the separator so that the pores form continuous paths from one end to the other. This ensures that lithium ions can still flow freely through the separator. In tests, battery cells with the new separator had 10 times higher ionic conductivity at -40 C than cells with the commercial separator.

Chen’s team is now testing the MOF-based separator on other electrolytes. “We are seeing similar effects. We can use this MOF as a stabilizer to adsorb various kinds of electrolyte molecules and improve the safety even in traditional lithium batteries, which also have volatile electrolytes.”

Credit: 
University of California - San Diego

Mandating vaccination could reduce voluntary compliance

image: Katrin Schmelz

Image: 
University of Konstanz

Citizen opposition to COVID-19 vaccination has emerged across the globe, prompting pushes for mandatory vaccination policies.  But a new study based on evidence from Germany and on a model of the dynamic nature of people's resistance to COVID-19 vaccination sounds an alarm: mandating vaccination could have a substantial negative impact on voluntary compliance.

Majorities in many countries now favor mandatory vaccination. In March, the government of Galicia in Spain made vaccinations mandatory for adults, subjecting violators to substantial fines. Italy has made vaccinations mandatory for care workers. The University of California and California State University systems announced in late April that vaccination would be required for anyone attending in the Fall.

The research, published in this week's Proceedings of the National Academy of Sciences (PNAS), extends an earlier PNAS study by first author Katrin Schmelz, a psychologist and behavioral economist at the University of Konstanz, documenting that a major source of vaccine hesitancy is distrust of government. She found that enforced vaccinations reduce people's desire to be vaccinated, particularly among those with low levels of trust in public institutions.

In the new study, Schmelz and economist Samuel Bowles of the Santa Fe Institute exploit a large panel survey implemented in Germany during the first and second waves of the pandemic. Despite infections in Germany being 15 times more common in the second wave of both the pandemic and the survey, the researchers observed increased opposition when they asked participants a hypothetical question about how they'd respond if vaccinations were to be legally required (the German government is publicly committed not to require vaccinations). In contrast, there was a higher and undiminished level of support for the voluntary vaccinations now in force. 

The authors also draw on evidence from the dynamics of diffusion of novel products and technologies such as TVs and washing machines in the last century. They reason that as those who are hesitant or opposed to vaccination see that others are getting vaccinated, they might change their mind. Learning from others' vaccination decisions - "conformism" in psychology - means that even if initial vaccination hesitancy is substantial, as more become vaccinated it may be possible to get to a herd immunity target without mandating vaccines.

They also use experimental evidence from behavioral economics showing that explicit incentives, whether in the form of carrots or sticks, may crowd out intrinsic or ethical motives. Policies that aim to incentivize a desired behavior, such as getting vaccinated, can actually undercut individuals' sense of a moral or ethical obligation to do the right thing. 

This is evident in their data. Mandating vaccinations by law directly reduces the desire to be vaccinated. Their model also suggests an adverse indirect effect: enforcement will reduce the extent to which others being vaccinated will induce vaccine hesitators to become willing, as this carries a weaker signal. Schmelz says "How people feel about getting vaccinated will be affected by enforcement in two ways -- it could crowd out pro-vaccine feelings, and reduce the positive effect of conformism if vaccination is voluntary."

Bowles says this should be a caution to governments considering mandated policies: "Costly errors may be avoided if policymakers reflect carefully on the costs of enforcement. These could not only increase opposition to vaccination, but also heighten social conflict by further alienating citizens from the government or scientific and medical elites," he says. Nonetheless, he says government enforcement "may still be necessary if the number wishing to be vaccinated is insufficient to control the pandemic."

Schmelz concludes that "Our findings have broad policy applicability beyond COVID-19. There are many cases in which voluntary citizen compliance to a policy is essential because state enforcement capacities are limited, and because results may depend on the ways that the policies themselves alter citizens' beliefs and preferences," adding that "... examples include policies to promote lifestyle changes to reduce carbon footprints or to sustain tolerance and mutual respect in a heterogeneous society."

Credit: 
Santa Fe Institute