Culture

A chemical embrace from the perfect host

video: KAUST chemists are developing a separation method for the petrochemical industry with high efficiency and low energy consumption.

Image: 
© 2020 KAUST; Heno Hwang

An industrial process that currently consumes vast amounts of energy in petrochemical plants around the globe could be replaced by an alternative process so efficient that it requires no heating or elevated pressure.

Niveen Khashab from KAUST's Advanced Membranes and Porous Materials Center and her colleagues have developed a new way to separate derivates of benzene called xylenes.

"Discussions with industry to implement the technology are already underway," Khashab says.

Xylenes come in three different forms, known as isomers, which differ from each other only in the location of a single carbon atom. Xylenes are used in many large-scale applications, including in polymers, plastics and fibers, and as fuel additives, but many uses rely on just one of the three isomers. Because the isomers are so similar, their physical properties, such as boiling point, are very close, which makes separating them energetically expensive.

"Every year, the global energy costs of separating these isomers using distillation is about 50 gigawatts, enough to power roughly 40 million homes," says Gengwu Zhang, a postdoc in Khashab's team and first author of the study. "We wanted to develop a method with high efficiency and low energy consumption to separate and purify these isomers for the petrochemical industry."

To draw apart the xylene isomers, Khashab and her team used doughnut-shaped molecules called cucurbiturils. The hole in the middle of these molecules can host smaller molecules inside it. The hole in the middle of cucurbit[7]uril is the ideal size for hosting the ortho-isomer of xylene, the team showed. Using a process called liquid-liquid extraction, the team used cucurbit[7]uril to separate ortho-xylene from the other isomers. "We could separate ortho-xylene with more than 92 percent specificity after one extraction cycle," Zhang says. "Unlike previous methods, our method is performed under ambient temperature and pressure, which means very low energy consumption and easy operation," he adds.

The whole process was designed to be readily adopted in the existing plants of petrochemical companies, Khashab explains. "Liquid-liquid extraction towers are already used in industry and so it is relatively easy to incorporate our material in this setup," she says. In addition, cucurbit[7]uril is inexpensive, commercially available or easily made, and highly stable compared to most porous materials.

"We have already shown that we can separate xylene from commercial oil samples at scales of up to 0.5 liters," Khashab adds. "We are in touch with Saudi Aramco to take this process to industrial implementation."

The team is also examining other applications for this separation process, Zhang says.

Credit: 
King Abdullah University of Science & Technology (KAUST)

Poor people experience greater financial hardship in areas where income inequality is greatest

image: In areas with the highest levels of income inequality, the poor are less likely to rely on their community for support due to shame or embarrassment, according to a study in Nature Human Behaviour.

Image: 
Egan Jimenez, Princeton University

PRINCETON, N.J.--While some are relying on friends and neighbors to help them get groceries, the poor may need to put themselves at risk for COVID-19 by venturing out on public transportation to get supplies. Depending on where they live, they may trust no one else to help out.

This is true in areas with the highest levels of income inequality, according to a paper to be published in Nature Human Behaviour, where the poor are less likely to rely on their community for support due to shame or embarrassment.

Look at New Haven, Connecticut, as an example. Part of the city is a wealthier University area, and the other part is primarily low-income. It would be rare, the research suggests, for someone from the lower-income areas to ask those in the University section for help -- especially now as the coronavirus continues to spread.

The findings illustrate why policymakers and researchers should move beyond a sole focus on helping low-income individuals and instead look at ways to develop stronger communities.

"If I'm poor, it exacerbates my need to rely on community, but what does it mean if I don't trust my community? It means that there is no way for me to get what I need without putting myself in danger. This can have disastrous long-term effects among the poor," said study lead author Jon Jachimowicz, assistant professor of business administration, at Harvard University.

"Our work shows that hardship increases for low-income individuals by reducing their ability to rely on their community as a buffer against financial and other related difficulties," said co-author Elke Weber, Gerhard R. Andlinger Professor in Energy and the Environment and professor of psychology and public affairs at Princeton University's Woodrow Wilson School of Public and International Affairs. "This suggests that stimulus measures designed to address the economic and social fallout of the coronavirus should focus on reducing the existing income and wealth gap in our country."

The study was an interdisciplinary effort led by psychologists and economists using data analysis strategies across disciplines.

The team also included co-lead author Barnabas Szaszi of Eotvos Lorand University, Marcel Lukas of Heriot-Watt University, David Smerdon of the University of Cambridge, and Jaideep Prabhu of the University of Cambridge.

The researchers conducted eight studies looking at more than one million people across the United States, Australia, and Uganda. Their work included an instrumental variable analysis, lab experiments, online studies, and field work.

In the first four studies, the team established empirical support for their hypothesis that greater income inequality hits the poor the hardest. Their findings were as expected: Across all countries, the greater the economic inequality, the harder the financial hardship for those with the lowest incomes.

In the next four studies, they investigated the main driver behind this effect, finding strong evidence supporting their claim: Higher economic inequality weakens the perception of a community buffer, which is a key source of support for low-income people.

The researchers estimated this lack of support comes at a cost $6,587. This means that a person earning $36,587 in New Haven, where there is greater income inequality, experiences the same financial hardship as someone making $30,000 in a more homogenous income area like Princeton.

So, why do low-income people feel they can't ask community members for support? Many people hit by money problems worry about what others would think, so they don't ask for help. The researchers actually found that the higher the income inequality in an area, the more distrustful the poor are of those in their community.

Other factors are also perpetuating cycles of poverty in these places. One is a person's need to display his or her wealth for reasons of status through physical objects like a fancy car, which further indebts them. In other cases, people overwhelmed by destitution may resort to negative behaviors like taking out payday loans to cover the bills, which only worsens their financial situation.

All of this supports the strengthening of local communities, the researchers said. Programs like the YMCA are extremely beneficial, and community investment funds could further empower towns with the greatest income disparity. Infrastructure also matters; a city's walkability can bond neighbors together. In light of COVID-19, stimulus bills could help address some of these issues, while financially helping the most vulnerable.

"At a time when the coronavirus crisis puts a premium on cooperation and community support, our policymakers need to be aware of the social and economic conditions that eat away at such support, especially for the most vulnerable among us, the poor," Weber said.

The results do not shed light on how economic inequality affects people at median income levels, so the researchers encourage further work in this area. It is possible that the availability of liquid assets and other kinds of wealth could help in times of need, the researchers said.

Credit: 
Princeton School of Public and International Affairs

A Martian mash up: Meteorites tell story of Mars' water history

In Jessica Barnes' palm is an ancient, coin-sized mosaic of glass, minerals and rocks as thick as a strand of wool fiber. It is a slice of Martian meteorite, known as Northwest Africa 7034 or Black Beauty, that was formed when a huge impact cemented together various pieces of Martian crust.

Barnes is an assistant professor of planetary sciences in the University of Arizona Lunar and Planetary Laboratory. She and her team chemically analyzed the Black Beauty meteorite and the infamous Allan Hills 84001 meteorite - controversial in the 1990s for allegedly containing Martian microbes - to reconstruct Mars' water history and planetary origins.

Their analysis, published today in Nature Geoscience, showed that Mars likely received water from at least two vastly different sources early in its history. The variability the researchers found implies that Mars, unlike Earth and the moon, never had an ocean of magma completely encompassing the planet.

"These two different sources of water in Mars' interior might be telling us something about the kinds of objects that were available to coalesce into the inner, rocky planets," Barnes said. Two distinct planetesimals with vastly different water contents could have collided and never fully mixed. "This context is also important for understanding the past habitability and astrobiology of Mars."

Reading the Water

"A lot of people have been trying to figure out Mars' water history," Barnes said. "Like, where did water come from? How long was it in the crust (surface) of Mars? Where did Mars' interior water come from? What can water tell us about how Mars formed and evolved?"

Barnes and her team were able to piece together Mars' water history by looking for clues in two types, or isotopes, of hydrogen. One hydrogen isotope contains one proton in its nucleus; this is sometimes called "light hydrogen." The other isotope is called deuterium, which contains a proton and a neutron in the nucleus; this is sometimes referred to as "heavy hydrogen." The ratio of these two hydrogen isotopes signals to a planetary scientist the processes and possible origins of water in the rocks, minerals and glasses in which they're found.

Meteorite Mystery

For about 20 years, researchers have been recording the isotopic ratios from Martian meteorites, and their data were all over the place. There seemed to be little trend, Barnes said.

Water locked in Earth rocks is what's called unfractionated, meaning it doesn't deviate much from the standard reference value of ocean water - a 1:6,420 ratio of heavy to light hydrogen. Mars' atmosphere, on the other hand, is heavily fractionated - it is mostly populated by deuterium, or heavy hydrogen, likely because the solar wind stripped away the light hydrogen. Measurements from Martian meteorites - many of which were excavated from deep within Mars by impact events - ran the gamut between Earth and Mars' atmosphere measurements.

Barnes' team set out to investigate the hydrogen isotope composition of the Martian crust specifically by studying samples they knew were originated from the crust: the Black Beauty and Allan Hills meteorites. Black Beauty was especially helpful because it's a mashup of surface material from many different points in Mars' history.

"This allowed us to form an idea of what Mars' crust looked like over several billions of years," Barnes said.

The isotopic ratios of the meteorite samples fell about midway between the value for Earth rocks and Mars' atmosphere. When the researchers' findings were compared with previous studies, including results from the Curiosity Rover, it seems that this was the case for most of Mars' 4 billion-plus-year history.

"We thought, ok this is interesting, but also kind of weird," Barnes said. "How do we explain this dichotomy where the Martian atmosphere is being fractionated, but the crust is basically staying the same over geological time?"

Barnes and her colleagues also grappled with trying to explain why the crust seemed so different from the Martian mantle, the rock later which lies below.

"If you try and explain this fairly constant isotopic ratio of Mars' crust, you really can't use the atmosphere to do that," Barnes said. "But we know how crusts are formed. They're formed from molten material from the interior that solidifies on the surface."

"The prevailing hypothesis before we started this work was that the interior of Mars was more Earthlike and unfractionated, and so the variability in hydrogen isotope ratios within Martian samples was due to either terrestrial contamination or atmospheric implantation as it made its way off Mars," Barnes said.

The idea that Mars' interior was Earthlike in composition came from one study of a Martian meteorite thought to have originated from the mantle - the interior between the planet's core and its surface crust.

However, Barnes said, "Martian meteorites basically plot all over the place, and so trying to figure out what these samples are actually telling us about water in the mantle of Mars has historically been a challenge. The fact that our data for the crust was so different prompted us to go back through the scientific literature and scrutinize the data."

The researchers found that two geochemically different types of Martian volcanic rocks -enriched shergottites and depleted shergottites - contain water with different hydrogen isotope ratios. Enriched shergottites contain more deuterium than the depleted shergottites, which are more Earth-like, they found.

"It turns out that if you mix different proportions of hydrogen from these two kinds of shergottites, you can get the crustal value," Barnes said.

She and her colleagues think that the shergottites are recording the signatures of two different hydrogen - and by extension, water - reservoirs within Mars. The stark difference hints to them that more than one source might have contributed water to Mars and that Mars did not have a global magma ocean.

Credit: 
University of Arizona

Fast-fail trial shows new approach to identifying brain targets for clinical treatments

A first-of-its-kind trial has demonstrated that a receptor involved in the brain's reward system may be a viable target for treating anhedonia (or lack of pleasure), a key symptom of several mood and anxiety disorders. This innovative fast-fail trial was funded by the National Institute of Mental Health (NIMH), part of the National Institutes of Health, and the results of the trial are published in Nature Medicine.

Mood and anxiety disorders are some of the most commonly diagnosed mental disorders, affecting millions of people each year. Despite this, available medications are not always effective in treating these disorders. The need for new treatments is clear, but developing psychiatric medications is often a resource-intensive process with a low success rate. To address this, NIMH established the Fast-Fail Trials program with the goal of enhancing the early phases of drug development.

"The fast-fail approach aims to help researchers determine--quickly and efficiently--whether targeting a specific neurobiological mechanism has the hypothesized effect and is a potential candidate for further clinical trials," explained Joshua A. Gordon, M.D., Ph.D., director of NIMH. "Positive results suggest that targeting a neurobiological mechanism affects brain function as expected, while negative results allow researchers to eliminate that target from further consideration. We hope this approach will pave the way towards the development of new and better treatments for individuals with mental illnesses."

In this study, researcher Andrew D. Krystal, M.D. -- who began the research while at the Duke University School of Medicine, Durham, North Carolina, and is now at the University of California, San Francisco -- and colleagues report the first comprehensive application of this fast-fail approach. The researchers examined the kappa opioid receptor (KOR) as a possible neurobiological target for the treatment of anhedonia. Previous findings suggest that drugs that block the KOR, known as KOR antagonists, can affect reward-related brain circuits in ways that could improve reward processing and reverse anhedonia and associated symptoms.

The researchers conducted an eight-week double-blind, randomized placebo-controlled trial with 86 participants across six clinical sites in the United States. Participants were eligible if they were 21 to 65 years old, met the criteria for clinically significant anhedonia and the diagnostic criteria for a mood or anxiety disorder, and did not have other medical or psychiatric conditions. Participants were randomly assigned to receive either a 10 mg dose of the KOR antagonist JNJ-67953964 (previously CERC-501 and LY2456302) or an identical-looking placebo tablet. They received one dose daily over the eight-week trial.

To measure the effects of the KOR antagonist, the researchers examined the activation of the ventral striatum, a structure located in the middle of the brain that is involved in decision making, motivation, reinforcement, and reward. Participants completed a reward anticipation task while their brain activity was measured in a functional MRI scanner. During the task, participants saw a cue that signaled whether the upcoming trial might lead to monetary gain, monetary loss, or neither. In some trials, participants had an incentive to press a specific button, as they could gain money or avoid losing money by doing so. They completed the task once at the beginning and again at the end of the trial.

Relative to those who received the placebo, participants who received the KOR antagonist showed increased activation in the ventral striatum when anticipating monetary gain (versus no-incentive trials). Additional analyses indicated that participants who received the KOR antagonist also showed greater activation of the ventral striatum during anticipation of loss.

Exploratory analyses indicated that lower ventral striatum activation in anticipation of monetary gain at baseline was associated with greater change in activation over the course of the trial, and this correlation was strongest for those who received the KOR antagonist. According to the researchers, this finding suggests that baseline ventral striatal activation may have promise as a neurobiological marker that identifies participants who are most likely to respond to the KOR antagonist. Further analyses suggest that the KOR antagonist also had observable effects on secondary behavioral and self-report measures, including decreased anhedonia scores.

"Together, these findings demonstrate that the KOR antagonist had the hypothesized effect on brain circuits involved in reward and pleasure, establishing proof of mechanism," explained Dr. Krystal. "The results provide support for the usefulness and feasibility of fast-fail trials and--more specifically--for KOR antagonism as a potential target for drug development."

Further testing in larger trials will allow researchers to examine whether using KOR antagonism to engage the ventral striatum yields observable therapeutic effects on anhedonia and related clinical outcomes.

"This study was the first successful implementation of the fast-fail approach and it serves as a proof of principle of the viability of this methodology," says Mi Hillefors, M.D., Ph.D., acting deputy director of NIMH's Division of Translational Research. "We hope that the knowledge gained from the study will lead to more informative treatment trials in the future, contribute to the field of psychopharmacology, and reduce the risks typically associated with developing new psychiatric medications."

Credit: 
NIH/National Institute of Mental Health

Sturgeon genome sequenced

image: The Sterlet (Acipenser ruthenus) belongs to the sturgeons. Its genome is an important piece of the puzzle in understanding the ancestry of vertebrates.

Image: 
Andreas Hartl

Sometimes referred to as the "the Methuselah of freshwater fish", sturgeons and their close relatives are very old from an evolutionary point of view. Fossils indicate that sturgeons date back 250 million years and have changed very little during this period, at least as far as their external appearance is concerned. So it is not surprising that already Charles Darwin coined the term "living fossils" for them.

Scientists from the University of Würzburg and the Leibniz Institute of Freshwater Ecology and Inland Fisheries (IGB) with colleagues in Constance, France and Russia have now successfully sequenced the genome of the sterlet (Acipenser ruthenus), a relatively small species of sturgeon. They were able to show that the genetic material, too, has changed very little since the heyday of the dinosaurs. The scientists present the results of their work in the latest issue of the journal Nature Ecology and Evolution.

Ancestors of the vertebrates

"Sturgeon genomes are an important piece of the puzzle that helps us understand the ancestry of vertebrates. And this has been missing until now," Professor Manfred Schartl explains the reasons why scientists are interested in this fish species. Schartl is the lead author of the recently published study and is senior professor at the Chair of Developmental Biochemistry at the University of Würzburg since this year. Sturgeons are among the oldest species on earth in terms of evolutionary history. They are the ancestors of more than 30,000 species of bony fish that occur today - and thus of more than 96 percent of all living fish species and about half of all known vertebrate species.

Schartl and his colleagues were able to show that sturgeons branched off onto their own evolutionary path at some point during the Upper Devonian or Carboniferous Period about 345 million years ago. "Their external appearance has changed very little since that time and this is also evident in their genetic material, the DNA," Dr. Du Kang explains; first author of the study and a research assistant at the Department of Biochemistry and Molecular Biology II at the University of Würzburg.

To verify this, the geneticists had to take a close look at the proteins encoded by the genes of the sterlet. And indeed, their calculations reveal that this so-called protein evolution has proceeded at a very slow pace. "The rate of protein evolution of the sterlet is similar to that of the coelacanth or of sharks - two fish species that have been roaming the oceans almost unchanged for more than 300 million years as well," says Dr. Matthias Stöck, an evolutionary biologist at the IGB.

Extensive genome change 180 million years ago

The sequence analysis revealed that the sterlet genome comprises 120 chromosomes, about 47,500 protein-coding genes and 1.8 billion base pairs. The researchers also showed that the sterlet duplicated its genome some 180 million years ago, leaving the species instead of the regular two with four sets of chromosomes, which is called tetraploidy in scientific jargon. The genome duplication does not come as a surprise: "Such processes have repeatedly had a major impact on the evolution of the vertebrate genome," says Manfred Schartl. Already their ancestors underwent "whole genome duplication" twice in their evolutionary history. Some species went through this process as many as three or four times.

What did surprise the scientists though was the fact that this duplication of the genome happened so far back in the long history of the sturgeon. "Over this long time span, we would have expected the genome to change more profoundly because in tetraploid organisms gene segments are often lost, silenced or acquire a new function over time," says Professor Axel Meyer, an evolutionary biologist at the University of Constance.

Genome uncertainty eliminated

The exact genomic state of sturgeons was long controversial among scientists. While considered polyploid by some, which means that the genome was duplicated multiple times, others interpreted the sturgeon as a "functional diploid", which refers to a species that first duplicated its genome to become tetraploid but then reduces the gene content again as it evolves. Although the chromosomes are still present in two pairs, they divide their tasks among themselves.

Now it's clear: "We have found out that the sterlet has not returned to a diploid state. Instead, it has retained an unexpectedly high degree of structural and functional polyploidy," says Manfred Schartl. This retention can be ascribed to the slow pace of molecular evolution of most fractions of the sterlet genome.

Genome duplication: A layperson might assume that this makes the job easier for scientists because everything is available in duplicate. But in fact, this presents researchers with a major technical challenge. "This has made it extremely difficult to assemble and assign the small 'snippets of DNA' that modern genome sequencing methods provide us with," says Schartl. However, using special procedures we were able to create "a very good reference genome and the first ever genome of an ancient fish" as part of an international research collaboration.

Genetic research to protect species

Gene sequencing is an important basis for protecting sturgeon species. "In the future, we will be able to determine the sex of the animals using genetic analyses which will greatly facilitate breeding. This will allow us to control reproduction and support the management of breeding populations. This is a milestone in our efforts to preserve these ancient species", says Dr. Jörn Gessner, the IGB's sturgeon expert.

About sturgeons

Sturgeons are native to subtropical, temperate and sub-Arctic rivers, lakes and coastlines of Eurasia and North America. Sturgeons are long-lived and reproduce late, typically not before the age of ten. In many sturgeon species, the adult fish repeatedly migrate from the sea into freshwater to spawn. They are highly sought after for their eggs - better known as caviar.

Because of habitat destruction, river fragmentation, marine pollution and 2,000 years of caviar production, most sturgeon species are now on the brink of extinction. Due to a ban on wild caviar trade, sturgeon aquaculture has become an important industry which can contribute to protecting wild populations by securing the market supply.

Credit: 
University of Würzburg

Exercise reduces caregiver's burden in dementia care

image: Exercise in older adults, even at an advanced stage of dementia, is an important strategy to maintain independence in everyday living and to promote quality of life.

Image: 
LVR, Matthias Jung

The research group "geriatric psychiatry in motion" of the German Sport University Cologne and the LVR-Hospital Cologne develop and evaluate exercise programmes for geriatric mental health care. Latest results from a study in acute dementia care indicate a special exercise programme is not only effective for the patients themselves, but also reduces the professional caregiver's burden caused by neuropsychiatric symptoms.

Short-bout exercise sessions of 20 minutes several times per day are key aspects of this 'exercise-carrousel' - a new exercise programme specially tailored for patients suffering from dementia, which has been developed and evaluated at the LVR-Hospital in Cologne. Throughout the day, the exercises are applied in small groups of patients - twice in the morning, twice in the afternoon. "With these recurrent activity and rest periods, we are not only trying to increase physical activity, but also aiming at stabilising their day-night rhythm" highlights Dr. Tim Fleiner, head of the research group. The novel exercise approach is feasible in the clinical setting - more than half of the patients are physically active for over 150 minutes per week, thus even meeting the recommendation for healthy older adults despite suffering from dementia.

With the same level of psychotropic medication, the patients show clinically relevant improvements in neuropsychiatric symptoms compared to a control group - in particular agitated behavior and lability improved.

As a special side effect, recently published findings show important improvements in the patient's environment: participating in the exercise-carrousel reduces the perceived burden of the patient's caregivers. "Reducing the burden of the patient's caregivers and their relatives is a key aspect in dementia care. That we can achieve an improvement for the patient and his/her environment through a special exercise programme is novel and important for the health care of older people" states PD Dr. Peter Haussermann, head physician of the Department of Geriatric Psychiatry at the LVR-Hospital Cologne.

An early online version of this paper detailing the findings has been published and is scheduled for publication in the March issue of the Journal of Alzheimer's Disease:
Fleiner, T., Dauth, H., Zijlstra, W., & Haussermann, P. (2020). A Structured Physical Exercise Program Reduces Professional Caregiver's Burden Caused by Neuropsychiatric Symptoms in Acute Dementia Care: Randomized Controlled Trial Results. Journal of Alzheimer's Disease: JAD.

Contact: Dr. Tim Fleiner, Institute of Movement and Sport Gerontology
t.fleiner@dshs-koeln.de - +49 221 4982 6144

Credit: 
German Sport University

Experimental AI tool predicts which COVID-19 patients develop respiratory disease

An artificial intelligence tool accurately predicted which patients newly infected with the COVID-19 virus would go on to develop severe respiratory disease, a new study found.

The work was led by NYU Grossman School of Medicine and the Courant Institute of Mathematical Sciences at New York University, in partnership with Wenzhou Central Hospital and Cangnan People's Hospital, both in Wenzhou, China.

Named "SARS-CoV-2," the new virus causes the disease called "coronavirus disease 2019" or "COVID-19." As of March 30, the virus had infected 735,560 patients worldwide. According to the World Health Organization, the illness has caused more than 34,830 deaths to date, more often among older patients with underlying health conditions. The New York State Department of Health has reported more than 33,700 cases to date in New York City.

Published online March 30 in the journal Computers, Materials & Continua, the study also revealed the best indicators of future severity, and found that they were not as expected.

"While work remains to further validate our model, it holds promise as another tool to predict the patients most vulnerable to the virus, but only in support of physicians' hard-won clinical experience in treating viral infections," says corresponding study author Megan Coffee, MD, PhD, clinical assistant professor in the Division of Infectious Disease & Immunology within the Department of Medicine at NYU Grossman School of Medicine.

"Our goal was to design and deploy a decision-support tool using AI capabilities - mostly predictive analytics - to flag future clinical coronavirus severity," says co-author Anasse Bari, PhD, a clinical assistant professor in Computer Science at the Courant institute. "We hope that the tool, when fully developed, will be useful to physicians as they assess which moderately ill patients really need beds, and who can safely go home, with hospital resources stretched thin."

Surprise Predictors

For the study, demographic, laboratory, and radiological findings were collected from 53 patients as each tested positive in January 2020 for the SARS-CoV2 virus at the two Chinese hospitals. Symptoms were typically mild to begin with, including cough, fever, and stomach upset. In a minority of patients, however, severe symptoms developed with a week, including pneumonia.

The goal of the new study was to determine whether AI techniques could help to accurately predict which patients with the virus would go on to develop Acute Respiratory Distress Syndrome or ARDS, the fluid build-up in the lungs that can be fatal in the elderly.

For the new study, the researchers designed computer models that make decisions based on the data fed into them, with programs getting "smarter" the more data they consider. Specifically, the current study used decision trees that track series of decisions between options, and that model the potential consequences of choices at each step in a pathway.

The researchers were surprised to find that characteristics considered to be hallmarks of COVID-19, like certain patterns seen in lung images (e.g. ground glass opacities), fever, and strong immune responses, were not useful in predicting which of the many patients with initial, mild symptoms would go to develop severe lung disease. Neither were age and gender helpful in predicting serious disease, although past studies had found men over 60 to be at higher risk.

Instead, the new AI tool found that changes in three features - levels of the liver enzyme alanine aminotransferase (ALT), reported myalgia, and hemoglobin levels - were most accurately predictive of subsequent, severe disease. Together with other factors, the team reported being able to predict risk of ARDS with up to 80 percent accuracy.

ALT levels - which rise dramatically as diseases like hepatitis damage the liver - were only a bit higher in patients with COVID-19, researchers say, but still featured prominently in prediction of severity. In addition, deep muscle aches (myalgia) were also more commonplace, and have been linked by past research to higher general inflammation in the body.

Lastly, higher levels of hemoglobin, the iron-containing protein that enables blood cells to carry oxygen to bodily tissues, were also linked to later respiratory distress. Could this explained by other factors, like unreported smoking of tobacco, which has long been linked to increased hemoglobin levels? Of the 33 patients at Wenzhou Central Hospital interviewed on smoking status, the two who reported having smoked, also reported that they had quit.

Limitations of the study, say the authors, included the relatively small data set and the limited clinical severity of disease in the population studied. The latter may be due in part to an as yet unexplained dearth of elderly patients admitted into the hospitals during the study period. The average patient age was 43.

"I will be paying more attention in my clinical practice to our data points, watching patients closer if they for instance complain of severe myalgia," adds Coffee. "It's exciting to be able to share data with the field in real time when it can be useful. In all past epidemics, journal papers only published well after the infections had waned."

Credit: 
NYU Langone Health / NYU Grossman School of Medicine

Risk of death from stroke falls by 24%

Thousands more patients each year are surviving strokes, as the risk of death and disability after a stroke fell significantly between 2000 and 2015, according to analysis by Guy's and St Thomas' researchers.

The study, published in PLOS Medicine, looked at data from south London patients who had an ischaemic stroke - one caused by a blood clot - between 2000 and 2015. The team attribute the reduced risk to improvements in care and medication.

After adjustments for population changes, the research showed that the risk of death from stroke fell by 24% over the 15 year period, with the one-year death rate dropping from 32.6% in 2000 to 20.15% in 2015. The one year death rate is the incidence of death within one year of the stroke. The risk of disability after a stroke fell by 23%, from 34.7% in 2000 to 26.7% in 2015.

With around 52,000 people having ischaemic strokes nationwide each year, the team arrived at a figure of 6,300 more patients annually surviving their stroke for over a year, and 3,200 fewer patients each year have a disability as a result of a stroke.

Stroke is a serious condition that occurs when blood supply to part of the brain is cut off. Stroke is the fourth single leading cause of death in England and Wales, and the third biggest cause of death in Scotland and Northern Ireland, with almost 38,000 people dying as a result of stroke in the UK in 2016.

The researchers used data from the South London Stroke Register, which collected data from patients in Lambeth and Southwark. They looked particularly at data from the 3,128 patients who had an ischaemic stroke. Previous analysis of the same source had shown that between 2000 and 2015 the rate of strokes in the area decreased by 43%, with the risk of death from stroke falling by 24%.

The paper showed that the risk of death and disability had reduced for all genders, and for both black and white patients.

Dr Yanzhong Wang, Reader in Medical Statistics at King's College London and author of the study said: "It's really positive news to see that for patients who do have a stroke, the risk of death and disability is decreasing. Alongside our previous work showing a reduction in the rate of strokes it shows that, although there is still more to do, trends are moving in the right direction.

"We think the change is due to improvements to the way we treat stroke, such as higher admission rates to hospital, increased use of CT and MRI scans, and more frequent treatment with thrombolytic and anticoagulant medications in the acute phase of stroke. We also believe that a shift towards patients having less severe strokes, perhaps caused by improved public health, could also play a role in the change.

"We're really grateful to all the patients who have taken part in the South London Stroke Register. Their valuable input is giving us an incredibly detailed understanding of stroke rates, which can help us treat and prevent strokes in the future."

The South London Stroke Register has been looking at the number of strokes recorded among the 350,000 people in south London since 1995. The register uses data sources including anonymous data from A&E records and data collected by specially trained doctors, nurses and field workers. The research was supported by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care South London at King's College Hospital NHS Foundation Trust and by the NIHR Biomedical Research Centre based at Guy's and St Thomas' NHS Foundation Trust and Kings College London.

Credit: 
National Institute for Health Research

Heart attack on a chip: Scientists model conditions of ischemia on a microfluidic device

video: HL-1 cardiomyocytes grown in the chip beat in unison at a regular rate under normal oxygen conditions

Image: 
Brian Timko, Tufts University

MEDFORD/SOMERVILLE, Mass. (March 30, 2020)--Researchers led by biomedical engineers at Tufts University invented a microfluidic chip containing cardiac cells that is capable of mimicking hypoxic conditions following a heart attack - specifically when an artery is blocked in the heart and then unblocked after treatment. The chip contains multiplexed arrays of electronic sensors placed outside and inside the cells that can detect the rise and fall of voltage across individual cell membranes, as well as voltage waves moving across the cell layer, which cause the cells to beat in unison in the chip, just as they do in the heart. After reducing levels of oxygen in the fluid within the device, the sensors detect an initial period of tachycardia (accelerated beat rate), followed by a reduction in beat rate and eventually arrhythmia which mimics cardiac arrest.

The research, published in Nano Letters, is a significant advance toward understanding the electrophysiological responses at the cellular level to ischemic heart attacks, and could be applied to future drug development. The paper was selected by the American Chemical Society as Editors' Choice, and is available with open access.

Cardiovascular disease (CVD) remains the leading cause of death worldwide, with most patients suffering from cardiac ischemia - which occurs when an artery supplying blood to the heart is partially or fully blocked. If ischemia occurs over an extended period, the heart tissue is starved of oxygen (a condition called "hypoxia"), and can lead to tissue death, or myocardial infarction. The changes in cardiac cells and tissues induced by hypoxia include changes in voltage potentials across the cell membrane, release of neurotransmitters, shifts in gene expression, altered metabolic functions, and activation or deactivation of ion channels.

The biosensor technology used in the microfluidic chip combines multi-electrode arrays that can provide extracellular readouts of voltage patterns, with nanopillar probes that enter the membrane to take readouts of voltage levels (action potentials) within each cell. Tiny channels in the chip allow the researchers to continuously and precisely adjust the fluid flowing over the cells, lowering the levels of oxygen to about 1-4 percent to mimic hypoxia or raising oxygen to 21 percent to model normal conditions. The changing conditions are meant to model what happens to cells in the heart when an artery is blocked, and then re-opened by treatment.

"Heart-on-a-chip models are a powerful tool to model diseases, but current tools to study electrophysiology in those systems are somewhat lacking, as they are either difficult to multiplex or eventually cause damage to the cells," said Brian Timko, assistant professor of biomedical engineering at Tufts University School of Engineering, and corresponding author of the study. "Signaling pathways between molecules and ultimately electrophysiology occur rapidly during hypoxia, and our device can capture a lot of this information simultaneously in real time for a large ensemble of cells."

When tested, the extracellular electrode arrays provided a two-dimensional map of voltage waves passing over the layer of cardiac cells, and revealed a predictable wave pattern under normal (21 percent) oxygen levels. In contrast, the researchers observed erratic and slower wave patterns when the oxygen was reduced to 1 percent.

The intracellular nanoprobe sensors provided a remarkably accurate picture of action potentials within each cell. These sensors were arranged as an array of tiny platinum tipped needles upon which the cells rest, like a bed of nails. When stimulated with an electric field, the needles puncture through the cell membrane, where they can begin taking measurements at single cell resolution. Both types of devices were created using photolithography - the technology used to create integrated circuits - which allowed researchers to achieve device arrays with highly reproducible properties.

The extracellular and intracellular sensors together provide information of the eletrophysiological effects of a modeled ischemic attack, including a "time lapse" of cells as they become dysfunctional and then respond to treatment. As such, the microfluidic chip could form the basis of a high throughput platform in drug discovery, identifying therapeutics which help cells and tissues recover normal function more rapidly.

"In the future, we can look beyond the effects of hypoxia and consider other factors contributing to acute heart disease, such as acidosis, nutrient deprivation and waste accumulation, simply by modifying the composition and flow of the medium," said Timko. "We could also incorporate different types of sensors to detect specific molecules expressed in response to stresses."

This work was supported by grants from Tufts Collaborates, the Department of Defense (W81XWH-16-1-0304), the American Heart Association (Grant-in-Aid 16GRNT27760100) and the Tufts Summer Scholars Program. The research was performed at the Tufts Micro- and Nanofabrication Facility.

First author of the study was Haitao Liu, visiting scholar at Tufts University School of Engineering, Co-authors included Ning Hu, also a visiting scholar; Rotimi Bolonduro and Breanna Duffy, both PhD candidates; undergraduates Akshita Rao and Jie Ju; Zhaohui Huang of the School of Materials Science and Technology at China University of Geosciences; and Lauren Black, associate professor of biomedical engineering at Tufts.

Liu, H., Bolonduro, O.A., Hu, N., Ju, J., Rao, A.A., Duffy, B.M., Huang, Z., Black, L.D. and Timko, B.P. "Heart-on-a-chip model with integrated extra- and intra-cellular bioelectronics for monitoring cardiac electrophysiology under acute hypoxia" Nano Letters (9 March 2020); DOI: 10.1021/acs.nanolett.0c00076

Credit: 
Tufts University

Vericiguat improves outcomes in patients with worsening heart failure

Patients with worsening heart failure and reduced ejection fraction who received the investigational drug vericiguat had a significantly lower rate of cardiovascular death or heart failure hospitalization compared with those receiving a placebo, based on research presented at the American College of Cardiology's Annual Scientific Session Together with World Congress of Cardiology (ACC.20/WCC).

About 6.5 million U.S. adults have heart failure, a debilitating condition in which the heart becomes too weak to pump enough blood to the body's organs and tissues. Ejection fraction is a measure of the proportion of blood that is pumped out of the left ventricle with each heartbeat; a lower ejection fraction indicates a weaker heart. While treatments are available to manage heart failure symptoms, patients with reduced ejection fraction and a worsening condition--often marked by repeated visits to the hospital or need for intravenous diuretics--have limited options for stemming the disease's progression.

Vericiguat is a novel drug--known as a guanylate cyclase stimulator--that is designed to enhance cyclic guanosine monophosphate production, which is a pathway that is critical for normal cardiac and vascular function but not currently targeted by existing heart failure drugs. While the drug has been tested in smaller groups of patients in phase II trials, the phase III VICTORIA (Vericiguat Global Study In Subjects With Heart Failure With Reduced Ejection Fraction) trial represents the first time vericiguat has been evaluated in a large group of patients with worsening heart failure receiving optimal standard of care treatments for their condition.

With a median follow-up period of 10.8 months, data showed patients randomized to receive vericiguat had a 10% lower rate of cardiovascular death or heart failure hospitalization--the study's composite primary endpoint--than those taking a placebo. The difference favoring vericiguat appeared after about three months of treatment and persisted throughout the duration of the study. A secondary analysis revealed that those taking vericiguat had a significant reduction in heart failure hospitalizations and a possible reduction in cardiovascular death that was not statistically significant.

"For a group of patients with this form of high-risk heart failure, where other heart failure drugs have rarely been studied, vericiguat provides a significant novel addition to usual treatment," said Paul W. Armstrong, MD, cardiologist and distinguished university professor of medicine at the Canadian VIGOUR Centre, University of Alberta, and the study's lead author. "I think it's a gratifying result in high-risk heart failure patients that not only opens up a new avenue for them, but also a pathway for future discovery in cardiovascular heart disease."

VICTORIA researchers enrolled 5,050 patients treated at 600 medical centers in 42 countries. Participants had heart failure with an average ejection fraction of 30% and markedly elevated natriuretic peptide levels, factors that indicate they were at high risk for hospitalization or death. In addition, all patients had been hospitalized within the last six months or required intravenous diuretics within three months, indicative of worsening disease. Half of the patients were assigned to take 10 milligrams of vericiguat once daily and half were assigned to take a placebo. All patients received standard heart failure treatment throughout the course of the study, including angiotensin converting enzyme inhibitors, angiotensin receptor blockers or angiotensin receptor-neprilysin inhibitors, combined with beta-blockers and mineralocorticoid antagonists. About a third of patients had either an implantable cardioverter-defibrillator, biventricular pacemaker or both devices.

"This is a sick population that has a significant unmet need. The results of our study translate into a clinically meaningful absolute reduction in the primary endpoint," Armstrong said. "Because of the high rate of events [cardiovascular death or heart failure hospitalization] in this population, the absolute risk reduction of 4.2 per 100 patient-years means that you would need to treat about 24 patients for an average of one year in order to prevent one event."

The effect of vericiguat on the primary outcome was consistent across most of the 13 prespecified subgroups (including those receiving sacubitril/valsartan), except in those defined by advanced age and very elevated levels of NT-proBNP, hormones that are associated with severe and worsening heart failure.

Vericiguat was generally well-tolerated and had few side effects. Patients taking vericiguat had a slightly increased incidence of symptomatic low blood pressure hypotension (which occurred in 9.1% of those taking vericiguat and 7.9% of those taking placebo) and fainting (which occurred in 4% of those taking vericiguat and 3.5% of those taking placebo), though these differences were not statistically significant.

Because approximately half of all patients with heart failure have preserved ejection fraction, a different form of the disease for which there are even fewer treatment options available, Armstrong said that a separate study is underway to investigate whether vericiguat offers benefits for those with heart failure with preserved ejection fraction. He added that future studies could help elucidate how vericiguat may compare to or complement other emerging heart failure treatments to improve the outlook for the most severely ill patients.

Heart failure generally worsens over time and is a contributing factor in about 1 in 8 deaths annually. Patients commonly suffer shortness of breath, fatigue and swelling, along with other symptoms that interfere with everyday activities.

Credit: 
American College of Cardiology

Researchers reverse muscle fibrosis from overuse injury in animals, hope for human trials

(Philadelphia, PA) - Overuse injuries - think muscle strains, tennis elbow, and rotator cuff tears - are a considerable problem in the United States, especially among young athletes. But while commonly associated with sports, overuse injuries - particularly those involving muscle strains - also affect significant numbers of workers whose jobs involve manual labor.

High-force, high-repetition movements, such as those involved in heavy lifting, create microinjuries in muscle fibers. Muscle tissue responds by making small repairs to the damaged fibers. But over time, with repetition of injury, healing capacity becomes overwhelmed, and microinjuries progress to fibrosis - the replacement of muscle tissue with connective tissue. Fibrosis ultimately weakens muscles and can put pressure on nerves, causing pain.

While long thought to be irreversible, new research by scientists at the Lewis Katz School of Medicine at Temple University (LKSOM) shows for the first time in animals that it may be possible to undo the damage caused by fibrosis and, in the process, restore muscle strength.

The findings, published online March 30 in The FASEB Journal, offer hope for people who have been unable to return to work because of an overuse injury.

"The accumulation of scar tissue from muscle fibrosis is the primary cause of muscle weakness that arises following overuse injury, also known as repetitive strain injury," explained Mary F. Barbe, PhD, Professor of Anatomy and Cell Biology and Professor in the Department of Physical Therapy at LKSOM.

Dr. Barbe and colleagues, including her primary collaborator Steven N. Popoff, PhD, the John Franklin Huber Chair and Professor of Anatomy and Cell Biology at LKSOM, found that this scarring process can be halted and even reversed by a drug known as FG-3019. FG-3019, which works by blocking the activity of a protein called CCN2, was recently approved by the U.S. Food and Drug Administration for the treatment of Duchenne muscular dystrophy.

Dr. Barbe's team carried out their investigation in a rat model of overuse injury in which animals were trained to do a task in a high-force, high-repetition manner for a reward. After 18 weeks, animals trained on the task developed muscle fibrosis characteristic of overuse injury.

The researchers then divided the animals into three groups: one that received no treatment, one that received an inactive "sham" treatment, and one that received FG-3019. The treatment period lasted six weeks, during which all animals were given a rest from the high-force, high-repetition task.

Following the six weeks of rest, analyses of muscle tissue showed that untreated animals and sham-treated animals had significantly elevated muscle levels of fibrosis-related proteins, including collagen and CCN2, which promotes the growth of connective tissue. By contrast, in FG-3019-treated animals, CCN2 and collagen levels were similar to levels in control rats that were not trained to perform the repetitive task. Fibrotic damage was also reversed in animals given FG-3019, and these animals showed significant improvements in grip and other tests of muscle strength.

"FG-3019 is already in clinical trials for other diseases involving fibrosis, including pulmonary fibrosis and kidney fibrosis," Dr. Barbe said. "Our work adds to the relevance of this drug in treating fibrotic diseases, with the novel application for muscle fibrosis associated with overuse injury."

Dr. Barbe and colleagues hope to pursue the use of FG-3019 in clinical trials in human patients.

"If we can successfully reverse muscle fibrosis in humans, we will be able to provide relief and help workers with overuse injury eventually return to their jobs."

Credit: 
Temple University Health System

Researchers discover potential boost to immunotherapy

Mount Sinai researchers have discovered a pathway that regulates special immune system cells in lung cancer tumors, suppressing them and allowing tumors to grow. The scientists also figured out how to interrupt this pathway and ramp up the immune system to prevent tumor formation or growth, offering a potential boost to immunotherapy, according to a study published in Nature in March.

Researchers analyzed human and mouse lung cancer lesions, specifically studying the highly specialized immune cells called dendritic cells, which are considered the generals of the immune system. Dendritic cells give other immune system cells, called T-cells, identifying information from tumors so the T-cells can recognize and fight the cancer. Certain genetic material in the tumors, however, tamps down the dendritic cells' function via this newly discovered immune regulatory pathway.

Scientists performed high-tech, single-cell sequencing and high-definition imaging on mouse and human tumors to study the dendritic cells' activity in lung cancer and adjacent noncancerous lung tissues. They identified a molecular pathway that dampens dendritic cells' ability to program T-cells to kill. This study also showed that reversing this pathway significantly improves tumor responses in animals.

Based on the findings, scientists are designing a clinical trial that they expect will enhance patients' response to an immunotherapy called checkpoint blockade, by adding a second therapy that blocks the immune regulatory pathway that decreases dendritic cells' function in tumors. Right now only about 20 percent of patients respond to checkpoint blockade therapies. The trial will be done in collaboration with Regeneron Inc.

"This study highlights the power of single-cell technologies to identify new therapeutic targets in cancer," said senior author Miriam Merad, MD, PhD, Director of the Precision Immunology Institute and Mount Sinai Professor in Cancer Immunology at the Icahn School of Medicine at Mount Sinai, Co-leader of the Cancer Immunology Program at The Tisch Cancer Institute at Mount Sinai, and Director of the Mount Sinai Human Immune Monitoring Center.

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

'Revita' improves blood glucose levels, liver metabolic health in type 2 diabetes

WASHINGTON--Patients with poorly controlled type 2 diabetes who underwent a novel, minimally invasive, endoscopic procedure called Revita® duodenal mucosal resurfacing (DMR) had significantly improved blood glucose (sugar) levels, liver insulin sensitivity, and other metabolic measures three months later, according to new data from the REVITA-2 study. These results, from a mixed meal tolerance test, have helped researchers verify the insulin sensitizing mechanism by which hydrothermal ablation of the duodenum improves blood sugar in patients with type 2 diabetes.

The research was accepted for presentation at ENDO 2020, the Endocrine Society's annual meeting, and will be published in a special supplemental section of the Journal of the Endocrine Society.

"This outpatient procedure is being studied as the first disease-modifying therapy for type 2 diabetes and has thus far been demonstrated to be safe, effective, and durable through at least two years of follow up," said study investigator, David Hopkins, MB.Ch.B. (M.D.), director of the Institute of Diabetes, Endocrinology and Obesity, King's Health Partners in London, U.K.

The REVITA-2 study, a randomized, sham-controlled clinical trial, explored the insulin-sensitizing mechanisms underlying the previously reported beneficial effects of the Revita DMR therapy, developed by Lexington, Mass.-based Fractyl Laboratories, sponsor of the study. The technology involves inserting a balloon catheter through the mouth and into the duodenum (upper small intestine) to precisely deliver thermal energy to the duodenal lining. Once treated, the damaged duodenal lining is flushed out, and a new mucosal layer begins to regenerate.

"The thickening of the duodenal mucosa (surface lining) occurs early in diabetes and may initiate changes in hormonal signaling that lead to insulin resistance, the main factor leading to type 2 diabetes," Hopkins said. "By resurfacing the mucosa with the Revita DMR technique and effectively resetting this signaling, we are able to demonstrate a reduction of the insulin resistance underlying diabetes and associated metabolic complications."

The researchers analyzed data from 70 patients (blinded: 35 treated; 35 sham) who had a meal tolerance test before and 12 weeks after a single endoscopic procedure. This test measures insulin secretion and insulin resistance by determining the insulin response to a "mixed meal," which is a nutritional drink with a fixed amount of glucose and protein similar to a typical meal. The patients' blood glucose levels were measured before the test (while fasting) and afterward.

The patients who received Revita DMR had a markedly improved glucose response, primarily driven by decreased fasting blood glucose levels, and had improved liver measures, suggesting better insulin action on the liver. Although the average fasting glucose level dropped by only 15 milligrams per deciliter (mg/dL) in the sham group, it fell by 41 mg/dL in the DMR-treated group. Hopkins said this suggests "a primary effect on insulin resistance."

"These findings confirm that the duodenum is an important therapeutic target for type 2 diabetes," Hopkins said. "The Revita treatment has the potential to transform the lives of patients who cannot adequately control their disease with drug therapies or who are interested in a non-drug treatment alternative that targets the root cause of metabolic disease."

Credit: 
The Endocrine Society

Where lions roam: West African big cats show no preference between national parks, hunting zones

West African lions are a critically endangered subpopulation, with an estimated 400 remaining and strong evidence of ongoing declines.

About 90% of these lions live in West Africa's largest protected area complex, the W-Arly-Pendjari. The WAP Complex includes five national parks and 14 hunting concessions across roughly 10,200 square miles in Burkina Faso, Niger and Benin.

Given that wildlife protection is one of the main purposes of a national park, you might expect West African lions to favor life inside park boundaries, rather than within the privately managed hunting concessions that surround the parks. After all, lions tend to shun people, and human pressures are higher in hunting areas than in the parks.

But a new University of Michigan-led camera survey of West African lions--believed to be the largest wildlife camera survey ever undertaken in West Africa and the first carried out within WAP Complex national parks and hunting concessions--found that West African lions show no statistically significant preference between the parks and trophy-hunting areas.

The findings, scheduled for publication March 30 in the Journal of Applied Ecology, have implications for conservation management of the remaining West African lions.

"Our results suggest habitat quality in national parks is inadequate, leading to a lack of preference in lions despite lower human pressures," said doctoral student Kirby Mills of U-M's Applied Wildlife Ecology (AWE) Lab, lead author of the study.

The researchers suspect that the lure of plentiful water, high-quality habitat and abundant prey on hunting properties outweigh the lions' natural avoidance of humans. Revenues from trophy hunting pay for enhanced infrastructure such as irrigation systems and solar-powered pumps at watering holes, as well as added patrol staff.

At the same time, under-resourced national parks struggle to deal with degraded wildlife habitat, poachers, inadequate staffing and displacement of wildlife by livestock, which are permitted within the parks.

"We recommend prioritizing the reduction of habitat degradation in the parks and increasing water availability to increase suitable habitat for lions and their prey," said Mills, who conducted the study for her master's thesis at the U-M School for Environment and Sustainability. "But at the same time, we recognize that management interventions at a large scale require economic resources unavailable to park managers in WAP, an incongruity prolific throughout the range of African lions."

The study's senior author is Nyeema Harris, an assistant professor in the U-M Department of Ecology and Evolutionary Biology and director of the AWE Lab. Harris designed the project and led the fieldwork with an international team that included government employees and students from Burkina Faso and Niger.

In the U-M-led study, 238 motion-activated digital cameras were deployed across 5,000 square miles in three WAP Complex national parks and 11 of the hunting concessions. The fieldwork was conducted from February through June in 2016, 2017 and 2018.

Some 1.7 million images were captured during that time, but West African lions triggered the shutter just 96 times, reflecting the critically endangered feline's scarcity. The cameras were programmed to rapid-fire three to five frames when triggered, so the total number of lion images is 360.

The camera data were used in two types of mathematical models--occupancy models and structural equation models. The occupancy models allowed the researchers to calculate the probability that an animal used a given space, while the SEM models enabled them to disentangle the relative effects of environmental, ecological and anthropogenic factors influencing space use by West African lions.

The researchers found that lion occupancy was largely driven by prey availability, which in turn was shaped by ecological and environmental variables--such as water availability and habitat diversity--that scored higher in hunting concessions than in national parks.

Contrary to the researchers' expectations, the WAP Complex lions showed no discernable preference between national parks and hunting zones. The U-M-led study provides the first estimate of West African lion occupancy using camera-trap data.

"We hypothesize that ecological cues indicating high-quality habitat, such as plentiful water and available prey, are mitigating the expected avoidance response to the increased human pressures and competitor activity in hunting concessions," Harris said.

"Because the lions rely heavily on prey, managers may be able to manipulate the distribution of prey within WAP to directly influence spatial distributions of lions and indirectly reduce human-lion conflict.

Stretching across three countries in the West African savanna belt, the WAP Complex is a UNESCO World Heritage site and is described by the U.N. agency as "a refuge for wildlife species that have disappeared elsewhere in West Africa or are highly threatened."

Trophy hunting is permitted in all of the WAP Complex concessions but is illegal in the five national parks and in Niger's Tamou game reserve, which is part of the protected area complex. The lions are known to feed on several species of antelope, as well as savanna buffalos and warthogs. Predators that compete with the lions for food include spotted hyenas and leopards.

West African lions are categorized as critically endangered in the International Union for Conservation of Nature's Red List of Threatened Species. In its 2015 assessment, the IUCN states that the West African lion subpopulation is estimated at just above 400 animals, with fewer than 250 mature individuals.

West African lions are smaller than, and genetically distinct from, other African lions. They form smaller prides, and the males have little to no mane.

"This population continues to decline," the IUCN assessment states. "Further deterioration of the last protected areas harbouring lions in West Africa will likely lead to the local extinction of the species."

Credit: 
University of Michigan

(Re)generation next: Novel strategy to develop scaffolds for joint tissue regeneration

image: Developing a novel strategy for tissue regeneration, which is a better alternative to conventional tissue regeneration methods.

Image: 
Tokyo University of Science

Joint diseases, such as knee osteoarthritis, are common in the elderly population and severely impair their quality of life. Conventional treatments like artificial joint replacements offer temporary relief but come with several disadvantages, including limited functionality and the need for replacement. A better solution is to find a way to promote tissue regeneration in joints: interpenetrating polymer network (IPN) hydrogels, when injected into joints, do exactly this--by acting as scaffolds for the growth of new cells and mimicking the cellular environment. However, existing techniques to develop IPNs are tedious: they require the addition of chemicals via multiple steps, which limits their practical application. Thus, there is a need for better techniques that can make the process of tissue regeneration easier.

In a new study published in Chemistry of Materials, scientists from Japan, including Asst Prof Shigehito Osawa and Prof Hidenori Otsuka of Tokyo University of Science, found a new method for developing tissue regeneration scaffolds. Prof Otsuka explains, "Generally, the formation of IPN gels is a cytotoxic, multistep process: it involves constructing a network, followed by the addition of chemical reagents or subjecting them to external stimuli, such as temperature or changes in light irradiation, to form the other network. We wanted to create a novel scaffold using a one-step process, which could overcome the limitations of existing IPNs."

To begin with, the scientists wanted to find self-assembling compounds that could form independent 3D networks without interfering with each other. They began by selecting a peptide called RADA16, which--under physiological conditions--forms a network owing to electrostatic and hydrophobic interactions. Then, they turned to a biopolymer called chitosan (CH) and a compound called polyethylene glycol (PEG), which form networks with each other via chemical reactions. Because the mechanisms of network formation in RADA16 and CH/PEG were drastically different, the scientists speculated that these networks would not interfere with each other. By simply mixing the two compounds, they found that this was indeed true. Prof Otsuka explains, "We mixed the two materials, RADA16 and CH/PEG, and found that they successfully formed heterologous IPNs. Moreover, these IPNs did not interfere with each other, as it turns out that the RADA16 networks form first, followed by the slower assembly of CH/PEG networks."

Next, the researchers wanted to check if the proposed IPN could effectively act as a scaffold to promote the growth of healthy chondrocytes (cells that produce cartilage). The scientists tested the scaffold using human cells and found that cells are embedded uniformly in the hydrogel, effectively generating functional cartilage tissue. In fact, in mice, implanting human chondrocytes within the hydrogel scaffold led to cartilage formation over a period of 8 weeks, even surpassing the performance of conventional tissue scaffolds! The biggest advantage of this technique was that not only did it successfully regenerate cartilage tissue, it was also performed in just one step or "pot," making it much simpler than existing techniques.

These findings could potentially overcome the limitations of tissue regeneration and pave the way for further applications such as drug delivery, diagnosis, and surface modification. Not just this, Prof Otsuka is optimistic that owing to the ease of the technique, it can be produced domestically, which could lead to significant social and economic benefits. Prof Otsuka concludes, "Our research has opened doors to the use of regenerative medicine for autonomous cartilage generation as an alternative to artificial joints, leading to significant improvement in patients' quality of life and benefiting the society overall."

Credit: 
Tokyo University of Science