Culture

Potential jurors favor use of artificial intelligence in precision medicine

video: Alexander Stremitzer discusses the legal implications for physicians of following artificial intelligence advice in this new video from The Journal of Nuclear Medicine.

Image: 
Video created by Alexander Stremitzer and Eidgenössische Technische Hochschule Zürich Center for Law and Economics.

Reston, Virginia--Physicians who follow artificial intelligence (AI) advice may be considered less liable for medical malpractice than is commonly thought, according to a new study of potential jury candidates in the U.S. Published in the January issue of The Journal of Nuclear Medicine (JNM). The study provides the first data related to physicians' potential liability for using AI in personalized medicine, which can often deviate from standard care.

"New AI tools can assist physicians in treatment recommendations and diagnostics, including the interpretation of medical images," remarked Kevin Tobia, JD, PhD, assistant professor of law at the Georgetown University Law Center, in Washington D.C. "But if physicians rely on AI tools and things go wrong, how likely is a juror to find them legally liable? Many such cases would never reach a jury, but for one that did, the answer depends on the views and testimony of medical experts and the decision making of lay juries. Our study is the first to focus on that last aspect, studying potential jurors' attitudes about physicians who use AI."

To determine potential jurors' judgments of liability, researchers conducted an online study of a representative sample of 2,000 adults in the U.S. Each participant read one of four scenarios in which an AI system provided a drug dosage treatment recommendation to a physician. The scenarios varied the AI recommendation (standard or nonstandard drug dosage) and the physician's decision (to accept or reject the AI recommendation). In all scenarios, the physician's decision subsequently caused harm to the patient.

Study participants then evaluated the physician's decision by assessing whether the treatment decision was one that could have been made by "most physicians" and "a reasonable physician" in similar circumstances. Higher scores indicated greater agreement and, therefore, lower liability.

Results from the study showed that participants used two different factors to evaluate physicians' utilization of medical AI systems: (1) whether the treatment provided was standard and (2) whether the physician followed the AI recommendation. Participants judged physicians who accepted a standard AI recommendation more favorably than those who rejected it. However, if a physician received a nonstandard AI recommendation, he or she was not judged as safer from liability by rejecting it.

While prior literature suggests that laypersons are very averse to AI, this study found that they are, in fact, not strongly opposed to a physician's acceptance of AI medical recommendations. This finding suggests that the threat of a physician's legal liability for accepting AI recommendations may be smaller than is commonly thought.

In an invited perspective on the JNM article, W. Nicholson Price II and colleagues noted, "Liability is likely to influence the behavior of physicians who decide whether to follow AI advice, the hospitals that implement AI tools for physician use and the developers who create those tools in the first place. Tobia et al.'s study should serve as a useful beachhead for further work to inform the potential for integrating AI into medical practice."

In an associated JNM article, the study authors were interviewed by Irène Buvat, PhD, and Ken Herrmann, MD, MBA, both leaders in the nuclear medicine and molecular imaging field. In the interview the authors discussed whether the results of their study might hold true in other countries, if AI could be considered as a type of "medical expert," and the advantages of using AI from a legal perspective, among other topics.

Credit: 
Society of Nuclear Medicine and Molecular Imaging

Why COVID-19 pneumonia lasts longer, causes more damage than typical pneumonia

'This effort truly represents a "moonshot" in COVID-19 research'

Scientists identify target to treat COVID pneumonia and reduce severity

Clinical trials with new experimental drug to begin early in 2021

Goal is to develop treatments that make COVID-19 no worse than a common cold

First comparison between immune mechanisms driving COVID-19 pneumonia with other pneumonias

CHICAGO --- Bacteria or viruses like influenza that cause pneumonia can spread across large regions of the lung over the course of hours. In the modern intensive care unit, these bacteria or viruses are usually controlled either by antibiotics or by the body's immune system within the first few days of the illness.

But in a study published in Nature on January 11, investigators at Northwestern Medicine show COVID-19 pneumonia is different.

Instead of rapidly infecting large regions of the lung, the virus causing COVID-19 sets up shop in multiple small areas of the lung. It then hijacks the lungs' own immune cells and uses them to spread across the lung over a period of many days or even weeks, like multiple wildfires spreading across a forest. As the infection slowly moves across the lung, it leaves damage in its wake and continuously fuels the fever, low blood pressure and damage to the kidneys, brain, heart and other organs in patients with COVID-19.

The severe complications of COVID-19 compared with other pneumonias might be related to the long course of disease rather than more severe disease, the study authors said.

This is the first study in which scientists analyzed immune cells from the lungs of COVID-19 pneumonia patients in a systematic manner and compared them to cells from patients with pneumonia from other viruses or bacteria.

Drug trial to treat newly discovered targets in COVID-19 pneumonia

As a result of the detailed analysis, researchers identified critical targets to treat severe SARS-CoV-2 pneumonia and lessen its damage. The targets are the immune cells: macrophages and T cells. The study suggests macrophages - cells typically charged with protecting the lung - can be infected by SARS-CoV-2 and can contribute to spreading the infection through the lung.

Northwestern Medicine will test an experimental drug to treat these targets in COVID-19 pneumonia patients in a clinical trial early in 2021. The drug to be tested quiets the inflammatory response of these immune cells, thus enabling initiation of the repair process in the injured lung.

Aim to make COVID-19 like a bad cold

"Our goal is to make COVID-19 mild instead of severe, making it comparable to a bad cold," said study co-senior author Dr. Scott Budinger, chief of pulmonary and critical care medicine at Northwestern University Feinberg School of Medicine and Northwestern Medicine.

"This effort truly represents a 'moonshot' in COVID-19 research," said study co-senior author Dr. Richard Wunderink, professor of pulmonary and critical care medicine at Feinberg and medical director of Northwestern Medicine's ICU.

COVID-19 unlikely to completely disappear

COVID-19, like influenza, is unlikely to ever go away, even if much of the population is vaccinated, said senior co-author Dr. Ben Singer, assistant professor of pulmonary and critical care medicine at Feinberg and a Northwestern Medicine physician.

"Already, researchers at Northwestern and elsewhere are anticipating mechanisms by which this RNA virus, which mutates quickly, will evade current vaccines," Singer said. "This study will help us develop treatments to reduce the severity of COVID-19 in those who develop it."

Mortality in COVID-19 patients on ventilators lower than regular pneumonia patients

The study also revealed why the mortality among patients on a ventilator for COVID-19 was lower than patients on a ventilator due to regular pneumonia, the study reports. An intense conflagration in the lungs (regular pneumonia) has a higher risk of death. Those with COVID-19 pneumonia are sick for a long time, but the inflammation in their lungs is not as severe as regular pneumonia.

"If patients with COVID-19 are carefully managed and the health care system isn't overwhelmed, you can get them through it," Budinger said. "These patients are very sick. It takes a really long time for them to get better. But If you have enough beds and health care providers, you can keep the mortality to 20%. When health systems are overwhelmed mortality rates double up to 40%."

For the study, scientists performed a high-resolution analysis of the lung fluid of 86 COVID-19 patients on a ventilator and compared it with lung fluid from 256 patients on a ventilator who had other types of pneumonia. Because of the safety concerns, only a handful of groups around the world performed analysis of the immune response in the lungs of patients with COVID-19. As a result, important information about what was killing patients with severe COVID-19 was missing.

Northwestern scientists, studying pneumonia for years, poised for COVID lung research

The study performed at Northwestern Medicine is unique because Wunderink and colleagues have been studying pneumonia for years before the pandemic. As a result, when the COVID-19 pandemic hit, they were prepared to collect fluid from the lungs of these patients in a safe and systematic manner and compare it with fluid collected from other ICU patients with pneumonia collected before the pandemic. This research infrastructure allowed them to show that pneumonia in patients with COVID-19 is different from other pneumonia, and more importantly, how it is different.

Scientists took cells from patients' lung fluid and looked at the RNA and the proteins those cells express, enabling them to identify how these immune cells drive inflammation.

"This level of resolution could never be achieved without directly sampling lung fluid," said study co-senior author Dr. Alexander Misharin, an assistant professor of pulmonary and critical care medicine at Feinberg and a Northwestern Medicine physician.

The complex nature of the study, in which samples from patients were analyzed with the most sophisticated technologies available in Northwestern's state-of-the art research labs, required the concerted effort of more than 100 researchers.

Credit: 
Northwestern University

Arecibo observatory helps find possible 'first hints' of low-frequency gravitational waves

ORLANDO, Jan. 11, 2021 - Data from Arecibo Observatory in Puerto Rico has been used to help detect the first possible hints of low-frequency disturbances in the curvature of space-time.

The results were presented today at the 237th meeting of the American Astronomical Society, which was held virtually, and are published in The Astrophysical Journal Letters. Arecibo Observatory is managed by the University of Central Florida for the National Science Foundation under a cooperative agreement.

The disturbances are known as gravitational waves, which ripple through space as a result of the movement of incredibly massive objects, such as black holes orbiting one another or the collision of neutron stars.

It's important to understand these waves as they provide insight into the history of the cosmos and expand researchers' knowledge of gravity past current limits of understanding.

Although the gravitational waves are stretching and squeezing the fabric of space-time, they don't impact humans and any changes in the relative distances between objects would change the height of a person by less than one one-hundredth the width of a human hair, says Joseph Simon, a postdoctoral associate in the Center for Astrophysics and Space Astronomy at the University of Colorado Boulder.

Simon presented the findings at the society today, is lead researcher of the paper, and is a member of the North American Nanohertz Observatory for Gravitational Waves, or NANOGrav, the team that performed the research.

NANOGrav is a group of more than 100 astronomers from across the U.S. and Canada whose common goal is to study the universe using low-frequency gravitational waves.

In 2015, NSF's Laser Interferometer Gravitational-Wave Observatory (LIGO) made the first direct observation of high frequency gravitational waves using interferometry, a measurement method that uses the interference of electromagnetic waves.

The new findings made by NANOGrav researchers are unique because the astronomers found possible hints of low-frequency gravitational waves by using radio telescopes, since they cannot be detected by LIGO. Both frequencies are important for understanding the universe.

Key to the research were two NSF-funded instruments - Green Bank Telescope in West Virginia and Arecibo Observatory in Puerto Rico.

Arecibo Observatory, with its 1,000-foot diameter dish provided very precise data, while the Green Bank Telescope, which has much larger sky coverage, sampled a wider range of information needed to discriminate gravitational wave perturbations from other effects, Simon says.

"We time roughly half the pulsars with each telescope," he says. "Each telescope provides about half of our total sensitivity in a complementary way."

Although the researchers used Arecibo data for the study, they are no longer able to make observations with it since the observatory collapsed in December following broken cables in August and November.

"It was a truly horrible day when the telescope collapsed," Simon says. "It feels like the loss of a good friend, and we are so saddened for our friends and colleagues in Puerto Rico. Going forward, we hope to increase the amount of time we use on the Green Bank Telescope to at least partially compensate for Arecibo's loss. Another large collecting area radio telescope must be built in the U.S. soon if we want this research area to flourish."

The researchers were able to detect possible hints of low-frequency gravitational waves by using the telescopes to study signals from pulsars, which are small, dense, rotating stars that send out pulses of radio waves at precise intervals toward Earth.

This regularity make them useful in astronomical study, and they are often referred to as the universe's timekeepers.

Gravitational waves can interrupt their regularity, causing deviations in pulsar signals arriving on Earth, thus indicating the position of the Earth has shifted slightly.

By studying the timing of the regular signals from many pulsars scattered over the sky at the same time, known as a "pulsar timing array," NANOGrav was able to detect minute changes in the Earth's position possibly due to gravitational waves stretching and shrinking space-time.

NANOGrav was able to rule out some effects other than gravitational waves, such as interference from the matter in the solar system or certain errors in the data collection.

To confirm direct detection of a signature from low-frequency gravitational waves, NANOGrav researchers will have to find a distinctive pattern in the signals between individual pulsars. At this point, the signal is too weak for such a pattern to be distinguishable, according to the researchers.

Boosting the signal requires NANOGrav to expand its dataset to include more pulsars studied for even longer lengths of time, which will increase the array's sensitivity. In addition, pooling NANOGrav's data together with those from other pulsar timing array experiments, a joint effort by the International Pulsar Timing Array, may reveal such a pattern. The International Pulsar Timing Array is a collaboration of researchers using the world's largest radio telescopes.

At the same time, NANOGrav is developing techniques to ensure the detected signal could not be from another source. They are producing computer simulations that help test whether the detected noise could be caused by effects other than gravitational waves, in order to avoid a false detection.

"It is incredibly exciting to see such a strong signal emerge from the data," Simon says. "However, because the gravitational-wave signal we are searching for spans the entire duration of our observations, we need to carefully understand our noise. This leaves us in a very interesting place, where we can strongly rule out some known noise sources, but we cannot yet say whether the signal is indeed from gravitational waves. For that, we will need more data."

Benetge Perera, a scientist at Arecibo Observatory who is a specialist in using observations of pulsars for the detection of gravitational waves, says the research aims to open a new window in the spectrum of gravitational wave frequencies.

"A low-frequency gravitational wave detection would enhance our understanding of supermassive black hole binaries, galaxy evolution, and the universe," says Perera, who is also a member of NANOGrav.

He says that despite the collapse of Arecibo Observatory, there are still much archived data to pore through to continue to learn about gravitational waves.

"Arecibo was very important as its timing data provided about 50 percent of NANOGrav's sensitivity to gravitational waves," he says. "I want to ensure that the sensitive data we collected before Arecibo's collapse has the highest possible scientific impact."

Credit: 
University of Central Florida

Study identifies exposure to common food-borne pathogen linked to rare brain cancer

ATLANTA AND TAMPA, FLA. - JANUARY 11, 2021 - A new study suggests a link between toxoplasma gondii (T. gondii) infection and the risk of glioma, a type of brain cancer, in adults. The report, appearing in the International Journal of Cancer, finds that people who have glioma are more likely to have antibodies to T. gondii (indicating that they have had a previous infection) than a similar group that was cancer free.

For the study, investigators led by James Hodge, JD, MPH and Anna Coghill, PhD examined the association between T. gondii antibodies measured several years before the cancer was diagnosed and the risk of developing a glioma. Study participants were from the American Cancer Society's Cancer Prevention Study-II (CPS-II) Nutrition Cohort and the Norwegian Cancer Registry's Janus Serum Bank (Janus). T. gondii is a common parasite that is most commonly acquired from undercooked meat, and may lead to the formation of cysts in the brain. These results suggest that reducing exposure to this common food-borne pathogen could provide a modifiable risk factor for highly aggressive brain tumors in adults.

Although glioma is a relatively rare disease, it is a highly fatal cancer. Globally in 2018, there were an estimated 300,000 incident cases and 241,000 deaths due to brain and other nervous system cancers. The majority (80%) of malignant brain tumors are gliomas, for which the estimated five-year relative survival rate is a stark 5%.

The study notes an association between T. gondii antibodies and glioma was similar in two demographically different groups of people: the CPS-II cases were approximately 70 years old at the time of blood draw, while those in the Janus cohort were approximately 40 years old.

"This does not mean that T. gondii definitely causes glioma in all situations. Some people with glioma have no T. gondii antibodies, and vice versa," notes Hodge.

"The findings do suggest that individuals with higher exposure to the T. gondii parasite are more likely to go on to develop glioma," said Coghill. "However, it should be noted that the absolute risk of being diagnosed with a glioma remains low, and these findings need to be replicated in a larger and more diverse group of individuals."

The authors note that, "if future studies do replicate these findings, ongoing efforts to reduce exposure to this common pathogen would offer the first tangible opportunity for prevention of this highly aggressive brain tumor."

Credit: 
American Cancer Society

First human culture lasted 20,000 years longer than thought

image: Freshly found artefact from Laminia, Senegal

Image: 
Eleanor Scerri

Fieldwork led by Dr Eleanor Scerri, head of the Pan-African Evolution Research Group at the Max Planck Institute for the Science of Human History in Germany and Dr Khady Niang of the University of Cheikh Anta Diop in Senegal, has documented the youngest known occurrence of the Middle Stone Age. This repertoire of stone flaking methods and the resulting tools includes distinctive ways of producing sharp flakes by carefully preparing nodules of rock, some of which were sometimes further shaped into tool forms known as 'scrapers' and 'points.' Middle Stone Age finds most commonly occur in the African record between around 300 thousand and 30 thousand years ago, after which point they largely vanish.

It was long thought that these tool types were replaced after 30 thousand years ago by a radically different, miniaturized toolkit better suited to diversified subsistence strategies and patterns of mobility across Africa. In a paper published in Scientific Reports this week, Scerri and colleagues show that groups of hunter-gatherers in what is today Senegal continued to use Middle Stone Age technologies associated with our species' earliest prehistory as late as 11 thousand years ago. This contrasts with the long-held view that humanity's major prehistoric cultural phases occurred in a neat and universal sequence.

The 'Last Eden'?

"West Africa is a real frontier for human evolutionary studies - we know almost nothing about what happened here in deep prehistory. Almost everything we know about human origins is extrapolated from discoveries in small parts of eastern and southern Africa," says Dr Eleanor Scerri, the lead author of the study.

To redress this gap in the data, Scerri and Niang put together a research program to explore different regions of Senegal. The program ranges from Senegal's desert edges to its forests and along different stretches of its major river systems: the Senegal and the Gambia, where they found multiple Middle Stone Age sites, all with surprisingly young dates.

"These discoveries demonstrate the importance of investigating the whole of the African continent, if we are to really get a handle on the deep human past." says Dr Khady Niang. "Prior to our work, the story from the rest of Africa suggested that well before 11 thousand years ago, the last traces of the Middle Stone Age - and the lifeways it reflects - were long gone."

Explaining why this region of West Africa was home to such a late persistence of Middle Stone Age culture is not straightforward.

"To the north, the region meets the Sahara Desert," explains Dr Jimbob Blinkhorn, one of the paper's authors. "To the east, there are the Central African rainforests, which were often cut off from the West African rainforests during periods of drought and fragmentation. Even the river systems in West Africa form a self-contained and isolated group."

"It is also possible that this region of Africa was less affected by the extremes of repeated cycles of climate change," adds Scerri. "If this was the case, the relative isolation and habitat stability may simply have resulted in little need for radical changes in subsistence, as reflected in the successful use of these traditional toolkits."

"All we can be sure about is that this persistence is not simply about a lack of capacity to invest in the development of new technologies. These people were intelligent, they knew how to select good stone for their tool making and exploit the landscape they lived in," says Niang.

An ecological, biological and cultural patchwork

The results fit in with a wider, emerging view that for most of humanity's deep prehistory, populations were relatively isolated from each other, living in subdivided groups in different regions.

Accompanying this striking finding is the fact that in West Africa, the major cultural shift to more miniaturized toolkits also occurs extremely late compared to the rest of the continent. For a relatively short time, Middle Stone Age using populations lived alongside others using the more recently developed miniaturized tool kits, referred to as the 'Later Stone Age'.

"This matches genetic studies suggesting that African people living in the last ten thousand years lived in very subdivided populations," says Dr Niang. "We aren't sure why, but apart from physical distance, it may be the case that some cultural boundaries also existed. Perhaps the populations using these different material cultures also lived in slightly different ecological niches."

Around 15 thousand years ago, there was a major increase in humidity and forest growth in central and western Africa, that perhaps linked different areas and provided corridors for dispersal. This may have spelled the final end for humanity's first and earliest cultural repertoire and initiated a new period of genetic and cultural mixing.

"These findings do not fit a simple unilinear model of cultural change towards 'modernity'," explains Scerri. " Groups of hunter-gatherers embedded in radically different technological traditions occupied neighbouring regions of Africa for thousands of years, and sometimes shared the same regions. Long isolated regions, on the other hand, may have been important reservoirs of cultural and genetic diversity," she adds. "This may have been a defining factor in the success of our species."

Credit: 
Max Planck Institute of Geoanthropology

Megalodons gave birth to large newborns that likely grew by eating unhatched eggs in womb

image: Identified annual growth bands in a vertebra of the extinct megatooth shark Otodus megalodon along with hypothetical silhouettes of the shark at birth and death, each compared with size of typical adult human. The vertebral specimen is housed in the Royal Belgian Institute of Natural Sciences in Brussels

Image: 
DePaul University/Kenshu Shimada

A new study shows that the gigantic Megalodon or megatooth shark, which lived nearly worldwide roughly 15-3.6 million years ago and reached at least 50 feet (15 meters) in length, gave birth to babies larger than most adult humans.

This latest research shedding light on the reproductive biology, growth and life expectancy of Megalodon (formally called Otodus megalodon) appears in the international journal Historical Biology.

Although Otodus megalodon is typically portrayed as a super-sized, monstrous shark in novels and films such as the 2018 sci-fi film "The Meg," scientific data support a more modest but still impressive estimate of about 50 feet (15 meters) for the presently known largest individuals. The study indicates that, from the moment of birth, Megalodon was already a big fish, noted Kenshu Shimada, a paleobiologist at DePaul University in Chicago and lead author of the study. Co-authors are Matthew Bonnan, Stockton University, New Jersey; and Martin Becker and Michael Griffiths, William Paterson University, New Jersey.

"As one of the largest carnivores that ever existed on Earth, deciphering such growth parameters of O. megalodon is critical to understand the role large carnivores play in the context of the evolution of marine ecosystems," said Shimada.

Otodus megalodon has a rich fossil record, but its biology remains poorly understood like most other extinct sharks because the cartilaginous fish is primarily known only from its teeth. Nevertheless, some remains of gigantic vertebrae are known, said Shimada.

Large size at birth

Researchers used a CT scanning technique to examine incremental 'growth bands' putatively recorded annually (analogous to tree rings) in Megalodon vertebral specimen housed in the Royal Belgian Institute of Natural Sciences in Brussels. Measuring up to about 6 inches (15 centimeters) in diameter, the vertebrae were previously estimated to have come from an individual about 30 feet (9 meters) in length based on comparisons with vertebrae of modern great white sharks, according to the researchers.

CT images reveal the vertebrae to have 46 growth bands, meaning that the 9-meter Megalodon fossil died at age 46. By back-calculating its body length when each band formed, the research suggests that the shark's size at birth was about 6.6 feet (2 meters) in length, a result that suggests Megalodon gave live birth to possibly the largest babies in the shark world. These data also suggest that, like all present-day lamniform sharks, embryonic Megalodon grew inside its mother by feeding on unhatched eggs in the womb -- a practice known as oophagy, a form of intrauterine cannibalism.

"Results from this work shed new light on the life history of Megalodon, not only how Megalodon grew, but also how its embryos developed, how it gave birth and how long it could have lived," said co-author Becker.

Interestingly, 'early-hatched embryos' in the shark group called Lamniformes will begin to eat surrounding unhatched eggs, and at least in the present day sandtiger shark, occasionally even feed on other hatched siblings for nourishment, the researchers noted. The outcome is that only a few embryos will survive and develop, but each of them can become considerably large at birth.

Although likely energetically costly for the mother to raise such large embryos, newborns have an advantage because their large size reduces chances of being eaten by other predators, said Shimada.

"The information presented in this new paper and our other recent work demonstrating just how large Megalodon was relative to other sharks have greatly increased the understanding of the Megalodon biology," said co-author Griffiths.

Added co-author Bonnan, "My students and I examine spiny dogfish shark anatomy in class and to think that a baby Megalodon was nearly twice as long as the largest adult sharks we examine is mind-boggling."

Relatively steady growth after birth

The study also shows that the shark grew without significant 'growth spurts' at an average rate of about 6.3 inches (16 centimeters) per year at least during the first 46 years of its life according to the data. This finding indicates that Megalodon was sufficiently large (6.6 feet) at birth to compete with other predators and to avoid being eaten. Further, a growth curve model based on the vertebrae appears to indicate its life expectance of at least 88-100 years, noted Shimada.

Credit: 
Taylor & Francis Group

COVID-19: Online tool identifies patients at highest risk of deterioration

A new risk-stratification tool which can accurately predict the likelihood of deterioration in adults hospitalised with COVID-19 has been developed by researchers from the UK Coronavirus Clinical Characterisation Consortium (known as ISARIC4C).

Researchers say the online tool, made freely available to NHS doctors from today (Friday 8 January 2021), could support clinicians' decision making - helping to improve patient outcomes and ultimately save lives.

The tool assesses 11 measurements* routinely collected from patients, including age, gender, and physical measurements (such as oxygen levels) along with some standard laboratory tests and calculates a percentage risk of deterioration, known as the '4C Deterioration Score'.

This innovation, published in The Lancet Respiratory Medicine, builds on the Consortium's previous work developing the '4C Mortality Score' to predict the percentage risk of death from COVID-19 after admission to hospital. The '4C Mortality Score' is already recommended for use by NHS England** to guide anti-viral treatments (Remdesivir).

Doctors will now see both the '4C Deterioration Score' and the '4C Mortality Score' at the same time, using the same tool.

Co-senior and corresponding author, Professor Mahdad Noursadeghi (UCL Infection & Immunity), said: "Accurate risk-stratification at the point of admission to hospital will give doctors greater confidence about clinical decisions and planning ahead for the needs of individual patients.

"The addition of the new 4C Deterioration Score alongside the 4C Mortality Score will provide clinicians with an evidence-based measure to identify those who will need increased hospital support during their admission, even if they have a low risk of death."

The tool was developed using data from 74,944 individuals with COVID-19*** admitted to 260 hospitals across England, Scotland and Wales, between February 6 and August 26, 2020.

Using a multivariable logistic regression model (where several measures are used to predict an outcome), researchers tested the 11 measures (age/gender/physical measures/lab tests) against the large patient cohort, to establish how, and to what to degree each of the measures affected the likelihood of deterioration.

Furthermore, researchers assessed how well the tool performed in nine NHS regions and found that it performed similarly well in each, suggesting that it is likely to be useful across the NHS. Importantly, the new risk score showed superior performance across the NHS, in comparison to previous risk scores.

First author Dr Rishi Gupta (UCL Institute of Global Health) said: "The scale and wide geographical coverage of the ISARIC4C study across the country was critical to the development of this prediction tool. Our analysis provides very encouraging evidence that the 4C Deterioration tool is likely to be useful for clinicians across England, Scotland and Wales to support clinical decision-making."

The tool can potentially be incorporated into NHS Trusts' Electronic Health Record System - used to manage all patient care - so that risk scores are automatically generated for patients.

Researchers suggest that the tool could also be used in other countries for risk-stratification, but should first be evaluated to test its accuracy in these settings.

Credit: 
University College London

The Lancet: Most patients hospitalised with COVID-19 have at least one symptom six months after falling ill, Wuhan follow-up study suggests

Study of 1,733 patients first diagnosed in Wuhan (China) between January and May followed to June and September.

76% of COVID-19 patients have at least one symptom six months after symptom onset.

Fatigue or muscle weakness is the most common symptom, with sleep difficulties and anxiety or depression also frequently reported.

Lower antibodies against COVID-19 in patients six months after becoming ill compared with during acute infection raises concerns about the possibility of re-infection.

More than three quarters of COVID-19 patients have at least one ongoing symptom six months after initially becoming unwell, according to research published in The Lancet.

The cohort study, looking at long-term effects of COVID-19 infection on people hospitalised in Wuhan, China, reveals that the most common symptom to persist is fatigue or muscle weakness (63% of patients), with patients also frequently experiencing sleep difficulties (26%). Anxiety or depression was reported among 23% of patients.

Patients who were severely ill in hospital more often had impaired lung function and abnormalities detected in chest imaging - which could indicate organ damage - six months after symptom onset.

Levels of neutralising antibodies fell by more than half (52.5%) after six months in 94 patients whose immune response was tested at the peak of the infection, raising concerns about the possibility of being re-infected by the virus.

Little is known about the long-term health effects of COVID-19 as few follow-up studies have so far been carried out. Those that have been conducted looked only at a small number of cases over a short follow-up period (typically around three months after discharge).

Professor Bin Cao, from National Center for Respiratory Medicine, China-Japan Friendship Hospital and Capital Medical University, said: "Because COVID-19 is such a new disease, we are only beginning to understand some of its long-term effects on patients' health. Our analysis indicates that most patients continue to live with at least some of the effects of the virus after leaving hospital, and highlights a need for post-discharge care, particularly for those who experience severe infections. Our work also underscores the importance of conducting longer follow-up studies in larger populations in order to understand the full spectrum of effects that COVID-19 can have on people." [1]

The new study included 1,733 COVID-19 patients who were discharged from Jin Yin-tan Hospital in Wuhan, China, between January 7th and May 29th 2020. Patients had a median age of 57 years. Follow-up visits were done from June 16th to September 3rd, 2020, and the median follow-up time was 186 days.

All patients were interviewed face-to-face using questionnaires to evaluate their symptoms and health-related quality of life. They also underwent physical examinations, lab tests, and a six-minute walking test to gauge patients' endurance levels. 390 patients had further tests, including an assessment of their lung function. In addition, 94 patients whose blood antibody levels were recorded at the height of the infection as part of another trial received a follow-up test.

At follow-up, 76% of patients (1,265/1,655) reported at least one ongoing symptom. Fatigue or muscle weakness was reported by 63% (1,038/1,655), while 26% (437/1,655) had sleep difficulties and 23% (367/1,733) experienced anxiety or depression.

Of the 390 patients who underwent additional testing, 349 completed the lung function test (41 were unable to complete the test due to poor compliance). Patients with more severe illness commonly had reduced lung function, with 56% (48/86) of those at severity scale 5-6 (who required ventilation) experiencing diffusion impairment - reduced flow of oxygen from the lungs to the bloodstream. For patients at severity scale 4 (who required oxygen therapy) and those at scale 3 (who did not require oxygen therapy) the figures were 29% (48/165) and 22% (18/83), respectively.

Patients with more severe disease performed worse in the six-minute walking test (which measures the distance covered in six minutes), with 29% of those at severity scale 5-6 walking less than the lower limit of the normal range, compared with 24% for those at scale 3, and 22% for scale 4.

The authors also found that some patients went on to develop kidney problems post-discharge. As well as the lungs, COVID-19 is known to affect other organs, including the kidney. Lab tests revealed that 13% (107/822) of patients whose kidney function was normal while in hospital had reduced kidney function in follow-up.

Follow-up blood antibody tests from 94 patients after six months revealed that levels of neutralising antibodies were 52.5% lower than at the height of infection. The authors say this raises concerns about the possibility of COVID-19 re-infection.

As the number of participants with antibody test results both at acute phase and follow-up was limited, larger samples are needed in future to clarify how levels of antibodies against the virus change over time. Further work is also needed to compare differences in outcomes between inpatients and outpatients, as patients with mild COVID-19 symptoms who stayed in temporary Fangcang shelter hospitals were not included in the study.

Impaired lung function and exercise capacity observed in the study cannot be directly attributed to COVID-19 as baseline data for these are unavailable. Due to the way the data was analysed, it also was not possible to determine if symptoms reported during follow-up were persistent following the infection, worsened after recovery, or occurred post-discharge.

Writing in a linked Comment, Monica Cortinovis, Norberto Perico, and Giuseppe Remuzzi, from the Istituto di Ricerche Farmacologiche Mario Negri IRCCS, Italy, who were not involved in the study, remark on the uncertainty regarding the possible long-term impacts of COVID-19 on health, saying: "Unfortunately, there are few reports on the clinical picture of the aftermath of COVID-19. The study by Huang and colleagues in The Lancet is therefore relevant and timely."

Echoing the study authors' calls for further research, they add: "Even though the study offers a comprehensive clinical picture of the aftermath of COVID-19 in hospitalised patients, only 4% were admitted to an intensive care unit (ICU), rendering the information about the long-term consequences in this particular cohort inconclusive. Nonetheless, previous research on patient outcomes after ICU stays suggests that several COVID-19 patients who were critically ill while hospitalised will subsequently face impairments regarding their cognitive and mental health and/or physical function far beyond their hospital discharge."

Credit: 
The Lancet

New tech helping cancer patients manage symptoms

image: Hands at a computer

Image: 
Stevepb/pixabay

Hundreds of cancer patients have benefitted from using computer algorithms to manage their symptoms and improve their wellbeing in a unique UK trial.

The early stage colorectal, breast or gynecological cancer patients took part in the trial of the eRAPID system, developed by the University of Leeds, which allowed them to report online symptoms from home and receive instant advice on whether to self-manage or seek medical attention.

Patients reported better symptom control and physical wellbeing in the early weeks of treatment, with the system preventing symptom deterioration in about 9% of patients after 12 weeks. Patients reported more confidence in managing their health at the end of their four-month trial period.

The results demonstrate that improvements to patients' physical wellbeing can be achieved in a cost-effective way without increasing clinicians' workload.

It is the first such trial to offer automated advice, and one of only a few to focus mainly on early-stage patients whose treatment aims to cure the cancer.

Programme lead Professor Galina Velikova, at the Leeds Institute of Medical Research at St James's, University of Leeds, and the Leeds Cancer Centre, Leeds Teaching Hospitals NHS Trust, said: "Rising numbers of cancer patients are receiving a range of anti-cancer treatments which means patients are living longer and require longer periods of care and monitoring.

"Remote online monitoring options have the potential to be a patient-centred, safe and effective approach to support patients during cancer treatment and manage the growing clinical workload for cancer care."

Dr Kate Absolom, University Academic Fellow in the Leeds Institute of Medical Research at St James's and the Leeds Institute of Health Sciences at the University of Leeds, said: "The encouraging results from this study will help pave the way for future development and refinement of these interventions in broader cancer settings. The COVID19 pandemic highlighted the need and speeded a shift towards technology-enabled care, so these study results are very timely."

Cancer patients can experience a range of symptoms, which can be caused by the cancer itself, by other conditions, or by side effects from chemotherapy and other treatments, which are sometimes life threatening and require emergency hospitalisation.

Symptoms can significantly lower patients' quality of life. Better monitoring and management can improve treatment delivery and reduce patients' physical distress.

Funded by the National Institute for Health Research (NIHR), the eRAPID trial set out to establish whether symptom control could be improved using automated advice, to try to improve patients' wellbeing. It included 508 patients aged 18 to 86 who were starting chemotherapy at Leeds Cancer Centre. All patients received their usual care, with 256 receiving the eRAPID system as additional care.

Participants answered a set of cancer specific questions through an online symptom report once a week, or when new symptoms emerged, over the 18-week study period. Using symptom severity grades, a computer algorithm designed by the researchers and clinicians scored all the responses and determined the advice patients received.

Questions covered pain levels, nausea, spending time in bed and not meeting family needs. Participants received immediate advice on symptom management or a prompt to contact the hospital. Symptom reports were immediately displayed in the patients' electronic records, and email alerts for severe symptom reports were sent directly to clinicians.

A total of 3,314 online reports were completed, reporting 18,867 individual symptoms - an average of 13 reports per patient. Emergency alerts were sent 29 times (under 1%), while serious symptoms not requiring immediate medical attention were reported on 461 occasions (14%). More than 80% of self-reported symptoms triggered self-management advice, providing a cost-effective solution with better outcomes for patients.

Clinical benefits in patients' physical wellbeing were seen particularly at the early period of treatment, between weeks 6 and 12, when challenges in controlling side-effects are expected.

The immediate advice increased patient confidence in managing the mild and moderate treatment-related symptoms, which can significantly impact patients' quality of life and ability to continue treatment.

And trial data showed no increase in hospital workload, no differences in chemotherapy delivery, and no compromise of patient safety.

Julia Brown, Professor of Clinical Trials Research and Director of the University of Leeds Institute of Clinical Trials Research, in the School of Medicine, said: "This study provides timely and important evidence that remote real-time monitoring of cancer patients, particularly essential during pandemic conditions, is feasible and can improve patients' physical wellbeing."

Credit: 
University of Leeds

Latina mothers, often essential workers, report COVID-19 took toll

image: Financial and psychological toll: The pandemic has forced more than 50 percent of the families in a regional study to make economic cutbacks.

Image: 
UC Davis

More than half of Latina mothers surveyed in Yolo and Sacramento counties reported making economic cutbacks in response to the pandemic shutdown last spring -- saying they bought less food and missed rent payments. Even for mothers who reported receiving the federal stimulus payment during this time, these hardships were not reduced, University of California, Davis, researchers found in a recent study.

"Latino families are fighting the pandemic on multiple fronts, as systemic oppression has increased their likelihood of contracting the virus, having complications from the virus and having significant economic hardship due to the virus," said Leah C. Hibel, associate professor of human development and family studies at UC Davis and lead author of the study. "These factors are likely to have a significant psychological toll on these families."

The study was published Jan. 7 in the journal Traumatology. Researchers administered surveys to 70 Latina mothers March 18 through June 5 of last spring, after the "shelter-in-place" orders went into effect in California in response to the COVID-19 pandemic.

The survey sample consisted of Latina mothers, all of whom are low-income, with 92 percent of the families having an essential worker (either the mother or her partner), in Yolo and Sacramento counties, researchers said. The survey respondents were identified through an earlier UC Davis study on Mexican-origin families living in the region. Yolo and Sacramento counties are in Northern California, which has a higher cost of living than most of the country, but has a relatively high level of social services available, researchers added.

Researchers said that although it has been reported that the stimulus checks may have kept some from falling below the poverty level, cutbacks due to other economic factors still had an effect on these low-income families.

"In other words, though the stimulus may have prevented some families from falling below the poverty line, our analyses suggest that many low-income families are still facing significant financial hardship," researchers said in the study. "This hardship appears to be placing families on a trajectory toward hunger and eviction."

Less food, higher stress

Mothers who engaged in cutbacks reported significantly higher levels of stress, depression, and anxiety. Further, receiving the federal stimulus money administered to those whose income was less than $99,000 a year (through the Coronavirus Aid, Relief, and Economic Security Act) was not associated with lower cutbacks, stress, depression or anxiety. Of those surveyed, 65 percent had reported receiving their checks by the time the survey was administered.

Mothers' depressive symptoms were assessed through survey questions administered by phone. Most were in English, but in some cases, questions were asked in Spanish by native-speaking interviewers. Assessed on a five-point answer scale, mothers were asked such questions as:

"How stressed are you because of the virus outbreak?"

"How often have you felt or experienced depressed mood?"

"How often have you felt panicky?"

Mothers were also asked to indicate whether or not they made any of 12 listed cutbacks (food, rent, cutting back on air conditioning, etc.) in previous weeks because of the pandemic. Of these mothers, 52 percent reported being forced to make economic cutbacks, and they reported higher stress, depressive symptoms and anxiety than those who reported not cutting back.

Researchers noted that the immediate economic impacts of the pandemic on low-income Latina mothers' well-being suggests that alleviating families' economic hardship might benefit mothers psychologically. Though the researchers did not find the stimulus payment to buffer the economic or psychological impacts, they suggest the stimulus money was simply not enough.

"Without additional local, state or federal aid, the pandemic is likely to cause severe hardship marked by homelessness, hunger and mental illness. Additional recurring monthly stimulus payments could be a lifeline for families who are struggling to make ends meet," the researchers wrote.

Credit: 
University of California - Davis

UCF engineering and biology researchers collaborate to aid coral reef restoration

image: A uniaxial compression test is performed on Acropora cervicornis coral skeleton.

Image: 
Mahmoud Omer, University of Central Florida

ORLANDO, Jan. 8, 2021 - Florida's threatened coral reefs have a more than $4 billion annual economic impact on the state's economy, and University of Central Florida researchers are zeroing in on one factor that could be limiting their survival - coral skeleton strength.

In a new study published in the journal Coral Reefs, UCF engineering researchers tested how well staghorn coral skeletons withstand the forces of nature and humans, such as impacts from hurricanes and divers.

The researchers subjected coral skeletons to higher stresses than those caused by ocean waves, says Mahmoud Omer, a doctoral student in UCF's Department of Mechanical and Aerospace Engineering and study co-author. "Under normal wave and tide regimes, a staghorn coral's skeleton will resist the physical forces exerted by the ocean waves. However, anthropogenic stressors such as harmful sunscreen ingredients, elevated ocean temperature, pollution and ocean acidification will weaken the coral skeleton and reduce its longevity."

Florida's coral reefs generate billions of dollars in local income, provide more than 70,000 jobs, protect the state's shorelines from storms and hurricanes, and support a diverse ecosystem of marine organisms, according to the U.S. National Oceanic and Atmospheric Administration.

The study uncovered a unique safety feature of the staghorn coral skeleton: Its porous design keeps the coral from being instantly crushed by an impact.

When the UCF engineers subjected skeleton samples to increasing stress, pores relieved the applied load and temporarily stalled further cracking and structural failure. The pores would "pop-in" and absorb some of the applied mechanical energy, thus preventing catastrophic failure. Although this ability has been shown in other coral species, this is the first time it's been demonstrated in staghorn coral.

"For the first time, we used the tools of mechanical engineering to closely examine the skeletons of a critically endangered coral raised in a coral nursery," says John Fauth, an associate professor in UCF's Department of Biology. "We now know more about the structure and mechanical performance of the staghorn coral skeleton than any other coral in the world. We can apply this knowledge to understand why staghorn coral restoration may work in some areas, but where their skeletons may fail due to human and environmental challenges in others."

The results provide baseline values that can be used to judge if nursery-reared staghorn coral have skeletons strong enough for the wild and to match them to areas with environmental conditions that best fit their skeleton strength.

Staghorn coral gets its name from the antler-like shape of its branches, which create an intricate underwater habitat for fish and reef organisms. It primarily is found in shallow waters around the Florida Keys, Puerto Rico, the U.S. Virgin Islands, and other Caribbean islands but has declined by more than 97 percent since the 1980s.

Although restoration efforts using transplanted, nursery-reared coral are underway, scientists continue working to increase their success rate.

Understanding coral skeleton structures also could inform development of skeletal structure replacements for humans, says Nina Orlovskaya, an associate professor in UCF's Department of Mechanical and Aerospace Engineering. "Our findings are of high importance for development of novel and superior biostructures, which can be used as bone graft substitutes," Orlovskaya says. "Coral skeleton structures could be either chemically converted or 3D printed into bio-compatible calcium phosphate ceramics that one day might be directly used to regenerate bones in humans."

In addition to compression tests, the researchers analyzed mechanical properties and spectral and fluidic behavior. The spectral analysis used Raman microscopy, which allowed the researchers to map the effects of compression at the microscopic level in the coral skeleton.

Fluidic behavior analysis revealed that vortices formed around the coral colony helped it to capture food and transport respiratory gases and wastes.

The coral skeletons studied were from Nova Southeastern University's coral nursery, about one mile off the coast of Ft. Lauderdale, near Broward County, Florida. The corals were deceased and were from colonies that failed or were knocked loose during a storm.

Credit: 
University of Central Florida

Large study finds higher burden of acute brain dysfunction for COVID-19 ICU patients

image: Brenda Pun, DNP, RN, Vanderbilt University Medical Center

Image: 
Vanderbilt University Medical Center

COVID-19 patients admitted to intensive care in the early months of the pandemic were subject to a significantly higher burden of delirium and coma than is typically found in patients with acute respiratory failure. Choice of sedative medications and curbs on family visitation played a role in increasing acute brain dysfunction for these patients.

That's according to an international study published Jan. 8 in The Lancet Respiratory Medicine, led by researchers at Vanderbilt University Medical Center in coordination with researchers in Spain.

The study, which is far the largest of its kind to date, tracks the incidence of delirium and coma in 2,088 COVID-19 patients admitted before April 28, 2020, to 69 adult intensive care units across 14 countries.

ICU delirium is associated with higher medical costs and greater risk of death and long-term ICU-related dementia. Seminal studies at VUMC over the past two decades have spurred widespread interest in ICU delirium research, and the resulting body of evidence has come to inform critical care guidelines endorsed by medical societies in several countries. These guidelines include well calibrated pain management with prompt discontinuation of analgesics and sedatives, daily spontaneous awakening trials, daily spontaneous breathing trials, delirium assessments throughout the day, early mobility and exercise, and family engagement.

Some 82% of patients in this observational study were comatose for a median of 10 days, and 55% were delirious for a median of three days. Acute brain dysfunction (coma or delirium) lasted for a median of 12 days.

"This is double what is seen in non-COVID ICU patients," said VUMC's Brenda Pun, DNP, RN, co-first author on the study with Rafael Badenes MD, PhD, of the University of Valencia in Spain. The authors cite a previous large, multi-site ICU study, also led by VUMC, where acute brain dysfunction lasted a median of five days, including four days of coma and one day of delirium.

The authors note that COVID-19 disease processes could predispose patient to a higher burden of acute brain dysfunction. But they also note that a number of patient care factors, some of which are related to pressures posed on health care by the pandemic, also appear to have played a significant role.

The study appears to show a reversion to outmoded critical care practices, including deep sedation, widespread use of benzodiazepine infusions (benzodiazepine is a nervous system depressant), immobilization, and isolation from families. The authors find that, where COVID-19 is concerned, there has been an apparent widespread abandonment of newer clinical protocols that are proven to help ward off the acute brain dysfunction that stalks many critically ill patients.

"It is clear in our findings that many ICUs reverted to sedation practices that are not in line with best practice guidelines," Pun said, "and we're left to speculate on the causes. Many of the hospitals in our sample reported shortages of ICU providers informed about best practices.
There were concerns about sedative shortages, and early reports of COVID-19 suggested that the lung dysfunction seen required unique management techniques including deep sedation. In the process, key preventive measures against acute brain dysfunction went somewhat by the boards."

Using electronic health records, investigators were able to closely examine patient characteristics, care practices and findings from clinical assessments. Some 88% of patients tracked in the study were invasively mechanical ventilated at some point during hospitalization, 67% on the day of ICU admission. Patients receiving benzodiazepine sedative infusions were at 59% higher risk of developing delirium. Patients who received family visitation (in-person or virtual) were at 30% lower risk of delirium.

"There's no reason to think that, since the close of our study, the situation for these patients has changed," said one of the study's senior authors, Pratik Pandharipande, MD, MSCI, professor of Anesthesiology.

"These prolonged periods of acute brain dysfunction are largely avoidable. Our study sounds an alarm: as we enter the second and third waves of COVID-19, ICU teams need above all to return to lighter levels of sedation for these patients, frequent awakening and breathing trials, mobilization and safe in-person or virtual visitation."

Credit: 
Vanderbilt University Medical Center

Including unhealthy foods may diminish positive effects of an otherwise healthy diet

Eating a healthy diet, such as the Mediterranean diet, has a positive impact on health, but little is known about the effects of including unhealthy foods in an otherwise healthy diet. Now researchers at Rush University Medical Center have reported diminished benefits of a Mediterranean diet among those with high frequency of eating unhealthy foods. The results of their study were published in Alzheimer's & Dementia: The Journal of the Alzheimer's Association on Jan. 7.

"Eating a diet that emphasizes vegetables, fruit, fish and whole grains may positively affects a person's health," said Puja Agarwal, PhD, a nutritional epidemiologist and assistant professor in the Department of Internal Medicine at Rush Medical College. "But when it is combined with fried food, sweets, refined grains, red meat and processed meat, we observed that the benefits of eating the Mediterranean part of the diet seems to be diminished."

A Mediterranean diet is associated with slower rates of cognitive decline in older adults.

The observational study included 5,001 older adults living in Chicago who were part of the Chicago Health and Aging Project, an evaluation of cognitive health in adults over the age of 65 conducted from 1993 to 2012. Every three years, the study participants completed a cognitive assessment questionnaire that tested basic information processing skills and memory, and they filled out a questionnaire about the frequency with which they consumed 144 food items.

The researchers analyzed how closely each of the study participants adhered to a Mediterranean diet, which includes daily consumption of fruit, vegetables, legumes, olive oil, fish, potatoes and unrefined cereals, plus moderate wine consumption. They also assessed how much each participant followed a Western diet, which included fried foods, refined grains, sweets, red and processed meats, full-fat dairy products and pizza. They assigned scores of zero to five for each food item to compile a total Mediterranean diet score for each participant along a range from zero to 55.

The researchers then examined the association between Mediterranean diet scores and changes in participants' global cognitive function, episodic memory and perceptual speed. Participants with slower cognitive decline over the years of follow-up were those who adhered closest to the Mediterranean diet, along with limiting foods that are part of Western diet, whereas participants who ate more of the Western diet had no beneficial effect of healthy food components in slowing cognitive decline.

There was no significant interaction between age, sex, race or education and the association with cognitive decline in either high or low levels of Western diet foods. The study also included models for smoking status, body mass index and other potential variables such as cardiovascular conditions and findings remained the same.

"Western diets may adversely affect cognitive health," Agarwal said. "Individuals who had a high Mediterranean diet score compared to those who had the lowest score were equivalent to being 5.8 years younger in age cognitively."

Agarwal said that the results complement other studies showing that a Mediterranean diet reduces the risk of heart disease, certain cancers and diabetes and also support previous studies on Mediterranean diet and cognition. The study also notes that most of the dietary patterns that have shown improvement in cognitive function among older adults, including the Mediterranean, MIND, and DASH diets, have a unique scoring matrix based on the amount of servings consumed for each diet component.

"The more we can incorporate green leafy vegetables, other vegetables, berries, olive oil, and fish into our diets, the better it is for our aging brains and bodies. Other studies show that red and processed meat, fried food and low whole grains intake are associated with higher inflammation and faster cognitive decline in older ages," Agarwal said. "To benefit from diets such as the Mediterranean diet, or MIND diet, we would have to limit our consumption of processed foods and other unhealthy foods such as fried foods and sweets."

The study and its findings cannot be readily generalized. Future longitudinal studies on diet and cognition among the middle-aged population are needed to extend these findings.

Credit: 
Rush University Medical Center

New findings help explain how COVID-19 overpowers the immune system

image: Portrait of Pinchas Cohen, MD, Dean of the USC Leonard Davis School of Gerontology

Image: 
Stephanie Kleinman

Seeking to understand why COVID-19 is able to suppress the body’s immune response, new research from the USC Leonard Davis School of Gerontology suggests that mitochondria are one of the first lines of defense against COVID-19 and identifies key differences in how SARS-CoV-2, the virus that causes COVID-19, affects mitochondrial genes when compared to other viruses. These differences offer possible explanations as to why older adults and people with metabolic disfunction have more severe responses to COVID-19 than other individuals and they also provide a starting point for more targeted approaches that may help identify therapeutics, says senior author Pinchas Cohen, professor of gerontology, medicine and biological sciences and dean of the USC Leonard Davis School.

“If you already have mitochondrial and metabolic dysfunction, then you may, as a result, have a poor first line of defense against COVID-19. Future work should consider mitochondrial biology as a primary intervention target for SARS-CoV-2 and other coronaviruses,” he said.

The study, published in the Nature journal Scientific Reports, expands on recent findings that COVID-19 mutes the body’s innate inflammatory response and reports that it does so by diverting mitochondrial genes from their normal function.

“We already knew that our immune response was not mounting a successful defense to COVID-19, but we didn't know why,” said lead author Brendan Miller, a senior doctoral student at the USC Leonard Davis School. “What we did differently was look at how the virus specifically targets mitochondria, a cellular organelle that is a crucial part of the body’s innate immune system and energy production.”

Making use of the vast amounts of public data being uploaded in the early days of the virus outbreak, the research team performed RNA sequencing analyses that compared mitochondrial-COVID interactions to those of other viruses: respiratory syncytial virus, seasonal influenza A virus, and human parainfluenza virus 3. These reanalyses identified three ways in which COVID-19, but not the other viruses, mutes the body’s cellular protective response.

Chief among their findings is that SARS-CoV-2 uniquely reduces the levels of a group of mitochondrial proteins, known as Complex One, that are encoded by nuclear DNA. It is possible that this effect "quiets" the cell's metabolic output and reactive oxygen species generation, that when functioning correctly, produces an inflammatory response that can kill a virus, they say.

“COVID-19 is reprogramming the cell to not make these Complex One-related proteins. That could be one way the virus continues to propagate,” says Miller, who notes that this, along with the study’s other observations, still needs to be validated in future experiments.

The study also revealed that SARS-CoV-2 does not change the levels of the messenger protein, MAVS mRNA, that usually tells the cell a viral attack has happened. Normally, when this protein gets activated it functions as an alarm system, warning the cell to self-destruct so that the virus cannot replicate, says Miller.

In addition, the researchers found that genes encoded by the mitochondria were not being turned on or off by SARS-CoV-2 - a process that is believed to produce energy that can help the cell evade a virus - at rates to be expected when confronted with a virus.

“This study adds to a growing body of research on mitochondrial-COVID interactions and presents tissue and cell-specific effects that should be carefully considered in future experiments,” said Cohen.

Credit: 
University of Southern California

Bacteria can tell the time

image: Shining a light on internal clocks - the bacterium Bacillus subtilis

Image: 
Professor Ákos Kovács, Technical University of Denmark

Humans have them, so do other animals and plants. Now research reveals that bacteria too have internal clocks that align with the 24-hour cycle of life on Earth.

The research answers a long-standing biological question and could have implications for the timing of drug delivery, biotechnology, and how we develop timely solutions for crop protection.

Biological clocks or circadian rhythms are exquisite internal timing mechanisms that are widespread across nature enabling living organisms to cope with the major changes that occur from day to night, even across seasons.

Existing inside cells, these molecular rhythms use external cues such as daylight and temperature to synchronise biological clocks to their environment. It is why we experience the jarring effects of jet lag as our internal clocks are temporarily mismatched before aligning to the new cycle of light and dark at our travel destination.

A growing body of research in the past two decades has demonstrated the importance of these molecular metronomes to essential processes, for example sleep and cognitive functioning in humans, and water regulation and photosynthesis in plants.

Although bacteria represent 12% biomass of the planet and are important for health, ecology, and industrial biotechnology, little is known of their 24hr biological clocks.

Previous studies have shown that photosynthetic bacteria which require light to make energy have biological clocks.

But free-living non photosynthetic bacteria have remained a mystery in this regard.

In this international study researchers detected free running circadian rhythms in the non-photosynthetic soil bacterium Bacillus subtilis.

The team applied a technique called luciferase reporting, which involves adding an enzyme that produces bioluminescence that allows researchers to visualise how active a gene is inside an organism.

They focused on two genes: firstly, a gene called ytvA which encodes a blue light photoreceptor and secondly an enzyme called KinC that is involved in inducing formation of biofilms and spores in the bacterium.

They observed the levels of the genes in constant dark in comparison to cycles of 12 hours of light and 12 hours of dark. They found that the pattern of ytvA levels were adjusted to the light and dark cycle, with levels increasing during the dark and decreasing in the light. A cycle was still observed in constant darkness.

Researchers observed how it took several days for a stable pattern to appear and that the pattern could be reversed if the conditions were inverted. These two observations are common features of circadian rhythms and their ability to "entrain" to environmental cues.

They carried out similar experiments using daily temperature changes; for example, increasing the length or strength of the daily cycle, and found the rhythms of ytvA and kinC adjusted in a way consistent with circadian rhythms, and not just simply switching on and off in response to the temperature.

"We've found for the first time that non-photosynthetic bacteria can tell the time," says lead author Professor Martha Merrow, of LMU (Ludwig Maximilians University) Munich. "They adapt their molecular workings to the time of day by reading the cycles in the light or in the temperature environment."

"In addition to medical and ecological questions we wish to use bacteria as a model system to understand circadian clock mechanisms. The lab tools for this bacterium are outstanding and should allow us to make rapid progress," she added.

This research could be used to help address such questions as: is the time of day of bacterial exposure important for infection? Can industrial biotechnological processes be optimised by taking the time of day into account? And is the time of day of anti-bacterial treatment important?

"Our study opens doors to investigate circadian rhythms across bacteria. Now that we have established that bacteria can tell the time we need to find out the processes that cause these rhythms to occur and understand why having a rhythm provides bacteria with an advantage," says author Dr Antony Dodd from the John Innes Centre.

Professor Ákos Kovács, co-author from the Technical University of Denmark adds that "Bacillus subtilis is used in various applications from laundry detergent production to crop protection, besides recently exploiting as human and animal probiotics, thus engineering a biological clock in this bacterium will culminate in diverse biotechnological areas."

Credit: 
John Innes Centre