Culture

Additional data on blood thinner efficacy for COVID-19 and insight on best possible regimens

image: Thromboembolic disease is a complication of COVID-19. Prophylactic and therapeutic anticoagulation are associated with better outcomes in hospitalized patients with COVID-19. randomized controlled trials evaluating different AC regimens in COVID-19 are needed.

Image: 
Mount Sinai Health System

Early in the COVID-19 pandemic, Mount Sinai researchers were among the first to show that anticoagulation therapy was associated with improved survival among hospitalized COVID-19 patients. But many questions remained--about the size of the potential benefit, and about what dosage of this therapy might be more effective. Now, the research team has suggested some possible answers, in a paper published in the August 26 online issue of the Journal of the American College of Cardiology.

In this observational study, the researchers found all regimens of anticoagulants--drugs that prevent blood clotting--were far superior to no anticoagulants in COVID-19 patients. More specifically, patients on both a "therapeutic" or full dose, and those on a "prophylactic" or lower dose, showed about a 50 percent higher chance of survival, and roughly a 30 percent lower chance of intubation, than those not on anticoagulants. The researchers looked at six different anticoagulant regimens, including both oral and intravenous dosing, within both therapeutic and prophylactic groups. They observed that therapeutic and prophylactic subcutaneous low-molecular weight heparin, and therapeutic oral apixaban may lead to better results.

"This work from the Mount Sinai COVID Informatics Center provides additional insight on the role of anticoagulation in the management of patients admitted to the hospital with COVID-19. Although this is an observational study, it helped in the design of a large-scale international clinical trial that we are coordinating. The randomized trial focuses on those three antithrombotic regimens-- therapeutic and prophylactic subcutaneous low-molecular weight heparin, and therapeutic oral apixaban," says senior corresponding author Valentin Fuster, MD, PhD, Director of Mount Sinai Heart and Physician-in-Chief of The Mount Sinai Hospital.

This study is an extension of Mount Sinai research that showed that treatment with anticoagulants was associated with improved outcomes both in and out of the intensive care unit among hospitalized COVID-19 patients. The work was prompted by the discovery that many patients hospitalized with COVID-19 developed high levels of life-threatening blood clots.

The team of investigators evaluated electronic medical records of 4,389 confirmed COVID-19-positive patients admitted to five hospitals in the Mount Sinai Health System in New York City (The Mount Sinai Hospital, Mount Sinai West, Mount Sinai Morningside, Mount Sinai Queens, and Mount Sinai Brooklyn) between March 1 and April 30, 2020. They specifically looked at survival and death rates for patients placed on therapeutic and prophylactic doses of blood thinners (oral antithrombotics, subcutaneous heparin, and intravenous heparin) versus those not placed on blood thinners. The researchers used a hazard score to estimate risk of death, which took relevant risk factors into account before evaluating the effectiveness of anticoagulation, including age, ethnicity, pre-existing conditions, and whether the patient was already on blood thinners. The researchers also took into account and corrected for disease severity, including low oxygen saturation levels and intubation.

Of the patients analyzed, 900 (20.5 percent) received a full-treatment dose of anticoagulants. Another 1,959 patients (44.6 percent) received a lower, prophylactic dose of anticoagulants, and 1,530 (34.5 percent) were not given blood thinners. There was a strong association between blood thinners and reduced likelihood of in-hospital deaths: both therapeutic and prophylactic doses of anticoagulants reduced mortality by roughly 50 percent compared to patients on no blood thinners.

Overall, 467 (10.6 percent) of the patients required intubation and mechanical ventilation during their hospitalization. Those on therapeutic blood thinners had 31 percent fewer intubations than those not on blood thinners, while those on prophylactic blood thinners had 28 percent fewer.

Bleeding rates--a known complication of blood thinners--were surprisingly low overall among all patients (three percent or less), but slightly higher in the therapeutic group compared to the prophylactic and no-blood-thinner groups, the researchers said. Their findings suggest that clinicians should evaluate patients on an individual basis given the benefit-risk tradeoff.

Separately, the researchers looked at autopsy results of 26 COVID-19 patients and found that 11 of them (42 percent) had blood clots--pulmonary, brain, and/or heart--that were never suspected in the clinical setting. These findings suggest that treating patients with anticoagulants may be associated with improved survival.

"This report is much more in-depth than our previous brief report and includes many more patients, longer follow-up, and rigorous methodology. Clearly, anticoagulation is associated with improved outcomes and bleeding rates appear to be low," says corresponding author Anu Lala, MD, Assistant Professor of Medicine (Cardiology) and Director of Heart Failure Research at the Icahn School of Medicine at Mount Sinai. "As a clinician who has treated COVID-19 patients on the front lines, I recognize the importance of having answers as to what the best treatment for these patients entails, and these results will inform the design of clinical trials to ultimately give concrete information."

"These observational analyses were done with the highest level of statistical rigor and provide exciting insights into the association of anticoagulation with critical in-hospital outcomes of mortality and intubation," says first author Girish Nadkarni, MD, Co-Founder and Co-Director of the Mount Sinai COVID Informatics Center and Clinical Director of the Hasso Plattner Institute for Digital Health at Mount Sinai. "We are excited that results from this observational study in one of the largest and most diverse hospitalized populations have led to an ongoing trial of type, duration, and doses of anticoagulation. Ultimately we hope this work will lead to improved outcomes and treatment for COVID-19 patients."

"This work highlights the need to better understand the disease from a diagnostic and therapeutic point of view and the importance of conducting properly designed diagnostic and interventional studies," explains co-author Zahi Fayad, PhD, Co-Founder of the Mount Sinai COVID Informatics Center and Director of Mount Sinai's BioMedical Engineering and Imaging Institute.

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

Genetic causes of severe childhood brain disorders found using new computational methods

Philadelphia, August 26, 2020 - A team of researchers at Children's Hospital of Philadelphia (CHOP) affiliated with the CHOP Epilepsy Neurogenetics Initiative (ENGIN) have combined clinical information with large-scale genomic data to successfully link characteristic presentations of childhood epilepsies with specific genetic variants. The findings were published today in the American Journal of Human Genetics.

Developmental and Epileptic Encephalopathies (DEE), a group of severe brain disorders that can cause difficult-to-treat seizures, cognitive and neurological impairment, and, in some cases, early death, are known to have more than 100 underlying genetic causes. However, matching characteristic clinical features and outcomes with specific genetic mutations can be especially daunting given the large number of genetic causes, each of which is very rare.

When genetic information is collected, a person's phenotype - or clinical features - are typically also documented. However, while genetic information is collected in a standardized manner, the same is not true when describing clinical symptoms, which makes it difficult when trying to pinpoint whether certain genetic mutations are responsible for specific clinical features.

Building upon their previous work, researchers from CHOP utilized the Human Phenotype Ontology (HPO), which provides a standardized format to characterize a patient's phenotypic features and allows clinical data to be used at a similar level as genetic data.

"For this study, we used phenotypic and genetic information that had been collected in several important cohorts for more than a decade," said Ingo Helbig, MD, attending physician at ENGIN, director of the genomic and data science core of ENGIN and lead investigator of the study. "In this study alone, we found associations of 11 genetic causes with specific phenotypes. Without methods to systematically analyze clinical data, we could not have possibly done this previously, even with this robust cohort of patients."

In total, the study team analyzed 31,742 HPO terms in 846 patients with existing whole exome sequencing data. Some examples of causative genes in DEE identified in this study were SCN1A, which was associated with complex febrile seizures and focal clonic seizures; STXBP1, which was associated with absent speech; and SLC6A1, which was associated with EEG with generalized slow activity. In total, 41 genes with variants presented in at least two individuals, and 11 of those genes showed significant similarity between phenotypes of the patients with changes in these genes. Using a statistical analysis, the researchers showed that this was more than would be possible via chance.

"Traditionally, many of the genetic epilepsies that we now develop treatments for were described because of a specific set of clinical features that stood out. However, this type of traditional description of new diseases requires patients to be seen by the same provider or within the same center. What we have done with this study is re-engineered the cognitive process that goes on when clinicians discover a new syndrome," Helbig said. "We have developed a computational mechanism to replicate this type of discovery from large, de-identified clinical data. As the amount of deep phenotypic data available to us increases, we now have the ability to identify novel genetic causes of particularly severe forms of epilepsy that are targets for new treatments."

Credit: 
Children's Hospital of Philadelphia

Difficult, complex decisions underpin the future of the world's coral reefs

video: Around the world, climate change is triggering more frequent and severe coral bleaching events, which kill coral. Global emissions reduction remains the most important action to minimise the impact of climate change on the Great Barrier Reef. However, with average global temperatures already 1°C above pre-industrial levels, emissions reduction is no longer enough to guarantee survival of the Great Barrier Reef as we know it. Modelling shows that even in the best-case scenario of carbon emissions reduction, water temperatures will continue to increase until 2050, outpacing corals' capacity to naturally adapt. In addition to best-practice reef management, and global action to reduce carbon emissions, bold action is urgently needed to help protect the Great Barrier Reef. Successful intervention is possible and could double the likelihood of sustaining the Reef in good condition by 2050, according to the world's most rigorous and comprehensive investigation into medium- and large-scale reef intervention. The Reef Restoration and Adaptation Program (RRAP) Concept Feasibility Study was conducted by a partnership of leading Australian experts including scientists, engineers, modellers and economists. RRAP is now embarking on a long-term research and development (R&D) program to rigorously develop, test and risk-assess novel interventions to help keep the Reef resilient and sustain critical functions and values. This ambitious undertaking will require not only our best minds working in partnership across many organisations and fields of expertise, but importantly, the input and support of Traditional Owners, reef communities and industries and the wider Australian public. The aim is to provide reef managers and decision-makers with an innovative toolkit of safe, acceptable, cost-effective interventions to help protect the Great Barrier Reef from the impacts of climate change. The toolkit would allow for an integrated three-point approach to helping protect the Reef: o cooling and shading to help protect the Reef from the impacts of climate change o assisting reef coral species to evolve and adapt to the changing environment, to minimise the need for ongoing intervention o supporting natural restoration of damaged and degraded reefs. The interventions would be implemented at an effective scale if, when and where it was decided action was needed. The RRAP R&D Program aims to achieve the best outcomes under a wide range of possible climate change scenarios. A 1:40 animation For more information: www.GBRrestoration.org

Image: 
Reef Restoration and Adaptation Program

Effective solutions to the climate challenge threatening the world's coral reefs require complex decisions about risk and uncertainty, timing, quality versus quantity as well as which species to support for the most robust and productive future, according to a science paper released today.

Interventions to help coral reefs under global change - a complex decision challenge, by a group of key scientists from Australia's Reef Restoration and Adaptation Program (RRAP), was today published in PLOS ONE.

The paper warns that while best-practice conventional management is essential, it is unlikely to be enough to sustain coral reefs under continued climate change. Nor is reducing emissions of greenhouse gases, on its own, sufficient any longer.

Lead author - marine biologist and decision scientist Dr Ken Anthony, of the Australian Institute of Marine Science (AIMS) - said that even with strong action to reduce carbon emissions, global temperatures could stay elevated for decades.

"Coordinated, novel interventions will most likely be needed - combined with best-practice conventional reef management and reduced carbon emissions - to help the Reef become resilient in the face of climate change," he said.

"Developing new technologies for environmental management and conservation carries some risks but delaying action represents a lost opportunity to sustain the Reef in the best condition possible."

Such interventions include local and regional cooling and shading technologies such as brightening clouds to reflect sunlight and shade the reef, assisting the natural evolution of corals to increase their resilience to the changing environment, and measures to support and enhance the natural recovery of damaged reefs.

The paper draws parallels between the risk assessment of coral reef interventions and driverless cars and new drugs. It outlines the prioritisation challenges and the trade-offs that need to be weighed.

"For example, should we aim to sustain minimal coral cover over a very large area of the reef or moderate coral cover over a smaller area?" he said.

"While the net result of coral area sustained may be the same, it could produce very different ecological outcomes and values for industries like tourism.

"Spreading efforts thinly could reduce the Reef's capacity to sustain critical ecological functions, while concentrating efforts on a selection of just a few reefs could sustain most of the Reef's tourism industry, which is spatially concentrated.

"But under severe climate change, preserving more coral cover in smaller areas could reduce the Great Barrier Reef to a fragmented (and therefore vulnerable) network of coral oases in an otherwise desolate seascape."

Dr Anthony said prioritising the coral species to be supported by adaptation and restoration measures added to the decision challenge for reef restoration and adaptation.

"Without significant climate mitigation, sensitive coral species will give way to naturally hardier ones, or to species that can adapt faster," he said.

"Picking who should be winners, and ultimately who will be losers under continued but uncertain climate change is perhaps the biggest challenge facing R&D programs tasked with developing reef rescue interventions."

Co-author and AIMS CEO Dr Paul Hardisty said how interventions were chosen and progressed for research and development would determine what options were available for reef managers and when.

"Ultimately, we need to consider what society wants, what can be achieved and what opportunities we have for action in a rapidly closing window," he said.

"It will require exceptional coordination of science, management and policy, and open engagement with the Traditional Owners and the general public. It will also require compromise, because reefs will change under climate change despite our best interventions."

RRAP is a partnership of organisations working together to create an innovative toolkit of safe, acceptable, large-scale interventions to helping the Reef resist, adapt to, and recover from the impacts of climate change.

In April, the Australian Government announced that an initial $150M would be invested in the RRAP R&D Program following endorsement of a two-year feasibility study. Of this, $100M is through the $443.3 million Great Barrier Reef Foundation - Reef Trust Partnership with a further $50M in research and scientific contributions from the program partners.

Dr Hardisty said RRAP aimed to research and develop new methods for management quickly and safely.

"We need to be expediently trialling promising interventions now, whatever emissions trajectory the world follows," he said.

"In this paper we offer a conceptual model to help reef managers frame decision problems and objectives, and to guide effective strategy choices in the face of complexity and uncertainty."

Credit: 
Australian Institute of Marine Science

How plants shut the door on infection

Plants have a unique ability to safeguard themselves against pathogens by closing their pores--but until now, no one knew quite how they did it. Scientists have known that a flood of calcium into the cells surrounding the pores triggers them to close, but how the calcium entered the cells was unclear.

A new study by an international team including University of Maryland scientists reveals that a protein called OSCA1.3 forms a channel that leaks calcium into the cells surrounding a plant's pores, and they determined that a known immune system protein triggers the process.

The findings are a major step toward understanding the defense mechanisms plants use to resist infection, which could eventually lead to healthier, more resistant and more productive crops. The research paper was published on August 26, 2020 in the journal Nature.

"This is a major advance, because a substantial part of the world's food generated by agriculture is lost to pathogens, and we now know the molecular mechanism behind one of the first and most relevant signals for plant immune response to pathogens--the calcium burst after infection," said José Feijó, a professor of cell biology and molecular genetics at UMD and co-author of the study. "Finding the mechanism associated with this calcium channel allows further research into its regulation, which will improve our understanding of the way in which the channel activity modulates and, eventually, boosts the immune reaction of plants to pathogens."

Plant pores--called stomata--are encircled by two guard cells, which respond to calcium signals that tell the cells to expand or contract and trigger innate immune signals, initiating the plant's defense response. Because calcium cannot pass directly through the guard cell membranes, scientists knew a calcium channel had to be at work. But they didn't know which protein acted as the calcium channel.

To find this protein, the study's lead author, Cyril Zipfel, a professor of molecular and cellular plant physiology at the University of Zurich and Senior Group Leader at The Sainsbury Laboratory in Norwich, searched for proteins that would be modified by another protein named BIK1, which genetic studies and bioassays identified as a necessary component of the immune calcium response in plants.

When exposed to BIK1, one protein called OSCA1.3 transformed in a very specific way that suggested it could be a calcium channel for plants. OSCA1.3 is a member of a widespread family of proteins known to exist as ion channels in many organisms, including humans, and it seems to be specifically activated upon detection of pathogens.

To determine if OSCA1.3 was, in fact, the calcium channel they were looking for, Zipfel's team reached out to Feijó, who is well known for identifying and characterizing novel ion channels and signaling mechanisms in plants. Erwan Michard, a visiting assistant research scientist in Feijó's lab and co-author of the paper, conducted experiments that revealed BIK1 triggers OSCA1.3 to open up a calcium channel into a cell and also explained the mechanism for how it happens.

BIK1 only activates when a plant gets infected with a pathogen, which suggests that OSCA1.3 opens a calcium channel to close stomata as a defensive, immune system response to pathogens.

"This is a perfect example of how a collaborative effort between labs with different expertise can bring about important conclusions that would be difficult on solo efforts," Feijó, said. "This fundamental knowledge is badly needed to inform ecology and agriculture on how the biome will react to the climatic changes that our planet is going through."

Feijó, will now incorporate this new knowledge of the OSCA1.3 calcium channel into other areas of research in his lab, which is working to understand how the mineral calcium was co-opted through evolution by all living organisms to serve as a signaling device for information about stressors from infection to climate change.

"Despite the physiological and ecological relevance of stomatal closure, the identity of some of the key components mediating this closure were still unknown," Zipfel said. "The identification of OSCA1.3 now fills one of these important gaps. In the context of plant immunity this work is particularly apt in 2020, the UN International Year of Plant Health."

Credit: 
University of Maryland

Domesticated chickens have smaller brains

image: Researchers from Linköping University suggest a process by which the timid junglefowl from the rain forest could have become today's domesticated chicken. When the scientists selectively bred the junglefowl with least fear of humans for 10 generations, the offspring acquired smaller brains and found it easier to become accustomed to frightening but non-hazardous events. The results shed new light over how domestication may have changed animals so much in a relatively short time.

Image: 
Per Jensen

Researchers from Linköping University suggest a process by which the timid junglefowl from the rain forest could have become today's domesticated chicken. When the scientists selectively bred the junglefowl with least fear of humans for 10 generations, the offspring acquired smaller brains and found it easier to become accustomed to frightening but non-hazardous events. The results shed new light over how domestication may have changed animals so much in a relatively short time.

Chickens are the most common birds on Earth. There are currently more than 20 billion individuals on the planet. All of them have come from the Red Junglefowl, originally found in south-east Asia. This species was tamed and domesticated by humans approximately 10,000 years ago. The results of the current study show that when our ancestors selected the tamest individuals for breeding, they may at the same time have unconsciously selected birds with a different brain - one that may have been more suitable for a life among humans. The findings are published in Royal Society Open Science.

Researchers Rebecca Katajamaa and Per Jensen started with a group of wild Red Junglefowl and selected as parents the birds that showed least fear of humans in a standard test. The breeding experiment was conducted for 10 generations. The birds that showed greatest fear of humans were placed into a second group. The researchers believe that they have in this way imitated the factor that must have been the most important during early domestication, namely that it was possible to tame the animals.

A somewhat unexpected result of the breeding was that the brains of the domesticated birds gradually became smaller relative to body size, which mirrors what has happened to modern domesticated chickens during the domestication process. The change was particularly pronounced in the brain stem, a primitive part of the brain that is involved in, among other things, certain stress reactions. The brain stem was relatively smaller in animals that were not overly timid.

The scientists carried out two behavioural experiments, to determine whether the difference in brain size and composition affected the ability of the fowl to learn. One test investigated how rapidly the birds became accustomed to something that could be experienced as frightening, but which was actually non-hazardous, in this case a flashing light. The tame birds became accustomed and stopped reacting to the stimulus significantly more rapidly.

"We believe that the ability to become accustomed rapidly is beneficial for the birds that are to live among humans, where events that are unknown and frightening, but not dangerous, are part of everyday life", says Rebecca Katajamaa, doctoral student in the Department of Physics, Chemistry and Biology at Linköping University.

The researchers also investigated whether the birds differed in the ability to learn to associate two things with each other, such as coupling a certain pattern with food. This process is known as "associative learning". However, they found no differences between the two groups.

It is not possible to say whether the differences in behaviour shown in the study are directly connected with the differences in brain size and composition. The researchers plan to investigate this in more detail.

"Our study not only sheds light on a possible process by which chickens - and possibly other species - become domesticated. It may also give new insight into how the structure of the brain is connected with differences in behaviour between individuals and species", says Per Jensen, professor in the Department of Physics, Chemistry and Biology at Linköping University.

Credit: 
Linköping University

Too many COVID-19 patients get unneeded 'just in case' antibiotics

More than half of patients hospitalized with suspected COVID-19 in Michigan during the state's peak months received antibiotics soon after they arrive, just in case they had a bacterial infection in addition to the virus, a new study shows. But testing soon showed that 96.5% of them only had the coronavirus, which antibiotics don't affect.

The 3.5% of patients who arrived at the hospital with both kinds of infection were more likely to die. But the study suggests that faster testing and understanding of infection risk factors could help hospital teams figure out who those patients are - and spare the rest of their COVID-19 patients the risks that come with overuse of antibiotics.

The new paper, published in Clinical Infectious Diseases by a team from the University of Michigan, VA Ann Arbor Healthcare System and St. Joseph Mercy Health Care System, is based on data from more than 1,700 hospitalized patients.

The data came from 38 hospitals taking part in a massive statewide effort called Mi-COVID19 that launched within weeks of the first case of COVID-19 being diagnosed in Michigan on March 10. It leverages the power of multiple quality improvement efforts sponsored by Blue Cross Blue Shield of Michigan.

During March and April, Michigan was one of the nation's early hotspot states, and the authors hope the new data will help patient care teams in current and future hotspots. Inpatient COVID-19 treatment guidelines shared by Michigan Medicine, U-M's academic medical center, have been updated based on these results.

Variation and change

In addition to widespread overuse of antibiotics, the study shows that hospitals varied widely in their use of antibiotics among people newly hospitalized for suspected COVID-19. In some, only a quarter of suspected COVID-19 patients received them within two days of being hospitalized, while in others, nearly all did.

As time went on, and COVID-19 test turnaround time shortened, the use of antibiotics dropped - but was still too high, says Valerie Vaughn, M.D., M.Sc., the study's lead author and a hospitalist physician who helped launch Michigan Medicine's COVID-19 intensive care units.

"For every patient who eventually tested positive for both SARS-Cov2 and a co-occurring bacterial infection that was present on their arrival, 20 other patients received antibiotics but turned out not to need them," says Vaughn. "These data show the crucial importance of early and appropriate testing, with rapid turnaround, to ensure appropriate use of antibiotics and reduce unneeded harm."

In addition to putting patients at risk of opportunistic infections like Clostridium difficile that can worsen their odds of recovery, antibiotics also pose a broader risk of feeding the epidemic of drug-resistant bacteria that already plagues many hospitals and can put patients and staff at risk.

Massive data

The new study wouldn't have been possible without the Mi-COVID19 registry, which includes detailed data from pre-, post- and in-hospital care on COVID-19 patients treated in hospitals of all sizes and kinds across Michigan.

Mi-COVID19 draws on the network of trained data-harvesting nurses and other staff, and physician partners, who before the pandemic focused on studying and improve care for hospitalized patients through a type of organization called a collaborative quality initiative or CQI.

The Mi-COVID19 effort is based in a CQI called the Michigan Hospital Medicine Safety Consortium, working in partnership with 11 other CQIs all sponsored by BCBSM. Additional publications about COVID-19 care are now being prepared that will draw on the data generated by the partnership.

Older people, people who had come to the hospital from a nursing home, and people who were admitted straight to intensive care were more likely to turn out to have a bacterial infection in addition to coronavirus. Half of these patient died, compared with 18% of those without bacterial infections.

Those who received antibiotics were more likely to be older, to have lower body mass index measurements, to have visible signs of infection on their chest X-ray, and to be in more critical condition when they arrive at the hospital.

The importance of rapid and appropriate testing

Vaughn, who has studied and worked to improve antibiotic prescribing for hospitalized pneumonia patients, notes that COVID-19 differs in important ways from regular pneumonia, so standard "antibiotic stewardship" techniques may not work.

For instance, many suspected COVID-19 patients had their blood tested soon after admission to the hospital to look for a substance called procalcitonin, which is often used as an early indicator of bacterial infection while doctors wait for more definitive test results.

Just over half of those who turned out to have a bacterial infection plus COVID-19 had a high procalcitonin reading. But so did 22% of those who didn't have bacterial infections. However, a low procalcitonin reading was almost certain to mean that the person didn't have a bacterial infection.

However, elevated white blood cell counts were a good predictor of who had a bacterial infection.

The faster patients got their COVID-19 viral test results back, the faster their antibiotics were stopped. Half were stopped within a day of a positive coronavirus test. The turnaround time for such tests decreased over time, with 89% getting their results within a day in May compared with 54% in March.

The vast majority of the patients tested for bacterial infections didn't have test that look in the respiratory tract. This may be because these tests require health workers to interact with patients' airways -- which can generate aerosols and risk transmitting coronavirus -- or because they require a sample of coughed-up sputum, which most of the patients didn't have because of the 'dry cough' that typifies COVID-19.

"Since their SARS-Cov2 infection explains their symptoms, we should all be more judicious with prescribing antibiotics unless we see signs of a bacterial infection," says Vaughn. "We need better guidance to help clinicians figure out if the cause of a rapid decline in condition is due to cytokine storm or bacterial infection, and better antibiotic stewardship programs to support physicians in determining if they need to order antibiotics and if so, for how long and with what tests for bacterial infection."

The study actually undercounts the percentage of patients who received antibiotics, Vaughn notes, because it left out those who received azithromycin. For a time that powerful drug was seen as promising for COVID-19 patients, in combination with hydroxychloroquine, though it has since been shown to be ineffective or even potentially harmful.

Patients who were transferred to another hospital as part of their initial COVID-19 stay were also omitted from the analysis.

Credit: 
Michigan Medicine - University of Michigan

How plants close their gates when microbes attack

Like humans, plants protect themselves against pathogens. An international consortium under the lead of UZH professor Cyril Zipfel has now identified a long sought-after factor of this plant immune system: The calcium channel triggers the closure of stomata upon contact with microbes such as bacteria. This innate defense mechanism could help to engineer crop plants that are resistant to pathogens.

Each plant leaf has hundreds of tiny pores that enable the exchange of gases with the environment. By inhaling CO2 and releasing oxygen and water vapor, these stomata are essential for photosynthesis, the survival of plants - and ultimately all life on this planet. The size of the openings is dynamically controlled to allow plants to adapt to changing conditions like sunlight, drought and rain. The opening and closing is facilitated through the swelling and shrinking of two so-called guard cells that form a ring-like border around the pore.

Plants can defend themselves

Plant researchers have long known that leaves also batten down the hatches when they encounter potentially pathogenic microbes. This reaction is part of the innate plant immune system: Receptors at the surface of plant cells recognize typical structures of microbes such as parts of bacterial flagellae. This leads to a series of reactions that ultimately block microbe entry and multiplication. One of these responses is stomatal closing, which is equivalent to closing the gates to the pathogens.

Long search for a missing link

However, the mechanism behind this microbe-induced closure of stomata remained largely unexplained, although it was shown that a rapid influx of calcium ions into the guard cells triggered the reaction. "The identity of the channels that mediate this rapid calcium movement was still unknown and has vexed researchers for a long time," says Cyril Zipfel, professor of molecular and cellular plant physiology at the University of Zurich and senior group leader at the Sainsbury Laboratory in Norwich, UK. After six years of research, he has now published a study that closes this gap and identifies the relevant calcium channel in the model plant Arabidopsis. As well as Zipfel's team, several international research groups contributed to the results.

Microbes trigger channel opening

The decisive clue was that the identified channel protein, OSCA1.3 - with a hitherto unknown function - was modified by an important component of the plant immune system. This modification leads to the opening of the OSCA1.3 channel, the influx of calcium ions into the guard cells and the closing of the stomata. Zipfel's team could show that this reaction was specifically initiated when Arabidopsis plants were brought into contact with parts of bacterial flagellae - one of the microbial triggers of the plant immune system.

Specific to immune response

The researchers confirmed this result by introducing several genetic mutations that abolished the function of the OSCA1.3 calcium channel. In these mutated plants, the microbial trigger did not lead to the closing of the pores. Further experiments showed that the channel is also not activated by drought and salinity, other environmental factors that induce the closure of stomata. "This finding reveals the first plant calcium channel with a role in stomatal closure," says Zipfel. "Interestingly, this channel seems specific to plant immunity." He therefore speculates that other plant calcium channels from the same family may respond specifically to other stress factors like drought. This will be the topic of future research.

"Obviously, this channel is involved in an important immune response in plants," says Zipfel. "These findings therefore have the potential to help with the engineering of pathogen-resistant crops." Under real and significant threat by pathogens the plants could then close the gates that would normally allow dangerous microbes to enter into their tissues.

Credit: 
University of Zurich

Corona pandemic: What dashboards do not show

How can the course of the corona pandemic and its effects be illustrated? In recent months, dashboards - interactive, graphically depicted online summaries - have become the new norm of displaying infection rates, deaths and patterns of spread. This is problematic, as geographer Professor Jonathan Everts at Martin Luther University Halle-Wittenberg (MLU) writes in a commentary for the journal Dialogues in Human Geography. He criticises the way the programmes are handled and explains which aspects of the pandemic they are not taking into account.

Dashboards are computer programmes that compile various data, information and statistics about a topic and graphically present them as concisely as possible. These can be simple figures, diagrams or enriched maps. During the corona pandemic, the "COVID-19 Map", produced by Johns-Hopkins University in the U.S., has been held up as the standard. "A dashboard always suggests that you are getting a summary of all the important data," says Jonathan Everts. The data are updated in almost real time and can be viewed by everyone. This is problematic, explains Everts, because these tools are no longer only being used by health authorities, but instead by many people around the world.

Everts says that dashboards often lack a clear explanation, either precise or in general, as to how these figures are compiled. "These values are actually way too complex to be used like this. This leads to overly simplified explanations for very complex phenomena," he criticises.

One example of this is mortality rates, which can vary dramatically from region to region. "These differences cannot be explained solely by the health and prevention measures taken locally. However, dashboards suggest that they can be depicted geographically," says Everts. In order to understand the causes better, there needs to be a differentiation of regional and demographic features, but this usually is not done in dashboards. For example, they do not provide any information about the social groups and places where the virus is spreading particularly fast at a local level. However, this information is necessary if appropriate measures to contain the virus are to be taken, says Everts.

Focusing on individual indicators, such as declining case numbers, could quickly create the false impression that the crisis will soon be over. Another possible side effect could be, that the potential long-term negative effects of the pandemic and the measures taken to control them might go unnoticed: "There is major concern for countries that now face problems in the future as a result of the pandemic. These include countries in Africa, where vaccination campaigns have been interrupted for long periods due to curfews and social distancing. This will create serious problems in the coming years," says Everts. The fact that people might miss routine appointments with physicians out of fear of contracting the new coronavirus could also lead to problems in the medium term, for example high blood pressure or heart attacks might be diagnosed too late. Thus, says Everts, social inequalities remain hidden, which could be further aggravated - or even introduced - by the pandemic.

The human geographer, who also did research on the H1N1 swine flu pandemic of 2009-2010, advocates for a more differentiated and cautious approach to indicators and dashboards in general. "There are two parts to every pandemic crisis: One is the spread of the pathogen around the world, the other is the way society deals with it," he says. There also needs to be a critical, balanced examination of the old problems as well as the new problems created by the pandemic.

Credit: 
Martin-Luther-Universität Halle-Wittenberg

Novel alkaline hydrogel advances skin wound care

image: A new method that requires no specialized equipment and can be performed at room temperature to produce an alkaline hydrogel in five minutes, allowing its easy implementation in any medical practice for superior wound healing.

Image: 
Tokyo University of Science

With an increase in the elderly and aging population and also in the number of invasive surgeries, wound healing has become a critical focus area in medicine. The complex bodily processes involved in wound healing make it challenging as well as rewarding to identify newer methods and materials for effective wound healing. Now, in a new study, published in Polymers for Advanced Technologies, led by undergraduate student (yes, you read that right) Ryota Teshima, researchers from Tokyo University of Science, Japan, have developed a groundbreaking novel material with possible applications in wound healing. But exactly why is this new material so exciting?

It is important to create an optimal physiological environment around a wound to promote the growth of new cells. Recent research has revealed that a type of material called "hydrogel" is exceptionally useful for achieving such conditions given its molecular structure. Hydrogels are three-dimensionally cross-linked networks of polymers that can absorb more than 95% of their volume in water. Hydrogels with natural polymers have excellent compatibility with the biological conditions of our skin and tissues (referred to as "biocompatibility"), can absorb fluids from the wound, and continuously provide moisture into the wound, creating a highly suitable environment for the wound to heal.

One such natural polymer that is used in hydrogels for wound dressing is alginate, a carbohydrate derived from seaweed, and therefore, abundantly available. Alginate gels are very easy to prepare, but gelation occurs quickly, making it difficult to control the gelation time. Although methods to achieve this control have previously been reported, ensuring short gelation time while maintaining transparency results in hydrogels with a slightly acidic (4-6) or neutral pH. Slightly acidic conditions were, until recently, believed to be beneficial for wound healing, but newer research has found that a slightly alkaline pH (8-8.5) is better for promoting the growth of "skin healing" cells such as fibroblasts and keratinocytes.

This is the context that shaped the characteristics of the next level alginate hydrogel production method that Mr Teshima and his team developed. He summarizes their breakthrough: "We have succeeded in preparing a novel alkaline alginate hydrogel (pH 8.38-8.57) suitable for wound healing via a method that requires no special equipment and can be carried out at room temperature. This, in addition to the fact that the hydrogel forms in 5 minutes, makes it ideal for potential use in any medical practice anywhere for superior wound healing."

Their method involves mixing calcium carbonate and potassium alginate, and then adding carbonated water to this mixture and letting the "gelation" (gel formation) process take place. In this method, the pH of the gel shifts to alkaline because the carbon dioxide volatilizes after gelation. This also ensures transparency of the gel, which in turn allows the visual assessment of wounds and helps in easily ascertaining the progress of healing. Also, regardless of the amounts of ingredients used, the resultant hydrogels have extremely high water content--up to 99%.

When the team placed their hydrogel in physiological saline solution, it passed the test for another critical requisite for a wound dressing: the potential to absorb exudates from the wound. And while the hydrogel did become structurally weak and could not be lifted with tweezers after a week of immersion, it retained its shape.

Speaking about the motivation behind this exciting study, Mr Teshima says, "I have been experimenting with alginate gels ever since junior high school. There was also increasing interest in regenerative medicine when I was growing up, which compelled me to focus on the creation of useful biocompatible materials that can be used in medical therapy." Well, there's no denying that this novel hydrogel developed by Mr Teshima's team shows immense potential for near-future application to wound healing in medicine.

Hopeful of even more potential applications of their method in medicine beyond wound healing, Mr Teshima says, "In the future, if it is possible to control the sustained release of an effective drug held inside it, this novel hydrogel can be used as a drug carrier as well."

For now, the next step is to assess its viability and effectiveness in living cells and animal models. When that is done, Mr Teshima's Japan, and subsequently, the world, can be made a better place.

Credit: 
Tokyo University of Science

A ribosome odyssey in mitochondria

video: Structure of ciliate mitoribosome provides new insights into the diversity of translation and its evolution.

Image: 
Victor Tobiasson

Proteins make life and are made by ribosomes. In mitochondria, the repertoire of the mitoribosomal architectures turns out to be much more diverse than previously thought.

A paper published in eLife by Alexey Amunts lab reports extraction of the mitoribosome from a ciliated protozoan and its reconstruction using cryo-EM. The ciliate mitoribosome substantially differs with regard to its structure that reveals a 4.0-MDa complex of 94 proteins. The high resolution of the reconstruction allowed to identify nine novel proteins encoded in the mitochondrial genome.

Not only that the compositional complexity of the ciliate mitoribosome rivals that of human, it also provides a possible evolutionary intermediate that explains how translation in mitochondria has evolved. A particularly surprising feature is that a single functional protein uS3m is encoded by three complementary genes from the nucleus and mitochondrion, establishing a link between genetic drift and mitochondrial translation.

Among the functional characteristics, the analysis revealed a mitochondria-specific protein mL105 in the exit tunnel that features an intrinsic protein targeting system in mitochondria through a possible recruitment of a synthesized polypeptide.

The exploration of the ciliate mitoribosome structure gauges the full extent of mitochondrial structural and functional complexity and identifies potential evolutionary trends. The results emphasize the power of the cryo-EM based analysis of mitochondria in revealing novel proteins in different eukaryotic lineages.

Credit: 
Science For Life Laboratory

COVID-19 -- Scenarios for the post-lockdown period in Italy

image: The comparative analysis of data and model results for hospitalizations in 107 Italian provinces as of May 1, 2020 is supported by: a a sketch of the Italian regions; b, c the prevalence of cumulative hospitalizations in each Italian province up to May 1, reconstructed data (b) and model simulations (c); d ratio between the estimated transmission rate on May 1, and the one estimated at the beginning of the outbreak (February 24).

Image: 
none

Infection has been reduced up to 70% as of May 1st. Thanks to developed model, scenarios can be drawn regarding future containment measures.

While the pandemic caused by SARS-CoV-2 is still ravaging most countries of the world and containment measures are implemented worldwide, a debate is emerging on whether these measures might be partially alleviated, and in case how and when. This discussion requires appropriate models that guide decision-makers through alternative actions via scenarios of the related trajectories of the epidemic. This is the subject of a research whose results are published today in the journal Nature Communications by a team of Italian scientists from Università Ca' Foscari (Venice), Politecnico di Milano (Milan), Università di Padova (Padua), and École Polytechnique Fédérale de Lausanne (Lausanne, Switzerland).

The analysis is based on a spatially explicit model of the COVID-19 spread in Italy (first published in the journal Proceedings of the National Academy of Sciences USA by the team in April), inclusive of mobility among communities, progressive mobility restrictions and social distancing. The benchmark model has been updated through the estimation of parameters using the number of daily hospitalized cases in all 107 Italian provinces from February 24 to May 1st.

The researchers have generated scenarios of the Italian infection dynamics resulting from the bulk effect of lockdown lifting, which initiated on May 4. They wondered how the modes of relaxation of previous confinement measures might affect residual epidemic trajectories. The answer to this question is not trivial, because different activities have been allowed to resume at different times. In addition, acquired awareness may have different lasting effects on social behaviour regardless of imposed measures, and compliance to proper use of Personal Protective Equipment (such as masks) may fade away in time.

Using the epidemiological data up to June 17 the researchers have provided an ex-post assessment of the explored scenarios comparing them with the actual space-time progression of the outbreak. The actual change in overall transmission has been tracked via the departure of the epidemic curve from the one projected by using the transmission rate achieved during the lockdown (termed the baseline scenario). The great majority of Italian regions have been close to the baseline scenario for the considered one-month-and-a-half period.

Scientists have then addressed the mitigation of the likely increased exposure, in particular by estimating the sufficient number of case isolation interventions that would prevent rebounding of the epidemics. A control effort capable of isolating daily about 5.5% of the exposed and highly infectious individuals is necessary to maintain the epidemic curve onto the decreasing baseline trajectory.

Credit: 
Politecnico di Milano

Catching genes from chlamydiae allowed complex life to live without oxygen

image: Collection of porefluid from sediment core for geochemical analysis. Samples are collected with so-called Rhizon samplers onboard R/V G.O. Sars in the Norwegian-Greenland sea during Centre of Geobiology expedition 2015.

Image: 
Michel Melcher

An international team of researchers has discovered a new group of Chlamydiae - Anoxychlamydiales - living under the ocean floor without oxygen. These Chlamydiae have genes that allow them to survive without oxygen while making hydrogen gas. The researchers found that our single-cell ancestors 'caught' these hydrogen-producing genes from ancient Chlamydiae up to two-billion years ago - an event that was critical for the evolution of all complex life alive today. The results are published in Science Advances.

Life on Earth can be classified into two main categories: eukaryotes (e.g., plants, animals, fungi, amoeba) and prokaryotes (e.g., bacteria and archaea). In comparison to relatively simple prokaryotic cells, eukaryotic cells have complex cellular organisation. How such cellular complexity evolved has puzzled scientists for decades. The prevailing hypothesis for the evolution of eukaryotes involves the merger, or symbiosis, of two prokaryotes - an archaeon and a bacterium - nearly two-billion years ago, in environments with little oxygen. Scientists assume that these microbes co-operated with each other to survive without oxygen by exchanging nutrients. While we do not know what these nutrient were, many scientists think that hydrogen might be the answer.

To find an answer to this two-billion year old mystery, scientists look at genomes of modern prokaryotes and eukaryotes to find genes for living without oxygen and nutrient metabolism with hydrogen. Much like fossils, genomes hold clues to the evolutionary history of their ancestors. In our cells, we have a specialized factory called the mitochondrion - or powerhouse of the cell - that helps us make energy using the oxygen we breathe and the sugar we eat. However, some mitochondria are able to make energy without oxygen by producing hydrogen gas. Since hydrogen has been proposed to have been an important nutrient for the origin of eukaryotes, scientists think that hydrogen production was present in one of the two-billion year old partners: the archaeon or the bacterium. However, there is no evidence for this with present data.

In an article published in Science Advances, a team of international researchers has discovered an unexpected source of these genes at the bottom of the ocean from the Anoxychlamydiales, a newly discovered group of Chlamydiae. Anoxychlamydiales live without oxygen, and have genes for producing hydrogen - a trait that has never before been identified in Chlamydiae. The researchers were surprised to find that the chlamydial genes for hydrogen production closely resembled those found in eukaryotes. This strongly suggests that ancient chlamydiae contributed these genes during the evolution of eukaryotes.

"In our study we identified the first evidence for how eukaryotes got the genes to make hydrogen and it was from a completely unexpected source!" says co-lead author Courtney Stairs, postdoctoral researcher at Uppsala University in Sweden. Fellow co-lead author Jennah Dharamshi, PhD student from Uppsala University, adds: "We found new evidence that the eukaryotic genome has a mosaic evolutionary history, and has come not only from Archaea and the mitochondrion, but also from Chlamydiae".

"Understanding where hydrogen metabolism came from in eukaryotes is important for gaining insight into how our two-billion-year old ancestors evolved," says senior author Thijs Ettema, Professor at Wageningen University and Research in The Netherlands, and coordinator of the international team of researchers. "For years, I thought that if we ever found out where eukaryotic hydrogen metabolism came from, we would have a clearer picture of how eukaryotes evolved - however, finding out that these genes might have come from Chlamydiae has raised even more questions", Courtney Stairs adds.

How did the eukaryotes get a hold of these genes?

"We know that microorganisms routinely share genes with each other in a process called 'gene transfer'. We can find these transfer events by building family trees of each gene and looking for patterns in their evolution" explains Courtney Stairs. Today, the closest relatives of the archaeon that participated in the initial symbiosis are Asgard archaea. These archaea are also found at the bottom of the ocean where Anoxychlamydiales reside. "Asgard archaea and Anoxychlamydiales are both found living under the ocean floor where there is no oxygen" Thijs Ettema explains, "their cohabitation could have allowed for genes to be transferred between the ancestors of these microbes".

Finding chlamydiae that can live without oxygen has important implications in itself. These bacteria are typically known as pathogens of humans and other animals, even though they can also infect single-cell eukaryotes such as amoeba. All chlamydiae known to date live inside eukaryotic cells.

"Finding chlamydiae that might be able to live without oxygen, produce hydrogen, and live outside a eukaryote challenges our previously held conceptions" says Jennah Dharamshi, "our findings suggest that chlamydiae may be important members of the ecosystem on the ocean floor and that perhaps all chlamydiae are not that bad after all."

Credit: 
Uppsala University

Samara Polytech scientists studied a new compound for lithium and sodium-ion batteries

image: NaVPO4F metrics and graphs

Image: 
@SamaraPolytech

Cathode materials based on sodium and d-metals fluorophosphates are in great demand in the production of metal-ion batteries, because they have a rich chemical composition that allows to regulate their electrochemical properties. The research team of the scientists of Samara Center for Theoretical Materials Science of Samara Polytech, Institute of Solid State Chemistry and Mechanochemistry of the Siberian Branch of the Russian Academy of Sciences, Federal Research Center Boreskov Institute of Catalysis and P.N. Lebedev Physical Institute of the Russian Academy of Sciences found an original way to obtain monoclinic NaVPO4F by quenching. Research results are published in the journal of Physical Chemistry Chemical Physics.

A detailed study of NaVPO4F structure is conducted by means of X-ray diffraction analysis, infrared spectroscopy and nuclear magnetic resonance. A comparative analysis of NaVPO4F (with a monoclinic structure) and LiVPO4F (with a triclinic structure) is carried out. By means of theoretical methods it is proved that NaVPO4F composition has a low sodium atoms mobility, and thus its use as the cathode active material is impractical. It is also shown that the triclinic modification is energetically more favorable for the LiVPO4F compound than the monoclinic one, which, together with the sodium ions low mobility in NaVPO4F, makes the synthesis of monoclinic LiVPO4F by electrochemical exchange hardly possible.
The work of Samara Polytech employees in this area was supported by the Russian Science Foundation grant 19-73-10026.
By the way, the research team article "Crystal structure and migration paths of alkaline ions in NaVPO4F" was included in the list of the most relevant articles of 2020 according to the editors of the journal Physical Chemistry Chemical Physics.

"The field of electrochemical energy storage systems (lithium-ion batteries are the best known example) is extremely popular in modern science. The reason is the ongoing transition of modern society from fossil fuels to renewable energy sources and the wide spreading of mobile/autonomous devices", Artem Kabanov, the senior researcher of Samara Center for Theoretical Materials Science of Samara Polytech explains. "The search for new materials for such systems is very important. Our joint work with colleagues from Novosibirsk is devoted exactly to the study of new compounds - NaVPO4F, that can potentially serve as a positive electrode. We have synthesized NaVPO4F compound using quenching and shown that it possesses low sodium ion mobility and is useless as an active material for battery cathodes. Polyanionic compounds are very popular now in the field, and I believe this is the cause of why our article was included in the "hot list" of the PCCP journal (Q1, IF=3.5). Our collaboration with Dr. Nina Kosova's group from Novosibirsk is very fruitful and we have just published the second joint article in the Electrochimica Acta journal".

Credit: 
Samara Polytech (Samara State Technical University)

Consuming your own fecal microbiome when dieting may limit weight regain -- Ben-Gurion University

image: In the weight loss trial, abdominally obese or dyslipidemic (high cholesterol) participants in Israel were randomly assigned to one of three groups (1) healthy dietary guidelines, (2) Mediterranean diet, and (3) green-Mediterranean diet. In the green-Mediterranean diet group, participants were provided with Mankai. In a complementary Mankai-specific mouse model experiment, the researchers were able to reproduce the effects of weight-nadir-based transplantation on weight regain and insulin sensitivity, and to isolate the specific contribution of Mankai consumption to induce these effects.

Image: 
Ben-Gurion U.

BEER-SHEVA, Israel...August 26, 2020 - People who consume frozen microbiome capsules derived from their own feces when dieting may limit their weight regain, according to a new study published in Gastroenterology, conducted by a team of researchers led by researchers at Ben-Gurion University of the Negev (BGU).

In an unprecedented, 14-month clinical trial in Israel, BGU Prof. Iris Shai, BGU Ph.D. student Dr. Ehud Rinott and Dr. Ilan Youngster from Tel-Aviv University, collaborated with a group of international experts from U.S. and European research institutes.

"It is well known that most weight-loss dieters reach their lowest body weight after 4-6 months, and are then challenged by the plateau or regain phase, despite continued dieting," says Prof. Shai. a member of the School of Public Health. In this groundbreaking study, the international group of researchers explored whether preserving the optimized personal microbiome from fecal transplants after six months of weight loss helps maintain weight loss by transplanting back the optimized microbiome during the subsequent expected regain phase.

In the weight loss trial, abdominally obese or dyslipidemic (high cholesterol) participants in Israel were randomly assigned to one of three groups (1) healthy dietary guidelines, (2) Mediterranean diet, and (3) green-Mediterranean diet, After six months during the weight-loss phase, 90 eligible participants provided a fecal sample that was processed into aFMT by frozen, opaque and odorless capsules. The participants were then randomly assigned to the groups that received 100 capsules containing their own fecal microbiota or placebo which they ingested until month 14.

In the green-Mediterranean diet group, participants were provided with Mankai, a specific duckweed aquatic strain in a green shake, green tea and 28g of walnuts. This was the group diet strategy that induced the largest significant change in the gut microbiome composition during the weight loss phase.

The 90 participants lost 8.3 kg (18.2 lbs.) on average after six months, However, only in the green-Mediterranean diet group did aFMT limit weight regain from only 17.1%, vs 50% for the placebo.

"The green-Mediterranean diet also resulted in preservation of weight loss-associated specific bacteria and microbial metabolic pathways, mainly glucose transport, following the microbiome intervention, compared to the control," says Dr. Rinott.

In a complementary Mankai-specific mouse model experiment conducted by Prof. Omry Koren at Bar-Ilan University, the researchers were able to reproduce the effects of weight-nadir-based transplantation on weight regain and insulin sensitivity, and to isolate the specific contribution of Mankai consumption to induce these effects.

"This study is the first of its kind to prove in humans that preservation of an "ideal" gut microbial composition can be used at a later time point to achieve metabolic benefits," says Dr. Youngster, director of the Pediatric Infectious Diseases Unit and the Center for Microbiome Research at Shamir Medical Center. "Using the patient's own stool after optimization is a novel concept that overcomes many of these barriers. It is my belief that the use of autologous fecal microbiota transplantation will be applicable in the future for other indications as well."

Furthermore, green plant-based diet such as Mankai, better optimizes the microbiome for the microbiota transplantation procedure. This potentially optimizes the conditions for the aFMT, collected during the maximal weight loss phase. The Mankai duckweed aquatic plant is being grown in Israel and other countries in a closed environment and is highly environmentally sustainable - requiring a fraction of the amount of water to produce each gram of protein compared to soy, kale or spinach.

According to Professor Omry Koren, at Bar-Ilan University who led the animal experiments: "The nutrition-microbiome axis has been proven in this study as high polyphenols diet, and specifically, Mankai, a protein-based plant and dietary fibers could ideally optimize the microbiome in the weight loss phase, to induce potent microbiome to recall the flora of germs related to regain attenuation and improved glycemic state after transplantation."

"These findings might be a good application of personal medicine," says Dr. Shai who is also an adjunct professor at Harvard. "Freezing a personal microbiome bank could be an effective way to maintain healthy weight while dieting as the rapid weight loss phase is accompanied by optimal cardiometabolic state. By optimizing the composition and function of the gut microbiome within the host, we have a novel approach for metabolic-memory preservation: taking a sample of the gut microbiome in its ideal phase, and administrating it when dieters start regaining their lost weight.

Credit: 
American Associates, Ben-Gurion University of the Negev

UofSC researchers reveal how THC may treat acute respiratory distress syndrome

COLUMBIA, SC - Acute Respiratory Distress Syndrome (ARDS), when caused by a bacterial toxin known as Staphylococcal enterotoxin, can be completely prevented by treatment with Δ9-tetrahydrocannabinol (THC), a cannabinoid found in the cannabis plant. This exciting finding, recently published in the highly cited British Journal of Pharmacology, also suggests a possible treatment for ARDS caused by COVID-19.

This new paper is based on research studies from the laboratories of Dr. Mitzi Nagarkatti and Dr. Prakash Nagarkatti at the University of South Carolina (UofSC) School of Medicine, Department of Pathology, Microbiology and Immunology. The Nagarkattis published "Protective Effects of Δ9-Tetrahydrocannabinol Against Enterotoxin-induced Acute Respiratory Distress Syndrome is Mediated by Modulation of Microbiota," with co-authors Amira Mohammed, Hasan Alghetaa and Juhua Zhou, who also work in their UofSC School of Medicine laboratories, and Saurabh Chatterjee from the UofSC Arnold School of Public Health. Drs. Mitzi and Prakash Nagarkatti have for years studied how plant-derived compounds can be used to prevent and reduce inflammation throughout the body.

The incidence of ARDS in the United States is 78.9 per 100,000 persons/year and the mortality rate is 38.5 percent. When inhaled, Staphylococcal enterotoxin can cause ARDS by activating immune cells to produce massive amounts of cytokines leading to "cytokine storm," which can cause the lungs and other organs to fail, often resulting in death. This immune process is similar to that seen in patients with severe COVID-19 who are admitted to the hospital and develop ARDS accompanied by cytokine storm, which leads to respiratory and multi-organ failure. These studies therefore raise the exciting possibility of using cannabinoids to treat ARDS seen in COVID-19 patients.

These studies also showed that Staphylococcal enterotoxin alters the microbiome in the lungs leading to the emergence of pathogenic microbiota. But THC helps this symptom too, by promoting beneficial bacteria that suppress inflammation thereby preventing the damage to the lungs.

"Acute respiratory distress syndrome is triggered by a variety of etiologic agents. Currently, there are no FDA-approved drugs to treat ARDS because of which the mortality rate is close to 40 percent. Our studies suggest that THC is highly effective to treat ARDS and thus, clinical trials are critical to investigate if this works," said Mitzi Nagarkatti.

"Cytokine storm is a huge clinical issue which leads to multiorgan failure and often death. It is also seen in COVID-19 patients, and there are no effective treatment modalities against this syndrome. We have been working on cannabinoids for over 20 years and found that cannabinoids such as THC are highly anti-inflammatory. Thus, our studies raise the exciting suggestion to test THC against ARDS seen in COVID-19 patients," said Prakash Nagarkatti.

The Nagarkatti laboratory has performed decades of pioneering studies on cannabinoids. In fact, their studies on the use of another cannabinoid derived from the cannabis plant, cannabidiol (CBD), to treat autoimmune hepatitis have been well-recognized in the field and have led to FDA approval of CBD as an orphan drug to treat this disorder.

The Nagarkatti Laboratory has published extensively to demonstrate that cannabinoids are potent anti-inflammatory agents that can be used safely to treat a variety of inflammatory and autoimmune diseases such as multiple sclerosis, colitis, hepatitis and the like.

Credit: 
University of South Carolina