Culture

New metabolism discovered in bacteria

image: While acetic acid creating bacteria and methanogens are dependent on the transfer of hydrogen in anoxic biotopes, Acetobacterium woodii recycles hydrogen within its cell

Image: 
Sarah Ciurus, Goethe University Frankfurt

They make sauerkraut sour, turn milk into yogurt and cheese, and give rye bread its intensive flavour: bacteria that ferment nutrients instead of using oxygen to extract their energy. Acetobacterium woodii (short: A. woodii) is one of these anaerobic living microbes. Cheese and bread are not its line of business - it lives far from oxygen in the sediments on the floor of the ocean, and can also be found in sewage treatment plants and the intestines of termites.

These biotopes are teeming with microbes that use the organic substances to their advantage in different ways. A number of bacteria ferment sugars, fatty acids and alcohols to acetic acid, also creating hydrogen (H2) in the process. In higher concentration, however, hydrogen inhibits the fermentation - too much hydrogen stops the fermentation reaction. For this reason, fermenting bacteria live together with microbes that depend on precisely this hydrogen, methanogens, for example, that create methane from hydrogen and carbon dioxide and thus gain energy. Both partners profit from this association - and are simultaneously so dependent on each other that neither one can survive without the other.

A. woodii masters both disciplines of the anaerobic "hydrogen association": it can ferment organic substances into acetic acid, and can also form acetic acid from carbon dioxide and hydrogen. In doing so, A. woodii recycles the important hydrogen within its own cell, as has now been discovered by the microbiologists in Professor Volker Müller's team at the Institute for Molecular Biosciences at Goethe University Frankfurt.

In the laboratory, the Frankfurt scientists turned off the gene for the enzyme that creates hydrogen in A. woodii, which is called hydrogenase. The result: the bacterium was only able to grow, for example in a medium with fructose, if hydrogen was added externally. Different additional tests confirmed that both paths for creating acetic acid are connected to hydrogen that does not leave the cell.

"Though the 'hydrogen recycling' we discovered, A. woodii possesses a maximum of metabolic flexibility," says the Frankfurt experimenter Dr Anja Wiechmann. "In one cycle, it can both create and use hydrogen itself, or utilise hydrogen from external sources. This makes it capable of living both from organic as well as solely from inorganic substances."

Professor Volker Müller explains: "Our findings have implications far beyond the study of Acetobacterium woodii. There have already been speculations that many ancient life forms possess the kind of metabolism that we have described in A. woodii. This is assumed, for example for the Asgard archaea that were just discovered a few years ago on the seabed off of California. Our investigations provide the first evidence that these paths of metabolism actually exist."

Credit: 
Goethe University Frankfurt

New mechanism underlying organelle communication revealed in brown fat cells

In recent years, brown fat has garnered increasing attention as the so-called good fat that can protect against obesity and associated health risks, like cardiovascular disease and diabetes. Brown fat is located in small pockets throughout the body and helps maintain body temperature in cold environments. It gets its color from high amounts of iron-containing mitochondria, unlike the standard white fat linked to obesity.

A team led by Ling Qi, Ph.D., professor of molecular & integrative physiology and internal medicine at U-M Medical School has been studying how mitochondria, the power plant of the cell, and another cellular structure called the endoplasmic reticulum (ER), which is involved in the production of proteins and lipids, interact inside brown fat cells.

In particular, they've studied the role of a protein complex involved in a process called ER-associated protein degradation, or ERAD. Simply put, ERAD is the process of removing and destroying misfolded proteins, like taking out the trash out of the ER.

"Everyone thought that ERAD was just part of the general cellular response when cells are undergoing ER stress," says Qi. "We've shown over the past six years that it plays a fundamental role in health and disease."

In a new study, published as a Research Article in Science, Qi along with first authors Zhangsen Zhou, Ph.D., Mauricio Torres, Ph.D., and their colleagues demonstrate how an ERAD protein complex affects the proper function of mitochondria.

Typically, the ER and mitochondria have ongoing interaction at touch points called mitochondria-associated membranes. These points of contact, mark areas for mitochondria to divide for the production of new mitochondria and for the exchange of other molecules such as lipids and calcium. The ER forms tubules that surround the mitochondria to get them ready for division.

Using state-of-the-art 3-D imaging, the researchers discovered what happens to mitochondria in brown fat that are missing part of an ERAD protein complex, called Sel1L-Hrd1, when exposed to cold.

"When you delete this complex in brown adipocytes, the mitochondria become elongated and enlarged," says Qi. The 3-D image enabled them to view a previously unrecognized interaction between the mitochondria and the ER, with the mitochondria wrapping in a U-shape around the ER tubules.

When the mice were placed in a cold environment, the ends of the outer membrane of the mitochondria folded back on itself, eventually fusing and completely enveloping the ER tubules. The result, says Qi, are abnormally large, misshapen, dysfunctional mitochondria.

"We showed that these mitochondria don't function normally and the mice become cold sensitive, their body temperature dropping very quickly," says Qi. In other words, without this ERAD protein complex, the brown fat is not being used to generate heat. Under a microscope, this dysfunctional brown fat had larger droplets of lipids than brown fat from mice with the protein complex intact. "This is highly unexpected. The results here fundamentally change our understanding of ER-mitochondrial communication and further demonstrate the importance of an ER degradation complex in cell biology."

Credit: 
Michigan Medicine - University of Michigan

Nafamostat is expected to prevent the transmission of new coronavirus infection (COVID-19)

image: Nafamostat, an existing safe drug, may inhibit entry of SARS-CoV-2.

Image: 
Image: 2020 The University of Tokyo.

Nafamostat mesylate (brand name: Fusan), which is the drug used to treat acute pancreatitis, may effectively block the requisite viral entry process the new coronavirus (SARS-CoV-2) uses to spread and cause disease (COVID-19). The University of Tokyo announced these new findings on March 18, 2020.

According to the new research, Nafamostat can prevent the fusion of the envelope of the virus with host cell surface membranes, the first step in infection with the causative virus SARS-CoV-2. Nafamostat can inhibit the membrane fusion at a concentration less than one-tenth that of Camostat mesylate (brand name: Foypan), which was recently identified by a German group as an inhibitor of SARS-CoV-2 infection (Reference 1).

Both Nafamostat and Camostat were developed in Japan as treatments for pancreatitis and some other diseases. These drugs have been prescribed in Japan for many years and have adequate clinical data with regard to safety.

The University of Tokyo plans to launch clinical trials in April 2020 in order to evaluate the effectiveness of these two drugs for treating COVID-19.

Search for therapeutics from existing drugs that have been confirmed to be safe (drug repurposing) seems to be extremely worthwhile

Professor Jun-ichiro Inoue and Assistant Professor Mizuki Yamamoto of the Research Center for Asian Infectious Diseases of the Institute of Medical Science, the University of Tokyo, have identified Nafamostat as a strong candidate to fight COVID-19.

Even after the World Health Organization's declaration of a pandemic, no drug has yet been shown to be effective for treating COVID-19 caused by the new coronavirus (SARS-CoV-2). The development of effective drugs is an urgent issue.

"Considering that SARS-CoV-2 infection is already spreading worldwide, drug repurposing (*1), which searches for therapeutics among existing drugs with established safety records, seems to be extremely worthwhile," Inoue said.

The genomic RNA of coronaviruses such as SARS-CoV-2 is surrounded by an envelope composed of a lipid bilayer and envelope proteins. SARS-CoV-2 initiates human cell entry after the Spike protein (S protein) present on the envelope binds to a cell membrane receptor ACE2 (*2). The S protein is cleaved into S1 and S2 by a human cell-derived protease (proteolytic enzyme) that is assumed to be Furin. S1 then binds to its receptor, ACE2. The other fragment, S2, is cleaved by TMPRSS2 (*3), a human cell surface serine protease, resulting in membrane fusion. According to Hoffmann et al., ACE2 and TMPRSS2 are essential in airway cells for SARS-CoV-2 infection (Reference 1).

The research group already reported in 2016 that Nafamostat effectively inhibits MERS-CoV S protein-initiated membrane fusion. The researchers did this using the Dual Split Protein (DSP) reporter fusion assay (*4) to screen a library consisting of 1,017 FDA-approved drugs. This screening result, together with experimental data from MERS-CoV infection of cultured airway epithelial cell-derived Calu-3 cells (*5), led them to propose that Nafamostat could be effective at inhibiting MERS-CoV infection (Reference 2).

In the present study, they newly established a SARS-CoV-2 S protein-initiated fusion assay and found that in the concentration range from 10 to 1000 nM, Nafamostat suppressed SARS-CoV-2 S protein-initiated fusion utilizing 293FT cells (derived from human fetal kidney) (*6) ectopically expressing ACE2 and TMPRSS2. Then, a similar experiment was performed using Calu-3 cells, which are considered an appropriate model for the cells SARS-CoV infects in humans. Low concentrations in the 1-10 nM range of Nafamostat significantly suppressed membrane fusion. This is almost the same as the concentration range for inhibition of membrane fusion by the MERS-CoV S protein.

The research group also compared the effects of Nafamostat and Camostat. They found that Nafamostat inhibited SARS-CoV-2 S protein-initiated fusion at a concentration less than one-tenth that needed for Camostat. Based on the above explanation, they concluded that Nafamostat is the most effective drug against SARS-CoV-2 S protein-initiated fusion among the protease inhibitors used in clinical practice and tested so far.

Future potential of Nafamostat and Camostat

Nafamostat is administered clinically by intravenous infusion. The research group speculated that the blood concentration of Nafamostat after administration would exceed the concentration needed experimentally to inhibit membrane fusion via the SARS-CoV-2 S protein. Therefore, it is expected that Nafamostat will prevent SARS-CoV-2 from entering human cells. Camostat is an oral drug. Blood levels after oral administration may be inferior to Nafamostat.

"Both drugs could be used alone, or in combination with other antiviral drugs that target separate processes needed for virus production, such as RNA replication or viral protein processing," said Inoue.

Credit: 
The Institute of Medical Science, The University of Tokyo

Screening of zebrafish identifies gene involved in human nicotine addiction

Researchers at Queen Mary University of London have shown that zebrafish can provide genetic clues to smoking, a complex human behaviour.

By studying genetically-altered zebrafish they were able to pinpoint a human gene, Slit3, involved in nicotine addiction and also discover the ways in which it may act.

While zebrafish have been used extensively in genetic research, they've been used only in developmental models, such as identifying genes associated with disease, rather than to predict genes involved in a complex cognitive behaviour such as smoking.

Although smoking has long been known to have a genetic element, relatively little has been known about the genes involved since it has been difficult to identify them from human studies alone.

In a study published in eLife journal, the researchers tested families of genetically altered zebrafish for nicotine preference. When one family showed a much stronger nicotine preference compared to the others, the researchers identified all the mutations in the family, eventually narrowing down to a mutation in the Slit3 gene linked to the behaviour.

To see if the same gene affected nicotine preference in people, the researchers looked for association between variants in the human Slit3 gene and smoking behaviour, such as decreased or increased desire to smoke and how easy it was to quit, in groups of people in the UK and Finland. They found 3 variants in the human Slit3 gene that were significantly linked to smoking activity.

To then learn more about how the Slit3 gene might be working, the researchers tested both mutant and wild type fish for sensitivity to a dopaminergic drug. In humans this drug affects the startle reflex - our physical reaction to a sudden loud noise - that is linked to addictions, including nicotine addiction. When tested with the startle reaction, the mutant fish showed decreased sensitivity to the drug. After testing various different receptors that might be involved in the reduced drug sensitivity, the researchers found that only one receptor was implicated - the serotonin receptor 5HT 1AA.

Caroline Brennan, Professor of Molecular Genetics at Queen Mary University of London, led the research. She explained: "This gives us a hypothesis for how the Slit3 gene works in humans. It is somehow altering the level of serotonin receptors present; and the differences in the levels are presumably then influencing sensitivity to nicotine addiction."

Professor Brennan added: "As well as finding out more about the genes involved in nicotine addiction, most importantly, we've found an easier way of finding these genes in the future. Although zebrafish are a 'lower' organism, they have a similar genetic structure to humans and share 70% of genes with us. 84% of genes known to be associated with human disease have a zebrafish counterpart; and while there has been scepticism regarding their usefulness in terms of human cognition, we have shown that they can give insight into the genetics of that as well."

Credit: 
Queen Mary University of London

A 'cardiac patch with bioink' developed to repair heart

image: Schematic diagram of the underlying mechanism of in vivo priming of BM-MSCs with HGF-eMSC

Image: 
Jinah Jang (POSTECH)

The heart is the driving force of circulating blood in the body and pumps blood to the entire body by repeating contraction and relaxation of the heart muscles continuously. Human stem cells are used in the clinical therapies of a dead heart, which happens when a blood vessel is clogged or whole or a part of heart muscles is damaged. The clinical use of human bone marrow-derived mesenchymal stem cells (BM-MSCs) have been expanded but failure of the transplanted stem cells in the heart still remains a problem. Recently, an international joint research team of POSTECH, Seoul St. Mary's Hospital, and City University of Hong Kong developed a 'cardiac patch with bioink' that enhanced the functionality of stem cells to regenerate blood vessels, which in turn improved the myocardial infarction affected area.

The joint research team consisted of Prof. Jinah Jang and Dr. Sanskrita Das of POSTECH Creative IT Engineering, Mr. Seungman Jung of POSTECH School of Interdisciplinary Bioscience and Bioengineering, Prof. Hun-Jun Park, Mr. Bong-Woo Park, and Ms. Soo-Hyun Jung of The Catholic University, and Prof. Kiwon Ban and his fellows from City University of Hong Kong. The team mixed genetically engineered stem cells (genetically engineered hepatocyte growth factor-expressing MSCs, HGF-eMSCs) developed by SL Bigen. Co., Ltd to make bioink in the form of a patch and introduced a new therapy by transplanting it to a damaged heart. They called this new strategy as 'in vivo priming'. The name came from the principle that maximized function of mesenchymal stem cells are maintained in vivo as well as through its exposure to the growth factor secreted by the genetically engineered stem cells.

The joint research team first genetically engineered the existing BM-MSCs to produce hepatocyte growth factor consistently to improve the therapeutic potential of stem cells. The engineered stem cells (HGF-eMSCs) were then mixed with BM-MSCs to make the bioink. They transplanted the cardiac patch with this bioink to the heart muscles affected by myocardial infarction. Considering the limited amount of cells that could be transferred, they used heart-derived extracellular matrix bioink to make a cardiac patch.

Implanted cells in a patch survived longer in vivo and had more myocardiocytes survived than the only BM-MSCs transplanted experimental group. This was because the secretion of cytokine, which helps formation of blood vessels and cell growth was maximized and delivered nutrients fluently that promoted vascular regeneration and enhanced survival of the myocardiocytes.

The research team anticipated that this new method could be a breakthrough treatment of myocardial infarction as the implanted stem cells through HGF-eMSCs ultimately enhanced vascular regeneration and improved the myocardial infarction affected area.

"We can augment the function of adult stem cells approved by Ministry of Food and Drug Safety and FDA using this newly developed and promising 3D bioprinting technology with the engineered stem cells. It is our goal to develop a new concept of medicine for myocardial infarction in the near future," said Prof. Jinah Jang who led the research.

POSTECH began to develop medicine for cardiovascular diseases based on this newly developed bioprinting method with the research team from The Catholic University in 2017. Now, it is being tested in animals for efficacy evaluation with Chonnam National University. Also, the technology is already transferred to T&R Biofab, which is a company developing 3D printers, software, and bioinks to print cells.

Credit: 
Pohang University of Science & Technology (POSTECH)

Study finds 'smart' devices effective in reducing adverse outcomes of heart condition

A new study, published in the Journal of the American College of Cardiology, highlights the feasible use of mobile health (mHealth) devices to help with the screening and detection of a common heart condition.

Atrial fibrillation (AF) is a heart rhythm condition that causes an irregular and sometimes, abnormally fast heart rate. In AF, the heart's upper chambers (atria) contract randomly and sometimes so fast that the heart muscle cannot relax properly between contractions. This reduces the heart's efficiency and performance - but also leads to a higher risk of blood clots.

AF is the most common heart rhythm disturbance, affecting around one million people in the UK. People with AF are at increased risk of having a stroke and dying, as well as heart failure and dementia. Currently, low detection due to lack of visible symptoms and non-adherence are major problems in current management approaches for patients with suspected AF.

Photoplethysmography technology

mHealth devices, such as fitness trackers, smart watches and mobile phones, may enable earlier AF detection, and improved AF management through the use of photoplethysmography (PPG) technology.

PPG is a simple and low-cost optical technique that can be used to detect blood volume changes in the microvascular bed of tissue. It is often used non-invasively to make measurements at the skin surface.

To help determine whether a mHealth technology-supported AF integrated management strategy would reduce AF-related adverse events, compared to usual care, an international team of researchers, led by Associate Professor Guo from Chinese PLA General Hospital in Beijing, and Professor Gregory Lip, Lead for the Liverpool Centre for Cardiovascular Science (LCCC)/Price-Evans Chair of Cardiovascular Medicine at University of Liverpool, conducted a randomised trial.

Central to the study was mobile health technologies developed by leading global technology companies, with a focus on using wearable smart devices such as those from Huawei, working in conjunction with a specially developed mobile app. These pieces of equipment and software can monitor a person's vital signs with great detail and, most importantly for this study, 24 hours a day.

The specially designed mobile app not only charted the patient's biometrics, it afforded clinicians the ability to offer integrated care throughout the duration of the trial. Doctors were able to periodically assess the patient's updated statistics and contact them through the app to offer advice via the ABC care pathway. The ABC pathway, developed in part by the LCCS' Professor Gregory Lip, is a set of guidance for patients and clinicians, which aims to promote a streamlined holistic approach to the management of AF, and ensure that the danger of complications is minimised.

The researchers enrolled a cluster of 3,324 AF patients aged over 18 years from 40 cities across China. The patients were randomized with 1678 receiving usual care and 1646 receiving integrated care based on a mobile AF Application (mAFA) incorporating the ABC Pathway: 'A' Avoid stroke; 'B' Better symptom management; 'C' Cardiovascular and other comorbidity risk reduction. All patients were followed up in outpatient clinics at 6 and 12 months.

Results

Upon completion of the study, the researchers were able to show that occurrences of stroke, systemic thromboembolism, death and rehospitalisation were significantly lower with those patients in the mHealth intervention group compared to those undergoing usual care (1.9% compared with 6%). Rehospitalisation rates were also notably reduced, with only 1.2% of patients in the intervention group needing to be readmitted to hospital, in comparison to 4.5% of patients in the control group.

In addition to these positive figures, subgroup analyses by gender, age, type of condition, risk score and comorbidities, demonstrated consistently lower risks for the composite outcome for patients receiving the mAFA intervention compared to usual care.

These results show an undeniable benefit for the adoption of an integrated approach to monitoring and treating cardiac conditions such as AF.

With smart technologies such as phones, watches and integrated smart home systems becoming increasingly accessible and affordable, the ability for clinicians and researchers to adopt this technology to passively and unobtrusively gather a seemingly unlimited amount of data and information on the global health population is offering boundless opportunity for assessing and treating all manner of diseases and conditions.

Integrated care approach

Associate Professor Guo said: "Our study clearly highlights the need for an integrated care approach to holistic AF care, supported by mobile health technology, as it help to reduce the risks of rehospitalisation and clinical adverse events."

Professor Lip said: "Improved AF care requires early detection which enables the implementation of the priorities of AF management, which is as 'easy as ABC': Avoid stroke; Better symptom optimisation; Cardiovascular and risk factor management. Our clinical trial shows how the mAFA App and smart devices can improve detection of AF and the holistic management of AF patients, improving outcomes in this common heart rhythm disorder."

Credit: 
University of Liverpool

Bison in northern Yellowstone proving to be too much of a good thing

image: Bison in Yellowstone's northern range.

Image: 
Bob Beschta, OSU

CORVALLIS, Ore. - Increasing numbers of bison in Yellowstone National Park in recent years have become a barrier to ecosystem recovery in the iconic Lamar Valley in the northern part of the park, according to a study by Oregon State University scientists.

In the valley, foraging by bison exerts 10 times the environmental pressure of elk, historically the area's dominant herbivore - that's a problem because bison are powerful "ecosystem engineers."

Large numbers of bison disrupt species distribution across shrub steppe and grasslands. They do so via what they eat, trample and rub their horns and bodies on - i.e., tree bark. Thus, bison have tremendous capacity to limit the structure and composition of woody plant communities.

That in turn affects the character of riparian plant communities, as well as stream and river channels, altering habitats and food webs for terrestrial and aquatic wildlife species alike.

The findings were recently published in the journal Food Webs.

In the United States, the range of the bison originally ran from east of the Appalachians to west of the Rocky Mountains, with most of them living on their evolutionary home base, the Great Plains.

Their numbers once totaled an estimated 30 million, perhaps more, said OSU College of Forestry researcher Bob Beschta, corresponding author of the Lamar Valley ecosystem study.

"The bison population sharply decreased in the 1800s and their distribution became more constricted as European-Americans extended their influences westward across the country," Beschta said.

By the 1830s, there were no bison east of the Mississippi River or on the Snake River Plains. Fifty years later, the Plains bison were close to extinction.

"Several small herds were reported near Yellowstone National Park just before the park's establishment in 1872, perhaps driven there by hunting pressure on the Great Plains," said study co-author Bill Ripple, also of the OSU College of Forestry. "Poaching of bison occurred after park establishment, until 1901, at which time only 22 bison were present in the park."

In 1907, more than 60 bison from a growing herd in the Mammoth Hot Springs area of Yellowstone were transferred to the Lamar Valley. By 1925 the Lamar Valley bison herd had grown to more than 750, necessitating population reduction measures. Culling of the Lamar herd continued for more than four decades.

Meanwhile, National Park Service managers became increasingly concerned about the environmental effects of Rocky Mountain elk in the park's northern range, which includes the Lamar Valley, and began to cull them as well. In the early 1900s both gray wolves and cougars, predators that influenced elk behavior and density, had been extirpated.

In the absence of these predators, combined with hunting prohibitions inside the park, wintering elk populations began to heavily browse young woody plants in the northern range, which led to a decrease in "recruitment" - the growth of seedlings and sprouts into tall saplings and trees - of quaking aspen, cottonwood, willow, thinleaf alder and berry-producing shrubs.

Culling of both elk and bison stopped amid public and congressional concerns in 1968, at which time there were about 4,000 elk and 100 bison in the northern range. Within two decades, those numbers had increased to 20,000 and 1,000.

Cougars returned to the northern range in the 1980s, followed by wolf reintroduction a decade later, thus restoring the park's guild of large predators.

"Changes in elk behavior were observed shortly after the return of wolves" Beschta said. "And, with predation pressure from wolves, cougars and grizzly bears, a degraded winter range, and human hunting of elk that wintered outside the park, annual counts of the northern range elk herd began to decrease from their historical highs in the 1990s."

In the years since wolf reintroduction, the northern range's elk population has declined to about 5,000, with most them wintering outside the park. Bison numbers inside the park, on the other hand, have increased to a historical high of about 4,000.

Deciduous woody plant species in many areas of the northern range started to increase in establishment, young plant height, diameter growth, recruitment, canopy cover and berry production - all associated with reduced browsing pressure from elk.

"But in portions of the northern range, like the Lamar Valley where bison are common, woody vegetation has continued to decline," Ripple said. "We hypothesized that was because of the bison. We also hypothesized that bison, via the suppression of riparian vegetation and trampling of streambanks, may be increasingly influencing the channel of the Lamar River and tributary streams that cross the valley floor."

Photo analysis indicated a near complete loss of willow-dominated riparian communities for at least some parts of the Lamar River and the West Fork of Rose Creek.

"And the roughly 7.5 hectares of aspen stands that were present on the valley floor in 1954 had diminished to one-tenth of a hectare by 2015, representing a 99% loss in the cover of overstory aspen trees," Ripple said. "The rapid increase in bison numbers in recent years suggests the park's large carnivore guild may be incapable of controlling bison populations. And prey switching by wolves - from elk to bison - looks unlikely to provide a stabilizing effect on bison populations."

The researchers stress that the long-term recovery of the Yellowstone bison herd has been a major conservation success story and, as one of the few remaining herds that has not hybridized with cattle, Yellowstone bison "are an invaluable conservation resource."

"However, increased bison numbers over the last two decades appear to have come at a major ecological cost to the biological diversity and functioning of the riparian ecosystems in the Lamar Valley," Beschta said. "Even to a casual observer there are clear indicators of highly altered ecological conditions across the Lamar Valley, including a high density of bison trails, wallows and scat. High bison numbers have been an effective agent for accelerating the biological and physical modification of the valley's seeps, wetlands, floodplains, riparian areas and channels, trends that had begun decades earlier by elk."

Ecosystem simplification - a loss of biodiversity, landscape complexity and ecological integrity - is well underway, much like as is associated with high levels of domestic livestock use in areas of the mountain west, Beschta added.

"The ongoing environmental effects of bison would have to be significantly reduced in order to restore biologically diverse communities dominated by willows, cottonwoods and aspen," Beschta said. "As park administrators make management decisions that affect ungulate densities and distributions in Yellowstone, as well as those in other parks and reserves with high ungulate densities, our findings indicate a need to take into account the often wide range of ecological effects that abundant large herbivores can have on terrestrial and aquatic ecosystems."

Credit: 
Oregon State University

Solving a medical mystery and changing CDC screenings for COVID-19

(SACRAMENTO, Calif.) - When UC Davis announced the first case of community transmission of COVID-19 in the U.S. on Feb. 26, it solved a medical mystery at the hospital and led to important changes to the U.S. Centers for Disease Control and Prevention (CDC) guidelines for novel coronavirus testing.

In a paper published today in Clinical Infectious Diseases, UC Davis Health physicians and medical staff who treated the severely ill patient provide a detailed case study of her condition and the medical steps and challenges they experienced before arriving at a diagnosis and treatment. The case study also reveals how her symptoms matched--and sometimes varied from-- published studies of COVID-19 infection at the time.

SARS-CoV-2, the novel coronavirus that causes COVID-19, was first identified in a small number of cases in Wuhan, China in December 2019. As of March 30, the virus has infected over 752,830 people around the globe and caused more than 36,230 deaths. In the U.S., the CDC reports 140,904 have tested positive with 2,405 deaths. In California, the California Department of Public Health reported 4,643 positive cases and 101 deaths.

No known risks for novel coronavirus

An otherwise healthy woman in her 40s, the patient was admitted to UC Davis with a respiratory infection. Her chest imaging suggested community acquired pneumonia. The patient was immediately placed on droplet and contact precautions to prevent infection transmission.

Within 24 hours of admission, her respiratory status deteriorated. She was intubated and given antibiotics including linezolid, piperacillin-tazobactam and azithromycin. Testing over several days--viral panel, respiratory culture, blood cultures, bronchoscopy cultures--failed to indicate a clear infectious source.

The UC Davis team suspected a potential COVID-19 infection. However, because the patient had not traveled to high-risk countries (at the time China) and had no contact with an individual with high-risk travel, she did not meet the CDC criteria. As a result, public health officials did not pursue testing.

The patient developed acute respiratory distress syndrome, a condition in which fluid builds up in the lungs and limits the oxygen that can reach the bloodstream. She developed a septic shock, a potentially fatal sharp drop in blood pressure in reaction to severe infection.

Because of the severity of her respiratory condition, the patient was reviewed again for possible coronavirus infection. This time, the CDC recommended COVID-19 testing. The patient was put on airborne precautions and strict contact precautions. Two days later, the results came back positive.

No approved therapies for the novel coronavirus

Currently, there are no approved antiviral therapies for COVID-19 in humans, but clinical trials are underway at several academic medical centers, including UC Davis. Because of the severity of the patient's illness, the team received approval from the Food and Drug Administration (FDA) to treat her with an investigational drug called remdesivir. The broad-spectrum antiviral developed by Gilead Sciences Inc., has been tested in humans with Ebola and has shown promise against coronaviruses in animal models.

After receiving remdesivir infusions, the patient improved, needing significantly less ventilator support and having better blood oxygen levels and chest X-ray results. Fourteen days after first arriving at UC Davis, she was removed from mechanical ventilation. She has since been discharged and is recovering at home.

The team emphasizes that whether remdesivir is effective against human COVID-19 is not yet known. Clinical trials funded by the National Institutes of Health (NIH) and the pharmaceutical industry will be key to analyzing the drug's efficacy against SARS-CoV-2.

"Given the urgent need to find an effective treatment for COVID-19, clinical trials are essential for determining, from a scientific standpoint, if remdesivir is safe and effective," said Allison Brashear, dean of the UC Davis School of Medicine. "With this new study funded by the NIH, UC Davis will be an important contributor to these critical efforts."

Testing is key to tackling pandemic

The case highlights significant knowledge gaps in the diagnosis and management of COVID-19. Without clear risk factors, the patient's infection first masqueraded as a community acquired pneumonia. She also was relatively young and without other health conditions that would identify her as at-risk for severe disease.

"Our case has influenced national health policies for revising screening criteria," said Angela Haczku, associate dean for translational research at the UC Davis School of Medicine and senior author on the study.

Because of this specific case, and other similar cases of community-acquired COVID-19, the CDC updated its guidelines so that any hospitalized patient with severe symptoms, such as acute respiratory distress syndrome or pneumonia without an explanatory diagnosis, can now be tested for COVID-19 even if no clear source of exposure is identified.

"There are individuals in the community who are not manifesting severe enough symptoms to check with their health care providers," said Michael Schivo, co-director of UC Davis Comprehensive COPD Clinic and senior author on the study. "We expect community spread to occur more frequently, challenging the ability of health care systems to adequately contain the spread of COVID-19."

As the virus continues to spread, and more data about it becomes known, the authors expect guidelines to change yet again. But they are adamant that to tackle the pandemic, there needs to be significantly faster, less expensive and more widespread testing of all patients who potentially have COVID-19.

Credit: 
University of California - Davis Health

American Society of Nephrology provides insights on COVID-19 and kidney disease

Highlight

The American Society of Nephrology has launched several initiatives to provide guidance on COVID-19 as it relates to the care of patients with kidney disease.

Washington, DC -- The pandemic of coronavirus disease 2019 (COVID-19) is especially threatening to patients with kidney diseases and to their caregivers. The American Society of Nephrology (ASN) has launched several initiatives to provide accurate and updated COVID-19-related information on an ongoing basis to clinicians who care for patients with kidney diseases.

As an initial step, ASN, in conjunction with the Centers for Disease Control and Prevention (CDC), has established a COVID-19 Response Team. Nephrologists, CDC physicians, infection preventionists, and dialysis nurses meet weekly to acquire new information and share it with the community, inform best practices, and adapt to the changing environment as the pandemic spreads.

ASN's publications--JASN, CJASN, and Kidney360--will provide valid, peer reviewed information on COVID-19 as quickly as possible, with collections updated with new articles as soon as they are published. Recent examples include the following:

In a CJASN article on COVID-19, experts brought together the evidence-based guidance of the CDC and the practical judgment of clinicians to mitigate the risk of COVID-19 in dialysis facilities, offering information on patient screening, placement, and instructions; use of face masks and other personal protective equipment by staff; disinfection practices; and communication with health departments. (https://doi.org/10.2215/CJN.03340320)

A Kidney360 article offers the perspective of 2 U.S. nephrologists on managing patients undergoing hemodialysis who have suspected or confirmed COVID-19, noting the steps and precautions taken for patients presenting to dialysis clinics or emergency departments. (https://doi.org/10.34067/KID.0001452020)

Another CJASN article provides insights from clinicians in Washington state whose patient--who had been undergoing outpatient hemodialysis--was the first to die from COVID-19 in the United States. A series of policies and procedures were immediately put in place to protect other patients and staff. (https://doi.org/10.2215/CJN.03540320)

The effects of certain antihypertension drugs on the virus that causes COVID-19 is the topic of another CJASN article. The authors stress that these medications are critical to the health of many patients, who should continue to take them as prescribed until definitive studies can determine whether they inhibit the virus or make people more susceptible to infection. The authors have created a website that is being updated in real time to provide a reliable source of information (http://www.nephjc.com/news/covidace2). (https://doi.org/10.2215/CJN.03530320)

Credit: 
American Society of Nephrology

Blood test accurately detects over 50 types of cancer, often before symptoms show

image: Identification of cancer status for more than 50 cancer types, as well as tissue of origin localisation, from a single blood draw. Cell-free DNA is isolated from blood samples drawn from a patient without cancer (top) or with cancer (bottom), and subjected to a targeted methylation sequencing assay. Sequencing results identifying methylated (red) or unmethylated (blue) CpG regions are fed into a machine-learning classifier that can identify the presence or absence of cancer, as well as identify the tissue of origin.

Image: 
Allen McCrodden, Associate Director, Creative Group of ProEd Communications

Researchers have developed the first blood test that can accurately detect more than 50 types of cancer and identify in which tissue the cancer originated, often before there are any clinical signs or symptoms of the disease.

In a paper published in the leading cancer journal Annals of Oncology [1] today (Tuesday) the researchers show that the test, which could eventually be used in national cancer screening programmes, has a 0.7% false positive rate for cancer detection, meaning that less than 1% of people would be wrongly identified as having cancer. As a comparison, about 10% of women are wrongly identified as having cancer in national breast cancer screening programmes, although this rate can be higher or lower depending on the number and frequency of screenings and the type of mammogram performed.

The test was able to predict the tissue in which the cancer originated in 96% of samples, and it was accurate in 93%.

Tumours shed DNA into the blood, and this contributes to what is known as cell-free DNA (cfDNA). However, as the cfDNA can come from other types of cells as well, it can be difficult to pinpoint cfDNA that comes from tumours. The blood test reported in this study analyses chemical changes to the DNA called "methylation" that usually control gene expression. Abnormal methylation patterns and the resulting changes in gene expression can contribute to tumour growth, so these signals in cfDNA have the potential to detect and localise cancer.

The blood test targets approximately one million of the 30 million methylation sites in the human genome. A machine learning classifier (an algorithm) was used to predict the presence of cancer and the type of cancer based on the patterns of methylation in the cfDNA shed by tumours. The classifier was trained using a methylation database of cancer and non-cancer signals in cfDNA. The database is believed to be the largest in the world and is owned by the company involved in this research, GRAIL, Inc. (California, USA).

Senior author of the paper, Dr Michael Seiden (MD, PhD), President of US Oncology (Texas, USA), said: "Our earlier research showed that the methylation approach outperformed both whole genome and targeted sequencing in the detection of multiple deadly cancer types across all clinical stages, and in identifying the tissue of origin. It also allowed us to identify the most informative regions of the genome, which are now targeted by the refined methylation test that is reported in this paper."

In the part of the Circulating Cell-free Genome Atlas (CCGA) study reported today, blood samples from 6,689 participants with previously untreated cancer (2482 patients) and without cancer (4207 patients) from North America were divided into a training set and a validation set. Of these, results from 4316 participants were available for analysis: 3052 in the training set (1531 with cancer, 1521 without cancer) and 1264 in the validation set (654 with cancer and 610 without cancer). Over 50 types of cancer were included.

The machine learning classifier analysed blood samples from the participants to identify methylation changes and to classify the samples as cancer or non-cancer, and to identify the tissue of origin.

The researchers found that the classifier's performance was consistent in both the training and validation sets, with a false positive rate of 0.7% in the validation set.

The classifier's ability to correctly identify when cancer was present (the true positive rate) was also consistent between the two sets. In 12 types of cancer that are often the most deadly (anal, bladder, bowel, oesophageal, stomach, head and neck, liver and bile duct, lung, ovarian and pancreatic cancers, lymphoma, and cancers of white blood cells such as multiple myeloma), the true positive rate was 67.3% across clinical stages I, II and III. These 12 cancers account for about 63% of cancer deaths each year in the USA and, at present, there is no way of screening for the majority of them before symptoms show. The true positive rate was 43.9% for all cancer types in the study across the three clinical stages.

Detection improved with each cancer stage. In the 12 pre-specified cancers, the true positive rate was 39% in stage I, 69% in stage II, 83% in stage III and 92% in stage IV. In all of more than 50 cancer types, the corresponding rates were 18%, 43%, 81% and 93%, respectively.

The test was also consistent between the training and validation sets in its ability to identify the tissue where cancer had originated, with an accuracy of 93% in the validation set.

Dr Seiden said: "These data support the ability of this targeted methylation test to meet what we believe are the fundamental requirements for a multi-cancer early detection blood test that could be used for population-level screening: the ability to detect multiple deadly cancer types with a single test that has a very low false positive rate, and the ability to identify where in the body the cancer is located with high accuracy to help healthcare providers to direct next steps for diagnosis and care.

"Considering the burden of cancer in our society, it is important that we continue to explore the possibility that this test might intercept cancers at an earlier stage and, by extension, potentially reduce deaths from cancers for which screening is either not available or has poor adherence. To our knowledge, this is the largest clinical genomics study, in participants with and without cancer, to develop and validate a blood test for early detection of multiple cancers."

The study is funded by GRAIL, the maker of the blood test. Researchers are continuing to validate the test in large, prospective studies in the USA (STRIVE and PATHFINDER studies) and the UK (SUMMIT study), and to examine its feasibility for screening populations [2].

A strength of the CCGA study is that it includes more than 15,000 participants from 142 clinics in North America, ensuring results are generalisable to a diverse population. The ongoing studies are assessing the test's performance in even broader populations. Limitations include: all the participants with cancer had already been diagnosed with cancer (e.g. via screening or patients presenting with symptoms); the study was not designed to establish the test's impact on death from cancer or other causes; at the time of this analysis, not all patients had been followed for a year, which is needed to ensure their non-cancer status was accurate; and some inaccuracy occurred in the detection of the tissue of origin for cancers that are driven by the human papilloma virus (HPV), such as cancers of the cervix, anus, and head and neck - this information is being used to improve the test's performance.

Editor-in-chief of Annals of Oncology, Professor Fabrice André, Director of Research at the Institut Gustave Roussy, Villejuif, France, said: "This is a landmark study and a first step toward the development of easy-to-perform screening tools. Earlier detection of more than 50% of cancers could save millions of lives every year worldwide and could dramatically reduce morbidity induced by aggressive treatments.

"While numbers are still small, the performance of this new technology is particularly intriguing in pancreatic cancer, for which mortality rates are very high because it is usually diagnosed when it's at an advanced stage."

Credit: 
European Society for Medical Oncology

Blood test detects wide range of cancers, available to at risk individuals in clinical study

image: Cell-free DNA is isolated from blood samples drawn from a patient without cancer (top) or with cancer (bottom), and subjected to a targeted methylation sequencing assay. Sequencing results identifying methylated (red) or unmethylated (blue) CpG regions are fed into a machine-learning classifier that can identify the presence or absence of cancer, as well as identify the tissue of origin (TOO).

Image: 
Editorial Figure by Allen McCrodden, Associate Director, Creative Group of ProEd Communications

In a study involving thousands of participants, a new blood test detected more than 50 types of cancer as well as their location within the body with a high degree of accuracy, according to an international team of researchers led by Dana-Farber Cancer Institute and the Mayo Clinic.

The results, published online today by the Annals of Oncology, indicate that the test - which identified some particularly dangerous cancers that lack standard approaches to screening - can play a key role in early detection of cancer. Early detection can often be critical to successful treatment.

Developed by GRAIL, Inc., of Menlo Park, Calif., the test uses next-generation sequencing to analyze the arrangement of chemical units called methyl groups on the DNA of cancer cells. Adhering to specific sections of DNA, methyl groups help control whether genes are active or inactive. In cancer cells, the placement of methyl groups, or methylation pattern, is often markedly different from that of normal cells - to the extent that abnormal methylation patterns are even more characteristic of cancer cells than genetic mutations are. When tumor cells die, their DNA, with methyl groups firmly attached, empties into the blood, where it can be analyzed by the new test.

"Our previous work indicated that methylation-based tests outperform traditional DNA-sequencing approaches to detecting multiple forms of cancer in blood samples," said Dana-Farber's Geoffrey Oxnard, MD, co-lead author of the study with Minetta Liu, MD, of the Mayo Clinic. "The results of this study suggest that such assays could be a feasible way of screening people for a wide variety of cancers."

In the study, investigators used the test to analyze cell-free DNA (DNA from normal and cancerous cells that had entered the bloodstream upon the cells' death) in 6,689 blood samples, including 2,482 from people diagnosed with cancer and 4,207 from people without cancer. The samples from patients with cancer represented more than 50 cancer types, including breast, colorectal, esophageal, gallbladder, bladder, gastric, ovarian, head and neck, lung, lymphoid leukemia, multiple myeloma, and pancreatic cancer.

The overall specificity of the test was 99.3%, meaning that only 0.7% of the results incorrectly indicated that cancer was present. The sensitivity of the assay for 12 cancers that account for nearly two-thirds of U.S. cancer deaths was 67.3%, meaning the test could find the cancer two-thirds of the time but a third of the time the test returned a negative result. Within this group, the sensitivity was 39% for patients with stage I cancer, 69% for those with stage II, 83% for those with stage III, and 92% for those with stage IV. The stage I-III sensitivity across all 50 cancer types was 43.9%. When cancer was detected, the test correctly identified the organ or tissue where the cancer originated in more than 90% of cases - critical information for determining how the disease is diagnosed and managed.

"Our results show that this approach to testing cell-free DNA in blood can detect a broad range of cancer types at virtually any stage of the disease, with specificity and sensitivity approaching the level needed for population-level screening," Oxnard observed. "The test can be an important part of clinical trials for early cancer detection."

Credit: 
Dana-Farber Cancer Institute

Comprehensive COVID-19 hospitalisation and death rate estimates help countries best prepare as global pandemic unfolds

image: This is the estimated proportion of hospitalization by age.

Image: 
The Lancet Infectious Diseases

First comprehensive estimates from mainland China of the proportion of people with COVID-19 who required hospitalisation, and latest death rate estimates, both show sharp increases with age. Authors warn that without intervention the number of people needing hospital treatment is likely to overwhelm even the most advanced healthcare systems worldwide, though caution that as the pandemic unfolds, it is possible that outcomes could improve and it will be important to revise estimates in this study.

Key findings:

The death rate from confirmed COVID-19 cases is estimated at 1.38%, while the overall death rate, which includes unconfirmed cases, is estimated at 0.66%; these rates are slightly lower than some estimates for COVID-19 to date, which had not adjusted for undiagnosed cases or for the number of people in each age group of a population

Death rates vary substantially, ranging from 0.0016% in 0 to 9-year-olds to 7.8% for people aged 80 and above

Differences in hospitalisation rates by age were reported, increasing with age - with 11.8% of people in their 60s, 16.6% of people in their 70s, and 18.4% of those in their 80s and above estimated to develop symptoms severe enough for hospitalisation

These hospitalisation rates compare with 0.04% of 10 to 19-year-olds, 1.0% of people in their 20s, and 3.4% of people aged 30 to 39. Hospitalisation rates nearly double from 4.3% in 40-49-year-olds to 8.2% in 50-59-year olds

Nearly one in five over-80s infected with COVID-19 are likely to require hospitalisation, compared with around 1% of people under 30, according to an analysis of 3,665 cases in mainland China, published in The Lancet Infectious Diseases journal.

The new analysis also finds that the estimated proportion of deaths from both diagnosed cases and from milder, unconfirmed cases is strongly influenced by age. The estimates are slightly lower than others that have been made for the virus, but are still much higher than for previous pandemics such as 2009 pandemic influenza H1N1, which was estimated to be fatal in around 0.02% of cases [1]. The new estimates are based on an analysis of 70,117 laboratory-confirmed and clinically-diagnosed cases in mainland China [2], combined with 689 positive cases among people evacuated from Wuhan on repatriation flights.

"This study provides critical estimates on the proportion of people requiring hospitalisation which, when applied to the UK population, enabled us to get a handle on how many people might need to access NHS services," says Professor Neil Ferguson from Imperial College London, UK. "As the UK epidemic unfolds, more data are becoming available, and at the moment the proportion of people in each age group most likely to require hospitalisation, and most likely to die from infection, are consistent with the estimates in this study." [3]

The authors warn that as 50% to 80% of the global population could be infected with COVID-19 [4] the number of people needing hospital treatment is likely to overwhelm even the most advanced healthcare systems worldwide. However, they caution that it is possible that outcomes could improve, in which case it will be important to revise the estimates in this study.

"Our estimates can be applied to any country to inform decisions around the best containment policies for COVID-19," says Professor Azra Ghani from Imperial College London, UK. "There might be outlying cases that get a lot of media attention, but our analysis very clearly shows that at aged 50 and over, hospitalisation is much more likely than in those under 50, and a greater proportion of cases are likely to be fatal." [3]

Previous estimates of deaths from confirmed cases of COVID-19 have ranged from 2% to 8% [5], while deaths from overall infections have been estimated at 0.2% to 1.6%. Also, estimates for the proportion of deaths in the oldest age group, the over-80s, have been estimated to be between 8% to 36%. However, these past estimates had not adjusted for the fact that only people with more severe symptoms are likely to be tested, or people in quarantine following repatriation to other countries, so they did not reflect the true number of cases across populations. No previous studies have estimated the proportion of infections that will require hospitalisation.

For the current analysis, a team of international researchers used 3,665 cases from mainland China to estimate the proportion of cases likely to be severe enough to require hospitalisation. To estimate the average time between a person displaying symptoms and dying, they analysed 24 deaths in Hubei Province. The average recovery time was estimated using data from 2,010 international cases, of whom 169 people recovered. Death rates from confirmed cases were estimated using data on 44,672 cases in mainland China [2]. To estimate death rates relevant to the wider population, data from 689 people repatriated from Wuhan to other countries and 3,711 people quarantined on board the cruise liner Diamond Princess were used.

For all the estimates, the researchers assumed that people of all ages are equally likely to become infected, which is consistent with previous studies on respiratory infections.

The analysis found the greatest number of severe cases, requiring hospitalisation, in people in their 50s (222 out of 790 cases), but once the researchers had adjusted for the fact that many milder cases will have gone undiagnosed, the hospitalisation rate is 8.2%, compared with an estimated 18.4% in the most at risk group, the over 80s (51 out of 76 cases before adjustment). 154 out of 743 people in their 40s had severe symptoms, whereas 133 out of 263 people in their 70s had severe symptoms, but the adjusted hospitalisation rates were again even wider apart than the bare numbers suggest: 4.3% for 40 to 49-year-olds compared with 16.6% for 70 to 79-year-olds. Of those in their 60s, 201 out of 560 cases were severe, whereas the adjusted hospitalisation rate was 11.8%.

The hospitalisation rates were lower for younger age groups: 3.4% of people in their 30s are likely to be hospitalised (while the unadjusted number of severe cases was 124 out of 733 cases), whereas for people in their 20s the rate is likely to be 1.0% (49 out of 437 cases before adjustment). There was only one severe case out of 50 for those aged 10 to 19 and the hospitalisation rate is estimated at 0.04%, whereas none of the 13 cases analysed in the under-10s were severe.

The average time between the first recorded symptoms and death from COVID-19 was estimated to be 17.8 days. The authors note that as the data are from early in the epidemic, more people might die following a longer time lag. Recovering from the disease is estimated to take slightly longer, with patients being discharged from hospital after an average of 22.6 days.

Most people will recover, even from severe symptoms. Death rates from confirmed cases were estimated at 1.38% across all age groups (1,023 out of 44,672 cases in mainland China, with unreported severe cases likely to add to the total, requiring the ratio to be adjusted), but the estimates rise rapidly with age. For example, there were no deaths out of 416 confirmed cases in the under 10-year olds, whereas 13.4% of people aged 80 or above were estimated to die (208 out of 1,408 cases before adjustment). [5]

The proportion of all people infected who die from the disease--most of whom will display only mild to moderate symptoms--is estimated to be slightly lower, at 0.66%. Again, the risk of death is much higher in older age groups. For example, 0.031% of people in their 20s are estimated to die, compared with 7.8% of the over-80s.

The authors note that they are unable to adjust for the effect on prognosis of underlying health conditions until individual-level data become available. However, underlying conditions are likely to be correlated with age. Their existence will also vary geographically, particularly between low-income and high-income regions and countries.

Writing in a linked Comment, Dr Shigui Ruan (who was not involved in the study) from the University of Miami, USA, says: "Estimates of case fatality ratios might vary slightly from country to country because of differences in prevention, control, and mitigation policies implemented, and because the case fatality ratio is substantially affected by the preparedness and availability of health care. Early studies have shown that delaying the detection of infected cases not only increases the probability of spreading the virus to others (most likely family members, colleagues, and friends) but also makes the infection worse in some cases, thereby increasing the case fatality ratio."

Credit: 
The Lancet

Projecting the outcomes of people's lives with AI isn't so simple

image: Brian J. Goode, a research scientist from Virginia Tech's Fralin Life Sciences Institute, was one of 112 data and social scientists tasked with building statistical and machine-learning models. Photo courtesy of the Data Analytics Center.

Image: 
Virginia Tech

The machine learning techniques scientists use to predict outcomes from large datasets may fall short when it comes to projecting the outcomes of people's lives, according to a mass study led by researchers at Princeton University in a collaboration with researchers across many institutions, including Virginia Tech.

This mass collaboration, called the Fragile Families Challenge, represents a cohort of scientists that build statistical and machine-learning models to predict and measure life outcomes for children, parents, and households across the United States.

Published by 112 co-authors in the Proceedings of the National Academy of Sciences, the results suggest that sociologists and data scientists should use caution when using predictive modeling, especially in the criminal justice system and social programs.

Even after using state-of-the-art modeling and a high-quality dataset containing 13,000 data points for more than 4,000 families, the best AI predictive models were not very accurate.

Brian J. Goode, a research scientist from Virginia Tech's Fralin Life Sciences Institute, was among the data and social scientists that were in the Fragile Families Challenge.

"It's one effort to try to capture the complexities and intricacies that compose the fabric of a human life in data and models. But, it is compulsory to take the next step and contextualize models in terms of how they are going to be applied in order to better reason about expected uncertainties and limitations of a prediction. That's a very difficult problem to grapple with, and I think the Fragile Families Challenge shows that we need more research support in this area, particularly as machine learning has a greater impact on our everyday lives," said Goode.
Goode's modeling was conducted through the Discovery Analytics Center at Virginia Tech. There, he teamed up with the Discovery Analytics Center's director and the Thomas L. Phillips Professor of Engineering, Naren Ramakrishnan, and Debanjan Datta, a Ph.D. student in the Department of Computer Science in the College of Engineering, who were instrumental in gathering and analyzing data.

The Virginia Tech team has also published research in a special issue of Socius, a new open-access journal from the American Sociological Association. In order to support additional research in this area, all the submissions to the Challenge -- code, predictions and narrative explanations -- are publicly available.

"The study also shows us that we have so much to learn, and mass collaborations like this are hugely important to the research community," said the PNAS study co-lead author Matt Salganik, professor of sociology at Princeton and interim director of the Center for Information Technology Policy, based at Princeton's Woodrow Wilson School of Public and International Affairs.

The project was inspired by Wikipedia, one of the world's first mass collaborations, which was created in 2001 as a shared encyclopedia. Salganik pondered what other scientific problems could be solved through a new form of collaboration, and that's when he joined forces with Sara McLanahan, the William S. Tod Professor of Sociology and Public Affairs at Princeton, as well as Princeton graduate students Ian Lundberg and Alex Kindel, both in the Department of Sociology.

McLanahan is principal investigator of the Fragile Families and Child Wellbeing Study based at Princeton and Columbia University, which has been studying a cohort of about 5,000 children born in large American cities between 1998 and 2000, with an oversampling of children born to unmarried parents. The longitudinal study was designed to understand the lives of children born into unmarried families.

Through surveys collected in six waves (when the child was born and then when the child reached ages 1, 3, 5, 9, and 15), the study has captured millions of data points on children and their families. Another wave will be captured at age 22.

At the time the researchers designed the challenge, data from age 15 (which the researchers call in the paper the "hold-out data) had not yet been made publicly available. This created an opportunity to ask other scientists to predict life outcomes of the people in the study through a mass collaboration.

The co-organizers received 457 applications from 68 institutions from around the world, including from several teams based at Princeton. Using the Fragile Families data, participants were asked to predict one or more of the six life outcomes at age 15. These included child grade point average (GPA); child grit; household eviction; household material hardship; primary caregiver layoff; and primary caregiver participation in job training.

The challenge was based around the common task method, a research design used frequently in computer science but not in the social sciences. This method releases some but not all of the data, allowing people to use whatever technique they want to determine outcomes. The goal is to accurately predict the hold-out data, no matter how fancy a technique it takes to get there.

The team is currently applying for grants to continue research in this area.

Credit: 
Virginia Tech

'Living drug factories' might treat diabetes and other diseases

One promising way to treat diabetes is with transplanted islet cells that produce insulin when blood sugar levels get too low. However, patients who receive such transplants must take drugs to prevent their immune systems from rejecting the transplanted cells, so the treatment is not often used.

To help make this type of therapy more feasible, MIT researchers have now devised a way to encapsulate therapeutic cells in a flexible protective device that prevents immune rejection while still allowing oxygen and other critical nutrients to reach the cells. Such cells could pump out insulin or other proteins whenever they are needed.

"The vision is to have a living drug factory that you can implant in patients, which could secrete drugs as-needed in the patient. We hope that technology like this could be used to treat many different diseases, including diabetes," says Daniel Anderson, an associate professor of chemical engineering, a member of MIT's Koch Institute for Integrative Cancer Research and Institute for Medical Engineering and Science, and the senior author of the work.

In a study of mice, the researchers showed that genetically engineered human cells remained viable for at least five months, and they believe they could last longer to achieve long-term treatment of chronic diseases such as diabetes or hemophilia, among others.

Suman Bose, a research scientist at the Koch Institute, is the lead author of the paper, which appears today in Nature Biomedical Engineering.

Protective effect

Patients with type 1 diabetes usually have to inject themselves with insulin several times a day to keep their blood sugar levels within a healthy range. Since 1999, a small number of diabetes patients have received transplanted islet cells, which can take over for their nonfunctioning pancreas. While the treatment is often effective, the immunosuppressant drugs that these patients have to take make them vulnerable to infection and can have other serious side effects.

For several years, Anderson's lab has been working on ways to protect transplanted cells from the host's immune system, so that immunosuppressant drugs would not be necessary.

"We want to be able to implant cells into patients that can secrete therapeutic factors like insulin, but prevent them from being rejected by the body," Anderson says. "If you could build a device that could protect those cells and not require immune suppression, you could really help a lot of people."

To protect the transplanted cells from the immune system, the researchers housed them inside a device built out of a silicon-based elastomer (polydimethylsiloxane) and a special porous membrane. "It's almost the same stiffness as tissue, and you make it thin enough so that it can wrap around organs," Bose says.

They then coated the outer surface of the device with a small-molecule drug called THPT. In a previous study, the researchers had discovered that this molecule can help prevent fibrosis, a buildup of scar tissue that results when the immune system attacks foreign objects.

The device contains a porous membrane that allows the transplanted cells obtain nutrients and oxygen from the bloodstream. These pores must be large enough to allow nutrients and insulin to pass through, but small enough so that immune cells such as T cells can't get in and attack the transplanted cells.

In this study, the researchers tested polymer coatings with pores ranging from 400 nanometers to 3 micrometers in diameter, and found that a size range of 800 nanometers to 1 micrometer was optimal. At this size, small molecules and oxygen can pass through, but not T cells. Until now, it had been believed that 1-micrometer pores would be too large to stop cellular rejection.

Drugs on demand

In a study of diabetic mice, the researchers showed that transplanted rat islets inside microdevices maintained normal blood glucose levels in the mice for more than 10 weeks.

The researchers also tested this approach with human embryonic kidney cells that were engineered to produce erythropoietin (EPO), a hormone that promotes red blood cell production and is used to treat anemia. These therapeutic human cells survived in mice for at least the 19-week duration of the experiment.

"The cells in the device act as a factory and continuously produce high levels of EPO. This led to an increase in the red blood cell count in the animals for as long as we did the experiment," Anderson says.

In addition, the researchers showed that they could program the transplanted cells to produce a protein only in response to treatment with a small molecule drug. Specifically, the transplanted engineered cells produced EPO when mice were given the drug doxycycline. This strategy could allow for on-demand production of a protein or hormone only when it is needed.

This type of "living drug factory" could be useful for treating any kind of chronic disease that requires frequent doses of a protein or hormone, the researchers say. They are currently focusing on diabetes and are working on ways to extend the lifetime of transplanted islet cells.

"This is the eighth Nature journal paper our team has published in the past four-plus years elucidating key fundamental aspects of biocompatibility of implants. We hope and believe these findings will lead to new super-biocompatible implants to treat diabetes and many other diseases in the years to come," says Robert Langer, the David H. Koch Institute Professor at MIT and an author of the paper.

Credit: 
Massachusetts Institute of Technology

Does preterm delivery contribute to increased cardiovascular disease burden in women?

image: Published monthly, is a core multidisciplinary journal dedicated to the diseases and conditions that hold greater risk for or are more prevalent among women, as well as diseases that present differently in women.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, March 30, 2020--A new study quantifies the future economic burden of cardiovascular disease (CVD) in women with a history of preterm delivery (PTD). The study, which used a novel Markov microsimulation model to quantify CVD burden in terms of cost and years of life lost over a 50-year period, is published in Journal of Women's Health, a peer-reviewed publication from Mary Ann Liebert, Inc., publishers. Click here to read the full-text article on the Journal of Women's Health website through April 30, 2020.

The article entitled "How Does Preterm Delivery Contribute to the Increased Burden of Cardiovascular Disease? Quantifying the Economic Impact of CVD in Women with a History of Preterm Delivery" was coauthored by Lan Gao, PhD, Shu-chuen Li, PhD, and Marj Moodie, DrPh, Deakin University (Geelong) and The University of Newcastle (Callaghan), Australia. While PTD is not traditionally recognized as a CVD risk factor, it places the mother at increased risk of developing CVD, including coronary heart disease and stroke, later in life, and women who have a PTD have about twice the risk of CVD mortality.

Based on an Australian healthcare system perspective, the study comprised two models -- a dynamic model and a static model -- which showed the total CVD cost burden to be 11.4 billion Australian dollars and 4.5 billion Australian dollars, respectively, over the 50-year study period. The years of life lost were 0.34 per capita and 0.52 per capita, respectively.

In an accompanying Editorial entitled "The Economic Burden of CVD in Women with a History of Preterm Delivery", Margo Minissian, PhD, Cedars-Sinai Medical Center (Los Angeles, CA) states: "Considering the substantial economic burden eloquently described by Gao et al., future prevention strategies for women who experience PTD are imperative." In addition, "recognizing PTD as a potential CVD risk factor/enhancer is important."

Dr. Minissian highlights the novel microsimulation modeling technique used in this study, which allows for subsequent recurrent CVD events to be captured over a lifetime. Most notable was the 19.8% 4-year recurrence rate of stroke.

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News