Culture

Biomedical sciences researchers find new way to prevent and cure rotavirus, other viral infections

image: Dr. Andrew Gewirtz, a professor in the Institute for Biomedical Sciences at Georgia State University

Image: 
Georgia State University

ATLANTA--A combination of two substances secreted by the immune system can cure and prevent rotavirus infection, as well as potentially treat other viral infections that target epithelial cells, which cover body surfaces such as skin, blood vessels, organs and the urinary tract, according to researchers in the Institute for Biomedical Sciences at Georgia State University.

Rotavirus, which causes severe, life-threatening diarrhea in young children and moderate gastrointestinal distress in adults, leads to thousands of deaths in children annually, particularly in developing countries where rotavirus vaccines are only moderately effective. Rotavirus is an RNA virus that primarily infects intestinal epithelial cells.

The substances identified in the study, officially known as cytokines, are interleukin 18 (IL-18) and interleukin 22 (IL-22). IL-18 and IL-22 are produced when the body detects a protein in the whip-like appendage of bacteria.

The study, which investigated how these cytokines inhibit rotavirus infection, found when mice were treated with both IL-18 and IL-22, the cytokines promoted each other's expression, but also impeded rotavirus by independent, distinct mechanisms that involved activating receptors in intestinal epithelial cells. These actions resulted in rapid and complete expulsion of rotavirus, even in hosts with severely compromised immune systems. The therapy was also found to be effective for norovirus, a contagious virus that causes vomiting and diarrhea. The findings are published in the journal Science Immunology.

"Our study reports a novel means of eradicating a viral infection, particularly viruses that infect epithelial cells," said Dr. Andrew Gewirtz, senior author of the study and a professor in the Institute for Biomedical Sciences at Georgia State. "The results suggest that a cocktail that combines IL-18 and IL-22 could be a means of treating viral infections that target short-lived epithelial cells with high turnover rates."

Credit: 
Georgia State University

The mode of detection of high-risk breast cancers is linked to patient prognosis

Breast cancers that are detected in the interval between national screening programme mammograms have a worse prognosis than those detected at the time of a screening, even if they have the same biology, according to research presented at the 12th European Breast Cancer Conference on Saturday.

Analysis of results from over eight years' follow-up of the international MINDACT randomised phase III clinical trial shows that although tumours may have the same genetic make-up, the way they are detected makes a significant difference to the period of time before the disease starts spreading to other parts of the body or results in death, whichever comes first. This is known as the distant metastasis-free interval (DMFI).

Dr Josephine Lopes Cardozo (MD), a PhD candidate at the Netherlands Cancer Institute (NKI) in Amsterdam, The Netherlands, and medical fellow at the European Organisation for Research and Treatment of Cancer (EORTC) in Brussels, Belgium, told the conference that the method of detection gave additional prognostic information and should be taken into account when deciding on what treatments in addition to surgery might be needed.

Dr Lopes Cardozo and her colleagues had found previously that tumours that occurred in the interval between screening mammographies, known as interval cancers, were more likely to have a high-risk genetic profile, as shown by a test that looks at the activity of 70 genes in the tumour tissue (the 70-gene signature, commercially known as MammaPrint), and were therefore at higher risk of distant metastases.

"However, there are also screen-detected cancers with a high-risk 70-gene signature," she said. "In our current analysis, we found a significant difference in survival between high-risk cancers detected during screening or in the interval between screenings. The eight-year DMFI rate was higher among women with screen-detected cancers than for women with interval cancers: 93.8% versus 85.2%.

"Although these tumours have the same biology - all 70-gene, high-risk and with similar tumour characteristics - they have different prognoses based on their method of detection. This suggests that the method of detection is an additional prognostic factor in this group of patients. The method of detection combined with the 70-gene signature can further optimise treatment for patients at high risk of recurrence. For patients with a very low risk of recurrence, longer follow-up may also help to identify those who are currently at risk of being over-treated."

A total of 1102 Dutch breast cancer patients enrolled in the MINDACT trial between 2007 and 2011, who participated in the national screening programme and who were aged 50-75, were included in the analysis. The Dutch national screening programme invites women aged 50-75 years for screening every two years. The researchers evaluated differences in DMFI for high, low and ultra-low risk tumours, as classified by the 70-gene signature. A total of 754 cases were detected during screening, and 348 during the interval between screenings.

With 50% of patients having reached at least 8.6 years of follow-up, there were 83 occurrences of distant metastases or death due to breast cancer. Among patients with screen-detected cancers, 36% received no adjuvant systemic treatment (such as chemotherapy and hormone therapy, in addition to surgery and radiotherapy), 33% received hormone therapy only and 30% received chemotherapy with or without hormone therapy. Among patients with interval cancers, 17% received no adjuvant systemic treatment, 35% had hormone therapy only and 47% had chemotherapy with or without hormone therapy.

"Most patients who received no adjuvant systemic therapy had grade I tumours, smaller than 2cms, had no signs of cancer in their lymph nodes and were classified as ultra-low or low risk by the 70-gene signature," said Dr Lopes Cardozo.

When the researchers looked at survival rates at eight years, they found that patients with screen-detected cancers had an eight-year DMFI rate of 98.2% in 118 women with ultra-low-risk tumours, 94.6% in the 398 women with low-risk tumours, and 93.8% in the 238 women with high-risk tumours.

Patients with interval cancers had an eight-year DMFI rate of 97.4% in the 39 women with ultra-low-risk tumours, 92.2% in the 143 women with low-risk tumours, and 85.2% in the 166 women with high-risk tumours.

Among patients with high-risk tumours, those that were detected in the interval between screenings had a 2.4-fold increased chance of developing distant metastases compared to those whose cancer was detected during screening.

Dr Lopes Cardozo concluded: "Both screen-detected and interval breast cancers have very good eight-year distant metastasis-free interval rates. However, among patients with high-risk tumours as classified by the 70-gene signature, there is a significant difference in these rates between screen-detected and interval cancers. Combining the prognostic information provided by the 70-gene signature and the method of detection can help to choose the best treatment for these patients."

Professor David Cameron, from the University of Edinburgh Cancer Centre, UK, who represents the European Breast Cancer Council at EBCC12, was not involved with the research. He commented: "This study highlights an interesting difference between breast cancers that are detected at the time a woman attends a scheduled appointment as part of a national screening programme (screen detected) and those that are diagnosed in the interval between screenings (interval cancers). It has been previously noted that interval cancers are more likely to be high grade and that would be associated with a poorer outcome, but the novel finding here is that for those cancers identified biologically as being high risk by the 70-gene signature test, the screen detected ones do better than those presenting as interval cancers.

"If these results are confirmed in another series, it would suggest that earlier diagnosis via screening of more biologically aggressive cancers is worthwhile: screen-detecting such cancers may improve patients' survival. The findings also suggest that clinicians should take into account the method of detection as an additional prognostic factor when considering adjuvant therapy, enabling further personalisation of therapy to the individual woman and her cancer. This is as important for low-risk cancers as for high-risk ones. Longer follow-up for low-risk cancers could give us more information as to whether more aggressive treatments could be avoided, as these tumours can recur 15 to 20 years later."

Credit: 
European Organisation for Research and Treatment of Cancer

Personalized cancer therapy improves outcomes in advanced disease, says study

image: Razelle Kurzrock, MD, director of the Center for Personalized Cancer Therapy at Moores Cancer Center.

Image: 
University of California San Diego

Patients receiving care for advanced cancer at Moores Cancer Center at UC San Diego Health were more likely to survive or experience a longer period without their disease progressing if they received personalized cancer therapy, report University of California San Diego School of Medicine researchers.

Led by Razelle Kurzrock, MD, director of the Center for Personalized Cancer Therapy at Moores Cancer Center and senior author of the study, a multidisciplinary molecular tumor board was established to advise treating physicians on course of care using an individual patient's molecular tumor makeup to design precision medicine strategies.

"Patients who underwent a molecular tumor board-recommended therapy were better matched to genomic alterations in their cancer and had improved outcomes," said Kurzrock. "The three-year survival for patients with the highest degree of matching and who received a personalized cancer therapy was approximately 55 percent compared to 25 percent in patients who received therapy that was unmatched or had low degrees of matching."

Of 429 patients evaluated by the molecular tumor board, 62 percent were matched to at least one drug, report the researchers in the October 2, 2020 online issue of Nature Communications. Twenty percent of patients matched to all recommended drugs, including combination therapies.

The tumor board acted in an advisory role and treating physicians chose not to use the board's recommended strategy in 38 percent of cases, opting instead for a standard therapy approach that might have been unmatched to the patient's genetic alterations or had a low degree of matching. These patients experienced a lower progression-free survival and overall survival rates.

The use of next-generation sequencing allows for the identification of novel potential targets for patients with cancer to improve outcomes, but there are challenges to using this approach widely, said Shumei Kato, MD, associate professor of medicine at UC San Diego School of Medicine and first author.

"One of the hurdles is that every cancer patient appears to be carrying different molecular and genomic patterns despite having the same cancer type," said Kato, a Moores Cancer Center medical oncologist specializing in rare and gastrointestinal cancers. "This can be challenging since we are customizing therapy based on the unique genomic pattern patients have, and thus it is difficult to predict the response. In addition, this approach requires multidisciplinary expertise as well as access to drugs or clinical trials not always available in smaller practices."

At Moores Cancer Center, the molecular tumor board is composed of experts in basic, transitional and clinical research as well as bioinformatics, genetics, radiology, pathology and physicians in multiple specialties such as medical, surgical and radiation oncology.

Further clinical investigations with a larger sample size are necessary to identify the matching score thresholds that determine the usefulness of a precision medicine approach, said the researchers.

Credit: 
University of California - San Diego

Harvesting vegetation on riparian buffers barely reduces water-quality benefits

image: Riparian buffer designs studied included widths of 35 to 100 feet, some all grass, some all trees, and some -- like the one shown -- both trees and grass. On some, the effects of harvesting grass every year and trees every three years were modeled.

Image: 
Rob Brooks/Penn State

Allowing farmers to harvest vegetation from their riparian buffers will not significantly impede the ability of those streamside tracts to protect water quality by capturing nutrients and sediment -- and it will boost farmers' willingness to establish buffers.

That is the conclusion of Penn State College of Agricultural Sciences researchers, who compared the impacts of six riparian buffer design scenarios over two, four-year crop rotations in two small central and southeastern Pennsylvania watersheds. Two of the buffer scenarios included the harvesting of switchgrass and swamp willow trees.

Allowing farmers to harvest vegetation from their riparian buffers and sell it for biofuels -- not permitted under current Conservation Reserve Enhancement Program, or CREP, federal regulations -- would go a long way toward persuading farmers to establish riparian buffers, researchers contend. And farmers' buy-in is badly needed in Pennsylvania, where hundreds of miles of new buffers are needed along streams emptying into the Chesapeake Bay to help the state meet water-quality standards.

"This is the first long-term study in the Chesapeake Bay watershed to model how harvesting vegetation affects riparian buffer performance over the full length of a buffer contract," said researcher Heather Preisendanz, associate professor of agricultural and biological engineering. "Allowing harvesting of the buffer vegetation -- either trees or grasses -- minimally impacted water quality, with only slight annual average reductions in the capture of nitrogen, phosphorus and sediment."

In addition, she noted, under the highest input loading conditions -- heavy runoff after storms -- buffers with lower removal efficiencies removed more total mass of pollutants than did buffers with high-removal efficiencies, if they were between streams and fields with row crops such as corn and soybeans. The location of the buffer was most important.

The researchers, who modeled runoff and resulting pollution from agricultural fields reaching the streams, studied riparian buffer performance on Spring Creek in Centre County and Conewago Creek in Lancaster County. Buffer design scenarios studied included 35-feet-wide grass; 50-feet-wide grass; 50-feet-wide deciduous trees; 100-feet-wide grass and trees; 100-feet-wide grass and trees, with trees harvested every three years; and 100-feet-wide grass and trees, with grass harvested every year.

The research team developed these scenarios after considering feedback from focus group meetings with farmers in the two watersheds. Farmers indicated they wanted to be able to install buffers tailored to their properties with the prospect of generating limited revenue.

In the Spring Creek watershed -- which has been studied closely by Penn State agricultural scientists for decades -- 16 years of daily-scale nutrient and sediment loads from three crop rotations and two soils were simulated in a soil and water assessment tool. That data was used as an input to a riparian ecosystem management model used nationally to better understand how a buffer's effectiveness changes as a function of input load, buffer design and buffer management.

The simulation results, recently published in the Journal of Environmental Quality, suggest that for buffers of the same width, the farmer-preferred grass vegetation outperformed policy-preferred vegetation of trees for sediment, nitrogen and phosphorus removal.

The findings of the research have important implications for informing flexible buffer design policies and enhanced placement of buffers in watersheds impaired by nutrient and sediment, Preisendanz explained. She pointed out, however, that more research may be needed to examine tradeoffs between water-quality impacts and other ecosystem services, such as streambank stabilization, habitat and stream shading.

"If incorporated into policy, these findings could remove one barrier to farmer adoption of riparian buffers," she said. "Based on our conversations with famers in focus groups, we think this approach -- government being more flexible with buffer designs and allowing harvesting -- would go a long way toward farmers agreeing to create more riparian buffers."

The state Department of Conservation and Natural Resources currently is promoting "multifunctional" buffers, Preisendanz added. "Our hope is that this work will help to inform tradeoffs of flexible buffer designs and management options in this new program."

Credit: 
Penn State

New COVID test doesn't use scarce reagents, catches all but the least infectious

image: Jason Botten and Emily Bruce, who pioneered a streamlined COVID-19 test that doesn't use scarce chemicals, in their research lab in the University of Vermont's Larner College of Medicine. The machine between them is used to measure the presence and quantity of viral RNA in patient samples.

Image: 
Brian Jenkins

A major roadblock to large scale testing for coronavirus infection in the developing world is a shortage of key chemicals, or reagents, needed for the test, specifically the ones used to extract the virus's genetic material, or RNA.

A team of scientists at the University of Vermont, working in partnership with a group at the University of Washington, has developed a method of testing for the COVID-19 virus that doesn't make use of these chemicals but still delivers an accurate result, paving the way for inexpensive, widely available testing in both developing countries and industrialized nations like the United States, where reagent supplies are again in short supply.

The method for the test, published Oct. 2 in PLOS Biology, omits the step in the widely used reverse transcription polymerase chain reaction (RT-PCR) test where the scarce reagents are needed.

92% accuracy, missing only lowest viral loads

The accuracy of the new test was evaluated by a team of researchers at the University of Washington led by Keith Jerome, director of the university's Molecular Virology Lab, using 215 COVID-19 samples that RT-PCR tests had shown were positive, with a range of viral loads, and 30 that were negative.

It correctly identified 92% of the positive samples and 100% of the negatives.

The positive samples the new test failed to catch had very low levels of the virus. Public health experts increasingly believe that ultra-sensitive tests that identify individuals with even the smallest viral loads are not needed to slow spread of the disease.

"It was a very positive result," said Jason Botten, an expert on pathogenic RNA viruses at the University of Vermont's Larner College of Medicine and senior author on the PLOS Biology paper. Botten's colleague Emily A. Bruce is the paper's first author.

"You can go for the perfect test, or you can use the one that's going to pick up the great majority of people and stop transmission," Botten said. "If the game now is focused on trying to find people who are infectious, there's no reason why this test shouldn't be front and center, especially in developing countries where there are often limited testing programs because of reagent and other supply shortages."

Skipping a step

The standard PCR test has three steps, while this simpler version of the standard test has only two, Botten said.

"In step 1 of the RT-PCR test, you take the swab with the nasal sample, clip the end and place it in a vial of liquid, or medium. Any virus on the swab will transfer from the swab into the medium," he said. "In step 2, you take a small sample of the virus-containing medium and use chemical reagents, the ones that are often in short supply, to extract the viral RNA. In step 3, you use other chemicals to greatly amplify any viral genetic material that might be there. If virus was present, you'll get a positive signal."

The new test skips the second step.

"It takes a sample of the medium that held the nasal swab and goes directly to the third, amplification step," Botten said, removing the need for scarce RNA extraction reagents as well as significantly reducing the time, labor and costs required to extract viral RNA from the medium in step 2.

Botten said the test is ideally suited to screening programs, in both developed and developing countries, since it is inexpensive, takes much less processing time and reliably identifies those who are likely to spread the disease.

Its low cost and efficiency could extend testing capacity to groups not currently being tested, Botten said, including the asymptomatic, nursing home residents, essential workers and school children. The standard RT-PCR test could be reserved for groups, like health care workers, where close to 100% accuracy is essential.

An influential pre-print points way to widespread adoption of test

The two-step test developed by the University of Vermont team first caught the attention of the scientific community in March, when preliminary results that accurately identified six positive and three negative Vermont samples were published as a preprint in bioRxiv, an open access repository for the biological sciences. The preprint was downloaded 18,000 times -- in its first week, it ranked 17th among 15 million papers the site had published -- and the abstract was viewed 40,000 times.

Botten heard from labs around the world who had seen the preprint and wanted to learn more about the new test.

"They said, 'I'm from Nigeria or the West Indies. We can't test, and people's lives are at stake. Can you help us?'"

Botten also heard from Syril Pettit, the director of HESI, the Health and Environmental Sciences Institute, a non-profit that marshals scientific expertise and methods to address a range of global health challenges, who had also seen the preprint.

Pettit asked Botten to join a think tank of likeminded scientists she was organizing whose goal was to increase global testing capacity for COVID-19. The test developed by the University of Vermont and University of Washington teams would serve as a centerpiece.
To catalyze a global response, the group published a call to action in EMBO Molecular Medicine.

And it took action, reaching out to 10 laboratories in seven countries, including Brazil, Chile, Malawi, Nigeria and Trinidad/Tobago, as well as the U.S. and France, to see if they would be interested in giving the two-step test a trial run.
"Universally, the response was yes," Pettit said.

The outreach led to a new HESI program called PROPAGATE. Each of the labs in the PROPAGATE Network will use the two-step test on a series of positive and negative samples sent to them by the University of Washington to see if they can replicate the results the university achieved.

The study has already shown promising results. One of the labs in Chile has also used the test on its own samples from the community and got accurate results.

Assuming all goes well, Pettit and her colleagues at the University of Vermont and the University of Washington as well as scientists from the 10 partner sites plan to publish the results.

"The goal is the make the two-step test accessible to any lab in the world facing these hurdles and see a broad uptake," she said.

Credit: 
University of Vermont

Subsidized cars help low-income families economically, socially

ITHACA, N.Y. - For one low-income woman, not having a car meant long commutes on public transit with her children in tow, sometimes slogging through cold or inclement weather. But after buying a subsidized car through a Maryland-based nonprofit, she was able to move to a home located farther from bus stops, send her children to better schools and reach less expensive medical services.

"So many different things open up to a person that is mobile," the woman told Nicholas Klein, assistant professor of city and regional planning at Cornell University.

In "Subsidizing Car Ownership for Low-Income Individuals and Households," published in the Journal of Planning Education and Research, Klein reports insights from interviews with 30 people who gained access to inexpensive, reliable cars through the nonprofit Vehicles for Change (VFC).

He found that the cars conferred wide-ranging benefits, not only shortening commutes and opening opportunities for higher-paying jobs, but also dramatically improving quality of life. The recipients of subsidized cars spent more time with family, visited doctors they preferred, shopped for groceries more efficiently, attended more school events and enrolled kids in previously inaccessible after-school enrichment programs.

"For a lot of families, it's a really transformative moment that allows them to move up the economic ladder, to access all sorts of sort of social benefits and to just make their lives easier," Klein said of the access to subsidized cars. "It permeated everyone's lives in all sorts of different ways."

Transportation planners and scholars have debated subsidizing car ownership for decades, and VFC, which has provided more than 6,000 cars in Maryland and Virginia since 1999, is one of only a handful of such programs across the country. Critics say subsidizing cars on a large scale would exacerbate environmental pollution, traffic congestion and sprawl, and impose new cost burdens on car owners.

Klein said his research took a longer, more nuanced view that suggested such answers are "not so clear-cut." Beyond interviewees' experiences with a subsidized car, he also learned about their personal and car-ownership histories.

Most had owned cars before and planned to purchase cars again, typically through used car dealers that Klein called "pernicious." The interviewees had typically paid significantly more for used cars that were less reliable than those provided by VFC, which cost less than $1,000 and passed thorough inspections (through a job training program for formerly incarcerated individuals).

Considering that context, Klein said, scholars and policymakers should be asking not only about the benefits and consequences of having a car, but about the consequences of not making subsidized car ownership available to low-income families.

"What I see is that a lot of low-income households are going out and spending quite a bit more on unreliable used cars, and those cars may be polluting much more," he said.

Klein concluded that subsidized car ownership should be implemented more broadly, along with complementary programs providing subsidized repairs or replacement of older, more polluting and less efficient cars.

Such programs shouldn't come at the expense of longer-term investments in public transit and infrastructure expanding alternatives to cars, Klein said. But that infrastructure takes time to build and can't support everyone living in suburban or rural areas.

"In the meantime, these families are struggling, and we can think about ways to help them while also investing in high-quality public transit, and biking and walking infrastructure," Klein said.

Klein said his research relying on interviews proved valuable in a transportation field that emphasizes quantitative methods - for example, to measure economic outcomes such as how car ownership affects income or employment.

"When we only do that, we miss a lot of important nuance and details and we miss people's voices and stories," he said. "Qualitative research lets us understand the broader scope of effects that we might miss if we only rely on what's in the data, allowing us to see a broader range of possibilities."

Credit: 
Cornell University

Yan report's claims that SARS-CoV-2 was created in a Chinese lab are misleading, unethical

CAMBRIDGE, MA - September 30, 2020--The MIT Press Journal Rapid Reviews: COVID-19 (RRC:19) has openly published the first official scholarly peer reviews of pre-print research from Li-Meng Yan, Shu Kang, Jie Guan, and Shanchang Hu that claims to show that unusual features of the SARS-CoV-2 genome suggest sophisticated laboratory modification rather than natural evolution. Reviewers Robert Gallo, Takahiko Koyama, and Adam Lauring rate the study as misleading and write that the "manuscript does not demonstrate sufficient scientific evidence to support its claims."

Find peer reviews and information about this study at Rapid Reviews website.

While this research has been widely debunked in popular media, scholarly peer review represents a different type of rebuke from the scientific community. The original study was posted on a public pre-print server without the benefit of peer review--a necessary part of the scientific publishing process in which scientists review one another's work, vetting research for accuracy and evaluating methods and evidence. Pre-prints enable researchers to share information quicker, but they have created a need for rapid and transparent peer review to correct misinformation about COVID-19 and to minimize the influence of unverified research.

"While pre-print servers offer a mechanism to disseminate world-changing scientific research at unprecedented speed, they are also a forum through which misleading information can instantaneously undermine the international scientific community's credibility, destabilize diplomatic relationships, and compromise global safety," explains the RR:C19 Editorial Office.

RR:C19 was launched in June 2020 to provide rapid and transparent peer review of COVID-19 pre-prints. When the 'Yan Report' was published in September, RR:C19 quickly sought out peer reviews from world-renowned experts in virology, molecular biology, structural biology, computational biology, vaccine development, and medicine.

These reviews are now openly published, along with a response from the RR:C19 Editorial Office, that states, "Collectively, reviewers have debunked the authors' claims that: (1) bat coronaviruses ZC45 or ZXC21 were used as a background strain to engineer SARS-CoV-2, (2) the presence of restriction sites flanking the RBD suggest prior screening for a virus targeting the human ACE2 receptor, and (3) the furin-like cleavage site is unnatural and provides evidence of engineering. In all three cases, the reviewers provide counter-arguments based on peer-reviewed literature and long-established foundational knowledge that directly refute the claims put forth by Yan et al. There was a general consensus that the study's claims were better explained by potential political motivations rather than scientific integrity."

Reviewer Dr. Robert Gallo, biomedical researcher and co-founder of The Institute of Human Virology
Evidence Scale Rating: Misleading

"Widely questionable, spurious, and fraudulent claims are made throughout the paper about the thought-to-be precursor of SARS-2, RaTG13, found in bat caves. The author's attacks include quotes which have not been referenced, including how this 'has been disputed and its truthfulness widely questioned. Soon a paper proving that will be submitted.' She then goes on to attack several genome sequences as fraudulent, ranging from pangolin coronaviruses to bat coronaviruses, again without evidence. The reference she cites for that, in fact, does not make that claim."

Reviewer Dr. Takahiko Koyama, IBM Research, Computational Biology Center
Evidence Scale Rating: Misleading

"[The] authors' speculation of furin cleavage insert PRRA in spike protein seemed quite interesting at first. Nevertheless, recently reported RmYN02 (EPI_ISL_412977), from a bat sample in Yunnan Province in 2019, has PAA insert at the same site[2]. While the authors state that RmYN02 is likely fraudulent, there are no concrete evidences to support the claim in the manuscript. In addition, argument of codon usage of arginine in PRRA is not convincing since these are likely derived from some kind of mobile elements in hosts or other pathogens. Further investigations are necessary to unravel the mystery of the PRRA insert. For these reasons, we conclude that the manuscript does not demonstrate sufficient scientific evidences to support genetic manipulation origin of SARS-CoV-2."

Reviewer Dr. Adam Lauring, University of Michigan, Internal Medicine
Evidence Scale Rating: Misleading

"A key aspect of research ethics and the responsible conduct of research is to include information on who supported the work - financially or otherwise. The authors' affiliation is the "Rule of Law Society & Rule of Law Foundation." It is not clear who supports this Foundation or what its purpose is. It is important for there to be transparency regarding research support, especially for a manuscript that is based on conjecture as opposed to data or empiricism. It is also unethical to promote what are essentially conspiracy theories that are not founded in fact."

Credit: 
The MIT Press

Effect of avoiding cow's milk formula at birth on preventing asthma in children

What The Study Did: Extended follow-up of randomized clinical trial participants was used to investigate whether the risk of asthma or recurrent wheeze among young children was changed by avoiding supplementing breastfeeding with cow's milk formula after birth.

Authors: Mitsuyoshi Urashima, M.D., Ph.D., M.P.H., of the Jikei University School of Medicine in Tokyo, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.18534)

Editor's Note: The article includes conflicts of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

New research on cataract surgery in order to improve health care

image: Prof. Madeleine Zetterberg, University of Gothenburg.

Image: 
Photo by Anna-Maria Timbus

In general, surgeons who perform numerous cataract operations every year encounter relatively few severe cases, and this probably contributes to their lower complication rate, as shown by a study led from the University of Gothenburg. These results provide new knowledge in the endeavor to further improve healthcare for a large group of patients.

Cataract surgery is the most frequent surgical intervention in Sweden, carried out some 130,000 times annually. During the surgery, which takes about ten minutes, the cloudy lens in the patient's own eye is replaced by an implanted artificial one made of plastic ("intraocular lens").

Data from the Swedish National Cataract Register, where these operations are recorded, enabled the researchers to analyze surgical outcomes in the years 2007-2016 at an individual level -- that is, for each surgeon. Although the surgeons' identities were coded, the results provide new knowledge and understanding.

A previous study showed that surgeons who performed 500 or more cataract operations a year had the best outcomes, in terms of an established quality measure: the proportion of patients suffering the complication known as "posterior capsule rupture." This means that a hole forms in the capsule that surrounds the old lens and is used as support for the new one.

In the current study, now published in the journal Ophthalmology, the researchers have proceeded to investigate how much case mix -- that is, how many cases of a more severe nature are treated by individual surgeons -- affects the outcome.

Madeleine Zetterberg, Professor of Ophthalmology at Sahlgrenska Academy, University of Gothenburg, and senior consultant at the University Hospital, is the lead and corresponding author.

"We saw that although the high-volume surgeons, too, had severe surgical cases, that proportion was lower than for the other surgeons. So an overwhelming proportion of the high-volume surgeons' operations were simple cases, which may contribute to their lower incidence of complications," she says.

Zetterberg and her fellow authors belong to the steering group for the Swedish National Cataract Register. Since data on operations and complications for each individual surgeon began to be registered in 2007, the proportion of high-volume surgeons has risen successively.

"Surgeons need to be aware of how operation volumes affect outcomes. A cataract operation is mostly a routine intervention, and the complication rate is generally low. But it's important to remember that complications do occur and they may be serious."

Since 2015, a majority of cataract procedures in Sweden are being performed at private clinics, to which the work is outsourced by the regional authorities to meet the high level of needs. In the next study, Zetterberg will look more closely at the role of the type of clinical unit -- whether under private or public management -- in terms of operation volume, incidence of complications and degree of severity among the cases dealt with.

"This is vital information for us to learn how best to shape healthcare so as to ensure both high accessibility and good patient safety," she concludes.

Credit: 
University of Gothenburg

How speech propels pathogens

image: Production of saliva filaments on the lips.

Image: 
M. Abkarian and H.A. Stone

Speech and singing spread saliva droplets, a phenomenon that has attracted much attention in the current context of the Covid-19 pandemic. Scientists from the CNRS, l'université de Montpellier, and Princeton University* sought to shed light on what takes place during conversations. A first study published in PNAS revealed that the direction and distance of airflow generated when speaking depend on the sounds produced. For example, the accumulation of plosive consonants, such as the "P" in "PaPa," produces a conical airflow that can travel up to 2 metres in 30 seconds. These results also emphasize that the time of exposure during a conversation influences the risk of contamination as much as distance does. A second study published on 2 October in the journal Physical Review Fluids describes the mechanism that produces microscopic droplets during speech: saliva filaments form on the lips for the consonants P and B, for example, and are then extended and fragmented in the form of droplets. This research is being continued with the Metropolitan Opera Orchestra ("MET Orchestra") in New York, as part of a project to identify the safest conditions for continuing this prestigious orchestra's activity.

Credit: 
CNRS

Tweaks to land-based conservation efforts would pay huge freshwater ecosystem dividends

image: Flona Tapajo?s, Para?, Brasil by Marizilda Cruppe - Rede Amazo?nia Sustanta?vel (6)

Image: 
Rede Amazo?nia Sustanta?vel (6)

CORVALLIS, Ore. - Conservation projects aimed at protecting land-dwelling species could net major gains in helping species living in streams, lakes and wetlands with relatively minor adjustments, an international research collaboration that included Oregon State University has discovered.

Published today in Science, the findings are important because freshwater ecosystems host roughly 10% of all known species and one-third of all vertebrates despite comprising less than 1% of the Earth's surface.

Streams, ponds, lakes, rivers and swamps also play a key role in climate regulation and in providing food and fuel for communities around the globe.

Thanks to human-caused pressures over the last half-century - including habitat loss, overexploitation of resources, dam building and introduction of non-native species - freshwater vertebrates have seen their populations fall by about 80%, more than double the decline for marine and terrestrial vertebrate populations.

Climate change and pollution are exacerbating the problem, and scientists say new conservation approaches are needed to save freshwater ecosystems and the species that live in them. But typically, conservation efforts have focused much more heavily on land than water.

Bob Hughes, senior research professor of fisheries and wildlife in OSU's College of Agricultural Sciences, was part of a collaboration led by scientists at Brazil's University of São Paulo and University of Lavras and Britain's Lancaster University that analyzed more than 1,500 species in the Amazon.

The scientists looked at both terrestrial and aquatic species - including fish, dragonflies, caddisflies, birds and beetles - and ran simulations of various conservation strategies.

"When the strategies prioritized only terrestrial species, the benefit to freshwater species was on average 22% of what it was for strategies that prioritized freshwater species," Hughes said. "But when we used a joint focus, it was possible to increase freshwater benefits by as much as 600% at the expense of just a 1% drop in terrestrial benefits."

Conservation projects have generally focused on protecting species that live on land, said co-lead author Cecilia Gontijo Leal from the University of São Paulo and University of Lavras, and if freshwater species are considered at all, it is assumed that they will be protected incidentally - as a by-product of efforts to conserve land species.

"But we found that to address the freshwater biodiversity crisis, freshwater species need to be explicitly incorporated into conservation planning," she said.

Co-lead author Gareth Lennox of Lancaster University said the findings illustrate "a great opportunity for conservation, where protection for one species group does not require either loss of protection for others or significant funding increases."

Because conservation energy has more often been directed at land species, information is somewhat lacking on the distribution of freshwater species, the scientists say. That means one challenge in protecting those species is not necessarily knowing where they are, particularly in tropical regions.

But because a critical factor for freshwater conservation is the concept of connectivity - the surface links between lakes, wetlands and streams - the researchers developed a new method for protecting freshwater species.

"Freshwater species crucially depend on the connectivity of river systems," said Silvio Ferraz of the University of São Paulo. "By designing conservation reserve networks that take such connectivity into consideration, we found that freshwater protection could still be doubled in the absence of species distribution data. This shows that there are few impediments to vastly improving freshwater conservation in data-poor regions of the world."

Jos Barlow from Lancaster University stressed that the urgency of the biodiversity crisis facing humanity means that the many important and endangered freshwater species can no longer be overlooked.

"Our findings show that conservation that thinks across ecosystems and habitats can provide substantially improved outcomes compared to more narrowly focused efforts," Barlow said.

Credit: 
Oregon State University

New model examines how societal influences affect US political opinions

EVANSTON, Ill. -- Northwestern University researchers have developed the first quantitative model that captures how politicized environments affect U.S. political opinion formation and evolution.

Using the model, the researchers seek to understand how populations change their opinions when exposed to political content, such as news media, campaign ads and ordinary personal exchanges. The math-based framework is flexible, allowing future data to be incorporated as it becomes available.

"It's really powerful to understand how people are influenced by the content that they see," said David Sabin-Miller, a Northwestern graduate student who led the study. "It could help us understand how populations become polarized, which would be hugely beneficial."

"Quantitative models like this allow us to run computational experiments," added Northwestern's Daniel Abrams, the study's senior author. "We could simulate how various interventions might help fix extreme polarization to promote consensus."

The paper will be published on Thursday (Oct. 1) in the journal Physical Review Research.

Abrams is an associate professor of engineering sciences and applied mathematics in Northwestern's McCormick School of Engineering. Sabin-Miller is a graduate student in Abrams' laboratory.

Researchers have been modeling social behavior for hundreds of years. But most modern quantitative models rely on network science, which simulates person-to-person human interactions.

The Northwestern team takes a different, but complementary, approach. They break down all interactions into perceptions and reactions. A perception takes into account how people perceive a politicized experience based on their current ideology. A far-right Republican, for example, likely will perceive the same experience differently than a far-left Democrat.

After perceiving new ideas or information, people might change their opinions based on three established psychological effects: attraction/repulsion, tribalism and perceptual filtering. Northwestern's quantitative model incorporates all three of these and examines their impact.

"Typically, ideas that are similar to your beliefs can be convincing or attractive," Sabin-Miller said. "But once ideas go past a discomfort point, people start rejecting what they see or hear. We call this the 'repulsion distance,' and we are trying to define that limit through modeling."

People also react differently depending on whether or not the new idea or information comes from a trusted source. Known as tribalism, people tend to give the benefit of the doubt to a perceived ally. In perceptual filtering, people -- either knowingly through direct decisions or unknowingly through algorithms that curate content -- determine what content they see.

"Perceptual filtering is the 'media bubble' that people talk about," Abrams explained. "You're more likely to see things that are consistent with your existing beliefs."

Abrams and Sabin-Miller liken their new model to thermodynamics in physics -- treating individual people like gas molecules that distribute around a room.

"Thermodynamics does not focus on individual particles but the average of a whole system, which includes many, many particles," Abrams said. "We hope to do the same thing with political opinions. Even though we can't say how or when one individual's opinion might change, we can look at how the whole population changes, on average."

Credit: 
Northwestern University

Medicine for multiple sclerosis patients inhibits coronavirus - at least in a test tube

An antiviral medication which effectively inhibits replication of the coronavirus causing the COVID-19 and, at the same time, fights the immune reaction that is killing COVID-19 patients around the world.

This is the hope of a group of researchers headed by Christian Kanstrup Holm and David Olagnier who are behind a newly-published new study in the journal Nature Communications. The study shows that a drug called dimethyl fumarate (DMF), which is approved for the treatment of multiple sclerosis patients, inhibits the growth of a range of viruses in the body's cells and that this includes the coronavirus (SARS-CoV2) - at least when the researchers test it in a test tube.

"As we're doing basic research, we obviously don't know whether the drug works on infections in humans, and it's up to the infectious disease experts to test for this. However, I have to say that I'm very optimistic," says Christian Kanstrup Holm, who, like his colleague David Olagnier, is associate professor at the Department of Biomedicine at Aarhus University, Denmark.

The current research results have been underway for a while now. When the pandemic struck, Christian Kanstrup Holm and his colleagues were in the process of testing the effects of a drug which was virtually identical with a particular sclerosis medicine, namely a substance called 4-octyl-itaconate, which is used on e.g. the herpes virus, smallpox virus (vaccinia virus) and zika virus, and which is also known to lead to foetal defects - all as part of the hunt for a broad spectrum antiviral medication. And their testing succeeded beyond expectations.

"Then the coronavirus suddenly appeared, which we therefore also tested, and saw an enormous effect. The number of duplications that the coronavirus makes of itself in the body's cells were simply drastically reduced," explains Christian Kanstrup Holm.

"At the same time, the drug inhibited the immune reaction or inflammatory condition that constitutes a large portion of the actual threat for coronavirus patients. People don't just die of the virus in itself, but also of the inflammation that occurs in the lungs," he says.

When the research group saw the encouraging results with 4-octyl-itaconate, they repeated the tests with a corresponding approved product, dimethyl fumarate (DMF), which showed virtually the same inhibitory effect.

This means that the effect of dimethyl fumarate (DMF) can be tested on corona patients 'here and now', if clinicians in Denmark or abroad - and the company that holds the patent - are prepared to test it in human trials.

"You can really save a lot of time when you're testing a medication that has already been approved and tested in another context," says Christian Kanstrup Holm with reference to the statutory phases involved in getting a medication approved from scratch.

"I'd really likely to be attached to this type of clinical trial if there are infectious disease researchers who assess that the result is worth proceeding with. As a basic researcher, I have neither access to patients, nor am I qualified to conduct clinical testing," he adds.

Credit: 
Aarhus University

Nitric oxide a possible treatment for COVID-19

Researchers at Uppsala University have found that an effective way of treating the coronavirus behind the 2003 SARS epidemic also works on the closely related SARS-CoV-2 virus, the culprit in the ongoing COVID-19 pandemic. The substance concerned is nitric oxide (NO), a compound with antiviral properties that is produced by the body itself. The study is published in the journal Redox Biology.

"To our knowledge, nitric oxide is the only substance shown so far to have a direct effect on SARS-CoV-2," says Åke Lundkvist, a professor at Uppsala University, who led the study.

Since there is still no effective cure for COVID-19, the main emphasis in the treatments tested has been on relieving symptoms. This can shorten hospital stays and reduce mortality. To date, however, it has not been possible to prove that any of these treatments has affected the actual virus behind the infection.

Nitric oxide (NO) is a compound produced naturally in the body. Its functions include acting like a hormone in controlling various organs. It regulates, for example, tension in the blood vessels and blood flow between and within organs. In acute lung failure, NO can be administered as inhaled gas, in low concentrations, to boost the blood-oxygen saturation level. During the SARS (severe acute respiratory syndrome) coronavirus epidemic of 2003, this therapy was tried out with success. One key reason for the successful results was that inflammation in the patients' lungs decreased. This property of nitric oxide - the protection it affords against infections, by being both antibacterial and antiviral - is the very one that now interests the researchers.

Their study builds further on a discovery about the coronavirus that caused the first SARS epidemic. In 2003, NO released from S-Nitroso-N-acetylpenicillamine (SNAP) proved to have a distinct antiviral effect. The researchers from Uppsala University and Karolinska Institute have now investigated how the novel coronavirus involved in the current pandemic, SARS CoV-2, reacts to the compound. And SNAP was shown to a clear antiviral effect on this virus, too - and an effect that grew stronger as the dose was raised.

"Until we get a vaccine that works, our hope is that inhalation of NO might be an effective form of treatment. The dosage and timing of starting treatment probably play an important part in the outcome, and now need to be explored as soon as possible," Åke Lundkvist says.

The research group are now planning to proceed by investigating the antiviral effects of NO emitted in gas form. To do so, they will construct a model in the laboratory in order to safely simulate a conceivable form of therapy for patients.

Credit: 
Uppsala University

Research shows cell perturbation system could have medical applications

image: This image depicts the delivery/sampling system.

Image: 
Northwestern McCormick School of Engineering

Cell lines injected with free nucleic acid are widely used for drug discovery and disease modeling. To avoid genetically mixed cell populations, investigators use dilution techniques to select single cells that will then generate identical lines. However, the route of limiting dilutions is tedious and time consuming.

A new study by Northwestern researchers shows how Nanofountain Probe Electroporation (NFP-E), a tool that delivers molecules into single-cells, could solve that issue, and could lead to new applications for drug screening and designing patient-specific courses of treatment.

The team, led by Northwestern Engineering's Horacio Espinosa and including Joshua Leonard, demonstrates the versatility of NFP-E -- which introduces DNA or RNA into cells using electricity. It can also deliver both proteins and plasmids in a variety of animal and human cell types with dosage control. The team included John Kessler, the Ken and Ruth Davee Professor of Stem Cell Biology and professor of neurology and pharmacology at the Northwestern University Feinberg School of Medicine.

The new method can be used to study disease or for cell therapy. In the former, the genome is manipulated. In the latter, gene-editing occurs in cells such as T-cells to treat cancer with immunotherapies.

By employing single-cell electroporation, the process of introducing DNA or RNA into single cells using a pulse of electricity, which briefly open pores in the cell membrane, their work shows how NFP-E achieves fine control over the relative expression of two co-transfected plasmids. Moreover, by pairing single-cell electroporation with time-lapse fluorescent imaging, their investigation reveals characteristic times for electro-pore closure.

"We demonstrated the potential of the NFP-E technology in manipulating a variety of cell types with stoichiometric control of molecular cargo that can be used for conducting a wide range of studies in drug screening, cell therapies, and synthetic biology," said Espinosa, James N. and Nancy J. Farley Professor in Manufacturing and Entrepreneurship and professor of mechanical engineering and (by courtesy) biomedical engineering and civil and environmental engineering.

Currently, biomolecules can be delivered into cells in numerous ways: viral vectors; chemical carriers, such as cell-penetrating peptides and polymer nano-capsules; lipofectamine, and bulk electroporation.

"There exist a number of strategies for delivering biomolecules into cells, but each has its limitations," said Leonard, associate professor of chemical and biological engineering and Charles Deering McCormick Professor of Teaching Excellence. "For instance, chemical carriers confer relatively slow delivery and can be toxic to the cell; viral vectors are often efficient but can induce adverse immune responses and insertional genotoxicity. Use of any traditional method often requires substantial effort to optimize the protocol depending on the cell type and molecule to be delivered, and, therefore, a readily generalizable biomolecule delivery strategy would offer some meaningful advantages."

The new NFP-E system enables single-cell delivery of DNA, RNA, and proteins into different immortalized cell lines as well as primary cells with more than 95 percent efficiency and more than 90 percent cell viability.

"The results indicate that the cell membrane resealing time scales non-linearly with the pulse voltage and the number of electroporation pulses, reaching a maximum at intermediate values," Espinosa said. "That means long pulsing times or high voltages appear not to be necessary for efficient molecular transport across cell membranes. That feature is important in obtaining high transport efficiency while keeping cell toxicity to a minimum."

Using single-cell electroporation technology, the researchers were able to understand transport mechanisms involved in localized electroporation-based cell sampling. One obstacle to nondestructive temporal single-cell sampling is the small amounts of cytosol -- the fluid inside cells -- that are extracted, which makes it challenging to test or detect RNA sequences or proteins.

Research showed that the scaling of membrane resealing time is a function of various electroporation parameters, providing insight into post-pulse electro-pore dynamics.

"The work addresses the need to understand ways to increase the cytosol-sampled amount, without adversely affecting cells," Espinosa said. "That can guide the research community in designing experiments aimed at electroporation-based sampling of intracellular molecules for temporal cell analysis."

This research is related to previous work that developed a minimally invasive method to sample cells that can be repeated multiple times. That earlier investigation, which used electric pulses to extract enzymes from the cytosol, assisted understanding of the kinetics of pore formation and closure.

Credit: 
Northwestern University