Body

Head to head comparison of five assays used to detect SARS-CoV-2 antibodies shows Siemens and Oxford assays met regulatory targets

New research being presented at the ESCMID Conference on Coronavirus Disease (ECCVID, online 23-25 September) shows that, in a head-to-head comparison of five tests used to detect SARS-CoV-2 antibodies (known as 'immunoassays'), an assay manufactured by Siemens and one developed by an academic partnership led by the University of Oxford had the most accurate results. The study is published in The Lancet Infectious Diseases, as part of a special ECCVID session featuring The Lancet journals.

Testing for SARS-CoV-2 antibodies can be of benefit to understanding how many people have been infected with SARS-CoV-2, and how people respond to vaccines that are being evaluated in research studies. The presence of antibodies may also correlate with protective immunity from SARS-CoV-2 re-infection, although this remains to be clearly demonstrated.

Several manufacturers have developed SARS-CoV-2 antibody immunoassays compatible with global laboratory infrastructures, enabling widespread testing of hundreds to thousands of samples per day. Understanding the performance of these tests is highly relevant to optimising their usage. The scale-up required for regular population-wide testing (e.g., every few weeks or months) might exceed the capacity of currently available commercial platforms, and additional, accurate, high-throughput tests would be of value.

To date, few thorough, direct assessments of immunoassay performance on large sample sets have been done, and governments, regulators, and clinical laboratories have had to balance the urgent need to facilitate the demand for serological testing with the few data available on assay performance. This has led to a relaxation of typical assessment criteria in the regulation and approval of tests on the market.

This study, carried out by The National SARS-CoV-2 Serology Assay Evaluation Group, a team of researchers and scientists collaborating across several UK institutions including Public Health England (Porton Down), involved a head-to-head assessment of four widely available commercial assays: the SARS-CoV-2 IgG assay (Abbott, Chicago, IL, USA), LIAISON SARS-CoV-2 S1/S2 IgG assay (DiaSorin, Saluggia, Italy), Elecsys Anti-SARS-CoV-2 assay (Roche, Basel, Switzerland), SARS-CoV-2 Total assay (Siemens, Munich, Germany); and a novel 384-well assay (the Oxford immunoassay). The study calculated the sensitivity (the ability of a test to correctly identify those with SARS-CoV-2 antibodies or 'true positive' rate) and the specificity (the ability of the test to correctly identify those without SARS-CoV-2 antibodies or 'true negative' rate).

Sensitivity and specificity were calculated by testing 976 pre-pandemic blood samples (collected several years before the SARS-CoV-2 pandemic started, and therefore known to be negative for SARS-CoV-2 antibodies) and 536 blood samples from patients with laboratory-confirmed SARS-CoV-2 infection (by RT-PCR), collected at least 20-days post symptom onset. This was in line with the UK Medicines and Healthcare products Regulatory Agency (MHRA) guidance on how these tests should be evaluated.

Using the tests exactly as specified by the manufacturers, the best results were delivered by the Siemens assay (sensitivity 98·1% / specificity 99·9%) and the Oxford immunoassay (sensitivity 99·1% / specificity 99·0%). For the Abbott assay sensitivity was 92·7% and specificity was 99·9%; for the DiaSorin assay sensitivity was 95·0% and specificity was 98·7%; for the Roche assay sensitivity was 97·2% and specificity was 99·8%. The researchers also found that changing the assay thresholds (i.e. the test value distinguishing between a 'positive' and a 'negative' test result) and using them on samples taken 30 days or more post-symptom onset (i.e. allowing more time for antibody responses to develop in affected individuals) could result in improved test performance.

"By running all the assays on the same large panel of blood samples, we showed that the Siemens assay and the Oxford immunoassay both achieved sensitivity and specificity of at least 98% on samples taken at least 20 days post symptom onset, in line with the current MHRA guidance for the regulatory approval of these tests.

However, all assays could potentially achieve these specifications through threshold adjustment, or by assessing samples collected at least 30 days post symptom onset, consistent with the time-dependent nature of antibody responses," explain the authors, who include Dr Nicole Stoesser, a clinician-scientist from the Nuffield Department of Medicine at the University of Oxford, UK.

She adds: "There is no such thing as a 'perfect test', but accurately evaluating how these tests perform can help us understand their limitations and improve how they are used. Importantly, consideration needs to be given to how many false-positive and false-negative results might occur with any given test; this depends on both the test performance, and how many people in the population being tested genuinely have SARS-CoV-2 antibodies. Overall however, our study supports the fact that global serology testing needs can be met using different assays, mitigating against the risk of shortages, and allowing deployment in laboratories with different analysers already installed for other testing purposes."*

However, she cautions: "Although all these assays can effectively detect SARS-CoV-2 antibodies, the nature and durability of any immunity conferred by these antibodies remain unclear."

She concludes: "This study represents a benchmark for future assessments of serological tests. New tests should be similarly rigorously evaluated. Such assays will be an important part of the clinical and research landscape in guiding public health policy, with effects to be delivered at the individual level and population level."

Credit: 
European Society of Clinical Microbiology and Infectious Diseases

Study suggests elderly care home outbreaks in England were caused by multiple indepedent infections and also within-home spread

New research presented at this week's ESCMID Congress on Coronavirus Disease (ECCVID, held online 23-25 September) shows that outbreaks of COVID-19 in elderly care homes were caused by multiple independent infections from outside, plus within care home spread. There is also evidence of transmission between residents and healthcare workers, including paramedics, possibily linking care home outbreaks to hospital outbreaks (though the direction of transmission between individuals could not be confirmed). The study is by Dr William Hamilton, Cambridge University Hospitals NHS Foundation Trust, Cambridge, UK, and colleagues.

COVID-19 poses a major challenge to infection control in care homes. SARS-CoV-2 is readily transmitted between people in close contact and causes disproportionately severe disease in older people. Understanding the burden and transmission dynamics of COVID-19 in care home residents is therefore a public health priority.

In this study, data and SARS-CoV-2 samples were collected from patients in the East of England between 26th February and 10th May 2020, and tested at the Cambridge Public Health England Clinical Microbiology Laboratory, UK. Care home residents were identified using address search terms and Care Quality Commission registration information. Samples were genetically sequenced at the University of Cambridge or the Wellcome Sanger Institute, and viral clusters were identified within each care home based on integrated genomic and temporal differences between cases.

A total of 7,406 SARS-CoV-2 positive samples from 6,600 patients were identified, of which 1,167 (18%) were residents from 337 different care homes. 40% of the care home residents tested acutely at Cambridge University Hospitals NHS Foundation Trust (CUH) died, roughly double the unadjusted mortality rate for non-care home residents.

Genetic sequences (genomes) were available for 700/1,167 (60%) residents from 292 care homes, and 409 distinct viral clusters were defined. The largest clusters comprised more than 10 samples from the same care home, consistent with care home COVID-19 outbreaks.

"Care homes with multiple clusters suggested multiple independent viral acquisitions among residents, " say the authors. "We also identified several probable transmissions between care home residents and healthcare workers, based both in the community (carers and paramedics) and the hospital, suggesting a potential link between care home-associated and healthcare-associated COVID-19 infections."

They conclude: "We present a large genomic epidemiology study of care home-associated COVID-19 infections in the UK. Care home residents had a significant burden of COVID-19 infections and high mortality. Larger viral clusters suggested within-care home outbreaks, while multiple clusters per care home suggested independent acquisitions. Integrated genomic and epidemiological data collected at scale can provide valuable insights into SARS-CoV-2 transmission dynamics; in future, such analyses could be used for targeting public health responses."

Credit: 
European Society of Clinical Microbiology and Infectious Diseases

New study: Face-covering use up, more people are taking COVID-19 threats seriously

ORLANDO, Sept. 23, 2020 - A new National Science Foundation-funded survey of six states has found that during the past two months, more people are wearing masks, vaccine uncertainty is on the rise, and many people are overestimating their risk of becoming seriously ill and dying from COVID-19.

The results are in a new report published this month by the Risk and Social Policy Group, a team of more than 15 scholars across the country that includes University of Central Florida associate professor Lindsay Neuberger.

"One of our primary goals is to get essential COVID-19 data into the hands of policymakers to try to help guide not only policy but also the effective communication of those policies to increase public health," says Neuberger, who is with UCF's Nicholson School of Communication and Media.

"These data provide valuable insights into public perceptions and behavior and demonstrate where messaging should be focused, such as priority populations, and potential pathways for effective communication," she says.

The survey, conducted in August, is the second of a three-part, six-month study that is examining perceptions and behaviors in response to the risk of COVID-19. The first survey was completed in late May and early June.

Respondents from the first survey were surveyed again for the next round to track any changes.

More than 2,000 people responded to the second survey. The respondents were from Colorado, Idaho, Louisiana, Massachusetts, Michigan and Washington.

The researchers selected the states to capture variation in U.S. demographic and social factors.

Mask-Wearing Practices

The researchers found that since late May and early June, indoor mask wearing in public places has increased among the respondents from 66% in the first survey to 79% in the second survey.

Although the reasons for this can't be determined from the survey, Neuberger says the increase may be the result of more evidence that supports mask wearing and an increase in mask policies at state, local and business levels.

Survey respondents least likely to wear a mask in indoor, public spaces were conservative men with high school and less than a four-year degree education levels.

Neuberger says one way to reach people not wearing masks may be to focus on efficacy in risk communication messaging.

"Our data suggest one of the strongest predictors of mask wearing is actually efficacy - so the beliefs that one is both able to wear a mask and the belief that a mask can be effective in avoiding a risk," she says. "I have not seen many efficacy-boosting messages, and I think that could be a strong approach for future messaging."

Chances of Getting COVID-19

The respondents, on average, perceived they had a 30% chance they would contract the virus in the next three months, a 2% increase from survey one.

They perceived they had 36% chance of getting seriously ill from COVID-19 and a 23% chance of dying from it, up 2 and 1%, respectively, from survey one.

The researchers say that although individual risk is difficult to calculate because of differences in people's choices to social distance, wear protection and their pre-existing conditions, this is an overestimation of COVID-19 risks.

Current data from Johns Hopkins University estimates there is a 97% survival rate for COVID-19. However, the U.S. Centers for Disease Control and Prevention states that people of any age with certain underlying medical conditions are at an increased risk for severe illness from COVID-19.

Will People Get Vaccines?

People intending to get a vaccine decreased from 54 to 46% from the first to second survey.

The main reasons they did not intend to get a vaccine included: vaccine safety, vaccine effectiveness and high potential cost.

Information Seeking

Respondents stated that they received most of their COVID-19 information from television if using traditional media and Facebook if using social media.

Survey Administration

Qualtrics, a U.S.-based survey company, conducted the surveys through its online panels in which people sign up to take surveys for a fee.

Qualtrics implemented quotas to recruit a sample for each state that is roughly representative of the state's age, race and ethnicity, and income demographics based on U.S. Census data.

Due to the attrition of about 900 respondents from the first survey of more than 3,000 individuals, there was a higher proportion of young, white, female respondents relative to census demographics for each state.

What's Next?

The group will distribute the third survey in October, and the subsequent report will be available at the group's website, where the survey one report can also be found.

Credit: 
University of Central Florida

Ultra-low-cost hearing aid could address age-related hearing loss worldwide

image: Georgia Tech assistant professor M. Saad Bhamla assembles a prototype LoCHAid, an ultra-low-cost hearing aid built with a 3D-printed case and components that cost less than a dollar.

Image: 
Craig Bromley

Using a device that could be built with a dollar's worth of open-source parts and a 3D-printed case, researchers want to help the hundreds of millions of older people worldwide who can't afford existing hearing aids to address their age-related hearing loss.

The ultra-low-cost proof-of-concept device known as LoCHAid is designed to be easily manufactured and repaired in locations where conventional hearing aids are priced beyond the reach of most citizens. The minimalist device is expected to meet most of the World Health Organization's targets for hearing aids aimed at mild-to-moderate age-related hearing loss. The prototypes built so far look like wearable music players instead of a traditional behind-the-ear hearing aids.

"The challenge we set for ourselves was to build a minimalist hearing aid, determine how good it would be and ask how useful it would be to the millions of people who could use it," said M. Saad Bhamla, an assistant professor in the School of Chemical and Biomolecular Engineering at the Georgia Institute of Technology. "The need is obvious because conventional hearing aids cost a lot and only a fraction of those who need them have access."

Details of the project are described September 23 in the journal PLOS ONE.

Age-related hearing loss affects more than 200 million adults over the age of 65 worldwide. Hearing aid adoption remains relatively low, particularly in low-and-middle income countries where fewer than 3 percent of adults use the devices - compared to 20 percent in wealthier countries. Cost is a significant limitation, with the average hearing aid pair costing $4,700 in the United States and even low-cost personal sound amplification devices - which don't meet the criteria for sale as hearing aids - priced at hundreds of dollars globally.

Part of the reason for high cost is that effective hearing aids provide far more than just sound amplification. Hearing loss tends to occur unevenly at different frequencies, so boosting all sound can actually make speech comprehension more difficult. Because decoding speech is so complicated for the human brain, the device must also avoid distorting the sound or adding noise that could hamper the user's ability to understand.

Bhamla and his team chose to focus on age-related hearing loss because older adults tend to lose hearing at higher frequencies. Focusing on a large group with similar hearing losses simplified the design by narrowing the range of sound frequency amplification needed.

Modern hearing aids use digital signal processors to adjust sound, but these components were too expensive and power hungry for the team's goal. The team therefore decided to build their device using electronic filters to shape the frequency response, a less expensive approach that was standard on hearing aids before the processors became widely available.

"Taking a standard such as linear gain response and shaping it using filters dramatically reduces the cost and the effort required for programming," said Soham Sinha, the paper's first author, who was born in semi-rural India and is a long-term user of hearing aid technology.

"I was born with hearing loss and didn't get hearing aids until I was in high school," said Sinha, who worked on the project while a Georgia Tech undergraduate and is now a Ph.D. student at Stanford University. "This project represented for me an opportunity to learn what I could do to help others who may be in the same situation as me but not have the resources to obtain hearing aids."

The ability to hear makes a critical quality-of-life difference, especially to older people who may have less access to social relationships, said Vinaya Manchaiah, professor of speech and hearing sciences at Lamar University and another member of the research team. "Hearing has a direct impact on how we feel and how we behave," he said. "For older adults, losing the ability to hear can result in a quicker and larger cognitive decline."

The inexpensive hearing aid developed by Bhamla's team can obviously not do everything that the more expensive devices can do, an issue Manchaiah compares to "purchasing a basic car versus a luxury car. If you ask most users, a basic car is all you need is to be able to get from Point A to Point B. But in the hearing aid world, not many companies make basic cars."

For Manchaiah, the issue is whether the prototype device provides sufficient value for the cost. The researchers have extensively studied the electroacoustic performance of their device, but the real test will come in clinical and user trials that will be necessary before it can be certified as a medical device.

"When we talk about hearing aids, even the lowest of technology is quite high in price for people in many parts of the world," he said. "We may not need to have the best technology or the best device in order to provide value and a good experience in hearing."

The electronic components of the LoCHAid cost less than a dollar if purchased in bulk, but that doesn't include assembly or distribution costs. Its relatively large size allows for low-tech assembly and even do-it-yourself production and repair. The prototype uses a 3D-printed case and is powered by common AA or lithium ion coin-cell batteries designed to keep costs as low as possible. With its focus on older adults, the device could be sold online or over-the-counter, Bhamla said.

"We have shown that it is possible to build a hearing aid for less than the price of a cup of coffee," he said. "This is a first step, a platform technology, and we've shown that low cost doesn't have to mean low quality."

Among the device's drawbacks are its large size, an inability to adjust frequency ranges, and an expected lifetime of just a year and a half. The cost of batteries is often a hidden burden for hearing aid users, and the AA batteries are expected to last up to three weeks, which is still an improvement from the 4-5 day life expectancy of common zinc-air batteries in current hearing aids.

The researchers are now working on a smaller version of the device that will boost the bulk component cost to seven dollars and require a sophisticated manufacturer to assemble. "We'll no longer be able to solder them ourselves in the lab," said Bhamla, whose research focuses on frugal science. "This is a labor of love for us, so we will miss that."

Credit: 
Georgia Institute of Technology

Routine blood test predicts increased mortality risk in patients with COVID-19

BOSTON - A standard test that assesses blood cells can identify which patients who are admitted to the hospital with COVID-19 face a high risk of becoming critically ill and dying. This discovery, which is described in JAMA Network Open, was made by a team of investigators at Massachusetts General Hospital (MGH) based in the MGH Center for Systems Biology.

"We wanted to help find ways to identify high-risk COVID patients as early and as easily as possible--who is likely to become severely ill and may benefit from aggressive interventions, and which hospitalized patients are likely to get worse most quickly," said senior author John M. Higgins, MD, an investigator in the Department of Pathology at MGH and an associate professor of Systems Biology at Harvard Medical School (HMS).

Higgins noted that early reports from China indicated that the body's inflammatory response was extremely intense in some patients and very mild in others. His own group's previous work revealed that certain changes in the numbers and types of blood cells during inflammation are associated with poor health outcomes in patients with diseases such as heart disease, cancer, and diabetes. "We quickly re-focused our computational infrastructure towards analysis of the COVID-19 patient cohort that was growing rapidly in the Boston area last spring," explained first author Brody Foy, DPhil, a eesearch fellow in Systems Biology at MGH and HMS.

Their analysis included all adults diagnosed with SARS-CoV-2 infection and admitted to one of four hospitals in the Boston area between March 4 and April 28, 2020. Before looking for complicated changes in circulating blood cells in the 1,641 patients included in the study, the scientists first searched for patterns using currently available blood tests that are routinely performed. "We were surprised to find that one standard test that quantifies the variation in size of red blood cells--called red cell distribution width, or RDW--was highly correlated with patient mortality, and the correlation persisted when controlling for other identified risk factors like patient age, some other lab tests, and some pre-existing illnesses," said co-author Jonathan Carlson, MD, PhD.

Patients who had RDW values above the normal range when they were admitted to the hospital had a 2.7-times higher risk of dying, with a mortality rate of 31 percent compared with 11 percent in patients with normal RDW values. Also, a subsequent increase in RDW after admission was associated with an even higher risk of dying, indicating that RDW could be tracked during hospitalization to help determine whether patients are responding to treatment or getting worse.

The investigators are currently seeking to uncover the mechanisms that cause RDW elevations in severe COVID-19 cases. "Such discoveries could point to new treatment strategies or identify better markers of disease severity," noted co-author Aaron Aguirre, MD, PhD, an MGH Cardiologist and Critical Care Physician.

Credit: 
Massachusetts General Hospital

Risk factors for hospitalization, mechanical ventilation or death among patients with SARS-CoV-2

What The Study Did: This observational study used data from the Department of Veterans Affairs (VA) health care system to examine what risk factors are associated with hospitalization, mechanical ventilation and death among patients with SARS-CoV-2 infection.

Authors: George N. Ioannou, B.M.B.Ch., M.S., of the Veterans Affairs Puget Sound Healthcare System and the University of Washington in Seattle, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.22310)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

#  #  #

Media advisory: The full study is linked to this news release.

Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time http://jamanetwork.com/journals/jamanetworkopen/fullarticle/10.1001/jamanetworkopen.2020.22310?utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_term=092320

About JAMA Network Open: JAMA Network Open is the new online-only open access general medical journal from the JAMA Network. On weekdays, the journal publishes peer-reviewed clinical research and commentary in more than 40 medical and health subject areas. Every article is free online from the day of publication.

Credit: 
JAMA Network

Early treatment for leg ulcers leads to better outcomes for patients

Venous leg ulcers are common and distressing, affecting around 1 in 300 adults in the UK. They are open, often painful, sores on the leg that take months to heal and can develop after a minor injury. People with enlarged veins known as varicose veins are at high risk of developing venous leg ulcers, as they have persistently high pressure in the veins leading to skin damage.

In a clinical trial, led by researchers at Imperial College London and clinicians at Imperial College Healthcare NHS Trust, 450 patients with venous leg ulcers were treated with early surgical interventions. This resulted in faster healing and a reduced risk of the condition coming back compared with current methods of treating patients with compression stockings and delayed surgical interventions.

The researchers behind the study, published in JAMA Surgery, suggest that current guidelines on treating leg ulcers should be revised to include early assessment of varicose veins and surgical treatment of leg ulcers to deliver clinical benefits and cost savings for the NHS. They suggest that this early treatment intervention could save the NHS an estimated £100 million per year.

Lead author of the study Professor Alun Davies, Professor of Vascular Surgery at Imperial College London and a Consultant Surgeon at Imperial College Healthcare NHS Trust, said:

"Venous leg ulcers cause enormous physical and mental distress to patients as well as having a financial impact on the NHS. Our study is the first to show that early surgical treatment of leg ulcers leads to faster healing and the reduced risk of the ulcer coming back compared to current methods.

The NHS spends around 2 per cent of its budget on managing lower limb wounds and there is an urgent need to find more effective treatments. We believe that the current guidelines should be changed so that patients with leg ulcers are treated with surgery at an earlier stage. This approach will lead to better outcomes and improve patients' quality of life."

The main treatment for leg ulcers is compression bandages or stockings, to improve the vein function in the legs. There are also surgical treatments such as endovenous ablation - a 'keyhole' treatment to close varicose veins. The treatment, under local or general anaesthetic, involves a small fibre passed through a catheter and positioned at the top of the varicose vein. The fibre delivers bursts of energy that heat up the vein and seal it closed. However, under current guidelines this treatment is not usually offered until the ulcer has been present for many months, if at all. Furthermore, if the underlying cause of the ulcer is not treated there's a high risk of the ulcer coming back after treatment.

The researchers wanted to see whether performing endovenous ablation to treat varicose veins at an earlier stage can lead to faster healing and reduce the risk of venous leg ulcers returning, requiring further treatment.

Researchers recruited 450 patients with venous leg ulcers from October 2013 to September 2016. All patients had leg ulcers of less than six months old and were treated at 20 hospitals in the UK, including Imperial College Healthcare NHS Trust hospitals. Two hundred and twenty four patients were randomly assigned to receive endovenous ablation within two weeks of randomisation followed by wearing compression stockings. The rest of the patients were given compression stockings but the endovenous ablation treatment was delayed by six months or until the ulcer was healed. The researchers then followed up over a period of five years to compare how quickly they healed and the rate of leg ulcer recurrence after treatment.

Of the 426 participants whose leg ulcer had healed, 121 participants experienced at least one recurrence during follow-up. In the early-intervention group, 56 patients experienced recurrence during follow-up. In comparison, 65 participants in the delayed intervention group experience recurrence during follow-up. The rate of recurrent ulcers was 60 per cent higher in the deferred intervention group (0.16 per year of follow-up compared to 0.1 per year in the early-intervention group). They also found that healing was shorter in the early intervention compared to the deferred intervention group.

The team compared the cost of early surgical intervention with delayed intervention over three years and found that early intervention was, on average, the less costly strategy over three years.

Credit: 
Imperial College London

Mathematics: Modelling the timings of a COVID-19 second wave in Europe

How a second wave of COVID-19 infections may evolve across Europe over the next few months, using data on infection rates and travel within and between European countries, is modelled in a Scientific Reports paper. The findings suggest that a second wave in Europe will occur between July 2020 and January 2021 and that the precise timing of peaks in infection rates for each country could be controlled via social distancing, control of local hotspots and border control measures.

Using data from the first wave, but allowing for a variation of 15% in infection rates, Giacomo Cacciapaglia and colleagues showed that the timing of second wave peaks is strongly dependent on infection rates, with sooner peaks expected for countries with higher rates of infection. Social distancing measures and responsible individual behaviour, if implemented early on, can have a strong effect on when peaks occur. Taking into account the current situation in Europe where ten countries - Belgium, Bosnia, Croatia, Czechia, Greece, the Netherlands, Serbia, Slovakia, Slovenia and Spain - showed the beginnings of a second wave in early August, the authors modelled the temporal dynamics of a second wave for all countries within Europe and created a video simulation of when a second wave is likely to peak in each country.

The results show that peaks are likely to occur between July 2020 and January 2021, but the precise timing for each country could potentially be controlled via border control and social distancing measures, as well as control of local hotspots. The model, which can be easily adjusted as new data becomes available, may be a useful tool for governments, financial markets, industry and individual citizens to prepare in advance and possibly counter the threat of recurring pandemic waves, the authors suggest.

Credit: 
Scientific Reports

Study identifies weight-loss threshold for heart health in patients with obesity, diabetes

video: A Cleveland Clinic study shows that 5 to 10 percent of surgically induced weight loss is associated with improved life expectancy and cardiovascular health. In comparison, about 20 percent weight loss is necessary to observe similar benefits with a non-surgical treatment. The findings also show that metabolic surgery may contribute health benefits that are independent of weight loss. The study is published in the October issue of Annals of Surgery.

Image: 
Cleveland Clinic

CLEVELAND: A Cleveland Clinic study shows that 5 to 10 percent of surgically induced weight loss is associated with improved life expectancy and cardiovascular health. In comparison, about 20 percent weight loss is necessary to observe similar benefits with a non-surgical treatment. The findings also show that metabolic surgery may contribute health benefits that are independent of weight loss. The study is published in the October issue of Annals of Surgery.

This large observational study looked at 7,201 Cleveland Clinic patients: 1,223 patients with obesity and type 2 diabetes who underwent metabolic surgery (bariatric or weight loss surgery) were matched to 5,978 patients who received usual medical care. About 80 percent of the patients had hypertension, 74 percent had dyslipidemia (elevated triglycerides and cholesterol), and 31 percent were taking insulin to treat their diabetes.

Using different statistical models, the effects of weight loss were studied to identify the minimum weight loss needed to decrease the risk of death and of experiencing major adverse cardiovascular events, such as coronary artery events, cerebrovascular events, heart failure, kidney disease, and atrial fibrillation.

"Following metabolic surgery, the risk of death and major heart complications appears to decrease after about 5 percent and 10 percent weight loss, respectively. Whereas, in the nonsurgical group, both the risk of death and major cardiovascular complications decreased after losing approximately 20 percent of body weight," said Ali Aminian, M.D., director of Cleveland Clinic's Bariatric & Metabolic Institute, and lead author of the study.

"This study suggests greater heart disease benefits are achieved with less weight loss following metabolic surgery than medical weight loss using lifestyle interventions. The study findings suggest that there are important benefits of metabolic surgery independent of the weight loss achieved," said Steven Nissen, M.D., Chief Academic Officer of the Heart, Vascular & Thoracic Institute at Cleveland Clinic, and the study's senior author.

The groundbreaking STAMPEDE study showed metabolic surgery's beneficial effects on blood glucose control. Since then, additional studies have observed health benefits other than weight loss following metabolic surgery. In fact, this research is a secondary analysis of a large study that showed weight-loss surgery is associated with a 40 percent reduction in risk of death and heart complications in patients with type 2 diabetes and obesity.

Researchers continue to study the physiological changes in the surgically modified gastrointestinal tract, the impact on hormone secretion and the microbiome. Those beneficial changes may contribute to the cardiovascular and survival benefits of metabolic surgery, independent of weight loss. More research is needed to better understand the underlying mechanisms for the health benefits of metabolic surgery in patients who have obesity and type 2 diabetes.

Credit: 
Cleveland Clinic

HIV drugs could prevent diabetes, study suggests

image: Jayakrishna Ambati, MD, and colleagues found that patients taking drugs called NRTIs to treat HIV and hepatitis B had a 33% lower risk of developing diabetes.

Image: 
UVA Communications

A group of drugs used to treat HIV and hepatitis B could be repurposed to prevent type 2 diabetes, a new study suggests.

Researchers found that patients taking the drugs had a 33% lower risk of developing diabetes. The scientists say that the risk reduction makes sense based on how the drugs are known to work, and noted that one of the drugs, lamivudine, improved insulin sensitivity significantly in human cell samples and in a mouse model of diabetes. (In type 2 diabetes, the body loses the ability to use insulin, a hormone, to control blood sugar effectively.)

"The fact that the protective effect against the development of diabetes was replicated in multiple databases in studies from multiple institutions enhances confidence in the results," said researcher Jayakrishna Ambati, MD, of the University of Virginia School of Medicine. "We are grateful to the UVA Strategic Investment Fund for enabling us to demonstrate the power of Big Data Archeology to rapidly identify existing, approved drugs to repurpose for diseases that have an enormous impact both globally and in Virginia."

The Diabetes Pandemic

Nearly 500 million people worldwide have diabetes - primarily type 2 diabetes - and that number is expected to soar in the coming years. This carries a tremendous health burden, as diabetes is associated with many chronic medical conditions, including heart disease, atherosclerosis (hardening of the arteries), nerve damage, vision loss and impaired wound healing.

The urgency of the situation has scientists desperately hunting for better ways to prevent and manage diabetes. To determine if drugs known as nucleoside reverse-transcriptase inhibitors (NRTIs) might help, Ambati and colleagues from multiple institutions analyzed five databases encompassing a diverse group of 128,861 patients with HIV-1 or hepatitis B. The primary database consisted of the Veterans Health Administration, the largest integrated health system in the United States, and was reviewed from 2000 to 2017.

The scientists found that patients taking NRTIs were more than 30% less likely to develop diabetes. Based on their analysis, the researchers predict there is a 95% chance that the drugs would reduce diabetes risk by 29% in a clinical trial.

To better understand the findings, the researchers examined the effect of lamivudine and two other drugs from the class in human cell samples. All three proved beneficial, prompting the scientists to conclude that the class as a whole is likely helpful in preventing diabetes. (Notably, the research identified a connection between diabetes and dysregulation of the inflammasome, previously linked to both Alzheimer's disease and macular degeneration.)

"The large scale of these clinical data and the size of the protective effect provide evidence that inflammasome inhibition in humans is beneficial," Ambati said. "We are hopeful that prospective clinical trials will establish that inflammasome inhibitors known as Kamuvudines, which are less-toxic derivatives of NRTIs, will be effective not only in diabetes but also in macular degeneration and Alzheimer's disease."

Credit: 
University of Virginia Health System

Flood risks: More accurate data due to COVID-19

image: The parking lot at a supermarket in Boston where the measurements were taken.

Image: 
© MassDot/NGS/CORS

Emerging use of Global Navigation Satellite System (GNSS) makes it possible to continuously measure shallow changes in elevation of Earth surface. A study by the University of Bonn now shows that the quality of these measurements may have improved significantly during the pandemic, at least at some stations. The results show which factors should be considered in the future when installing GPS antennas. More precise geodetic data are important for assessing flood risks and for improving earthquake early warning systems. The journal Geophysical Research Letters now reports on this.

A number of countries went into politically decreed late hibernation at the onset of the Covid-19 pandemic. Many of those affected by the lockdown suffered negative economic and social consequences. Geodesy, a branch of the Earth Science to study Earth's gravity field and its shape, on the other hand, has benefited from the drastic reduction in human activity. At least that is what the study now published in the Geophysical Research Letters shows. The study, which was carried out by geodesists from the University of Bonn, investigated the location of a precise GNSS antenna in Boston (Massachusetts) as an example.

GNSS receivers can determine their positions to an accuracy of a few mm. They do this using the US GPS satellites and their Russian counterparts, GLONASS. For some years now, it has also been possible to measure the distance between the antenna and the ground surface using a new method. "This has recently allowed our research group to measure elevation changes in the uppermost of soil layers, without installing additional equipment," explains Dr. Makan Karegar from the Institute of Geodesy and Geoinformation at the University of Bonn. Researchers, for instance, can measure the wave-like propagation of an earthquake and the rise or fall of a coastal area.

The measuring method is based on the fact that the antenna does not only pick up the direct satellite signal. Part of the signal is reflected by the nearby environment and objects and reaches the GNSS antenna with some delays. This reflected part therefore travels a longer path to the antenna. When superimposed on the directly received signal, it forms certain patterns called interference. The can be used to calculate the distance between the antenna and the ground surface which can change over time. To calculate the risk of flooding in low-elevation coastal areas, it is important to know this change - and thus the subsidence of the Earth surface - precisely.

This method works well if the surrounding ground is flat, like the surface of a mirror. "But many GNSS receivers are mounted on buildings in cities or in industrial zones," explains Prof. Dr. Jürgen Kusche. "And they are often surrounded by large parking lots - as is the case with the antenna we investigated in Boston."

Cars cause disturbance

In their analysis, the researchers were able to show that parked cars significantly reduce the quality of the elevation data: Parked vehicles scatter the satellite signal and cause it to be reflected several times before it reaches the antenna, like a cracked mirror. This not only reduces the signal intensity, but also the information that can be extracted from it: It's "noisy." In addition, because the "pattern" of parked cars changes from day to day, these data can not be easily corrected.

"Before the pandemic, measurements of antenna height had an average accuracy of about four centimeters due to the higher level of noise," says Karegar. "During the lockdown, however, there were almost no vehicles parked in the vicinity of the antenna; this improved the accuracy to about two centimeters." A decisive leap: The more reliable the values, the smaller the elevation fluctuations that can be detected in the upper soil layers.

In the past, GNSS stations were preferably installed in sparsely populated regions, but this has changed in recent years. "Precise GNSS sensors are often installed in urban areas to support positioning services for engineering and surveying applications, and eventually for scientific applications such as deformation studies and natural hazards assessment," says Karegar. "Our study recommends that we should try to avoid installation of GNNS sensors next to parking lots."

Credit: 
University of Bonn

Persons with Parkinson's disease can have a brighter future

image: Schematic illustration of the two parallel tracks (a fundamental one and an applied one) that need to be followed simultaneously to create a better future for persons living with Parkinson's disease.

Image: 
Dr. Bastiaan R. Bloem and Dr. Patrik Brundin

Amsterdam, NL, September 23, 2020 - Well over six million people globally have been diagnosed with Parkinson's disease, which has an enormous impact on the lives of patients and their families and incurring mounting costs for society. In this special supplement to the Journal of Parkinson's Disease experts review common and vexing issues affecting people with Parkinson's disease as well as emerging concerns such as the importance of personalized care management.

Two of the world's leading experts in Parkinson's disease and Journal of Parkinson's Disease Editors-in-Chief guided the development of this landmark supplement: Bastiaan R. Bloem, MD, PhD, Department of Neurology, Radboud University Nijmegen Medical Center, Donders Institute for Brain, Cognition and Behavior, Center of Expertise for Parkinson & Movement Disorders, Nijmegen, The Netherlands; and Patrik Brundin, MD, PhD, Center for Neurodegenerative Science, Van Andel Research Institute, Grand Rapids, MI, USA.

Dr. Bloem and Dr. Brundin define the way forward as two parallel tracks. Efforts that are part of the first track aim to further unravel the etiology and pathophysiology underlying Parkinson's disease, as a basis for development of new therapies that will slow down or even arrest disease progression.

However, they note that, "Encouraging as these developments may be, we realize that this road is paved with tremendous challenges." They stress the importance of also investing in the second track, which aims to develop better care for the many patients who currently experience the impact of Parkinson's disease.

This supplement focuses entirely on care management issues. Pragmatic in scope, the contributions offer practical advice to physicians that they can apply in their own clinical settings.

"There are exciting developments in fundamental research that will ultimately lead to novel treatments that may slow down the progression of Parkinson's, but while we are waiting for the arrival of such treatments, this special supplement highlights a range of very important management strategies that can help to improve the quality of life of patients living with Parkinson's disease today," explained Dr. Brundin.

"Although Parkinson's disease is a complex and debilitating condition, it is also a treatable one if a comprehensive, integrated and multidisciplinary approach is applied," noted Dr. Bloem. "Of course, modern management means much more than just caring for our patients; we increasingly realize that optimal care can only be achieved by working closely together with patients and their families, in a process of co-creation and participatory care, with obviously different but equally important contributions by both professionals and patients."

To underline the importance of patients, carers and their physicians working together, the supplement begins with a contribution by two patient advocates, John Andrejack and Soania Mathur, to reflect the "voice of the customer" as the ultimate stakeholder to optimize the care for and with patients with Parkinson's disease. The authors stress the importance of education for themselves, their families and healthcare professionals on the disease, its symptoms, how to recognize them, new treatment options and coping strategies, and available lifestyle changes.

The prevalence of Parkinson's disease rises sharply with age, but it also affects a significant number of young persons (under the age of 50 years) who have specific issues that merit the attention of the multidisciplinary care team. Examples include employment, sexuality, and children. Bart Post, MD, PhD, Department of Neurology, Radboud University Medical Center, Donders Institute for Brain, Cognition and Behavior, Center of Expertise for Parkinson and Movement Disorders, Nijmegen, The Netherlands, and colleagues describe the distinction between young onset Parkinson's disease (YOPD) and late-onset. There are both genetic differences (more common in people with YOPD) and clinical differences. Dystonia (a movement disorder in which a person's muscles contract uncontrollably) and levodopa-induced dyskinesias (movement disorders that are characterized by involuntary muscle movements) are more common in YOPD.

Moreover, people with YOPD tend to have different family and societal engagements compared to individuals with late-onset Parkinson's disease. "These unique features have implications for clinical management and call for a tailored multidisciplinary approach involving shared-decision making," explained Dr. Post. "Genetic testing can be considered in YOPD but should be done at centers that have proven experience in the clinical aspects, counseling dilemmas, and genetic pitfalls of testing the genes associated with Parkinson's disease." Evaluation of mood is of particular importance, and the authors recommend the use of a mindmap to start exploring the needs of persons with YOPD.

Other topics covered in this supplement are:

The benefits of exercise

Management of pain

Management of visual dysfunction

Sleep disorders

Management of orthostatic hypotension

Choice between different advanced treatments and timing

Challenges in the management of late-stage Parkinson's disease

Multimorbidity and frailty

"Making sure that healthcare professionals are optimally trained requires continuous attention, and we hope that this special supplement will contribute to the lifelong learning process of professionals who have committed themselves to helping persons with Parkinson's disease," commented Dr. Bloem and Dr. Brundin. "At the same time, optimal care also requires active involvement of patients, and this supplement also highlights the key wishes and needs of individuals living with Parkinson's disease."

Parkinson's disease is a slowly progressive disorder that affects movement, muscle control and balance. It is the second most common age-related neurodegenerative disorder affecting about 3% of the population by the age of 65 and up to 5% of individuals over 85 years of age.

Credit: 
IOS Press

Decreased protein degradation in cerebellum leads to motor dysfunction

image: Motor function was evaluated in control mice (⚪) and mice with reduced CMA activity (⚫) by a beam walking test. A higher fault rate and shorter walking distances indicate lower motor function.

Image: 
Associate Professor Takahiro Seki

A research team from Kumamoto University, Japan has developed an animal model that reproduces motor dysfunction and cerebellar neurodegeneration similar to that in spinocerebellar ataxia (SCA) by inhibiting chaperone-mediated autophagy (CMA) in cerebellar neurons. Since CMA activity is reduced in cells expressing SCA causing proteins, CMA is expected to become a new therapeutic target for SCA—a disease that currently has no basic treatment.

SCA is an inherited, intractable neurological disorder caused by several genes. It causes atrophy and neurodegeneration of the cerebellum and is characterized by progressive motor dysfunction like staggering and dysarthria (a brain disorder that results in a loss of control the muscles used in speech). It is classified into 48 different types according to the causative genes, and the common mechanisms that lead to cerebellar atrophy and ataxia in SCA are not yet understood.

The cells of living organisms have a mechanism called autophagy that maintains homeostasis by breaking down intracellular components, such unnecessary proteins, and eliminating pathogenic microorganisms. Chaperone-mediated autophagy (CMA), a type of autophagy, transports intracellular components to lysosomes for degradation using Hsc70, a molecular chaperone, and LAMP2A, a lysosomal membrane protein. Recent research has shown that CMA is involved in the maintenance of neuronal protein homeostasis and that reduced CMA activity is involved in the pathogenesis of Parkinson's disease. The relationship of CMA with neurodegenerative diseases like Parkinson's has attracted much attention.

Kumamoto University researchers developed an original method for assessing CMA activity in cells and investigated its relationship with the pathogenesis of neurodegenerative diseases. After finding that reduced CMA activity was observed in cells expressing several SCA causing proteins, they hypothesized that reduced CMA activity might be a common part of the pathogenesis of SCA.

They generated mice with reduced CMA activity in cerebellar neurons by administering adeno-associated viral vectors. These vectors were able to introduce a gene that specifically diminishes LAMP2A into the mouse cerebellum which resulted in progressive motor dysfunction. At early stages of motor dysfunction, histological analysis showed no effect on the morphology of cerebellar neurons or cerebellar structure. However, glial cells, such as astrocytes and microglia, became activated in the cerebellum. Glial activation is a hallmark of neuroinflammation and causes neurodegeneration in various neurodegenerative disorders. At a considerably worse stage of motor dysfunction, glial cell activation as well as cerebellar neurodegeneration and associated atrophy of the cerebellar cortex became prominent.

These effects (progressive motor deficits and neurodegeneration with associated atrophy of the cerebellar cortex) observed in mice with reduced CMA activity in cerebellar neurons are consistent with observations in patients with SCA. Moreover, early glial cell activation has been observed in various SCA mouse models. Since CMA activity has been found to be reduced by the proteins responsible for SCA, the results of this study strongly suggest that reduced CMA activity in cerebellar neurons is a common molecular mechanism in SCA pathogenesis.

"We have shown that CMA is a novel therapeutic target for the treatment of spinocerebellar ataxia and may lead to the development of a fundamental treatment that has not yet been established," said Associate Professor Takahiro Seki, leader of this study. "For SCA, the presence or absence of a causative gene can be determined by genetic diagnosis. However, even if a causative gene is found, there is no way to prevent the onset of the disease. If a safe compound that activates CMA can be developed, it could be a very effective treatment and preventive agent for SCA."

Credit: 
Kumamoto University

NTU Singapore scientists devise 'Trojan horse' approach to kill cancer cells without using drugs

image: (Left to Right) Members of the NTU research team include Assistant Professor Dalton Tay from the School of Materials Science and Engineering, Research associate Kenny Wu and Associate Professor Tan Nguan Soon from the Lee Kong Chian School of Medicine.

Image: 
NTU Singapore

Cancer cells are killed in lab experiments and tumour growth reduced in mice, using a new approach that turns a nanoparticle into a 'Trojan horse' that causes cancer cells to self-destruct, a research team at the Nanyang Technological University, Singapore (NTU Singapore) has found.

The researchers created their 'Trojan horse' nanoparticle by coating it with a specific amino acid - L-phenylalanine - that cancer cells rely on, along with other similar amino acids, to survive and grow. L-phenylalanine is known as an 'essential' amino acid as it cannot be made by the body and must be absorbed from food, typically from meat and dairy products.

Studies by other research teams have shown that cancer tumour growth can be slowed or prevented by 'starving' cancer cells of amino acids. Scientists believe that depriving cancer cells of amino acids, for example through fasting or through special diets lacking in protein, may be viable ways to treat cancer.

However, such strict dietary regimes would not be suitable for all patients, including those at risk of malnutrition or those with cachexia - a condition arising from chronic illness that causes extreme weight and muscle loss. Furthermore, compliance with the regimes would be very challenging for many patients.

Seeking to exploit the amino acid dependency of cancer cells but avoid the challenges of strict dietary regimes, the NTU researchers devised a novel alternative approach.

They took a silica nanoparticle designated as 'Generally Recognized As Safe' by the US Food and Drug Administration and coated it with L-phenylalanine, and found that in lab tests with mice it killed cancer cells effectively and very specifically, by causing them to self-destruct.

The anti-cancer therapeutic nanoparticle is ultrasmall, with a diameter of 30 nanometres, or approximately 30,000 times smaller than a strand of human hair, and is named "Nanoscopic phenylalanine Porous Amino Acid Mimic", or Nano-pPAAM,

Their findings, published recently in the scientific journal Small, may hold promise for future design of nanotherapies, said the research team.

Assistant Professor Dalton Tay from the School of Materials Science and Engineering, lead author of the study, said: "Against conventional wisdom, our approach involved using the nanomaterial as a drug instead as a drug-carrier. Here, the cancer-selective and killing properties of Nano-pPAAM are intrinsic and do not need to be 'activated' by any external stimuli. The amino acid L-phenylalanine acts as a 'trojan horse' - a cloak to mask the nanotherapeutic on the inside."

"By removing the drug component, we have effectively simplified the nanomedicine formulation and may overcome the numerous technological hurdles that are hindering the bench-to-bedside translation of drug-based nanomedicine."

Intrinsic anti-cancer therapeutic properties of Nano-pPAAM

As a proof of concept, the scientists tested the efficacy of Nano-pPAAM in the lab and in mice and found that the nanoparticle killed about 80 per cent of breast, skin, and gastric cancer cells, which is comparable to conventional chemotherapeutic drugs like Cisplatin. Tumour growth in mice with human triple negative breast cancer cells was also significantly reduced compared to control models.

Further investigations showed that the amino acid coating of Nano-pPAAM helped the nanoparticle to enter the cancer cells through the amino acid transporter cell LAT1. Once inside the cancer cells, Nano-pPAAM stimulates excessive reactive oxygen species (ROS) production - a type of reactive molecule in the body - causing cancer cells to self-destruct while remaining harmless to the healthy cells.

Co-author Associate Professor Tan Nguan Soon from NTU's Lee Kong Chian School of Medicine said: "With current chemotherapy drug treatment, a common issue faced is that recurrent cancer becomes resistant to the drug. Our strategy does not involve the use of any pharmacological drugs but relies on the nanoparticles' unique properties to release catastrophic level of reactive oxygen species (ROS) to kill cancer cells."

Providing an independent view, Associate Professor Tan Ern Yu, a breast cancer specialist at Tan Tock Seng Hospital said, "This novel approach could hold much promise for cancer cells that have failed to respond to conventional treatment like chemotherapy. Such cancers often have evolved mechanisms of resistance to the drugs currently in use, rendering them ineffective. However, the cancer cells could potentially still be susceptible to the 'Trojan horse' approach since it acts through a completely different mechanism - one that the cells will not have adapted to."

The scientists are now looking to further refine the design and chemistry of the Nano-pPAAM to make it more precise in targeting specific cancer types and achieve higher therapeutic efficacy.

This includes combining their method with other therapies such as immunotherapy which uses the body's immune system to fight cancer.

Credit: 
Nanyang Technological University

Why some cancers may respond poorly to key drugs discovered

Patients with BRCA1/2 mutations are at higher risk for breast, ovarian and prostate cancers that can be aggressive when they develop - and, in many cases, resistant to lifesaving drugs. Now scientists at The University of Texas at Austin and Ajou University in South Korea have identified a driver of the drug resistance that can make a life or death difference for patients with these cancers.

"A major issue with cancer treatments is the development of resistance," said Kyle Miller, a UT Austin associate professor of molecular biosciences. "When treatments stop working for patients, it's incredibly demoralizing and it's been a huge drive in research to understand these resistance mechanisms."

In a paper published today in the journal Molecular Cell, the researchers describe a protein that may help doctors predict which patients will become resistant to a class of drugs frequently used to treat BRCA 1/2-deficient tumors. The finding could help create more effective treatment plans for their patients.

The scientists identified that a protein called PCAF promotes DNA damage in BRCA 1/2-mutated cancer cells. Patients with low levels of this protein are likely to have poor outcomes and develop resistance to a type of drug that is used to treat BRCA-deficient tumors, called a PARP inhibitor.

"PARP inhibitors are an important breakthrough in treating these aggressive cancers," Miller said. "What we found is that when levels of PCAF are low, it actually protects the cancer cells from this drug. By testing biopsy samples, doctors may be able to tell using PCAF as a molecular marker for PARP inhibitor responses what treatment may work best for a patient."

Fortunately, there is already another class of drugs on the market, called HDAC inhibitors, that can boost the effectiveness of the PCAF protein. HDAC inhibitors and PARP inhibitors have the potential to be prescribed as a combination therapy.

"Previous studies have shown that these two drugs work well together," Miller said. "We believe we've found the reason why."

It is possible to test for PCAF levels in biopsy or tissue samples, Miller said, and in the future, the test could be included on a standard panel for cancer testing.

But unlocking the workings of PCAF doesn't just offer clues to combatting cancer. Because this protein is responsible for modifying chromatin, the stuff that organizes 6 feet of DNA in each of our cells so that it fits into its nuclear volume, PCAF also may offer important clues about cell replication.

"The focus in my lab is on understanding chromatin and its impact on replicating DNA, protecting DNA and controlling access to DNA," Miller said. "Our goal is to understand how every molecule is interacting inside our cells, as this gives clues to what is going wrong in human diseases."

Credit: 
University of Texas at Austin