Body

A UOC team develops a neural network to identify tiger mosquitoes

A study by researchers in the Scene understanding and artificial intelligence (SUNAI) research group, of the Universitat Oberta de Catalunya's (UOC) Faculty of Computer Science, Multimedia and Telecommunications, has developed a method that can learn to identify mosquitoes using a large number of images that volunteers took using mobile phones and uploaded to Mosquito Alert platform.

Citizen science to investigate and control disease-transmitting mosquitoes

As well as being annoying because of their bites, mosquitoes can be the carriers of pathogens. Rising temperatures worldwide are facilitating their spread. This is the case with the tiger mosquito, Aedes albopictus, and other species in Spain and around the world. As these species spread, the science dedicated to combating the problems associated with them develops. This is how Mosquito Alert was set up, a citizen science project coordinated by the Centre for Research on Ecology and Forestry Applications, the Blanes Centre for Advanced Studies and the Universitat Pompeu Fabra to which UOC researchers have contributed.

This project brings together information collected by volunteer citizens, who use their mobile phones to capture mosquito images as well as that of their breeding sites in public spaces. Along with the photo the location of the observation and other necessary information to help in the identification of the species are also collected. This data is then processed by entomologists and other experts to confirm the presence of a potentially disease-carrying species and alert the relevant authorities. In this way, with a simple photo and an app, citizens can help to generate a map of the mosquitoes' distribution all over the world and help to combat them.

"Mosquito Alert is a platform set up in 2014 to monitor and control disease-carrying mosquitoes," says Gereziher Adhane, who worked on the study with Mohammad Mahdi Dehshibi and David Masip. "Identifying the mosquitoes is fundamental, as the diseases they transmit continue to be a major public health issue. "The greatest challenge we encountered in identifying the type of mosquito in this study was due to images taken in uncontrolled conditions by citizens", he comments. He explains, the image was not shot in close-up, and it contains additional objects, which could reduce the performance of the proposed method. Even if the images were taken up close, they were not necessarily at an angle that entomologists could quickly identify, or because the images were taken of killed mosquitos, the mosquito body patterns were deformed.

"Entomologists and experts can identify mosquitoes in the laboratory by analysing the spectral wave forms of their wing beats, the DNA of larvae and morphological parts of the body," Adhane points out. "This type of analysis depends largely on human expertise and requires the collaboration of professionals, is typically time-consuming, and is not cost-effective because of the possible rapid propagation of invasive species. Moreover, this way of studying populations of mosquitoes is not easy to adapt to identify large groups with experiments carried out outside the laboratory or with images obtained in uncontrolled conditions," he adds. This is where neural networks can play a role as a practical solution for controlling the spread of mosquitoes.

Deep neural networks, cutting-edge technology for identifying mosquitoes

Neural networks consist of a complex combination of interconnected neurons. Information is entered at one end of the network and numerous operations are performed until a result is obtained. A feature of neural networks is that they can be trained through supervised, semi-supervised, or unsupervised manner to process data and guide the network about the type of result being sought. Another important characteristic is their ability to process large amounts of data, such as those submitted by volunteers participated in Mosquito Alert project. The neural network can be trained to analyse images, among other data types, and detect small variations that could be difficult for experts to easily perceive.

"Manual inspection to identify the disease-carrying mosquitoes is costly, requires a lot of time and is difficult in settings outside the laboratory. Automated systems to identify mosquitoes could help entomologists to monitor the spread of disease vectors with ease", the UOC researcher emphasizes.

Conventional machine learning algorithms are not efficient enough for big data analysis like the data available in Mosquito Alert platform, because it contains many details and there is a high degree of similarity between the morphological structures of different mosquito species. However, in the study, the UOC researchers showed that deep neural networks can be used to distinguish between the morphological similarities of different species of mosquito, using the photographs uploaded to the platform. "The neural network we have developed can perform as well or nearly as well as a human expert and the algorithm is sufficiently powerful to process massive amounts of images," says Adhane.

How does a deep neural network work?

"When a deep neural network receives input data, information patterns are learned through convolution, pooling, and activation layers which ultimately arrive at the output units to perform the classification task," the researcher tells us, describing the complex process hidden behind this model.

"For a neural network to learn there has to be some kind of feedback, to reduce the difference between real values and those predicted by the computing operation. The network is trained until the designers determine that its performance is satisfactory. The model we have developed could be used in practical applications with small modifications to work with mobile apps," he explains. Although there is still much development work to do the researcher concludes that "using this trained network it is possible to make predictions about images of mosquitoes taken using smartphones efficiently and in real time, as has happened with the Mosquito Alert project."

Credit: 
Universitat Oberta de Catalunya (UOC)

People with ADHD and multiple psychiatric diagnoses stop their ADHD treatment more often

A research study from the The Lundbeck Foundation Initiative for Integrative Psychiatric Research iPSYCH shows that people with ADHD, who also have another psychiatric diagnosis, are more likely to stop taking their ADHD medicine.

ADHD is one of the most common psychiatric disorders in childhood and is commonly treated with medication. ADHD medicine can be divided into two groups: medicine that has a stimulating effect - also known as stimulants - and non-stimulants, which are often used if a person does not respond well to the other form of medicine.

The medication can be an effective way of reducing symptoms, by increasing individual's ability help focus and reduce hyperactivity and impulsivity.

A research result from iPSYCH now shows that people with ADHD, who also have another psychiatric diagnosis, have a higher risk of stopping their medication than people who 'only' have an ADHD diagnosis.

Fifty per cent stop taking the medication

According to research, although stimulants work for the majority of people with ADHD, the medications sometimes can cause side effects, and up to fifty per cent stop taking their ADHD medications within two years of beginning the treatment.

"We discovered that people who have another psychiatric diagnosis in addition to ADHD, for example tics, anxiety, bipolar disorder or some form of substance abuse, to a greater extent stop taking their stimulant medication or switch to a non-stimulant ADHD medication," says researcher at iPSYCH, Isabell Brikell, who lead the the study.

According to the researcher, this could potentially be due to a higher risk of side effects or a lower effect of stimulants in people with several diagnoses, for example, stimulants can in rare cases lead to tics, which may cause the person to stops taking their stimulantmedication or tries non-stimulants.

Positive effect on important areas

"It's important to understand why so many stop taking their ADHD medication. Prior research has shown that the treatment can have has a positive effects on important parameters such as school performance and a lower risk of accidents and injuries for people with ADHD," says Isabell Brikell.

The researchers also found evidence that a higher genetic risk of schizophrenia and bipolar disorder is associated with an increased risk of stopping stimulant medications.

The results have just been published in the scientific journal American Journal of Psychiatry, and the study is the largest of its kind to date.

The researchers have collected information from 9,133 people diagnosed with ADHD and their prescriptions for ADHD medications in Denmark since 1995. The study attempted to identify the genetic, clinical (age at time of ADHD diagnosis and other diagnosis) and socio-demographic factors, such as the parents education, income and psychiatric history, which may affect the risk of the person stopping theirADHD medication.

"Our findings confirm previous smaller studies by showing that certain psychiatric comorbidity may have a negative effect on the treatment outcomes for ADHD," says Isabell Brikell, and emphasises that the results are particularly interesting for clinicians working with ADHD, as they make treatment decisions when meeting new patients with ADHD.

"With more knowledge about why and who has an increased risk of interrupting their treatment, we can better equip clinicians to give these people more targeted treatment, monitoring and support," she says.

The researchers are currently in the process of analysing genetic information on more than 20,000 people with ADHD to test whether the results of the study can be confirmed and extended.

Credit: 
Aarhus University

Why we need to talk openly about vaccine side effects

image: "When communication about vaccines is not transparent, it triggers uncertainty and people feel they may be misled," says Michael Bang Petersen, professor of political science at Aarhus BSS, Aarhus University.

Image: 
Aarhus University

Concerns have been raised about the AstraZeneca and Johnson & Johnson vaccines regarding very rare but potentially fatal side effects related to low blood platelet counts and blood clots. Recently, reports also emerged that the Pfizer-BioNTech vaccine may cause a rare yet serious side effect: heart inflammation. Concerns about side effects may trigger vaccine hesitancy, which the WHO considers one of 'Ten threats to global health'. Securing sufficient acceptance of vaccines is a key challenge in defeating the coronavirus pandemic, both now and in the future.

How can health authorities and politicians help ensure public acceptance of vaccines, which - their rare side effects aside - have proven effective in preventing serious Covid-19 disease? The best way to do this is to talk openly about all aspects of the vaccines including potential negative aspects such as side effects.

"How to communicate about the vaccines is a real dilemma. Politicians have a desire to stop the pandemic as quickly as possible, and this may give them an incentive to tone down the negative sides of the vaccines in order to vaccinate as many people as possible," says Michael Bang Petersen, professor of political science at Aarhus BSS, Aarhus University.

"But our research shows that it does not foster support for vaccination when communication about the vaccines is reassuring, but vague. On the contrary, vague communication weakens people's confidence in the health authorities, and feeds conspiracy theories. When communication is not transparent, it triggers uncertainty and people feel they may be misled," says Michael Bang Petersen.

Together with colleagues from Aarhus BSS at Aarhus University, he has studied the effect of different ways of communicating about vaccines. The study included 13,000 participants, half of them Americans and the other half Danes, and the results have just been published in the widely recognized journal Proceedings of the National Academy of Sciences of the United States of America (PNAS).

Vague communication feeds conspiracies

The results of the study show that open communication fosters support for the vaccines if it transparently describes neutral and positive facts about the vaccines. However, the willingness to be vaccinated declines when the communication is open about negative features of the vaccine.

"Transparency about the negative features of a vaccine creates hesitancy. But this hesitancy is reason-based, and accordingly health authorities still have the possibility of communicating with citizens and explain to them why it may still be advisable to accept the vaccine," says Michael Bang Petersen.

On the other hand, vague or reassuring communication, where negative features of the vaccines are toned down, lowers acceptance of vaccines. The reason is that vague communication creates a sense of hesitancy and uncertainty, and this in turn feeds conspiracy theories and reduces confidence in the health authorities.

Trust is essential

The advantage of open communication - also about the negative features - is that it prevents conspiracy theories from spreading while at the same time boosting trust in the health authorities. According to the researchers, this is key to defeating the coronavirus pandemic.

"Maintaining trust in the health authorities is extremely important because this is the most crucial factor in securing public support for the vaccines. Communicating transparently about vaccines secures the single most important factor for sustaining vaccine acceptance," says Michael Bang Petersen, and he continues:

"Openness ensures long-term trust, and this is crucial if we are to be revaccinated, or in relation to the next major health crisis."

Facts about the research:

The new findings are part of a large-scale data-driven research project entitled HOPE - How Democracies Cope with COVID-19 (https://hope-project.dk/#/). The project is financed by the Carlsberg Foundation and headed by Professor Michael Bang Petersen.

Method: Pre-registered experimental studies with a total of more than 13,000 participants, half of them Danes and the other half Americans.

The research has been published in the article "Transparent communication about negative features of COVID-19 vaccines decreases acceptance but increases trust" in Proceedings of the National Academy of Sciences of the United States of America (PNAS): https://www.pnas.org/content/118/29/e2024597118

Authors: Professor Michael Bang Petersen, Postdoc Alexander Bor, Postdoc Frederik Jørgensen and Research Assistant Marie Fly Lindholt from the Department of Political Science at Aarhus BSS, Aarhus University

Credit: 
Aarhus University

Multimodal analgesia: The new 'standard of care' for pain control after total joint replacement

July 8, 2021 - Until relatively recently, opioids were a mainstay of treatment for pain following total hip or knee replacement. Today, a growing body of evidence supports the use of multimodal analgesia - combinations of different techniques and medications to optimize pain management while reducing the use and risks of opioids, according to a paper in The Journal of Bone & Joint Surgery. The journal is published in the Lippincott portfolio in partnership with Wolters Kluwer.

"Multimodal analgesia has become the standard of care for total joint arthroplasty as it provides superior analgesia with fewer side effects than opioid-only protocols," write Javad Parvizi, MD, FRCS, of Rothman Orthopaedic Institute at Thomas Jefferson University, Philadelphia, and coauthors. They provide an update on multimodal analgesia for patients undergoing total joint arthroplasty (TJA), including the protocol utilized at their institution.

Following TJA, combination techniques improve pain control - and reduce opioid risks

Good pain management is critical to achieving the best possible outcomes in hip or knee replacement. Traditionally, pain following TJA was treated with opioid-based regimens, especially patient-controlled analgesia with intravenous opioids. However, opioids have substantial adverse effects, including confusion, nausea and vomiting, and respiratory depression - in addition to the well-known risks of opioid addiction, abuse, and misuse.

"Multimodal analgesia involves the use of various agents with different mechanisms of action, thus maximizing benefit while minimizing side effects," Dr. Parvizi and colleagues write. Although the exact medications and techniques may vary by hospital, multimodal combinations are now preferred over opioid-based approaches.

Ideally, multimodal analgesia starts before surgery and continues during and after the procedure, including after the patient is discharged from the hospital. Recent studies have shown that multimodal analgesia can improve pain scores and side effects while reducing opioid use, compared to opioid-based regimens.

In their review, Dr. Parvizi and coauthors provide an update on the use of multimodal analgesia for TJA, including:

Analgesic Medications. Familiar drugs with good safety characteristics, such as acetaminophen and nonsteroidal anti-inflammatory drugs, which provide excellent pain control. Other useful medications include certain antiseizure medications, which are typically used for the treatment of nerve pain, and corticosteroids such as dexamethasone, which are highly effective in reducing inflammation. The "weak" opioid drug tramadol is sometimes used, although questions remain about its safety.

Local Anesthetics. Multimodal analgesia may also include various local anesthetic techniques. These include anesthetics given via local infiltration: similar to local anesthesia for dental procedures, these techniques prevent pain by numbing the nerves in a specific area. Local anesthetics can also be used to perform specific types of nerve blocks, especially for knee replacement surgery.

Nondrug Strategies. Nonpharmacologic techniques such as electrotherapy, acupuncture, or cryotherapy, which can be as simple as applying an ice pack, have shown promising results. However, there are questions about the quality of the current studies on these modalities. New surgical techniques, such as those that avoid the use of a tourniquet, may help reduce the need for opioids.

Dr. Parvizi and coauthors look at some emerging treatments for pain control, such as the local delivery of pain medications via implants, a procedure using cold to temporarily block pain transmission by nerves (cryoneurolysis), and the stimulation of peripheral nerves. Cannabis is being explored as a possible multimodal treatment, although studies so far have shown no reduction in pain or opioid use after TJA.

The article includes a table summarizing the Rothman Institute protocol for multimodal analgesia for TJA - from pre-emptive analgesia before surgery through follow-up care after discharge. Dr. Parvizi and colleagues conclude, "[M]ultimodal pain management is essential to guarantee proper perioperative pain control in TJA and optimize surgical outcome and postoperative recovery, while minimizing the use of opioids."

Credit: 
Wolters Kluwer Health

Precision medicine helps identify "at-risk rapid decliners" in early-stage kidney disease

Diabetes is the leading cause of kidney failure in the United States, but identifying type 1 or type 2 diabetes patients at high risk for progressive kidney disease has never had a sure science behind it.

Historically, assessing kidney function meant looking at estimated glomerular filtration rate, a calculation that determines how well blood is filtered by the kidneys, and urine albumin excretion, a urine test to detect the amount of the protein albumin, which is filtered by the kidneys. However, both tests have limited predictive power in early stage diabetes when kidney function is normal.

The therapeutic approach to both type 1 and type 2 diabetic kidney disease also follows a similar strategy, despite having different biological causes.

Now, with more advanced technology available, a multi-institution international research team sought out to understand what lipid biomarkers may predict the progression of diabetic kidney disease and how these predictors differ between type 1 and type 2 diabetes.

The case-control study, published in Diabetes Care, included more than 800 patients with type 1 diabetes since patients with type 2 have been the target of previous studies published by the lead author, Farsad Afshinnia, M.D., a nephrologist at University of Michigan Health, part of Michigan Medicine, and in the Michigan O’Brien Kidney Translational Center, part of the University of Michigan Medical School.

The case group were patients with a rapid decline in their kidney function (as measured by a substantial fall in their estimated glomerular filtration rate), compared to the control subjects who showed preserved kidney function (minimal decline of their estimated glomerular filtration rate) over a 4-year follow-up period.

Notably, Afshinnia and his team found differences in lipid predictors of diabetic kidney disease between type 1 and type 2 patients, and these biomarkers could be identified in early stage diabetes when standard markers of kidney function such as glomerular filtration rate and urine albumin excretion, are normal.

“These findings not only provide a platform for risk stratification, but also suggest the underlying mechanism of progressive kidney disease,” said Afshinnia, who also works in the University of Michigan JDRF Center of Excellence. “Understanding the mechanism may be a future target for therapeutic intervention.”

Testing a hypothesis

In earlier studies of patients with type 2 diabetes, Afshinnia and senior author Subramaniam Pennathur, M.D., also the chief of nephrology at University of Michigan Health and director of the Michigan O’Brien Kidney Translational Center, have shown that certain lipids affect the progression of diabetic kidney disease in those with type 2 diabetes.

Unique to type 2, though, lipid alterations can be mediated by insulin resistance.

Since patients with type 1 diabetes and normal kidney function are typically sensitive to insulin, the research team anticipated that the biomarkers of kidney disease progression in type 1 would be different than those in type 2.

Specifically, Afshinnia, Pennathur and their research team hypothesized that there would be a unique pattern of circulating free fatty acids, acylcarnitines and glycerolipids between type 1 and type 2 diabetes, making clear what biomarkers in the body can cause rapid or slow decline of kidney function.

“Both types of diabetes can result in diabetic kidney disease, later requiring dialysis or transplantation,” said Pennathur. “The current standard of care for patients with diabetes is to optimize their blood sugar and blood pressure control, but better biochemical markers of kidney function decline in patients with normal kidney function is lacking.”

Using precision medicine in this way could accurately identify patients vulnerable for future loss of kidney function in patients with either type 1 or type 2 diabetes.

Discovering an underlying mechanism

After quantifying more than 300 lipids in the more than 800 study participants with type 1 diabetes, the researchers found 47 lipids that were significantly different between rapid and slow kidney function decliners.

Credit: 
Michigan Medicine - University of Michigan

Reporting of adverse effects in drug trials has only improved slightly in 17 years, new study shows

Researchers, including academics from the University of York, analysed systematic reviews of 1,200 Randomised Controlled Trials (RCTs) to assess whether reporting had improved over time.

However, the information the researchers needed to assess what adverse effects were reported (and how they were reported) was only included in less than half of the RCTs they analysed.

Co-author Dr Su Golder from the Department of Health Sciences, said: "Drug trials are conducted to give clinicians information on the benefits and adverse effects of treatments. Our study shows that, disappointingly, there's only been a slight improvement in reporting the adverse effects in trials over the last 17 years."

The study argues that many trials focus on the benefits, rather than the adverse effects of the drug being trialled.

"There is also a tendency to focus only on those harms that are either common, or defined as serious which cause hospitalisation, disability or death. Yet other seemingly minor harms which may be important to patients - everything from diarrhoea and insomnia to rashes, coughs and muscle aches - may be important to capture, especially since it may stop people taking medication," Dr Golder added.

Randomised Control Trials authors were also at times selective about which harms they reported, the study went on to say.

Dr Golder added: "We also need to know if a particular drug affected people differently, for example if it affected females more than males, or if a particular harm increased with age."

The study concluded that the lack of reporting or selective reporting of adverse effects in published clinical trials can promote a false impression of safety and misinform clinical and policy decisions and that the NHS, policy makers and patients all need reliable information about the benefits and adverse effects of treatments to make good, informed decisions.

In 2004, major new guidelines on reporting Randomised Controlled Trials (RCTs) were published, with the aim of improving the reporting of adverse effects in trials.

Credit: 
University of York

Early blood-sugar levels in type 2 diabetes crucial for future prognosis

People who get type 2 diabetes need to gain control of their blood-sugar levels -- fast. The years immediately after diagnosis are strikingly critical in terms of their future risk for heart attacks and death. This is shown by a joint study from the Universities of Gothenburg and Oxford.

In a collaboration between the University of Gothenburg in Sweden and the University of Oxford in the UK, the significance of blood sugar levels from the time type 2 diabetes is diagnosed for the risk of heart attacks and death has been studied. The project was led jointly by Professor Marcus Lind in Gothenburg and Professor Rury Holman in Oxford.

The research was based on a key trial in type 2 diabetes, the UK Prospective Diabetes Study (UKPDS). This new analysis examined the role of blood-sugar levels in the first years after type 2 diabetes was diagnosed for the prognosis of myocardial infarction and death 10-20 years later.

The results, presented in the scientific journal Diabetes Care, show that blood-sugar levels early in the course of the condition have a much greater impact on the future prognosis than had been thought previously. They show that targeting blood-sugar levels according to treatment guidelines (HbA1c 52 mmol/mol or lower) from the time of diagnosis was associated with an approximately 20 percent lower risk of death 10-15 years later, compared with targeting a higher blood-sugar level (HbA1c 63 mmol/mol). In addition, it showed that delaying the introduction of good blood-sugar levels until 10 years after diagnosis was associated with only a 3% lower risk of death.

"These latest results are evidence that proper early blood-sugar treatment in type 2 diabetes is crucial to optimise diabetes care. Previously we haven't performed this kind of analysis, or understood just how important early blood-sugar control is for the prognosis. They also mean that there is a need for a greater focus on detecting type 2 diabetes at the earliest opportunity to prevent people living with undetected high blood-sugar levels for several years," says Professor Marcus Lind.

Professor Rury Holman, from the Radcliffe Department of Medicine at the University of Oxford, said "These new results provide a mechanistic explanation for the glycaemic 'legacy effect', first identified by the UKPDS, whereby instituting good blood-sugar control in newly-diagnosed type 2 diabetes was shown to reduce the risks of diabetic complications and death for up to 30 years. The discovery of the 'legacy effect' has led treatment guidelines worldwide recommending the need to achieve good blood-glucose control as soon as possible".

Credit: 
University of Gothenburg

Research encourages re-evaluation of special nerve treatment for chronic pain

video: Dr. Eldon Loh explains new research that assessed the use of a specialized treatment for chronic pain and its impact on health care use and opioid prescribing.

Image: 
Lawson Health Research Institute

LONDON, ON - Hospital researchers from Lawson Health Research Institute have published a recent study that assessed the use of a specialized treatment for chronic pain and its impact on health care use and opioid prescribing.

Paravertebral blocks (PVBs) belong to a broader group of procedures called "nerve blocks." A recent Toronto Star report noted that OHIP has been billed $420 million for nerve block procedures since 2011. PVBs involve injecting medication around the nerves where they exit the bones of the spine, at different locations depending on the patient and the chronic pain they are experiencing.

The regular use of these procedures has been questioned by health care providers due to the high cost and limited evidence of their benefit in reducing chronic pain. While the effectiveness of PVBs has been examined in trauma, cancer pain and regional anesthesia during surgery, they have not been evaluated for use in chronic pain despite widespread use in Ontario.

It is estimated that one in five Canadians live with chronic pain. Pain that persists can affect all aspects of someone's life and health, particularly when it is not being managed.

This new study from London researchers found that 66,310 patients had a PVB between July 2013 and March 2018, and 47,723 patients were included in the study. In the year after a patient's first PVB, there was a significant increase in the number of physician visits. Additional PVBs were frequently performed after the first treatment, with over 26 per cent of patients receiving a PVB ten or more times in one year, with almost eight per cent of patients receiving 30 or more. No overall change was found in opioid dosage in the year after PVB was initiated compared to the year before.

"Frequent use of PVB is common. Initiating treatment with PVCs is associated with marked increases in health care utilization, which includes physician visits and other injection procedures," explains Dr. Eldon Loh, Lawson Associate Scientist and Physiatrist at St. Joseph's Health Care London.

This research provides a broad perspective on the use of PVBs in Ontario, and on the use of nerve blocking treatments in general. There has been a concern for several years about the over use of these procedures; however, this is the first study to systematically document the impact on health care utilization and opioid use.

"We hope that from this study, the appropriate use of PVBs and other pain interventions will be re-evaluated at a provincial level to ensure the use of health resources is being properly managed and we achieve the best outcome for patients," Dr. Loh adds.

Credit: 
Lawson Health Research Institute

Biomaterial vaccines ward off broad range of bacterial infections and septic shock

image: This illustration shows how a ciVax infection vaccine against a pathogenic E. coli strain is produced and applied. First, carbohydrate-containing surface molecules (PAMPs) of killed bacteria are captured with magnetic beads coated with FcMBL. The beads are then combined with mesoporous silica (MPS) rods and immune cell-recruiting GMCSF and immune cell-activating CpG adjuvant to form the complete ciVax vaccine. Upon injection under the skin of mice, the ciVAX vaccine forms a permeable scaffold that recruits immature dendritic cells (DCs), educates them to present PAMP-derived antigens, and additionally activates and releases them again. The reprogrammed DCs then migrate to draining lymph nodes where they orchestrate a complex immune response, including reactive T cells and antibody-producing B cells reacting against the E. coli pathogen.

Image: 
Wyss Institute at Harvard University.

(BOSTON) — Current clinical interventions for infectious diseases are facing increasing challenges due to the ever-rising number of drug-resistant microbial infections, epidemic outbreaks of pathogenic bacteria, and the continued possibility of new biothreats that might emerge in the future. Effective vaccines could act as a bulwark to prevent many bacterial infections and some of their most severe consequences, including sepsis. According to the Centers of Disease Control and Prevention (DCD), “each year, at least 1.7 million adults in America develop sepsis. Nearly 270,000 Americans die as a result of sepsis [and] 1 in 3 patients who dies in a hospital has sepsis.” However, for the most common bacterial pathogens that cause sepsis and many other diseases, still no vaccines are available.

Now, as reported in Nature Biomedical Engineering, a multi-disciplinary team of researchers at Harvard’s Wyss Institute for Biologically Inspired Engineering and John A. Paulson School for Engineering and Applied Sciences (SEAS) developed a biomaterial-based infection vaccine (ciVAX) approach as a solution that could be broadly applied to this pervasive problem. ciVAX vaccines combine two technologies that are currently in clinical development for other applications, and that together enable the capture of immunogenic antigens from a broad spectrum of pathogens and their incorporation into immune cell-recruiting biomaterial scaffolds. Injected or implanted under the skin, ciVAX vaccines then reprogram the immune system to take action against pathogens.

“The protective powers of the vaccines that we have designed and tested so far and the immune responses they stimulated are extremely encouraging, and open up a wide range of potential vaccine applications ranging from sepsis prophylaxis to rapid measures against future pandemic threats and biothreats, as well as new solutions to some of the challenges in veterinary medicine,” said corresponding author David Mooney, Ph.D, who is a Founding Core Faculty member at the Wyss Institute and leads the Institute’s Immuno-Materials Platform. He also the Robert P. Pinkas Family Professor of Bioengineering at SEAS.

In their study, the researchers successfully tested ciVAX technology as a protective measure against the most common causes of sepsis, including Gram-positive S. aureus and Gram-negative E. coli strains. Highlighting the technology’s potential, they found that a prophylactic ciVAX vaccine, protected all vaccinated mice against a lethal attack with an antibiotic-resistant E. coli strain, while only 9% of unvaccinated control animals survived. In a pig model of septic shock induced by a different human E. coli isolate, a ciVAX vaccine prevented the development of sepsis in all four animals, while four unvaccinated animals developed severe and sudden sepsis within 12 hours. Finally, using an approach that mimicked a ring vaccination protocol in human or animal populations, a CiVax vaccine, when loaded with pathogen-derived material isolated from animals infected with one lethal E.coli strain, was able to cross-protect animals against a different lethal E. coli strain.

“Our method captures the majority of glycoprotein (and glycolipid) antigens from the pathogens, and presents these in their native form to the immune system, giving us access to a much larger spectrum of potential antigens than vaccines consisting of single or mixtures of recombinant antigens,” said co-first author and Wyss Lead Senior Staff Scientist Michael Super, Ph.D. “ciVAX vaccines against known pathogens can be fabricated and stored, but additionally, all components except the bacterial antigens can be pre-assembled from shelf-stable cGMP products. The complete vaccines can then be assembled in less than an hour once the antigens are available, which gives this technology unique advantages over other vaccine approaches when rapid responses are called for.” Super conceived the ciVAX concept with co-first author Edward Doherty, who as a former Lead Senior Staff Scientist worked with Mooney on the Wyss’ Immuno-Material platform on biomaterials-based vaccines for cancer applications.

Super and Wyss Founding Director Donald Ingber, M.D., Ph.D., who also authored the study, previously developed the pathogen capture technology used in ciVAX, which is based on a native human pathogen-binding opsonin - Mannose Binding Lectin (MBL) – that they fused to the Fc portion of an Immunoglobulin to generate FcMBL. Recombinant FcMBL binds to more than 120 different pathogen species and toxins, including bacteria, fungi, viruses and parasites. In earlier efforts, the team applied FcMBL to multiple diagnostic problems, and the technology is currently being tested in a clinical trial by the Wyss startup BOA Biomedical as part of a new sepsis treatment.

The second technology component of ciVAX component, the biomaterials-based vaccine technology, was developed as a conceptually new type of cancer immunotherapy by Mooney and his group at the Wyss Institute and SEAS, together with clinical collaborators at the Dana-Farber Cancer Institute. Validated in a clinical trial in human cancer patients, a specifically designed cancer vaccine stimulated significant anti-tumor immune responses. Novartis is currently working to commercialize the vaccine technology for certain cancer applications, and a related biomaterials-based vaccine approach is being pursued by the Wyss startup Attivare Therapeutics, with Doherty and former Wyss researchers Benjamin Seiler and Fernanda Langellotto, Ph.D., who also co-authored this study, as founding members.

To assemble ciVAX vaccines, the team used FcMBL on magnetic beads to capture inactivated bacterial carbohydrate-containing molecules, known as Pathogen Associated Molecular Patterns (PAMPs), from the pathogen of choice, and then simply mixed the complexes with particles of mesoporous silica (MPS) and immune cell-recruiting and activating factors. Under the skin, MPS forms a permeable, biodegradable scaffold that recruits dendritic cells (DCs) of the immune system, reprograms them to present fragments of the captured PAMPs, and releases them again. The DCs then migrate to nearby draining lymph nodes where they orchestrate a broad immune response against the bacterial pathogen. The team found that ciVAX vaccines rapidly enhanced the accumulation and activation of DCs at injection sites and the numbers of DCs, antibody-producing B cells, and different T cell types in draining lymph nodes, and thereby engineered effective pathogen-directed immune responses.

“Beyond the potential of reducing the risk for sepsis in and out of hospitals, our ciVAX vaccine technology has the potential to save the lives of many individuals threatened by a multitude of pathogens, in addition to potentially preventing the spread of infections in animal populations or livestock before they reach humans. It is a terrific example how Wyss researchers from different disciplines and experiences self-assemble around medical problems that urgently need to be solved to create powerful new approaches,” said Ingber who is also the Judah Folkman Professor of Vascular Biology at HMS and Boston Children’s Hospital, and Professor of Bioengineering at the Harvard John A. Paulson School of Engineering and Applied Sciences.

Credit: 
Wyss Institute for Biologically Inspired Engineering at Harvard

A summary of myocarditis cases following COVID-19

Myocarditis-or inflammation around the heart--has been reported in some patients with COVID-19. After searching the medical literature, researchers have now summarized the results of 41 studies describing myocarditis in 42 patients with COVID-19.

The analysis, which is published in the International Journal of Clinical Practice, notes that the median age of patients with myocarditis following COVID-19 infection was 43.4 years, with 71.4% of patients being male.

Fever was the most prevalent symptom, seen in 57% of patients. Hypertension was the most pervasive comorbidity. Markers of cardiac health were altered in most patients, and cardiac imaging tests showed evidence of injury.

Antivirals and corticosteroids were the most frequently used medications. Among the 42 patients, 67% recovered and eight died.

"Myocarditis is becoming a more prevalent complication in COVID-19 disease as more studies are being published. Due to the risk of a sudden worsening of patients' conditions, knowledge of this cardiac complication of COVID-19 disease is crucial for healthcare professionals," said lead author Sawai Singh Rathore, MBBS, of Dr. Sampurnanand Medical College, in India.

Credit: 
Wiley

Has the COVID-19 pandemic lessened bullying at school?

Students reported far higher rates of bullying at school before the COVID-19 pandemic than during the pandemic across all forms of bullying--general, physical, verbal, and social--except for cyber bullying, where differences in rates were less pronounced. The findings come from a study published in Aggressive Behavior.

The study surveyed 6,578 Canadian students in grades 4 to 12. There were certain patterns seen in previous reports:

Girls were more likely to report being bullied than boys.

Boys were more likely to report bullying others than girls.

Elementary school students reported higher bullying involvement than secondary school students.

And gender diverse and LGTBQ+ students reported being bullied at higher rates than students who identified as gender binary or heterosexual.

"Most pandemic studies suggest notable threats to the wellbeing and learning outcomes of children and youth. Our study highlights one potential silver lining--the reduction of bullying," said lead author Tracy Vaillancourt, PhD, of the University of Ottawa. "Reducing bullying is important because it negatively affects all aspects of functioning, both in the immediate and in the long-term. Given the strikingly lower rates of bullying during the pandemic, we should seriously consider retaining some of the educational reforms used to reduce the spread of COVID-19 such as reducing class sizes and increasing supervision."

Credit: 
Wiley

Study: Hospitalizations for eating disorders spike among adolescents during COVID

ANN ARBOR, Mich. - The number of adolescents admitted to the hospital for severe illness from eating disorders has increased significantly during the COVID-19 pandemic, new research suggests.

At one center, the number of hospital admissions among adolescents with eating disorders more than doubled during the first 12 months of the pandemic, according to the study that appears in a pre-publication of Pediatrics.

The 125 hospitalizations among patients ages 10-23 at Michigan Medicine in those 12 months reflect a significant increase over previous years, as admissions related to eating disorders during the same timeframe between 2017 and 2019 averaged 56 per year.

"These findings emphasize how profoundly the pandemic has affected young people, who experienced school closures, cancelled extracurricular activities, and social isolation. Their entire worlds were turned upside down overnight," said lead author Alana Otto, M.D., M.P.H., an adolescent medicine physician at University of Michigan Health C.S. Mott Children's Hospital.

"For adolescents with eating disorders and those at risk for eating disorders, these significant disruptions may have worsened or triggered symptoms."

Findings may be the tip of the iceberg

But the numbers may represent only a fraction of those with eating disorders affected by the pandemic, researchers said, as they only included young people whose severe illness led to hospitalization.

"Our study suggests that the negative mental health effects of the pandemic could be particularly profound among adolescents with eating disorders," Otto said. "But our data doesn't capture the entire picture. These could be really conservative estimates."

The study also suggests the rate of admissions at the institution steadily increased over time during the first year of the pandemic. The highest rates of admissions per month occurred between nine and 12 months after the pandemic began, with rates continuing to climb when the study period ended in March 2021.

Restrictive eating disorders include anorexia nervosa and may be marked by dietary restriction, excessive exercise, and/or or purging to lose weight.

Genetics, psychological factors, and social influences have all been linked to developing eating disorders and adolescents with low self-esteem or depressive symptoms are at especially high risk.

Changes to adolescents' day-to-day lives during the pandemic, such as school closures and cancellation of organized sports, may also disrupt routines related to eating and exercise, and be an impetus for developing unhealthy eating behaviors among those already at risk, Otto said.

"A stressful event may lead to the development of symptoms in a young person at risk for eating disorders," she said.

"During the pandemic, the absence of routine, disruptions in daily activities and a sense of a loss of control are all possible contributing factors. For many adolescents, when everything feels out of control, the one thing they feel they can control is their eating."

Some patients also reported that limitations in playing sports and other physical activities made them worry about gaining weight, leading to unhealthy dieting or exercise. Increased social media use during the pandemic may also expose young people to more negative messaging about body image and weight.

There could be indirect connections to the pandemic as well, Otto said. For example, an adolescent with significant eating disorder symptoms and severe malnutrition may have only come to medical attention when they moved back in with their parents after their college closed unexpectedly during the shutdowns.

Increased demand but limited access to care

Another potential factor may be delayed care for non-COVID-19 conditions, including eating disorders, and fewer in-person visits as part of measures to reduce transmission risks, authors noted.

Adolescents with eating disorders may be particularly impacted by reduced availability of in-person care, Otto said. Assessment and management of patients with malnutrition generally requires measuring weight and vital signs and may involve a full physical examination or lab tests.

Confidentiality, a critical component of clinical care for adolescents, may also be limited in virtual settings.

While the study is limited by its small sample size, it comes as international reports indicate increases in both outpatient referrals to child and adolescent eating disorder services and inpatient admissions related to anorexia nervosa among adolescents, Otto said.

"Although our findings reflect the experience of a single institution, they're in line with emerging reports of the pandemic's potential to have profound negative effects on the mental and physical health of adolescents across the globe," Otto said.

"Adolescents may be particularly vulnerable to negative effects of societal upheaval related to the pandemic and to developing eating disorders during the COVID-19 era. Providers who care for adolescents and teens should be attuned to these risks and monitor patients for signs and symptoms of an eating disorder."

Patient demographics were similar before and during the pandemic, according to the study. But patients admitted during the COVID-19 pandemic were less likely than those admitted prior to the pandemic to have public insurance, something that should be studied further, authors said.

Otto noted that for adolescents with eating disorders, medical admission is often the beginning, not the end, of treatment, which can be a long journey. Among the biggest barriers to care are a shortage of qualified providers and insurance coverage gaps.

"Access to care was already limited before the pandemic and now we're seeing an increased demand for these services. As we see a wave of young people coming to the hospital for urgent medical concerns related to eating disorders, we need to be prepared to continue to care for them after they leave the hospital," Otto said.

"I'm hopeful that as adolescents are able to go back to school and engage with friends and activities that are meaningful to them, we will see admissions decrease," she added. "But it takes time for these symptoms to develop and eating disorders generally last for months or years.

"We expect to see downstream effects of the pandemic on adolescents and young people for some time."

Credit: 
Michigan Medicine - University of Michigan

Newborn screening for epilepsy in sight through the discovery of novel disease biomarkers

The door has finally opened on screening newborn babies for pyridoxine-dependent epilepsy (PDE), a severe inherited metabolic disorder. This screening promises to enable better and earlier treatment of the disease. To identify new biomarkers that can be used in the newborn screening protocol, also known as the neonatal heel prick, researchers at the Radboud University Medical Center joined forces with scientists at the Radboud University's FELIX laser laboratory. They published their findings in The Journal of Clinical Investigation.

The discovery and identification of the new biomarkers could lead to an important addition to worldwide newborn screening protocols. Currently, there are over a thousand known inborn metabolic diseases (IMD), but only 2% of them can be detected through the neonatal heel prick. While these are relatively rare as individual disorders, in the Netherlands, every other day a child is born with an IMD. These disorders have severe health consequences for patients and are currently one of the leading causes of early death among children in the Netherlands.

Technologies combined

"Using new techniques in our clinical laboratory where we study the products of chemical processes (metabolomics), we were able to detect the presence of compounds in body fluids of patients that are not present in persons unaffected by PDE - that was a great first step. However, we could only identify the exact structure of these compounds, the new PDE biomarkers, using the infrared laser at FELIX", says Karlien Coene, laboratory specialist and researcher at the Translational Metabolic Laboratory of the Radboud University Medical Center. This is the first time that an infrared free electron laser - of which are only a hand full in the world - is combined with these clinical experiments.

Pyridoxine-dependent epilepsy (PDE) is an inherited metabolic disorder that is primarily characterized by intractable seizures that do not respond to conventional antiepileptic medications. Seizures are often controlled by daily high doses of vitamin B6, however 80% of affected children nevertheless suffer developmental delay and intellectual disability.

Early screening for metabolic diseases is critical for optimal treatment. That is why researchers are constantly looking for new ways to detect more metabolic diseases earlier in life via the heel prick. These diseases can be identified by looking for the presence of small molecules in the blood that are unique to the disease, also called "biomarkers".

Circumvent bottlenecks

Biomarker discovery and identification is a well-known bottleneck in research of metabolic diseases. "To overcome this hurdle, we decided to combine the advanced analytical instrumentation with the infrared laser of the FELIX laboratory', says Jonathan Martens, researcher at Radboud University's FELIX Laboratory. "The measurements obtained using the unique FELIX laser gives us information about the bonds between the atoms and leads us to the precise molecular structure. With this information, we ultimately managed to synthesize the molecules and this allowed us to further investigate their role in the disease."

In addition to new possibilities in newborn screening, this finding has also revealed fundamental insights about the disease, which could ultimately lead to optimized treatment and better chances to prevent cognitive disability.

Martens: "Now that we have demonstrated that this new combination of techniques really works, we are actively applying our method in research on a range of other (metabolic) diseases for which biomarkers are currently lacking."

Credit: 
Radboud University Nijmegen

Tool helps predicts who will respond best to targeted prostate cancer therapy

LOS ANGELES - A new prognostic tool developed by researchers from the UCLA Jonsson Comprehensive Cancer Center and five other institutions helps predict which men with advanced metastatic prostate cancer will respond favorably to a novel targeted therapy.

The tool, described in a study published today in Lancet Oncology, analyzes a wide spectrum of imaging and clinical data and is intended to assist physicians considering treating patients with Lutetium-177 prostate-specific membrane antigen, or LuPSMA.

LuPSMA, which binds to PSMA proteins and delivers targeted radiation to prostate cancer tissue, offers a new option to men with PSMA-positive metastatic cancer that is castration-resistant, meaning it has stopped responding to hormone therapy. LuPSMA is currently pending approval by the U.S. Food and Drug Administration.

Candidates for the therapy are typically screened using a technique pioneered by UCLA and UC San Francisco called PSMA PET imaging, which combines positron emission tomography with a PET-sensitive drug to detect prostate cancer throughout the body and verify PSMA-expression in the tumors.

The new research demonstrates that a combination of clinical characteristics and PSMA PET imaging characteristics and can be used to predict which patients will have improved progression-free survival (slower disease progression) and improved overall survival (life expectancy) as a result of the treatment.

"Until now, there has been no validated tool to adequately predict the response of patients with advanced metastatic prostate cancer to LuPSMA treatment," said lead author Dr. Andrei Gafita, a postdoctoral scholar in the Ahmanson Translational Theranostics Division of the department of molecular and medical pharmacology at the David Geffen School of Medicine at UCLA. "Most important, this study suggests that screening with PSMA PET imaging could help to select patients who are most likely to benefit from this treatment."

Creating the 'nomogram'

The predictive tool -- commonly called a nomogram -- was developed by researchers from institutions across Europe, Australia and the U.S. who analyzed data from 270 prostate cancer patients who underwent LuPSMA treatment in clinical trials or via compassionate use.

The research team used eight parameters -- including PSMA PET parameters and factors such as the time since initial diagnosis, chemotherapy status and hemoglobin levels -- to create the mathematical formula that allows the tool to predict survival.

Based on the nomogram, the researchers also created a web-based risk calculator that forecasts the probability of overall and progression-free survival in response to LuPSMA. The predictions are used to stratify men into either high-risk or low-risk groups.

In the men studied, those identified by the tool as low risk had longer overall survival (24 months) and progression-free survival (6 months) than those classified as high risk (6 months and 2 months, respectively).

The findings from the research are encouraging, Gafita said, and may provide the foundation for patient selection for LuPSMA therapy. Nevertheless, he notes, until its clinical validity is demonstrated in prospective trials, the tool should be used cautiously and should not replace the clinical judgement of treating physicians.

"Our validated tool can find application particularly at institutions where LuPSMA is just being introduced as a novel therapeutic option," he said.

Credit: 
University of California - Los Angeles Health Sciences

Young South Asian heart attack patients more likely to be obese, use tobacco

A new study examining why young South Asian heart attack patients have more adverse outcomes found this patient population was often obese, used tobacco products, and had a family history of heart disease or risk factors that could have been prevented, monitored for or treated before heart attacks happen. The study will be presented at the ACC Asia 2021 Together with SCS 32nd Annual Scientific Meeting Virtual being held July 9-11, 2021.

"South Asians tend to have multiple co-morbidities including diabetes and obesity at younger ages which is different from the white population," said Salik ur Rehman Iqbal, MBBS, Cardiologist, Aga Khan University Hospital in Karachi, Pakistan, and the study's lead investigator. "This can impact the complexity of coronary lesions and success of revascularization. Moreover, due to lack of awareness and system delays, a significant proportion of patients present to the hospital late translating into adverse outcomes."

Researchers examined heart attack patients less than 45 years old who underwent primary percutaneous coronary intervention between 2013-2019. Patients with previous heart attack or revascularization were excluded, leaving a total of 165 patients. The patient population was:
* 90.3% male
* 48.3% obese
* 45% tobacco-users
* 48.4% positive family history of Ischemic Heart Disease

For patients with delayed presentation at the hospital (more than four hours), 27.3% experienced delayed discharge of more than five days. Thirty-day all-cause mortality was seen in six patients. According to the researchers, learning more about the common clinical, prognostic features and differences in young South Asian heart attack patients could have important clinical, as well as quality of life implications, for this patient population. Iqbal said more than 90% of the young patients with STEMI in this study were males, who are often the sole breadwinners of their families.

"This can translate into a significant impact on their families and dependents," Iqbal said. "Moreover, these same patients will be at risk for recurrent heart attacks and other cardiovascular events. This will also put a great burden on our health budget. Targeting these modifiable risk factors, creating awareness and decreasing system delays should be our goals towards reducing the cardiovascular risk in this population."

Iqbal noted another important aspect particular to South Asians is the presence of abnormal lipids, due to mutations, like higher Lipoprotein-A and Apo B-100. Studies focusing on these abnormal lipids are lacking and identification and treatment of dyslipidemia may be a significant future step, he said.

In another study being presented by the investigators at ACC Asia 2021, the researchers examined 23 young South Asian women who presented with heart attack at Aga Khan University Hospital between 2013 and 2020. The median age was 41 years, 53% had uncontrolled Type 2 diabetes and 50% were obese. A positive family history of ischemic heart disease was found in a third of the patients as was a history of high blood pressure. No patients reported smoking.

Credit: 
American College of Cardiology